WorldWideScience

Sample records for analyzing complex problems

  1. Analyzing the complexity of nanotechnology

    NARCIS (Netherlands)

    Vries, de M.J.; Schummer, J.; Baird, D.

    2006-01-01

    Nanotechnology is a highly complex technological development due to many uncertainties in our knowledge about it. The Dutch philosopher Herman Dooyeweerd has developed a conceptual framework that can be used (1) to analyze the complexity of technological developments and (2) to see how priorities

  2. Qubit Complexity of Continuous Problems

    National Research Council Canada - National Science Library

    Papageorgiou, A; Traub, J. F

    2005-01-01

    .... The authors show how to obtain the classical query complexity for continuous problems. They then establish a simple formula for a lower bound on the qubit complexity in terms of the classical query complexity...

  3. Problem-solving tools for analyzing system problems. The affinity map and the relationship diagram.

    Science.gov (United States)

    Lepley, C J

    1998-12-01

    The author describes how to use two management tools, an affinity map and a relationship diagram, to define and analyze aspects of a complex problem in a system. The affinity map identifies the key influencing elements of the problem, whereas the relationship diagram helps to identify the area that is the most important element of the issue. Managers can use the tools to draw a map of problem drivers, graphically display the drivers in a diagram, and use the diagram to develop a cause-and-effect relationship.

  4. Complex multiplication and lifting problems

    CERN Document Server

    Chai, Ching-Li; Oort, Frans

    2013-01-01

    Abelian varieties with complex multiplication lie at the origins of class field theory, and they play a central role in the contemporary theory of Shimura varieties. They are special in characteristic 0 and ubiquitous over finite fields. This book explores the relationship between such abelian varieties over finite fields and over arithmetically interesting fields of characteristic 0 via the study of several natural CM lifting problems which had previously been solved only in special cases. In addition to giving complete solutions to such questions, the authors provide numerous examples to illustrate the general theory and present a detailed treatment of many fundamental results and concepts in the arithmetic of abelian varieties, such as the Main Theorem of Complex Multiplication and its generalizations, the finer aspects of Tate's work on abelian varieties over finite fields, and deformation theory. This book provides an ideal illustration of how modern techniques in arithmetic geometry (such as descent the...

  5. SCHOOL VIOLENCE: A COMPLEX PROBLEM

    Directory of Open Access Journals (Sweden)

    María del Rosario Ayala-Carrillo

    2015-07-01

    Full Text Available School violence is one type of violence that reflects the breakdown of current society. It is impossible to speak of school violence as an isolated phenomenon without establishing nexuses between public and private life, between collective and individual behaviors, between family and community aspects, without making reference to differences in gender and the life stories of those who are the aggressors or the victims, and without considering the patriarchal culture and interpersonal relationships. When all these factor are interrelated, they make the problem of violence a very complex one that requires us to know the different factors in order to understand it and deal with it.

  6. Solving complex problems a handbook

    CERN Document Server

    Schönwandt, Walter; Grunau, Jens; Utz, Jürgen; Voermanek, Katrin

    2014-01-01

    When you're planning something big, problems appear rather quickly. We hear of them on a daily basis. The bigger or more complex a task, the more we have to deal with complicated, multidisciplinary task formulations. In many cases it is architecture, including urban and spatial planning, but also politics and all types of organizational forms, irrespective of whether they are public authorities or private enterprises, which are expected to deliver functional solutions for such challenges. This is precisely where this book is helpful. It introduces a methodology for developing target-specific,

  7. Complex analogues of real problems

    DEFF Research Database (Denmark)

    Esdahl-Schou, Rune

    This thesis will be a mix of different problems in number theory. As such it is split into two natural parts. The rst part focuses on normal numbers and construction of numbers that are normal to a given complex base. It is written in the style of a thorough and introductory paper on that subject....... Certain classical theorems are stated without proof but with a reference instead, though usually a proof is given. This part of the thesis represents the pinnacle of the authors work during the first two years of his PhD study. The work presented is greatly inspired by the work of Madritsch, Thuswaldner...... and Tichy in [Madritsch et al., 2008] and [Madritsch, 2008] and contains a generalisation of the main theorem in [Madritsch, 2008]. The second part of the thesis focuses on Diophantine approximation, mainly on a famous conjecture by Schmidt from the 1980s. This conjecture was solved by Badziahin, Pollington...

  8. Advice Complexity of the Online Search Problem

    DEFF Research Database (Denmark)

    Clemente, Jhoirene; Hromkovič, Juraj; Komm, Dennis

    2016-01-01

    the minimum amount of information needed in order to achieve a certain competitive ratio. We design an algorithm that reads $b$ bits of advice and achieves a competitive ratio of (M/m)^{1/(2^b+1)} where M and m are the maximum and minimum price in the input. We also give a matching lower bound. Furthermore......The online search problem is a fundamental problem in finance. The numerous direct applications include searching for optimal prices for commodity trading and trading foreign currencies. In this paper, we analyze the advice complexity of this problem. In particular, we are interested in identifying......, we compare the power of advice and randomization for this problem....

  9. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  10. Solving complex fisheries management problems

    DEFF Research Database (Denmark)

    Petter Johnsen, Jahn; Eliasen, Søren Qvist

    2011-01-01

    A crucial issue for the new EU common fisheries policy is how to solve the discard problem. Through a study of the institutional set up and the arrangements for solving the discard problem in Denmark, the Faroe Islands, Iceland and Norway, the article identifies the discard problem as related...

  11. Complex Sequencing Problems and Local Search Heuristics

    NARCIS (Netherlands)

    Brucker, P.; Hurink, Johann L.; Osman, I.H.; Kelly, J.P.

    1996-01-01

    Many problems can be formulated as complex sequencing problems. We will present problems in flexible manufacturing that have such a formulation and apply local search methods like iterative improvement, simulated annealing and tabu search to solve these problems. Computational results are reported.

  12. Common ground, complex problems and decision making

    NARCIS (Netherlands)

    Beers, P.J.; Boshuizen, H.P.A.; Kirschner, P.A.; Gijselaers, W.H.

    2006-01-01

    Organisations increasingly have to deal with complex problems. They often use multidisciplinary teams to cope with such problems where different team members have different perspectives on the problem, different individual knowledge and skills, and different approaches on how to solve the problem.

  13. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc

    2017-01-01

    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze...... programs in high-level languages, AProVE automatically converts them to (int-)TRSs. Then, a wide range of techniques is employed to prove termination and to infer complexity bounds for the resulting rewrite systems. The generated proofs can be exported to check their correctness using automatic certifiers...

  14. Education for complex problem solving

    DEFF Research Database (Denmark)

    Kjær-Rasmussen, Lone Krogh

    The Problem-Based Learning model as it is practiced at Aalborg University grew out of expectations for future graduates in the 1970s. Many changes and developments have taken place since then in the ways the principles and methodologies are practiced, due to changes in society and governmental...... regulations. However, the basic educational principles and methodologies are still the same and seem to meet expectations from society and academic work places today. This is what surveys and research, done regularly, document. (see for instance Krogh, 2013)....

  15. Analyzing complex networks evolution through Information Theory quantifiers

    International Nuclear Information System (INIS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  16. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  17. Using Model Checking for Analyzing Distributed Power Control Problems

    Directory of Open Access Journals (Sweden)

    Thomas Brihaye

    2010-01-01

    Full Text Available Model checking (MC is a formal verification technique which has been known and still knows a resounding success in the computer science community. Realizing that the distributed power control (PC problem can be modeled by a timed game between a given transmitter and its environment, the authors wanted to know whether this approach can be applied to distributed PC. It turns out that it can be applied successfully and allows one to analyze realistic scenarios including the case of discrete transmit powers and games with incomplete information. The proposed methodology is as follows. We state some objectives a transmitter-receiver pair would like to reach. The network is modeled by a game where transmitters are considered as timed automata interacting with each other. The objectives are then translated into timed alternating-time temporal logic formulae and MC is exploited to know whether the desired properties are verified and determine a winning strategy.

  18. Increasing process understanding by analyzing complex interactions in experimental data

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Allesø, Morten; Kristensen, Henning Gjelstrup

    2009-01-01

    understanding of a coating process. It was possible to model the response, that is, the amount of drug released, using both mentioned techniques. However, the ANOVAmodel was difficult to interpret as several interactions between process parameters existed. In contrast to ANOVA, GEMANOVA is especially suited...... for modeling complex interactions and making easily understandable models of these. GEMANOVA modeling allowed a simple visualization of the entire experimental space. Furthermore, information was obtained on how relative changes in the settings of process parameters influence the film quality and thereby drug......There is a recognized need for new approaches to understand unit operations with pharmaceutical relevance. A method for analyzing complex interactions in experimental data is introduced. Higher-order interactions do exist between process parameters, which complicate the interpretation...

  19. Environmental problems in the nuclear weapons complex

    International Nuclear Information System (INIS)

    Fultz, K.O.

    1989-04-01

    This paper provide the authors' views on the environmental problems facing the Department of Energy. Testimony is based on a large body of work, over 50 reports and testimonies since 1981, on environmental, safety, and health aspects of DOE's nuclear weapons complex. This work has shown that the complex faces a wide variety of serious problem areas including aging facilities, safety concerns which have shut down DOE's production reactors, and environmental cleanup

  20. The Process of Solving Complex Problems

    Science.gov (United States)

    Fischer, Andreas; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This article is about Complex Problem Solving (CPS), its history in a variety of research domains (e.g., human problem solving, expertise, decision making, and intelligence), a formal definition and a process theory of CPS applicable to the interdisciplinary field. CPS is portrayed as (a) knowledge acquisition and (b) knowledge application…

  1. Program for Analyzing Flows in a Complex Network

    Science.gov (United States)

    Majumdar, Alok Kumar

    2006-01-01

    Generalized Fluid System Simulation Program (GFSSP) version 4 is a general-purpose computer program for analyzing steady-state and transient flows in a complex fluid network. The program is capable of modeling compressibility, fluid transients (e.g., water hammers), phase changes, mixtures of chemical species, and such externally applied body forces as gravitational and centrifugal ones. A graphical user interface enables the user to interactively develop a simulation of a fluid network consisting of nodes and branches. The user can also run the simulation and view the results in the interface. The system of equations for conservation of mass, energy, chemical species, and momentum is solved numerically by a combination of the Newton-Raphson and successive-substitution methods.

  2. Quantum complexity of graph and algebraic problems

    International Nuclear Information System (INIS)

    Doern, Sebastian

    2008-01-01

    This thesis is organized as follows: In Chapter 2 we give some basic notations, definitions and facts from linear algebra, graph theory, group theory and quantum computation. In Chapter 3 we describe three important methods for the construction of quantum algorithms. We present the quantum search algorithm by Grover, the quantum amplitude amplification and the quantum walk search technique by Magniez et al. These three tools are the basis for the development of our new quantum algorithms for graph and algebra problems. In Chapter 4 we present two tools for proving quantum query lower bounds. We present the quantum adversary method by Ambainis and the polynomial method introduced by Beals et al. The quantum adversary tool is very useful to prove good lower bounds for many graph and algebra problems. The part of the thesis containing the original results is organized in two parts. In the first part we consider the graph problems. In Chapter 5 we give a short summary of known quantum graph algorithms. In Chapter 6 to 8 we study the complexity of our new algorithms for matching problems, graph traversal and independent set problems on quantum computers. In the second part of our thesis we present new quantum algorithms for algebraic problems. In Chapter 9 to 10 we consider group testing problems and prove quantum complexity bounds for important problems from linear algebra. (orig.)

  3. Quantum complexity of graph and algebraic problems

    Energy Technology Data Exchange (ETDEWEB)

    Doern, Sebastian

    2008-02-04

    This thesis is organized as follows: In Chapter 2 we give some basic notations, definitions and facts from linear algebra, graph theory, group theory and quantum computation. In Chapter 3 we describe three important methods for the construction of quantum algorithms. We present the quantum search algorithm by Grover, the quantum amplitude amplification and the quantum walk search technique by Magniez et al. These three tools are the basis for the development of our new quantum algorithms for graph and algebra problems. In Chapter 4 we present two tools for proving quantum query lower bounds. We present the quantum adversary method by Ambainis and the polynomial method introduced by Beals et al. The quantum adversary tool is very useful to prove good lower bounds for many graph and algebra problems. The part of the thesis containing the original results is organized in two parts. In the first part we consider the graph problems. In Chapter 5 we give a short summary of known quantum graph algorithms. In Chapter 6 to 8 we study the complexity of our new algorithms for matching problems, graph traversal and independent set problems on quantum computers. In the second part of our thesis we present new quantum algorithms for algebraic problems. In Chapter 9 to 10 we consider group testing problems and prove quantum complexity bounds for important problems from linear algebra. (orig.)

  4. Addressing complex design problems through inductive learning

    OpenAIRE

    Hanna, S.

    2012-01-01

    Optimisation and related techniques are well suited to clearly defined problems involving systems that can be accurately simulated, but not to tasks in which the phenomena in question are highly complex or the problem ill-defined. These latter are typical of architecture and particularly creative design tasks, which therefore currently lack viable computational tools. It is argued that as design teams and construction projects of unprecedented scale are increasingly frequent, this is just whe...

  5. Complex Problems in Entrepreneurship Education: Examining Complex Problem-Solving in the Application of Opportunity Identification

    Directory of Open Access Journals (Sweden)

    Yvette Baggen

    2017-01-01

    Full Text Available In opening up the black box of what entrepreneurship education (EE should be about, this study focuses on the exploration of relationships between two constructs: opportunity identification (OI and complex problem-solving (CPS. OI, as a domain-specific capability, is at the core of entrepreneurship research, whereas CPS is a more domain-general skill. On a conceptual level, there are reasons to believe that CPS skills can help individuals to identify potential opportunities in dynamic and nontransparent environments. Therefore, we empirically investigated whether CPS relates to OI among 113 masters students. Data is analyzed using multiple regressions. The results show that CPS predicts the number of concrete ideas that students generate, suggesting that having CPS skills supports the generation of detailed, potential business ideas of good quality. The results of the current study suggest that training CPS, as a more domain-general skill, could be a valuable part of what should be taught in EE.

  6. Using Model Checking for Analyzing Distributed Power Control Problems

    DEFF Research Database (Denmark)

    Brihaye, Thomas; Jungers, Marc; Lasaulce, Samson

    2010-01-01

    Model checking (MC) is a formal verification technique which has been known and still knows a resounding success in the computer science community. Realizing that the distributed power control ( PC) problem can be modeled by a timed game between a given transmitter and its environment, the authors...... objectives a transmitter-receiver pair would like to reach. The network is modeled by a game where transmitters are considered as timed automata interacting with each other. The objectives are then translated into timed alternating-time temporal logic formulae and MC is exploited to know whether the desired...

  7. ANALYZING ALGEBRAIC THINKING USING “GUESS MY NUMBER” PROBLEMS

    Directory of Open Access Journals (Sweden)

    Estella De Los Santos

    2012-01-01

    Full Text Available The purpose of this study was to assess student knowledge of numeric, visual and algebraic representations. A definite gap between arithmetic and algebra has been documented in the research. The researchers’ goal was to identify a link between the two. Using four “Guess My Number problems, seventh and tenth grade students were asked to write numeric, visual, and algebraic representations. Seventh-grade students had significantly higher scores than tenth-grade students on visual representation responses. There were no significant differences between the seventh and tenth grade students’ responses on the numeric and algebraic representation. The researchers believed that the semi-concrete and visual models, such as used in this study, may provide the link between numeric and algebraic concepts for many students.

  8. Collecting and Analyzing Stakeholder Feedback for Signing at Complex Interchanges

    Science.gov (United States)

    2014-10-01

    The purpose of this project was to identify design constraints related to signing, markings, and geometry for complex interchanges, and then to identify useful topics for future research that will yield findings that can address those design issues. ...

  9. Automatic Algorithm Selection for Complex Simulation Problems

    CERN Document Server

    Ewald, Roland

    2012-01-01

    To select the most suitable simulation algorithm for a given task is often difficult. This is due to intricate interactions between model features, implementation details, and runtime environment, which may strongly affect the overall performance. An automated selection of simulation algorithms supports users in setting up simulation experiments without demanding expert knowledge on simulation. Roland Ewald analyzes and discusses existing approaches to solve the algorithm selection problem in the context of simulation. He introduces a framework for automatic simulation algorithm selection and

  10. Analyzing the Implicit Computational Complexity of object-oriented programs

    OpenAIRE

    Marion , Jean-Yves; Péchoux , Romain

    2008-01-01

    International audience; A sup-interpretation is a tool which provides upper bounds on the size of the values computed by the function symbols of a program. Sup-interpretations have shown their interest to deal with the complexity of first order functional programs. This paper is an attempt to adapt the framework of sup-interpretations to a fragment of object-oriented programs, including loop and while constructs and methods with side effects. We give a criterion, called brotherly criterion, w...

  11. Dependability problems of complex information systems

    CERN Document Server

    Zamojski, Wojciech

    2014-01-01

    This monograph presents original research results on selected problems of dependability in contemporary Complex Information Systems (CIS). The ten chapters are concentrated around the following three aspects: methods for modelling of the system and its components, tasks ? or in more generic and more adequate interpretation, functionalities ? accomplished by the system and conditions for their correct realization in the dynamic operational environment. While the main focus is on theoretical advances and roadmaps for implementations of new technologies, a?much needed forum for sharing of the bes

  12. Principles of big data preparing, sharing, and analyzing complex information

    CERN Document Server

    Berman, Jules J

    2013-01-01

    Principles of Big Data helps readers avoid the common mistakes that endanger all Big Data projects. By stressing simple, fundamental concepts, this book teaches readers how to organize large volumes of complex data, and how to achieve data permanence when the content of the data is constantly changing. General methods for data verification and validation, as specifically applied to Big Data resources, are stressed throughout the book. The book demonstrates how adept analysts can find relationships among data objects held in disparate Big Data resources, when the data objects are endo

  13. Towards a theoretical framework for analyzing complex linguistic networks

    CERN Document Server

    Lücking, Andy; Banisch, Sven; Blanchard, Philippe; Job, Barbara

    2016-01-01

    The aim of this book is to advocate and promote network models of linguistic systems that are both based on thorough mathematical models and substantiated in terms of linguistics. In this way, the book contributes first steps towards establishing a statistical network theory as a theoretical basis of linguistic network analysis the boarder of the natural sciences and the humanities.This book addresses researchers who want to get familiar with theoretical developments, computational models and their empirical evaluation in the field of complex linguistic networks. It is intended to all those who are interested in statisticalmodels of linguistic systems from the point of view of network research. This includes all relevant areas of linguistics ranging from phonological, morphological and lexical networks on the one hand and syntactic, semantic and pragmatic networks on the other. In this sense, the volume concerns readers from many disciplines such as physics, linguistics, computer science and information scien...

  14. Analyzing complex networks through correlations in centrality measurements

    International Nuclear Information System (INIS)

    Ricardo Furlan Ronqui, José; Travieso, Gonzalo

    2015-01-01

    Many real world systems can be expressed as complex networks of interconnected nodes. It is frequently important to be able to quantify the relative importance of the various nodes in the network, a task accomplished by defining some centrality measures, with different centrality definitions stressing different aspects of the network. It is interesting to know to what extent these different centrality definitions are related for different networks. In this work, we study the correlation between pairs of a set of centrality measures for different real world networks and two network models. We show that the centralities are in general correlated, but with stronger correlations for network models than for real networks. We also show that the strength of the correlation of each pair of centralities varies from network to network. Taking this fact into account, we propose the use of a centrality correlation profile, consisting of the values of the correlation coefficients between all pairs of centralities of interest, as a way to characterize networks. Using the yeast protein interaction network as an example we show also that the centrality correlation profile can be used to assess the adequacy of a network model as a representation of a given real network. (paper)

  15. New Approach to Analyzing Physics Problems: A Taxonomy of Introductory Physics Problems

    Science.gov (United States)

    Teodorescu, Raluca E.; Bennhold, Cornelius; Feldman, Gerald; Medsker, Larry

    2013-01-01

    This paper describes research on a classification of physics problems in the context of introductory physics courses. This classification, called the Taxonomy of Introductory Physics Problems (TIPP), relates physics problems to the cognitive processes required to solve them. TIPP was created in order to design educational objectives, to develop…

  16. Ordinal optimization and its application to complex deterministic problems

    Science.gov (United States)

    Yang, Mike Shang-Yu

    1998-10-01

    We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.

  17. ANALYZING THE RELATIONSHIP BETWEEN PROBLEM SOLVING SKILLS AND PERSONALITY CHARACTERISTICS OF UNIVERSITY STUDENTS

    OpenAIRE

    SÜLEYMAN DÜNDAR

    2013-01-01

    The aim of this study is to analyze problem solving skills of university students according to their personal characteristics. We try to find out if there is a difference in problem solving skills considering sex, class and personality harmony characteristics. Personal data form, Problem Solving Scale and Hacettepe Personality Scale are used as measurement tools. The results of the study indicate that there is no difference between male and female students in problem solving skills. Problem s...

  18. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    Science.gov (United States)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  19. Complex network problems in physics, computer science and biology

    Science.gov (United States)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  20. Modeling the Structure and Complexity of Engineering Routine Design Problems

    NARCIS (Netherlands)

    Jauregui Becker, Juan Manuel; Wits, Wessel Willems; van Houten, Frederikus J.A.M.

    2011-01-01

    This paper proposes a model to structure routine design problems as well as a model of its design complexity. The idea is that having a proper model of the structure of such problems enables understanding its complexity, and likewise, a proper understanding of its complexity enables the development

  1. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    Science.gov (United States)

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  2. Problems of development of Kuzbass fuel power-engineering complex

    International Nuclear Information System (INIS)

    Mazikin, V.P.; Razumnyak, N.L.; Shatirov, S.V.; Gladyshev, G.P.

    2000-01-01

    Problems of Kuzbass fuel and energy complex development, bituminous and brown coal being its main resource, are discussed. Balance reserves of bituminous coal in Kuzbass are estimated as 59 bln. tons, which makes up 29% of the world and nearly 60% of bituminous coal reserves in Russia. Dynamics of price rise in reference to energy-grade coal of Kuzbass is analyzed. The structure of the Kuzbass energy system is considered and characteristics of its major state district electric power plants and heat and power generating plants are provided. Water-coal and water-black oil fuels are od interest for Kuzbass energy production as alternative source of energy. Special attention is paid to environmental problems of coal concentration [ru

  3. How Unstable Are Complex Financial Systems? Analyzing an Inter-bank Network of Credit Relations

    Science.gov (United States)

    Sinha, Sitabhra; Thess, Maximilian; Markose, Sheri

    The recent worldwide economic crisis of 2007-09 has focused attention on the need to analyze systemic risk in complex financial networks. We investigate the problem of robustness of such systems in the context of the general theory of dynamical stability in complex networks and, in particular, how the topology of connections influence the risk of the failure of a single institution triggering a cascade of successive collapses propagating through the network. We use data on bilateral liabilities (or exposure) in the derivatives market between 202 financial intermediaries based in USA and Europe in the last quarter of 2009 to empirically investigate the network structure of the over-the-counter (OTC) derivatives market. We observe that the network exhibits both heterogeneity in node properties and the existence of communities. It also has a prominent core-periphery organization and can resist large-scale collapse when subjected to individual bank defaults (however, failure of any bank in the core may result in localized collapse of the innermost core with substantial loss of capital) but is vulnerable to system-wide breakdown as a result of an accompanying liquidity crisis.

  4. Explicitly solvable complex Chebyshev approximation problems related to sine polynomials

    Science.gov (United States)

    Freund, Roland

    1989-01-01

    Explicitly solvable real Chebyshev approximation problems on the unit interval are typically characterized by simple error curves. A similar principle is presented for complex approximation problems with error curves induced by sine polynomials. As an application, some new explicit formulae for complex best approximations are derived.

  5. Finding practical solutions to complex problems: IDRC's fifth annual ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-04-15

    Apr 15, 2016 ... English · Français ... Finding practical solutions to complex problems: IDRC's fifth annual ... “IDRC staff share a common goal with the researchers they work with – to find low-cost, down-to-earth solutions to complex problems ...

  6. The ESTER particle and plasma analyzer complex for the Phobos mission

    Energy Technology Data Exchange (ETDEWEB)

    Afonin, V.V.; Shutte, N.M. (AN SSSR, Moscow (USSR). Inst. Kosmicheskikh Issledovanij); McKenna-Lawlor, S.; Rusznyak, P. (Space Technology Ireland Ltd., Maynooth (Ireland)); Kiraly, P.; Szabo, L.; Szalai, S.; Szucs, I.T.; Varhalmi, L. (Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics); Marsden, R. (European Space Agency, Noordwijk (Netherlands). Space Science Dept.); Richter, A.; Witte, M. (Max-Planck-Institut fuer Aeronomie, Katlenburg-Lindau (Germany, F.R.))

    1990-05-01

    The ESTER particle and plasma analyzer system for the Phobos Mission comprised a complex of three instruments (LET, SLED and HARP) serviced by a common Data Processing Unit. An account is provided of this complex, its objectives and excellent performance in space. (orig.).

  7. Modeling Complex Chemical Systems: Problems and Solutions

    Science.gov (United States)

    van Dijk, Jan

    2016-09-01

    Non-equilibrium plasmas in complex gas mixtures are at the heart of numerous contemporary technologies. They typically contain dozens to hundreds of species, involved in hundreds to thousands of reactions. Chemists and physicists have always been interested in what are now called chemical reduction techniques (CRT's). The idea of such CRT's is that they reduce the number of species that need to be considered explicitly without compromising the validity of the model. This is usually achieved on the basis of an analysis of the reaction time scales of the system under study, which identifies species that are in partial equilibrium after a given time span. The first such CRT that has been widely used in plasma physics was developed in the 1960's and resulted in the concept of effective ionization and recombination rates. It was later generalized to systems in which multiple levels are effected by transport. In recent years there has been a renewed interest in tools for chemical reduction and reaction pathway analysis. An example of the latter is the PumpKin tool. Another trend is that techniques that have previously been developed in other fields of science are adapted as to be able to handle the plasma state of matter. Examples are the Intrinsic Low Dimension Manifold (ILDM) method and its derivatives, which originate from combustion engineering, and the general-purpose Principle Component Analysis (PCA) technique. In this contribution we will provide an overview of the most common reduction techniques, then critically assess the pros and cons of the methods that have gained most popularity in recent years. Examples will be provided for plasmas in argon and carbon dioxide.

  8. Analyzing the Responses of 7-8 Year Olds When Solving Partitioning Problems

    Science.gov (United States)

    Badillo, Edelmira; Font, Vicenç; Edo, Mequè

    2015-01-01

    We analyze the mathematical solutions of 7- to 8-year-old pupils while individually solving an arithmetic problem. The analysis was based on the "configuration of objects," an instrument derived from the onto-semiotic approach to mathematical knowledge. Results are illustrated through a number of cases. From the analysis of mathematical…

  9. Solving Complex Problems to Create Charter Extension Options

    DEFF Research Database (Denmark)

    Tippmann, Esther; Nell, Phillip Christopher

    undertaken by 29 subsidiary units supports our hypotheses, demonstrating that these activities are a means to systematically reduce inherent problem solving biases. This study contributes to problem solving theory, the literature on headquarters’ roles in complex organizations, as well as the literature......This study examines subsidiary-driven problem solving processes and their potential to create advanced solutions for charter extension options. Problem solving theory suggests that biases in problem formulation and solution search can confine problem solving potential. We thus argue that balanced...... solution search, or activities to reconcile the need for some solution features to be locally-tailored while others can be internationally standardized, mediates the relationships between problem complexity/headquarters involvement and the capacity to create advanced solutions. An analysis of 67 projects...

  10. Analyzing the Problems of Ayandeh Bank Branches across the Country Using Data Mining Technique

    Directory of Open Access Journals (Sweden)

    Shabnam Mohammadi

    2014-06-01

    Full Text Available In order to manage problems and complaints of customers and branches, many banks in the country outsource parts of their customer relationship management to companies such as call centers. Since this important unit is managed out of the banks, analyzing the data and evaluating the performance of call centers are very important. On the other hand, many banks are not able to analyze and do not know how to use hidden patterns in the data. Hence, by presenting RFS model in this paper, we have tried to cluster bank branches based on R factor (recently announced problem, F (frequency or number of difficulties and S (branches satisfaction with call center and find the relationship between these factors and mentioned problems. Moreover, call center's ability to resolve problems of branches of each cluster can be assessed using S Factor. Branches were distributed into four optimized clusters based on their behavior pattern. Finally, the results were analyzed and the recommendations were presented to improve the performance of call centers.

  11. Knowledge based method for solving complexity in design problems

    NARCIS (Netherlands)

    Vermeulen, B.

    2007-01-01

    The process of design aircraft systems is becoming more and more complex, due to an increasing amount of requirements. Moreover, the knowledge on how to solve these complex design problems becomes less readily available, because of a decrease in availability of intellectual resources and reduced

  12. Solution of a Complex Least Squares Problem with Constrained Phase.

    Science.gov (United States)

    Bydder, Mark

    2010-12-30

    The least squares solution of a complex linear equation is in general a complex vector with independent real and imaginary parts. In certain applications in magnetic resonance imaging, a solution is desired such that each element has the same phase. A direct method for obtaining the least squares solution to the phase constrained problem is described.

  13. Structuring and assessing large and complex decision problems using MCDA

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    This paper presents an approach for the structuring and assessing of large and complex decision problems using multi-criteria decision analysis (MCDA). The MCDA problem is structured in a decision tree and assessed using the REMBRANDT technique featuring a procedure for limiting the number of pair...

  14. On the complexity of container stowage planning problems

    DEFF Research Database (Denmark)

    Tierney, Kevin; Pacino, Dario; Jensen, Rune Møller

    2014-01-01

    The optimization of container ship and depot operations embeds the kk-shift problem, in which containers must be stowed in stacks such that at most kk containers must be removed in order to reach containers below them. We first solve an open problem introduced by Avriel et al. (2000) by showing...... that changing from uncapacitated to capacitated stacks reduces the complexity of this problem from NP-complete to polynomial. We then examine the complexity of the current state-of-the-art abstraction of container ship stowage planning, wherein containers and slots are grouped together. To do this, we define...... the hatch overstow problem, in which a set of containers are placed on top of the hatches of a container ship such that the number of containers that are stowed on hatches that must be accessed is minimized. We show that this problem is NP-complete by a reduction from the set-covering problem, which means...

  15. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 1. [thermal analyzer manual

    Science.gov (United States)

    Lee, H. P.

    1977-01-01

    The NASTRAN Thermal Analyzer Manual describes the fundamental and theoretical treatment of the finite element method, with emphasis on the derivations of the constituent matrices of different elements and solution algorithms. Necessary information and data relating to the practical applications of engineering modeling are included.

  16. Fluid leadership: inviting diverse inputs to address complex problems

    OpenAIRE

    Moir, Sylvia

    2016-01-01

    Approved for public release; distribution is unlimited History is replete with examples of misapplied leadership strategies. When singular methods are used to solve multifaceted problems, negative results are often the consequence. Complex issues in a complex environment require complex perspectives; the homeland security enterprise (HSE) needs leaders who can adapt their leadership styles according to emerging environments. Furthermore, the diverse agencies within the HSE must work togeth...

  17. Solving complex band structure problems with the FEAST eigenvalue algorithm

    Science.gov (United States)

    Laux, S. E.

    2012-08-01

    With straightforward extension, the FEAST eigenvalue algorithm [Polizzi, Phys. Rev. B 79, 115112 (2009)] is capable of solving the generalized eigenvalue problems representing traveling-wave problems—as exemplified by the complex band-structure problem—even though the matrices involved are complex, non-Hermitian, and singular, and hence outside the originally stated range of applicability of the algorithm. The obtained eigenvalues/eigenvectors, however, contain spurious solutions which must be detected and removed. The efficiency and parallel structure of the original algorithm are unaltered. The complex band structures of Si layers of varying thicknesses and InAs nanowires of varying radii are computed as test problems.

  18. Correcting environmental problems facing the nuclear weapons complex

    International Nuclear Information System (INIS)

    Rezendes, V.S.

    1990-06-01

    This report discusses DOE's efforts to correct the environmental problems facing the nuclear weapons complex. It focuses on three main points. First, the weapons complex faces a variety of serious and costly environmental problems. Second, during the past year, DOE has made some important changes to its organization that should help change its management focus from one that emphasizes materials production to one that more clearly focuses on environmental concerns. Third, because resolution of DOE's environmental problems will require considerable resources during a period of budgetary constraints, it is imperative that DOE have internal controls in place to ensure that resources are spent efficiently

  19. New complex variable meshless method for advection—diffusion problems

    International Nuclear Information System (INIS)

    Wang Jian-Fei; Cheng Yu-Min

    2013-01-01

    In this paper, an improved complex variable meshless method (ICVMM) for two-dimensional advection—diffusion problems is developed based on improved complex variable moving least-square (ICVMLS) approximation. The equivalent functional of two-dimensional advection—diffusion problems is formed, the variation method is used to obtain the equation system, and the penalty method is employed to impose the essential boundary conditions. The difference method for two-point boundary value problems is used to obtain the discrete equations. Then the corresponding formulas of the ICVMM for advection—diffusion problems are presented. Two numerical examples with different node distributions are used to validate and inestigate the accuracy and efficiency of the new method in this paper. It is shown that ICVMM is very effective for advection—diffusion problems, and has a good convergent character, accuracy, and computational efficiency

  20. Analyzing Integrated Cost-Schedule Risk for Complex Product Systems R&D Projects

    Directory of Open Access Journals (Sweden)

    Zhe Xu

    2014-01-01

    Full Text Available The vast majority of the research efforts in project risk management tend to assess cost risk and schedule risk independently. However, project cost and time are related in reality and the relationship between them should be analyzed directly. We propose an integrated cost and schedule risk assessment model for complex product systems R&D projects. Graphical evaluation review technique (GERT, Monte Carlo simulation, and probability distribution theory are utilized to establish the model. In addition, statistical analysis and regression analysis techniques are employed to analyze simulation outputs. Finally, a complex product systems R&D project as an example is modeled by the proposed approach and the simulation outputs are analyzed to illustrate the effectiveness of the risk assessment model. It seems that integrating cost and schedule risk assessment can provide more reliable risk estimation results.

  1. Particle swarm as optimization tool in complex nuclear engineering problems

    International Nuclear Information System (INIS)

    Medeiros, Jose Antonio Carlos Canedo

    2005-06-01

    Due to its low computational cost, gradient-based search techniques associated to linear programming techniques are being used as optimization tools. These techniques, however, when applied to multimodal search spaces, can lead to local optima. When finding solutions for complex multimodal domains, random search techniques are being used with great efficacy. In this work we exploit the swarm optimization algorithm search power capacity as an optimization tool for the solution of complex high dimension and multimodal search spaces of nuclear problems. Due to its easy and natural representation of high dimension domains, the particle swarm optimization was applied with success for the solution of complex nuclear problems showing its efficacy in the search of solutions in high dimension and complex multimodal spaces. In one of these applications it enabled a natural and trivial solution in a way not obtained with other methods confirming the validity of its application. (author)

  2. Ability to analyze the statement of a problem as a metasubject result of learning

    Directory of Open Access Journals (Sweden)

    V.A. Guruzhapov

    2014-08-01

    Full Text Available We provide with the results of experimental research of younger school students ability to analyze and understand the missing terms of a mathematical problem as one of the components of metasubject educational outcomes. The pupils were offered tasks of the diagnostic technique developed by V.A. Guruzhapov, and aimed at assessing the relationships of varying quantities of items. The sample of subjects was 168 students of forms I-III of two Moscow schools. It was found that this technique can estimate the metasubject component of the educational process in the traditional system of education in terms of the analysis of the adequacy of the object display properties in its model. The validity of the methodology was tested in a training experiment conducted by L.N. Shilenkova. An analysis of tasks of another subject content than what was presented in diagnostic tasks was performed with younger students. After learning, the results of the experimental group students significantly improved. On this basis it is concluded that the proposed diagnostic tasks can be used to assess the ability of younger school students to analyze and understand the missing statements of the problem as one of the components of metasubject educational outcomes. The designed developing educational situation can be used in the practice of modern elementary school to enhance learning.

  3. Modal and Mixed Specifications: Key Decision Problems and their Complexities

    DEFF Research Database (Denmark)

    Antonik, Adam; Huth, Michael; Larsen, Kim Guldstrand

    2010-01-01

    , and whether all implementations of one specification are implementations of another one. For each of these decision problems we investigate the worst-case computational complexity for the modal and mixed case. We show that the first decision problem is EXPTIME-complete for modal as well as for mixed......Modal and mixed transition systems are specification formalisms that allow mixing of over- and under-approximation. We discuss three fundamental decision problems for such specifications: whether a set of specifications has a common implementation, whether a sole specification has an implementation...... specifications. We prove that the second decision problem is EXPTIME-complete for mixed specifications (while it is known to be trivial for modal ones). The third decision problem is furthermore demonstrated to be EXPTIME-complete for mixed specifications....

  4. What Do Employers Pay for Employees' Complex Problem Solving Skills?

    Science.gov (United States)

    Ederer, Peer; Nedelkoska, Ljubica; Patt, Alexander; Castellazzi, Silvia

    2015-01-01

    We estimate the market value that employers assign to the complex problem solving (CPS) skills of their employees, using individual-level Mincer-style wage regressions. For the purpose of the study, we collected new and unique data using psychometric measures of CPS and an extensive background questionnaire on employees' personal and work history.…

  5. Conceptual and Developmental Analysis of Mental Models: An Example with Complex Change Problems.

    Science.gov (United States)

    Poirier, Louise

    Defining better implicit models of children's actions in a series of situations is of paramount importance to understanding how knowledge is constructed. The objective of this study was to analyze the implicit mental models used by children in complex change problems to understand the stability of the models and their evolution with the child's…

  6. Application of NASA management approach to solve complex problems on earth

    Science.gov (United States)

    Potate, J. S.

    1972-01-01

    The application of NASA management approach to solving complex problems on earth is discussed. The management of the Apollo program is presented as an example of effective management techniques. Four key elements of effective management are analyzed. Photographs of the Cape Kennedy launch sites and supporting equipment are included to support the discussions.

  7. Tourists' mental representations of complex travel decision problems

    NARCIS (Netherlands)

    Dellaert, B.G.C.; Arentze, T.A.; Horeni, O.

    2014-01-01

    Tourism research has long recognized the complexity of many decisions that tourists make and proposed models to describe and analyze tourist decision processes. This article complements this previous research by proposing a view that moves away from the process of making a decision and instead

  8. ATHENA [Advanced Thermal Hydraulic Energy Network Analyzer] solutions to developmental assessment problems

    International Nuclear Information System (INIS)

    Carlson, K.E.; Ransom, V.H.; Roth, P.A.

    1987-03-01

    The ATHENA (Advanced Thermal Hydraulic Energy Network Analyzer) code has been developed to perform transient simulation of the thermal hydraulic systems that may be found in fusion reactors, space reactors, and other advanced systems. As an assessment of current capability the code was applied to a number of physical problems, both conceptual and actual experiments. Results indicate that the numerical solution to the basic conservation equations is technically sound, and that generally good agreement can be obtained when modeling relevant hydrodynamic experiments. The assessment also demonstrates basic fusion system modeling capability and verifies compatibility of the code with both CDC and CRAY mainframes. Areas where improvements could be made include constitutive modeling, which describes the interfacial exchange term. 13 refs., 84 figs

  9. Advice Complexity of the Online Induced Subgraph Problem

    DEFF Research Database (Denmark)

    Komm, Dennis; Královič, Rastislav; Královič, Richard

    2016-01-01

    of the input can influence the solution quality. We evaluate the information in a quantitative way by considering the best possible advice of given size that describes the unknown input. Using a result from Boyar et al. we give a tight trade-off relationship stating that, for inputs of length n, roughly n...... subgraph problem, preemption does not significantly help by giving a lower bound of Omega(n/(c^2\\log c)) on the bits of advice that are needed to obtain competitive ratio c, where c is any increasing function bounded from above by \\sqrt{n/\\log n}. We also give a linear lower bound for c close to 1....... these problems by investigating a generalized problem: for an arbitrary but fixed hereditary property, find some maximal induced subgraph having the property. We investigate this problem from the point of view of advice complexity, i.e. we ask how some additional information about the yet unrevealed parts...

  10. A Comparison of Geographic Information Systems, Complex Networks, and Other Models for Analyzing Transportation Network Topologies

    Science.gov (United States)

    Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher

    2005-01-01

    This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.

  11. Analyzing the causation of a railway accident based on a complex network

    Science.gov (United States)

    Ma, Xin; Li, Ke-Ping; Luo, Zi-Yan; Zhou, Jin

    2014-02-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents.

  12. Analyzing the causation of a railway accident based on a complex network

    International Nuclear Information System (INIS)

    Ma Xin; Li Ke-Ping; Luo Zi-Yan; Zhou Jin

    2014-01-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents. (interdisciplinary physics and related areas of science and technology)

  13. Complex saddle points and the sign problem in complex Langevin simulation

    International Nuclear Information System (INIS)

    Hayata, Tomoya; Hidaka, Yoshimasa; Tanizaki, Yuya

    2016-01-01

    We show that complex Langevin simulation converges to a wrong result within the semiclassical analysis, by relating it to the Lefschetz-thimble path integral, when the path-integral weight has different phases among dominant complex saddle points. Equilibrium solution of the complex Langevin equation forms local distributions around complex saddle points. Its ensemble average approximately becomes a direct sum of the average in each local distribution, where relative phases among them are dropped. We propose that by taking these phases into account through reweighting, we can solve the wrong convergence problem. However, this prescription may lead to a recurrence of the sign problem in the complex Langevin method for quantum many-body systems.

  14. Data Mining and Complex Problems: Case Study in Composite Materials

    Science.gov (United States)

    Rabelo, Luis; Marin, Mario

    2009-01-01

    Data mining is defined as the discovery of useful, possibly unexpected, patterns and relationships in data using statistical and non-statistical techniques in order to develop schemes for decision and policy making. Data mining can be used to discover the sources and causes of problems in complex systems. In addition, data mining can support simulation strategies by finding the different constants and parameters to be used in the development of simulation models. This paper introduces a framework for data mining and its application to complex problems. To further explain some of the concepts outlined in this paper, the potential application to the NASA Shuttle Reinforced Carbon-Carbon structures and genetic programming is used as an illustration.

  15. Fluid Ability (Gf) and Complex Problem Solving (CPS)

    OpenAIRE

    Patrick Kyllonen; Cristina Anguiano Carrasco; Harrison J. Kell

    2017-01-01

    Complex problem solving (CPS) has emerged over the past several decades as an important construct in education and in the workforce. We examine the relationship between CPS and general fluid ability (Gf) both conceptually and empirically. A review of definitions of the two factors, prototypical tasks, and the information processing analyses of performance on those tasks suggest considerable conceptual overlap. We review three definitions of CPS: a general definition emerging from the human pr...

  16. Topographical memory analyzed in mice using the Hamlet test, a novel complex maze.

    Science.gov (United States)

    Crouzier, Lucie; Gilabert, Damien; Rossel, Mireille; Trousse, Françoise; Maurice, Tangui

    2018-03-01

    The Hamlet test is an innovative device providing a complex environment for testing topographic memory in mice. Animals were trained in groups for weeks in a small village with a central agora, streets expanding from it towards five functionalized houses, where they can drink, eat, hide, run, interact with a stranger mouse. Memory was tested by depriving mice from water or food and analyzing their ability to locate the Drink/Eat house. Exploration and memory were analyzed in different strains, gender, and after different training periods and delays. After 2 weeks training, differences in exploration patterns were observed between strains, but not gender. Neuroanatomical structures activated by training, identified using FosB/ΔFosB immunolabelling, showed an involvement of the hippocampus-subiculum-parahippocampal gyrus axis and dopaminergic structures. Training increased hippocampal neurogenesis (cell proliferation and neuronal maturation) and modified the amnesic efficacy of muscarinic or nicotinic cholinergic antagonists. Moreover, topographical disorientation in Alzheimer's disease was addressed using intracerebroventricular injection of amyloid β 25-35 peptide in trained mice. When retested after 7 days, Aβ 25-35 -treated mice showed memory impairment. The Hamlet test specifically allows analysis of topographical memory in mice, based on complex environment. It offers an innovative tool for various ethological or pharmacological research needs. For instance, it allowed to examine topographical disorientation, a warning sign in Alzheimer's disease. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Complexity of hierarchically and 1-dimensional periodically specified problems

    Energy Technology Data Exchange (ETDEWEB)

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.; Radhakrishnan, V.

    1995-08-23

    We study the complexity of various combinatorial and satisfiability problems when instances are specified using one of the following specifications: (1) the 1-dimensional finite periodic narrow specifications of Wanke and Ford et al. (2) the 1-dimensional finite periodic narrow specifications with explicit boundary conditions of Gale (3) the 2-way infinite1-dimensional narrow periodic specifications of Orlin et al. and (4) the hierarchical specifications of Lengauer et al. we obtain three general types of results. First, we prove that there is a polynomial time algorithm that given a 1-FPN- or 1-FPN(BC)specification of a graph (or a C N F formula) constructs a level-restricted L-specification of an isomorphic graph (or formula). This theorem along with the hardness results proved here provides alternative and unified proofs of many hardness results proved in the past either by Lengauer and Wagner or by Orlin. Second, we study the complexity of generalized CNF satisfiability problems of Schaefer. Assuming P {ne} PSPACE, we characterize completely the polynomial time solvability of these problems, when instances are specified as in (1), (2),(3) or (4). As applications of our first two types of results, we obtain a number of new PSPACE-hardness and polynomial time algorithms for problems specified as in (1), (2), (3) or(4). Many of our results also hold for O(log N) bandwidth bounded planar instances.

  18. Is Principled Pragmatism a Viable Framework for Addressing Complex Problems?

    Science.gov (United States)

    Islam, S.

    2017-12-01

    Complex water problems are connected with many competing and often conflicting values, interests, and tools. These problems can't be addressed through simply applying dogmatic principles or a deal-making pragmatic approach. Because these problems are interconnected and interdependent, a final solution can't be pre-specified. Any intervention to a complex problem requires attention to both principles and pragmatism. Strict adherence to principles without pragmatism is often not actionable; pure pragmatism exercised without guiding principles is not sustainable. In a colloquial sense, pragmatism is often taken to suggest practical, opportunistic, and expedient approaches at the expense of principles. This perception appears to be rooted in the dichotomy between "being pragmatic" and "being ideological". The notion of principled pragmatism attempts to get away from this duality by focusing on how to make ideas clear and actionable. In other words, how to connect our thoughts to action given the context, constraints, and capacity. Principled pragmatism - rooted in equity and sustainability as guiding principles for water management - approach attempts to synthesize symbolic aspirations with realistic assessment to chart a trajectory of actionable subset of implementable solutions. Case studies from the Ganges Basin will show the utility of principled pragmatism for water management in a changing world.

  19. Using the Van Hiele theory to analyze primary school teachers' written work on geometrical proof problems

    Science.gov (United States)

    Jupri, A.

    2018-05-01

    The lack of ability of primary school teachers in deductive thinking, such as doing geometrical proof, is an indispensable issue to be dealt with. In this paper, we report on results of a three-step of the field document study. The study was part of a pilot study for improving deductive thinking ability of primary school teachers. First, we designed geometrical proof problems adapted from literature. Second, we administered an individual written test involving nine master students of primary education program, in which they are having experiences as primary school mathematics teachers. Finally, we analyzed the written work from the view of the Van Hiele theory. The results revealed that even if about the half of the teachers show ability in doing formal proof, still the rest provides inappropriate proving. For further investigation, we wonder whether primary school teachers would show better deductive thinking if the teaching of geometry is designed in a systematic and appropriate manner according to the Van Hiele theory.

  20. Employing the Hilbert-Huang Transform to analyze observed natural complex signals: Calm wind meandering cases

    Science.gov (United States)

    Martins, Luis Gustavo Nogueira; Stefanello, Michel Baptistella; Degrazia, Gervásio Annes; Acevedo, Otávio Costa; Puhales, Franciano Scremin; Demarco, Giuliano; Mortarini, Luca; Anfossi, Domenico; Roberti, Débora Regina; Costa, Felipe Denardin; Maldaner, Silvana

    2016-11-01

    In this study we analyze natural complex signals employing the Hilbert-Huang spectral analysis. Specifically, low wind meandering meteorological data are decomposed into turbulent and non turbulent components. These non turbulent movements, responsible for the absence of a preferential direction of the horizontal wind, provoke negative lobes in the meandering autocorrelation functions. The meandering characteristic time scales (meandering periods) are determined from the spectral peak provided by the Hilbert-Huang marginal spectrum. The magnitudes of the temperature and horizontal wind meandering period obtained agree with the results found from the best fit of the heuristic meandering autocorrelation functions. Therefore, the new method represents a new procedure to evaluate meandering periods that does not employ mathematical expressions to represent observed meandering autocorrelation functions.

  1. The Similar Structures and Control Problems of Complex Systems

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In this paper, the naturally evolving complex systems, such as biotic and social ones, are considered. Focusing on their structures, a feature is noteworthy, i.e., the similarity in structures. The relations between the functions and behaviors of these systems and their similar structures will be studied. Owing to the management of social systems and the course of evolution of biotic systems may be regarded as control processes, the researches will be within the scope of control problems. Moreover, since it is difficult to model for biotic and social systems, it will start with the control problems of complex systems, possessing similar structures, in engineering.The obtained results show that for either linear or nonlinear systems and for a lot of control problemssimilar structures lead to a series of simplifications. In general, the original system may be decomposed into reduced amount of subsystems with lower dimensions and simpler structures. By virtue of such subsystems, the control problems of original system can be solved more simply.At last, it turns round to observe the biotic and social systems and some analyses are given.

  2. Internet of THings Area Coverage Analyzer (ITHACA for Complex Topographical Scenarios

    Directory of Open Access Journals (Sweden)

    Raúl Parada

    2017-10-01

    Full Text Available The number of connected devices is increasing worldwide. Not only in contexts like the Smart City, but also in rural areas, to provide advanced features like smart farming or smart logistics. Thus, wireless network technologies to efficiently allocate Internet of Things (IoT and Machine to Machine (M2M communications are necessary. Traditional cellular networks like Global System for Mobile communications (GSM are widely used worldwide for IoT environments. Nevertheless, Low Power Wide Area Networks (LP-WAN are becoming widespread as infrastructure for present and future IoT and M2M applications. Based also on a subscription service, the LP-WAN technology SIGFOXTM may compete with cellular networks in the M2M and IoT communications market, for instance in those projects where deploying the whole communications infrastructure is too complex or expensive. For decision makers to decide the most suitable technology for each specific application, signal coverage is within the key features. Unfortunately, besides simulated coverage maps, decision-makers do not have real coverage maps for SIGFOXTM, as they can be found for cellular networks. Thereby, we propose Internet of THings Area Coverage Analyzer (ITHACA, a signal analyzer prototype to provide automated signal coverage maps and analytics for LP-WAN. Experiments performed in the Gran Canaria Island, Spain (with both urban and complex topographic rural environments, returned a real SIGFOXTM service availability above 97% and above 11% more coverage with respect to the company-provided simulated maps. We expect that ITHACA may help decision makers to deploy the most suitable technologies for future IoT and M2M projects.

  3. Analyzing Pre-Service Primary Teachers' Fraction Knowledge Structures through Problem Posing

    Science.gov (United States)

    Kilic, Cigdem

    2015-01-01

    In this study it was aimed to determine pre-service primary teachers' knowledge structures of fraction through problem posing activities. A total of 90 pre-service primary teachers participated in this study. A problem posing test consisting of two questions was used and the participants were asked to generate as many as problems based on the…

  4. Analyzing Interpersonal Problem Solving in Terms of Solution Focused Approach and Humor Styles of University Student

    Science.gov (United States)

    Koc, Hayri; Arslan, Coskun

    2017-01-01

    In this study university students interpersonal problem solving approaches were investigated in terms of solution focused approach and humor styles. The participants were 773 (542 female and 231 male, between 17-33 years old) university students. To determine the university students' problem solving approaches "Interpersonal Problem Solving…

  5. Comparisons of complex network based models and real train flow model to analyze Chinese railway vulnerability

    International Nuclear Information System (INIS)

    Ouyang, Min; Zhao, Lijing; Hong, Liu; Pan, Zhezhe

    2014-01-01

    Recently numerous studies have applied complex network based models to study the performance and vulnerability of infrastructure systems under various types of attacks and hazards. But how effective are these models to capture their real performance response is still a question worthy of research. Taking the Chinese railway system as an example, this paper selects three typical complex network based models, including purely topological model (PTM), purely shortest path model (PSPM), and weight (link length) based shortest path model (WBSPM), to analyze railway accessibility and flow-based vulnerability and compare their results with those from the real train flow model (RTFM). The results show that the WBSPM can produce the train routines with 83% stations and 77% railway links identical to the real routines and can approach the RTFM the best for railway vulnerability under both single and multiple component failures. The correlation coefficient for accessibility vulnerability from WBSPM and RTFM under single station failures is 0.96 while it is 0.92 for flow-based vulnerability; under multiple station failures, where each station has the same failure probability fp, the WBSPM can produce almost identical vulnerability results with those from the RTFM under almost all failures scenarios when fp is larger than 0.62 for accessibility vulnerability and 0.86 for flow-based vulnerability

  6. A case study of analyzing 11th graders’ problem solving ability on heat and temperature topic

    Science.gov (United States)

    Yulianawati, D.; Muslim; Hasanah, L.; Samsudin, A.

    2018-05-01

    Problem solving ability must be owned by students after the process of physics learning so that the concept of physics becomes meaningful. Consequently, the research aims to describe their problem solving ability. Metacognition is contributed to physics learning to the success of students in solving problems. This research has already been implemented to 37 science students (30 women and 7 men) of eleventh grade from one of the secondary schools in Bandung. The research methods utilized the single case study with embedded research design. The instrument is Heat and Temperature Problem Solving Ability Test (HT-PSAT) which consists of twelve questions from three context problems. The result shows that the average value of the test is 8.27 out of the maximum total value of 36. In conclusion, eleventh graders’ problem-solving ability is still under expected. The implication of the findings is able to create learning situations which are probably developing students to embrace better problem solving ability.

  7. Redundant interferometric calibration as a complex optimization problem

    Science.gov (United States)

    Grobler, T. L.; Bernardi, G.; Kenyon, J. S.; Parsons, A. R.; Smirnov, O. M.

    2018-05-01

    Observations of the redshifted 21 cm line from the epoch of reionization have recently motivated the construction of low-frequency radio arrays with highly redundant configurations. These configurations provide an alternative calibration strategy - `redundant calibration' - and boost sensitivity on specific spatial scales. In this paper, we formulate calibration of redundant interferometric arrays as a complex optimization problem. We solve this optimization problem via the Levenberg-Marquardt algorithm. This calibration approach is more robust to initial conditions than current algorithms and, by leveraging an approximate matrix inversion, allows for further optimization and an efficient implementation (`redundant STEFCAL'). We also investigated using the preconditioned conjugate gradient method as an alternative to the approximate matrix inverse, but found that its computational performance is not competitive with respect to `redundant STEFCAL'. The efficient implementation of this new algorithm is made publicly available.

  8. Application of Artificial Neural Networks to Complex Groundwater Management Problems

    International Nuclear Information System (INIS)

    Coppola, Emery; Poulton, Mary; Charles, Emmanuel; Dustman, John; Szidarovszky, Ferenc

    2003-01-01

    As water quantity and quality problems become increasingly severe, accurate prediction and effective management of scarcer water resources will become critical. In this paper, the successful application of artificial neural network (ANN) technology is described for three types of groundwater prediction and management problems. In the first example, an ANN was trained with simulation data from a physically based numerical model to predict head (groundwater elevation) at locations of interest under variable pumping and climate conditions. The ANN achieved a high degree of predictive accuracy, and its derived state-transition equations were embedded into a multiobjective optimization formulation and solved to generate a trade-off curve depicting water supply in relation to contamination risk. In the second and third examples, ANNs were developed with real-world hydrologic and climate data for different hydrogeologic environments. For the second problem, an ANN was developed using data collected for a 5-year, 8-month period to predict heads in a multilayered surficial and limestone aquifer system under variable pumping, state, and climate conditions. Using weekly stress periods, the ANN substantially outperformed a well-calibrated numerical flow model for the 71-day validation period, and provided insights into the effects of climate and pumping on water levels. For the third problem, an ANN was developed with data collected automatically over a 6-week period to predict hourly heads in 11 high-capacity public supply wells tapping a semiconfined bedrock aquifer and subject to large well-interference effects. Using hourly stress periods, the ANN accurately predicted heads for 24-hour periods in all public supply wells. These test cases demonstrate that the ANN technology can solve a variety of complex groundwater management problems and overcome many of the problems and limitations associated with traditional physically based flow models

  9. Complex analysis and dynamical systems new trends and open problems

    CERN Document Server

    Golberg, Anatoly; Jacobzon, Fiana; Shoikhet, David; Zalcman, Lawrence

    2018-01-01

    This book focuses on developments in complex dynamical systems and geometric function theory over the past decade, showing strong links with other areas of mathematics and the natural sciences. Traditional methods and approaches surface in physics and in the life and engineering sciences with increasing frequency – the Schramm‐Loewner evolution, Laplacian growth, and quadratic differentials are just a few typical examples. This book provides a representative overview of these processes and collects open problems in the various areas, while at the same time showing where and how each particular topic evolves. This volume is dedicated to the memory of Alexander Vasiliev.

  10. Design patterns for instructional materials that foster proficiency at analyzing and interpreting complex geoscience data

    Science.gov (United States)

    Kastens, K. A.; Krumhansl, R.

    2016-12-01

    The Next Generation Science Standards incorporate a stronger emphasis on having students work with data than did prior standards. This emphasis is most obvious in Practice 4: Analyzing and Interpreting Data, but also permeates performance expectations built on Practice 2 when students test models, Practice 6 when students construct explanations, and Practice 7 when student test claims with evidence. To support curriculum developers who wish to guide high school students towards more sophisticated engagement with complex data, we analyzed a well-regarded body of instructional materials designed for use in introductory college courses (http://serc.carleton.edu/integrate/teaching_materials/). Our analysis sought design patterns that can be reused for a variety of topics at the high school or college level. We found five such patterns, each of which was used in at least half of the modules analyzed. We describe each pattern, provide an example, and hypothesize a theory of action that could explain how the sequence of activities leverages known perceptual, cognitive and/or social processes to foster learning from and about data. In order from most to least frequent, the observed design patterns are as follows: In Data Puzzles, students respond to guiding questions about high-value snippets of data pre-selected and sequenced by the curriculum developer to lead to an Aha! inference. In Pooling Data to See the Big Picture, small groups analyze different instances of analogous phenomenon (e.g. different hurricanes, or different divergent plate boundaries) and pool their insights to extract the commonalities that constitute the essence of that phenomenon. In Make a Decision or Recommendation, students combine geoscience data with other factors (such as economic or environmental justice concerns) to make a decision or recommendation about a human or societal action. In Predict-Observe-Explain, students make a prediction about what the Earth will look like under conditions

  11. Fluid Ability (Gf and Complex Problem Solving (CPS

    Directory of Open Access Journals (Sweden)

    Patrick Kyllonen

    2017-07-01

    Full Text Available Complex problem solving (CPS has emerged over the past several decades as an important construct in education and in the workforce. We examine the relationship between CPS and general fluid ability (Gf both conceptually and empirically. A review of definitions of the two factors, prototypical tasks, and the information processing analyses of performance on those tasks suggest considerable conceptual overlap. We review three definitions of CPS: a general definition emerging from the human problem solving literature; a more specialized definition from the “German School” emphasizing performance in many-variable microworlds, with high domain-knowledge requirements; and a third definition based on performance in Minimal Complex Systems (MCS, with fewer variables and reduced knowledge requirements. We find a correlation of 0.86 between expert ratings of the importance of CPS and Gf across 691 occupations in the O*NET database. We find evidence that employers value both Gf and CPS skills, but CPS skills more highly, even after controlling for the importance of domain knowledge. We suggest that this may be due to CPS requiring not just cognitive ability but additionally skill in applying that ability in domains. We suggest that a fruitful future direction is to explore the importance of domain knowledge in CPS.

  12. Setting up problems raised by construction of the EDF-Eurodif complex

    International Nuclear Information System (INIS)

    Fontaine, J.P.; Roux, J.P.

    1977-01-01

    After presentation of the Tricastin site and the nuclear complex to be built there, the main problems of social, economical or administrative order arising from the establishment of the site are analyzed and the solutions applied in order to overcome them are described. In conclusion they note that the largest site in Europe should be carried out up to completion in the best interests of local collectivities, of the Engineers and the populations concerned [fr

  13. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  14. The problem of motivating teaching staff in a complex amalgamation.

    Science.gov (United States)

    Kenrick, M A

    1993-09-01

    This paper addresses some of the problems brought about by the merger of a number of schools of nursing into a new complex amalgamation. A very real concern in the new colleges of nursing and midwifery in the United Kingdom is the effect of amalgamation on management systems and staff morale. The main focus of this paper is the motivation of staff during this time of change. There is currently a lack of security amongst staff and in many instances the personal job satisfaction of nurse teachers and managers of nurse education has been reduced, which has made the task of motivating staff difficult. Hence, two major theories of motivation and the implications of these theories for managers of nurse education are discussed. The criteria used for the selection of managers within the new colleges, leadership styles and organizational structures are reviewed. The amalgamations have brought about affiliation with higher-education institutions. Some problems associated with these mergers and the effects on the motivation of staff both within the higher-education institutions and the nursing colleges are outlined. Strategies for overcoming some of the problems are proposed including job enlargement, job enrichment, potential achievement rewards and the use of individual performance reviews which may be useful for assessing the ability of all staff, including managers, in the new amalgamations.

  15. Analyze the optimal solutions of optimization problems by means of fractional gradient based system using VIM

    Directory of Open Access Journals (Sweden)

    Firat Evirgen

    2016-04-01

    Full Text Available In this paper, a class of Nonlinear Programming problem is modeled with gradient based system of fractional order differential equations in Caputo's sense. To see the overlap between the equilibrium point of the fractional order dynamic system and theoptimal solution of the NLP problem in a longer timespan the Multistage Variational İteration Method isapplied. The comparisons among the multistage variational iteration method, the variationaliteration method and the fourth order Runge-Kutta method in fractional and integer order showthat fractional order model and techniques can be seen as an effective and reliable tool for finding optimal solutions of Nonlinear Programming problems.

  16. One Problem, Many Solutions : Simple Statistical Approaches Help Unravel the Complexity of the Immune System in an Ecological Context

    NARCIS (Netherlands)

    Buehler, Deborah M.; Versteegh, Maaike A.; Matson, Kevin D.; Tieleman, Irene

    2011-01-01

    The immune system is a complex collection of interrelated and overlapping solutions to the problem of disease. To deal with this complexity, researchers have devised multiple ways to measure immune function and to analyze the resulting data. In this way both organisms and researchers employ many

  17. One problem, many solutions: simple statistical approaches help unravel the complexity of th eimmune system in an ecological context

    NARCIS (Netherlands)

    Buehler, D.M.; Versteegh, M.A.; Matson, K.D.; Tieleman, B.I.

    2011-01-01

    The immune system is a complex collection of interrelated and overlapping solutions to the problem of disease. To deal with this complexity, researchers have devised multiple ways to measure immune function and to analyze the resulting data. In this way both organisms and researchers employ many

  18. A Hybrid DGTD-MNA Scheme for Analyzing Complex Electromagnetic Systems

    KAUST Repository

    Li, Peng

    2015-01-07

    A hybrid electromagnetics (EM)-circuit simulator for analyzing complex systems consisting of EM devices loaded with nonlinear multi-port lumped circuits is described. The proposed scheme splits the computational domain into two subsystems: EM and circuit subsystems, where field interactions are modeled using Maxwell and Kirchhoff equations, respectively. Maxwell equations are discretized using a discontinuous Galerkin time domain (DGTD) scheme while Kirchhoff equations are discretized using a modified nodal analysis (MNA)-based scheme. The coupling between the EM and circuit subsystems is realized at the lumped ports, where related EM fields and circuit voltages and currents are allowed to “interact’’ via numerical flux. To account for nonlinear lumped circuit elements, the standard Newton-Raphson method is applied at every time step. Additionally, a local time-stepping scheme is developed to improve the efficiency of the hybrid solver. Numerical examples consisting of EM systems loaded with single and multiport linear/nonlinear circuit networks are presented to demonstrate the accuracy, efficiency, and applicability of the proposed solver.

  19. Modeling and Analyzing Operational Decision-Making Synchronization of C2 Organization in Complex Environment

    Directory of Open Access Journals (Sweden)

    Zou Zhigang

    2013-01-01

    Full Text Available In order to improve capability of operational decision-making synchronization (ODMS in command and control (C2 organization, the paper puts forward that ODMS is the negotiation process of situation cognition with three phases about “situation cognition, situation interaction and decision-making synchronization” in complex environment, and then the model and strategies of ODMS are given in quantity. Firstly, measure indexes of three steps above are given in the paper based on the time consumed in negotiation, and three patterns are proposed for negotiating timely in high quality during situation interaction. Secondly, the ODMS model with two stages in continuous changing situation is put forward in the paper, and ODMS strategies are analyzed within environment influence and time restriction. Thirdly, simulation cases are given to validate the process of ODMS under different continuous changing situations the results of this model are better than the other previous models to fulfill the actual restrictions, and the process of ODMS can be adjusted more reasonable for improving the capability of ODMS. Then we discuss the case and summarize the influence factors of ODMS in the C2 organization as organization structure, shared information resources, negotiation patterns, and allocation of decision rights.

  20. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    Science.gov (United States)

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58) or low (n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as

  1. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands.

    Science.gov (United States)

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high ( n = 58) or low ( n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as

  2. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    Directory of Open Access Journals (Sweden)

    Vera Hagemann

    2017-09-01

    Full Text Available Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes, the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58 or low (n = 58 collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes

  3. Quantum trajectories in complex space: One-dimensional stationary scattering problems

    International Nuclear Information System (INIS)

    Chou, C.-C.; Wyatt, Robert E.

    2008-01-01

    One-dimensional time-independent scattering problems are investigated in the framework of the quantum Hamilton-Jacobi formalism. The equation for the local approximate quantum trajectories near the stagnation point of the quantum momentum function is derived, and the first derivative of the quantum momentum function is related to the local structure of quantum trajectories. Exact complex quantum trajectories are determined for two examples by numerically integrating the equations of motion. For the soft potential step, some particles penetrate into the nonclassical region, and then turn back to the reflection region. For the barrier scattering problem, quantum trajectories may spiral into the attractors or from the repellers in the barrier region. Although the classical potentials extended to complex space show different pole structures for each problem, the quantum potentials present the same second-order pole structure in the reflection region. This paper not only analyzes complex quantum trajectories and the total potentials for these examples but also demonstrates general properties and similar structures of the complex quantum trajectories and the quantum potentials for one-dimensional time-independent scattering problems

  4. Efficient algorithms for analyzing the singularly perturbed boundary value problems of fractional order

    Science.gov (United States)

    Sayevand, K.; Pichaghchi, K.

    2018-04-01

    In this paper, we were concerned with the description of the singularly perturbed boundary value problems in the scope of fractional calculus. We should mention that, one of the main methods used to solve these problems in classical calculus is the so-called matched asymptotic expansion method. However we shall note that, this was not achievable via the existing classical definitions of fractional derivative, because they do not obey the chain rule which one of the key elements of the matched asymptotic expansion method. In order to accommodate this method to fractional derivative, we employ a relatively new derivative so-called the local fractional derivative. Using the properties of local fractional derivative, we extend the matched asymptotic expansion method to the scope of fractional calculus and introduce a reliable new algorithm to develop approximate solutions of the singularly perturbed boundary value problems of fractional order. In the new method, the original problem is partitioned into inner and outer solution equations. The reduced equation is solved with suitable boundary conditions which provide the terminal boundary conditions for the boundary layer correction. The inner solution problem is next solved as a solvable boundary value problem. The width of the boundary layer is approximated using appropriate resemblance function. Some theoretical results are established and proved. Some illustrating examples are solved and the results are compared with those of matched asymptotic expansion method and homotopy analysis method to demonstrate the accuracy and efficiency of the method. It can be observed that, the proposed method approximates the exact solution very well not only in the boundary layer, but also away from the layer.

  5. Understanding the determinants of problem-solving behavior in a complex environment

    Science.gov (United States)

    Casner, Stephen A.

    1994-01-01

    It is often argued that problem-solving behavior in a complex environment is determined as much by the features of the environment as by the goals of the problem solver. This article explores a technique to determine the extent to which measured features of a complex environment influence problem-solving behavior observed within that environment. In this study, the technique is used to determine how complex flight deck and air traffic control environment influences the strategies used by airline pilots when controlling the flight path of a modern jetliner. Data collected aboard 16 commercial flights are used to measure selected features of the task environment. A record of the pilots' problem-solving behavior is analyzed to determine to what extent behavior is adapted to the environmental features that were measured. The results suggest that the measured features of the environment account for as much as half of the variability in the pilots' problem-solving behavior and provide estimates on the probable effects of each environmental feature.

  6. Applications of systems thinking and soft operations research in managing complexity from problem framing to problem solving

    CERN Document Server

    2016-01-01

    This book captures current trends and developments in the field of systems thinking and soft operations research which can be applied to solve today's problems of dynamic complexity and interdependency. Such ‘wicked problems’ and messes are seemingly intractable problems characterized as value-laden, ambiguous, and unstable, that resist being tamed by classical problem solving. Actions and interventions associated with this complex problem space can have highly unpredictable and unintended consequences. Examples of such complex problems include health care reform, global climate change, transnational serious and organized crime, terrorism, homeland security, human security, disaster management, and humanitarian aid. Moving towards the development of solutions to these complex problem spaces depends on the lens we use to examine them and how we frame the problem. It will be shown that systems thinking and soft operations research has had great success in contributing to the management of complexity. .

  7. Case Studies in Critical Ecoliteracy: A Curriculum for Analyzing the Social Foundations of Environmental Problems

    Science.gov (United States)

    Turner, Rita; Donnelly, Ryan

    2013-01-01

    This article outlines the features and application of a set of model curriculum materials that utilize eco-democratic principles and humanities-based content to cultivate critical analysis of the cultural foundations of socio-environmental problems. We first describe the goals and components of the materials, then discuss results of their use in…

  8. The application of game theory and cognitive economy to analyze the problem of undesired location

    International Nuclear Information System (INIS)

    Villani, S.

    2008-01-01

    The analysts of the processes of public bodies decision - taking have long been discussing on the establishment of proper strategies to manage environmental conflicts - above all the so-called problems of undesired location of public works and facilities - efficiently (i.e. on a short-period basis so as to grant decision and agreement stability) and fairly (the parties' satisfaction is itself a further guarantee of decision and agreement stability). Each strategy, anyway, is still in progress, like a universe to create and explore. Therefore, in this paper, we will focus on the analysis of the problem and provide as well some theoretical proposals to arrange a new interpreting model of public bodies decision-taking processes based on the achievements of two new subject-matters: evolutionary game theory and cognitive economy. Both sciences share their investigation field with law and economic science. [it

  9. Deep graphs—A general framework to represent and analyze heterogeneous complex systems across scales

    Science.gov (United States)

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  10. Use of multiple singular value decompositions to analyze complex intracellular calcium ion signals

    KAUST Repository

    Martinez, Josue G.; Huang, Jianhua Z.; Burghardt, Robert C.; Barhoumi, Rola; Carroll, Raymond J.

    2009-01-01

    ) to extract the signals from such movies, in a way that is semi-automatic and tuned closely to the actual data and their many complexities. These complexities include the following. First, the images themselves are of no interest: all interest focuses

  11. Making mobility-related disability better: a complex response to a complex problem

    Directory of Open Access Journals (Sweden)

    Rockwood Kenneth

    2012-10-01

    Full Text Available Abstract Mobility disability in older adults can arise from single system problems, such as discrete musculoskeletal injury. In frail older adults, however, mobility disability is part of a complex web of problems. The approach to their rehabilitation must take that complexity into account, as is reported by Fairhall et al. First, their overall health state must be assessed, which is achieved by a comprehensive geriatric assessment. The assessment can show how a particular patient came to be disabled, so that an individualized care plan can be worked out. Whether this approach works in general can be evaluated by looking at group differences in mean mobility test scores. Knowing whether it has worked in the individual patient requires an individualized measure. This is because not every patient starts from the same point, and not every patient achieves success by aiming for the same goal. For one patient, walking unassisted for three metres would be a triumph; for another it would be a tragedy. Unless we understand the complexity of the needs of frail older adults, we will neither be able to treat them effectively nor evaluate our efforts sensibly. Please see related article http://www.biomedcentral.com/1741-7015/10/120

  12. Making mobility-related disability better: a complex response to a complex problem.

    Science.gov (United States)

    Rockwood, Kenneth

    2012-10-15

    Mobility disability in older adults can arise from single system problems, such as discrete musculoskeletal injury. In frail older adults, however, mobility disability is part of a complex web of problems. The approach to their rehabilitation must take that complexity into account, as is reported by Fairhall et al. First, their overall health state must be assessed, which is achieved by a comprehensive geriatric assessment. The assessment can show how a particular patient came to be disabled, so that an individualized care plan can be worked out. Whether this approach works in general can be evaluated by looking at group differences in mean mobility test scores. Knowing whether it has worked in the individual patient requires an individualized measure. This is because not every patient starts from the same point, and not every patient achieves success by aiming for the same goal. For one patient, walking unassisted for three metres would be a triumph; for another it would be a tragedy. Unless we understand the complexity of the needs of frail older adults, we will neither be able to treat them effectively nor evaluate our efforts sensibly.Please see related article http://www.biomedcentral.com/1741-7015/10/120.

  13. [Methamphetamine - just another stimulant or a more complex problem?].

    Science.gov (United States)

    Lecomte, Tania; Massé, Marjolaine

    2014-01-01

    Methamphetamine (MA) has recently become very popular in the media, due in part to its increasing popularity as well as its psychotropic effects and the negative consequences of its use. Is it a stimulant like any other, or does methamphetamine use lead to specific difficulties in its users? The aim of this article is to provide a brief review of the literature by explaining some of the reasons for its popularity in Canada as well as the physical, dental, psychiatric, cognitive and legal problems associated with its use. MA's popularity: Regarding its popularity, MA has benefitted from multiple factors, namely its low cost for users and manufacturers, its quick and intense psychotropic effects (increased energy, sexual arousal, rapid thinking, sleeplessness, lack of appetite), its easy access, as well as its various methods of ingestion (nasal, oral, injection). MA abuse also results in a multitude of negative effects, both physical and mental. MA's physical effects: In terms of negative physical effects, cardiac problems, skin infections, sexually transmitted (and injection-related) diseases as well as meth mouth are described. MA's mental effects: In terms of mental consequences, two recently published Canadian studies revealing high rates of depression symptoms and of sustained psychotic symptoms in a subgroup of MA users are presented. Studies reporting various cognitive deficits in MA user are also reviewed, including reports of high prevalence of childhood attention deficit and hyperactivity disorder diagnoses among adult MA users. Furthermore, MA abusers are documented as having been highly exposed to trauma in their lives, with many presenting with post-traumatic stress disorder criteria. This manuscript also explores the reasons behind the forensic profiles of individuals using MA, particularly the increased tendency toward violent acts, the high incarceration rates of the homeless users and the high percentage of individuals diagnosed with antisocial

  14. Beyond Psychometrics: The Difference between Difficult Problem Solving and Complex Problem Solving

    Directory of Open Access Journals (Sweden)

    Jens F. Beckmann

    2017-10-01

    Full Text Available In this paper we argue that a synthesis of findings across the various sub-areas of research in complex problem solving and consequently progress in theory building is hampered by an insufficient differentiation of complexity and difficulty. In the proposed framework of person, task, and situation (PTS, complexity is conceptualized as a quality that is determined by the cognitive demands that the characteristics of the task and the situation impose. Difficulty represents the quantifiable level of a person’s success in dealing with such demands. We use the well-documented “semantic effect” as an exemplar for testing some of the conceptual assumptions derived from the PTS framework. We demonstrate how a differentiation between complexity and difficulty can help take beyond a potentially too narrowly defined psychometric perspective and subsequently gain a better understanding of the cognitive mechanisms behind this effect. In an empirical study a total of 240 university students were randomly allocated to one of four conditions. The four conditions resulted from contrasting the semanticity level of the variable labels used in the CPS system (high vs. low and two instruction conditions for how to explore the CPS system’s causal structure (starting with the assumption that all relationships between variables existed vs. starting with the assumption that none of the relationships existed. The variation in the instruction aimed at inducing knowledge acquisition processes of either (1 systematic elimination of presumptions, or (2 systematic compilation of a mental representation of the causal structure underpinning the system. Results indicate that (a it is more complex to adopt a “blank slate” perspective under high semanticity as it requires processes of inhibiting prior assumptions, and (b it seems more difficult to employ a systematic heuristic when testing against presumptions. In combination, situational characteristics, such as the

  15. ReaderBench: A Multi-lingual Framework for Analyzing Text Complexity

    NARCIS (Netherlands)

    Dascalu, Mihai; Gutu, Gabriel; Ruseti, Stefan; Paraschiv, Ionut Cristian; Dessus, Philippe; McNamara, Danielle S.; Crossley, Scott; Trausan-Matu, Stefan

    2017-01-01

    Assessing textual complexity is a difficult, but important endeavor, especially for adapting learning materials to students’ and readers’ levels of understanding. With the continuous growth of information technologies spanning through various research fields, automated assessment tools have

  16. A Framework For Analyzing And Mitigating The Vulnerabilities Of Complex Systems Via Attack And Protection Trees

    National Research Council Canada - National Science Library

    Edge, Kenneth S

    2007-01-01

    .... In addition to developing protection trees, this research improves the existing concept of attack trees and develops rule sets for the manipulation of metrics used in the security of complex systems...

  17. Cross-national comparisons of complex problem-solving strategies in two microworlds.

    Science.gov (United States)

    Güss, C Dominik; Tuason, Ma Teresa; Gerhard, Christiane

    2010-04-01

    Research in the fields of complex problem solving (CPS) and dynamic decision making using microworlds has been mainly conducted in Western industrialized countries. This study analyzes the CPS process by investigating thinking-aloud protocols in five countries. Participants were 511 students from Brazil, Germany, India, the Philippines, and the United States who worked on two microworlds. On the basis of cultural-psychological theories, specific cross-national differences in CPS strategies were hypothesized. Following theories of situatedness of cognition, hypotheses about the specific frequency of problem-solving strategies in the two microworlds were developed. Results of the verbal protocols showed (a) modification of the theoretical CPS model, (b) task dependence of CPS strategies, and (c) cross-national differences in CPS strategies. Participants' CPS processes were particularly influenced by country-specific problem-solving strategies. Copyright © 2009 Cognitive Science Society, Inc.

  18. Decomposition of overlapping protein complexes: A graph theoretical method for analyzing static and dynamic protein associations

    Directory of Open Access Journals (Sweden)

    Guimarães Katia S

    2006-04-01

    Full Text Available Abstract Background Most cellular processes are carried out by multi-protein complexes, groups of proteins that bind together to perform a specific task. Some proteins form stable complexes, while other proteins form transient associations and are part of several complexes at different stages of a cellular process. A better understanding of this higher-order organization of proteins into overlapping complexes is an important step towards unveiling functional and evolutionary mechanisms behind biological networks. Results We propose a new method for identifying and representing overlapping protein complexes (or larger units called functional groups within a protein interaction network. We develop a graph-theoretical framework that enables automatic construction of such representation. We illustrate the effectiveness of our method by applying it to TNFα/NF-κB and pheromone signaling pathways. Conclusion The proposed representation helps in understanding the transitions between functional groups and allows for tracking a protein's path through a cascade of functional groups. Therefore, depending on the nature of the network, our representation is capable of elucidating temporal relations between functional groups. Our results show that the proposed method opens a new avenue for the analysis of protein interaction networks.

  19. A formulation to analyze system-of-systems problems: A case study of airport metroplex operations

    Science.gov (United States)

    Ayyalasomayajula, Sricharan Kishore

    A system-of-systems (SoS) can be described as a collection of multiple, heterogeneous, distributed, independent components interacting to achieve a range of objectives. A generic formulation was developed to model component interactions in an SoS to understand their influence on overall SoS performance. The formulation employs a lexicon to aggregate components into hierarchical interaction networks and understand how their topological properties affect the performance of the aggregations. Overall SoS performance is evaluated by monitoring the changes in stakeholder profitability due to changes in component interactions. The formulation was applied to a case study in air transportation focusing on operations at airport metroplexes. Metroplexes are geographical regions with two or more airports in close proximity to one another. The case study explored how metroplex airports interact with one another, what dependencies drive these interactions, and how these dependencies affect metroplex throughput and capacity. Metrics were developed to quantify runway dependencies at a metroplex and were correlated with its throughput and capacity. Operations at the New York/New Jersey metroplex (NYNJ) airports were simulated to explore the feasibility of operating very large aircraft (VLA), such as the Airbus A380, as a delay-mitigation strategy at these airports. The proposed formulation was employed to analyze the impact of this strategy on different stakeholders in the national air transportation system (ATS), such as airlines and airports. The analysis results and their implications were used to compare the pros and cons of operating VLAs at NYNJ from the perspectives of airline profitability, and flight delays at NYNJ and across the ATS.

  20. How Cognitive Style and Problem Complexity Affect Preservice Agricultural Education Teachers' Abilities to Solve Problems in Agricultural Mechanics

    Science.gov (United States)

    Blackburn, J. Joey; Robinson, J. Shane; Lamm, Alexa J.

    2014-01-01

    The purpose of this experimental study was to determine the effects of cognitive style and problem complexity on Oklahoma State University preservice agriculture teachers' (N = 56) ability to solve problems in small gasoline engines. Time to solution was operationalized as problem solving ability. Kirton's Adaption-Innovation Inventory was…

  1. Methodological issues in analyzing human communication – the complexities of multimodality

    DEFF Research Database (Denmark)

    Høegh, Tina

    2017-01-01

    This chapter develops a multimodal method for transcribing speech, communication, and performance. The chapter discusses the methodological solutions to the complex translation of speech, language rhythm and gesture in time and space into the two-dimensional format of a piece of paper. The focus...

  2. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela

    2017-08-29

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  3. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela; Bender, Miriam; Ombao, Hernando

    2017-01-01

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  4. An analytical approach to managing complex process problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramstad, Kari; Andersen, Espen; Rohde, Hans Christian; Tydal, Trine

    2006-03-15

    The oil companies are continuously investing time and money to ensure optimum regularity on their production facilities. High regularity increases profitability, reduces workload on the offshore organisation and most important; - reduces discharge to air and sea. There are a number of mechanisms and tools available in order to achieve high regularity. Most of these are related to maintenance, system integrity, well operations and process conditions. However, for all of these tools, they will only be effective if quick and proper analysis of fluids and deposits are carried out. In fact, analytical backup is a powerful tool used to maintain optimised oil production, and should as such be given high priority. The present Operator (Hydro Oil and Energy) and the Chemical Supplier (MI Production Chemicals) have developed a cooperation to ensure that analytical backup is provided efficiently to the offshore installations. The Operator's Research and Development (R and D) departments and the Chemical Supplier have complementary specialties in both personnel and equipment, and this is utilized to give the best possible service when required from production technologists or operations. In order for the Operator's Research departments, Health, Safety and Environment (HSE) departments and Operations to approve analytical work performed by the Chemical Supplier, a number of analytical tests are carried out following procedures agreed by both companies. In the present paper, three field case examples of analytical cooperation for managing process problems will be presented. 1) Deposition in a Complex Platform Processing System. 2) Contaminated Production Chemicals. 3) Improved Monitoring of Scale Inhibitor, Suspended Solids and Ions. In each case the Research Centre, Operations and the Chemical Supplier have worked closely together to achieve fast solutions and Best Practice. (author) (tk)

  5. Eye-Tracking Study of Complexity in Gas Law Problems

    Science.gov (United States)

    Tang, Hui; Pienta, Norbert

    2012-01-01

    This study, part of a series investigating students' use of online tools to assess problem solving, uses eye-tracking hardware and software to explore the effect of problem difficulty and cognitive processes when students solve gas law word problems. Eye movements are indices of cognition; eye-tracking data typically include the location,…

  6. On the complexity of a bundle pricing problem

    NARCIS (Netherlands)

    Grigoriev, Alexander; van Lohn, Joyce; Uetz, Marc Jochen

    2010-01-01

    We consider the problem of pricing items in order to maximize the revenue obtainable from a set of single minded customers. We relate the tractability of the problem to structural properties of customers' valuations: the problem admits an effcient approximation algorithm, parameterized along the

  7. The use of synthetic spectra to test the preparedness to evaluate and analyze complex gamma spectra

    International Nuclear Information System (INIS)

    Nikkinen, M

    2001-10-01

    This is the report of two exercises that were run under the NKS BOK-1.1 sub-project. In these exercises synthetic gamma spectra were developed to exercise the analysis of difficult spectra typically seen after a severe nuclear accident. The spectra were analyzed twice; first, participants were given short time to give results to resemble an actual emergency preparedness situation, then a longer period of time was allowed to tune the laboratory analysis results for quality assurance purposes. The exercise did prove that it is possible to move measurement data from one laboratory to another if second opinion of the analysis is needed. It was also felt that this kind of exercise would enhance the experience the laboratories have in analyzing accident data. Participants expressed the need for additional exercises of this type, this is inexpensive and an easy way to exercise quick emergency response situations not normally seen in daily laboratory routines. (au)

  8. Analyzing complex wake-terrain interactions and its implications on wind-farm performance.

    Science.gov (United States)

    Tabib, Mandar; Rasheed, Adil; Fuchs, Franz

    2016-09-01

    Rotating wind turbine blades generate complex wakes involving vortices (helical tip-vortex, root-vortex etc.).These wakes are regions of high velocity deficits and high turbulence intensities and they tend to degrade the performance of down-stream turbines. Hence, a conservative inter-turbine distance of up-to 10 times turbine diameter (10D) is sometimes used in wind-farm layout (particularly in cases of flat terrain). This ensures that wake-effects will not reduce the overall wind-farm performance, but this leads to larger land footprint for establishing a wind-farm. In-case of complex-terrain, within a short distance (say 10D) itself, the nearby terrain can rise in altitude and be high enough to influence the wake dynamics. This wake-terrain interaction can happen either (a) indirectly, through an interaction of wake (both near tip vortex and far wake large-scale vortex) with terrain induced turbulence (especially, smaller eddies generated by small ridges within the terrain) or (b) directly, by obstructing the wake-region partially or fully in its flow-path. Hence, enhanced understanding of wake- development due to wake-terrain interaction will help in wind farm design. To this end the current study involves: (1) understanding the numerics for successful simulation of vortices, (2) understanding fundamental vortex-terrain interaction mechanism through studies devoted to interaction of a single vortex with different terrains, (3) relating influence of vortex-terrain interactions to performance of a wind-farm by studying a multi-turbine wind-farm layout under different terrains. The results on interaction of terrain and vortex has shown a much faster decay of vortex for complex terrain compared to a flatter-terrain. The potential reasons identified explaining the observation are (a) formation of secondary vortices in flow and its interaction with the primary vortex and (b) enhanced vorticity diffusion due to increased terrain-induced turbulence. The implications of

  9. Electromagnetic waves in complex systems selected theoretical and applied problems

    CERN Document Server

    Velychko, Lyudmyla

    2016-01-01

    This book gives guidance to solve problems in electromagnetics, providing both examples of solving serious research problems as well as the original results to encourage further investigations. The book contains seven chapters on various aspects of resonant wave scattering, each solving one original problem. All of them are unified by the authors’ desire to show advantages of rigorous approaches at all stages, from the formulation of a problem and the selection of a method to the interpretation of results. The book reveals a range of problems associated with wave propagation and scattering in natural and artificial environments or with the design of antennas elements. The authors invoke both theoretical (analytical and numerical) and experimental techniques for handling the problems. Attention is given to mathematical simulations, computational efficiency, and physical interpretation of the experimental results. The book is written for students, graduate students and young researchers. .

  10. Use of multiple singular value decompositions to analyze complex intracellular calcium ion signals

    KAUST Repository

    Martinez, Josue G.

    2009-12-01

    We compare calcium ion signaling (Ca(2+)) between two exposures; the data are present as movies, or, more prosaically, time series of images. This paper describes novel uses of singular value decompositions (SVD) and weighted versions of them (WSVD) to extract the signals from such movies, in a way that is semi-automatic and tuned closely to the actual data and their many complexities. These complexities include the following. First, the images themselves are of no interest: all interest focuses on the behavior of individual cells across time, and thus, the cells need to be segmented in an automated manner. Second, the cells themselves have 100+ pixels, so that they form 100+ curves measured over time, so that data compression is required to extract the features of these curves. Third, some of the pixels in some of the cells are subject to image saturation due to bit depth limits, and this saturation needs to be accounted for if one is to normalize the images in a reasonably un-biased manner. Finally, the Ca(2+) signals have oscillations or waves that vary with time and these signals need to be extracted. Thus, our aim is to show how to use multiple weighted and standard singular value decompositions to detect, extract and clarify the Ca(2+) signals. Our signal extraction methods then lead to simple although finely focused statistical methods to compare Ca(2+) signals across experimental conditions.

  11. Taking advantage of local structure descriptors to analyze interresidue contacts in protein structures and protein complexes.

    Science.gov (United States)

    Martin, Juliette; Regad, Leslie; Etchebest, Catherine; Camproux, Anne-Claude

    2008-11-15

    Interresidue protein contacts in proteins structures and at protein-protein interface are classically described by the amino acid types of interacting residues and the local structural context of the contact, if any, is described using secondary structures. In this study, we present an alternate analysis of interresidue contact using local structures defined by the structural alphabet introduced by Camproux et al. This structural alphabet allows to describe a 3D structure as a sequence of prototype fragments called structural letters, of 27 different types. Each residue can then be assigned to a particular local structure, even in loop regions. The analysis of interresidue contacts within protein structures defined using Voronoï tessellations reveals that pairwise contact specificity is greater in terms of structural letters than amino acids. Using a simple heuristic based on specificity score comparison, we find that 74% of the long-range contacts within protein structures are better described using structural letters than amino acid types. The investigation is extended to a set of protein-protein complexes, showing that the similar global rules apply as for intraprotein contacts, with 64% of the interprotein contacts best described by local structures. We then present an evaluation of pairing functions integrating structural letters to decoy scoring and show that some complexes could benefit from the use of structural letter-based pairing functions.

  12. Framework based on communicability and flow to analyze complex network dynamics

    Science.gov (United States)

    Gilson, M.; Kouvaris, N. E.; Deco, G.; Zamora-López, G.

    2018-05-01

    Graph theory constitutes a widely used and established field providing powerful tools for the characterization of complex networks. The intricate topology of networks can also be investigated by means of the collective dynamics observed in the interactions of self-sustained oscillations (synchronization patterns) or propagationlike processes such as random walks. However, networks are often inferred from real-data-forming dynamic systems, which are different from those employed to reveal their topological characteristics. This stresses the necessity for a theoretical framework dedicated to the mutual relationship between the structure and dynamics in complex networks, as the two sides of the same coin. Here we propose a rigorous framework based on the network response over time (i.e., Green function) to study interactions between nodes across time. For this purpose we define the flow that describes the interplay between the network connectivity and external inputs. This multivariate measure relates to the concepts of graph communicability and the map equation. We illustrate our theory using the multivariate Ornstein-Uhlenbeck process, which describes stable and non-conservative dynamics, but the formalism can be adapted to other local dynamics for which the Green function is known. We provide applications to classical network examples, such as small-world ring and hierarchical networks. Our theory defines a comprehensive framework that is canonically related to directed and weighted networks, thus paving a way to revise the standards for network analysis, from the pairwise interactions between nodes to the global properties of networks including community detection.

  13. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    Science.gov (United States)

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we developed. The…

  14. A system to analyze the complex physiological states of coal solubilizing fungi

    Energy Technology Data Exchange (ETDEWEB)

    Hoelker, U.; Moenkemann, H.; Hoefer, M. [Universitaet Bonn, Bonn (Germany). Botanisches Institut

    1997-11-01

    The mechanism by which some microorganisms solubilize brown coal is still unknown. The paper discusses the deuteromycetes Fusarium oxysporum and Trichoderma atroviride as a suitable test system to analyse the complex fungal physiology relating to coal solubilization. The two fungi can occur in two different growth substrate-controlled physiological states: a coal-solubilizing one, when cells are grown on glutamate or gluconate as substrate and a non-solubilizing one, when grown on carbohydrates. When grown on carbohydrates, F.oxysporum produces the pigment bikaverein. Purified bikaverein inhibits also coal solubilization by T. atroviride. The ability to solubilize coal is constitutive in F. oxysporum, while in T. atroviride, it has to be induced. 10 refs., 3 figs., 3 tabs.

  15. A new theoretical approach to analyze complex processes in cytoskeleton proteins.

    Science.gov (United States)

    Li, Xin; Kolomeisky, Anatoly B

    2014-03-20

    Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.

  16. A Complex Network Model for Analyzing Railway Accidents Based on the Maximal Information Coefficient

    International Nuclear Information System (INIS)

    Shao Fu-Bo; Li Ke-Ping

    2016-01-01

    It is an important issue to identify important influencing factors in railway accident analysis. In this paper, employing the good measure of dependence for two-variable relationships, the maximal information coefficient (MIC), which can capture a wide range of associations, a complex network model for railway accident analysis is designed in which nodes denote factors of railway accidents and edges are generated between two factors of which MIC values are larger than or equal to the dependent criterion. The variety of network structure is studied. As the increasing of the dependent criterion, the network becomes to an approximate scale-free network. Moreover, employing the proposed network, important influencing factors are identified. And we find that the annual track density-gross tonnage factor is an important factor which is a cut vertex when the dependent criterion is equal to 0.3. From the network, it is found that the railway development is unbalanced for different states which is consistent with the fact. (paper)

  17. Asbestos quantification in track ballast, a complex analytical problem

    Science.gov (United States)

    Cavallo, Alessandro

    2016-04-01

    Track ballast forms the trackbeb upon which railroad ties are laid. It is used to bear the load from the railroad ties, to facilitate water drainage, and also to keep down vegetation. It is typically made of angular crushed stone, with a grain size between 30 and 60 mm, with good mechanical properties (high compressive strength, freeze - thaw resistance, resistance to fragmentation). The most common rock types are represented by basalts, porphyries, orthogneisses, some carbonatic rocks and "green stones" (serpentinites, prasinites, amphibolites, metagabbros). Especially "green stones" may contain traces, and sometimes appreciable amounts of asbestiform minerals (chrysotile and/or fibrous amphiboles, generally tremolite - actinolite). In Italy, the chrysotile asbestos mine in Balangero (Turin) produced over 5 Mt railroad ballast (crushed serpentinites), which was used for the railways in northern and central Italy, from 1930 up to 1990. In addition to Balangero, several other serpentinite and prasinite quarries (e.g. Emilia Romagna) provided the railways ballast up to the year 2000. The legal threshold for asbestos content in track ballast is established in 1000 ppm: if the value is below this threshold, the material can be reused, otherwise it must be disposed of as hazardous waste, with very high costs. The quantitative asbestos determination in rocks is a very complex analytical issue: although techniques like TEM-SAED and micro-Raman are very effective in the identification of asbestos minerals, a quantitative determination on bulk materials is almost impossible or really expensive and time consuming. Another problem is represented by the discrimination of asbestiform minerals (e.g. chrysotile, asbestiform amphiboles) from the common acicular - pseudo-fibrous varieties (lamellar serpentine minerals, prismatic/acicular amphiboles). In this work, more than 200 samples from the main Italian rail yards were characterized by a combined use of XRD and a special SEM

  18. Computational Complexity of Some Problems on Generalized Cellular Automations

    Directory of Open Access Journals (Sweden)

    P. G. Klyucharev

    2012-03-01

    Full Text Available We prove that the preimage problem of a generalized cellular automation is NP-hard. The results of this work are important for supporting the security of the ciphers based on the cellular automations.

  19. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    Science.gov (United States)

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  20. Noise problems in coal mining complex- a case discussion

    International Nuclear Information System (INIS)

    Mishra, Y.; Mitra, H.; Ghosh, S.; Pal, A.K.

    1996-01-01

    Noise monitoring study was conducted at Moonidih mining complex of Jharia coal-field. The study included monitoring and analysis of ambient as well as workplace noise levels. An attempt has been made to critically analyse the noise situation through octave band analysis, thereby identifying alarming noise frequencies for each noise generating equipment having Leq level more than 90 dBA. A noise model has also been developed to draw noise contours of the entire mining complex. Based on these studies, suitable control measures have been suggested. (author). 6 refs., 3 figs

  1. Problem analysis of geotechnical well drilling in complex environment

    International Nuclear Information System (INIS)

    Kasenov, A K; Biletskiy, M T; Ratov, B T; Korotchenko, T V

    2015-01-01

    The article examines primary causes of problems occurring during the drilling of geotechnical wells (injection, production and monitoring wells) for in-situ leaching to extract uranium in South Kazakhstan. Such a drilling problem as hole caving which is basically caused by various chemical and physical factors (hydraulic, mechanical, etc.) has been thoroughly investigated. The analysis of packing causes has revealed that this problem usually occurs because of insufficient amount of drilling mud being associated with small cross section downward flow and relatively large cross section upward flow. This is explained by the fact that when spear bores are used to drill clay rocks, cutting size is usually rather big and there is a risk for clay particles to coagulate

  2. Problems of complex automation of process at a NPP

    International Nuclear Information System (INIS)

    Naumov, A.V.

    1981-01-01

    The importance of theoretical investigation in determining the level and quality of NPP automation is discussed. Achievements gained in this direction are briefly reviewed on the example of domestic NPPs. Two models of the problem solution on function distribution between the operator and technical means are outlined. The processes subjected to automation are enumerated. Development of the optimal methods of power automatic control of power units is one of the most important problems of NPP automation. Automation of discrete operations especially during the start-up, shut-down or in imergency situations becomes important [ru

  3. Status of the Monte Carlo library least-squares (MCLLS) approach for non-linear radiation analyzer problems

    Science.gov (United States)

    Gardner, Robin P.; Xu, Libai

    2009-10-01

    The Center for Engineering Applications of Radioisotopes (CEAR) has been working for over a decade on the Monte Carlo library least-squares (MCLLS) approach for treating non-linear radiation analyzer problems including: (1) prompt gamma-ray neutron activation analysis (PGNAA) for bulk analysis, (2) energy-dispersive X-ray fluorescence (EDXRF) analyzers, and (3) carbon/oxygen tool analysis in oil well logging. This approach essentially consists of using Monte Carlo simulation to generate the libraries of all the elements to be analyzed plus any other required background libraries. These libraries are then used in the linear library least-squares (LLS) approach with unknown sample spectra to analyze for all elements in the sample. Iterations of this are used until the LLS values agree with the composition used to generate the libraries. The current status of the methods (and topics) necessary to implement the MCLLS approach is reported. This includes: (1) the Monte Carlo codes such as CEARXRF, CEARCPG, and CEARCO for forward generation of the necessary elemental library spectra for the LLS calculation for X-ray fluorescence, neutron capture prompt gamma-ray analyzers, and carbon/oxygen tools; (2) the correction of spectral pulse pile-up (PPU) distortion by Monte Carlo simulation with the code CEARIPPU; (3) generation of detector response functions (DRF) for detectors with linear and non-linear responses for Monte Carlo simulation of pulse-height spectra; and (4) the use of the differential operator (DO) technique to make the necessary iterations for non-linear responses practical. In addition to commonly analyzed single spectra, coincidence spectra or even two-dimensional (2-D) coincidence spectra can also be used in the MCLLS approach and may provide more accurate results.

  4. Markov Renewal Methods in Restart Problems in Complex Systems

    DEFF Research Database (Denmark)

    Asmussen, Søren; Lipsky, Lester; Thompson, Stephen

    A task with ideal execution time L such as the execution of a computer program or the transmission of a file on a data link may fail, and the task then needs to be restarted. The task is handled by a complex system with features similar to the ones in classical reliability: failures may...

  5. On the complexity of the balanced vertex ordering problem

    Directory of Open Access Journals (Sweden)

    Jan Kara

    2007-01-01

    Full Text Available We consider the problem of finding a balanced ordering of the vertices of a graph. More precisely, we want to minimise the sum, taken over all vertices v, of the difference between the number of neighbours to the left and right of v. This problem, which has applications in graph drawing, was recently introduced by Biedl et al. [Discrete Applied Math. 148:27--48, 2005]. They proved that the problem is solvable in polynomial time for graphs with maximum degree three, but NP-hard for graphs with maximum degree six. One of our main results is to close the gap in these results, by proving NP-hardness for graphs with maximum degree four. Furthermore, we prove that the problem remains NP-hard for planar graphs with maximum degree four and for 5-regular graphs. On the other hand, we introduce a polynomial time algorithm that determines whetherthere is a vertex ordering with total imbalance smaller than a fixed constant, and a polynomial time algorithm that determines whether a given multigraph with even degrees has an `almost balanced' ordering.

  6. Zoonoses, One Health and complexity: wicked problems and constructive conflict.

    Science.gov (United States)

    Waltner-Toews, David

    2017-07-19

    Infectious zoonoses emerge from complex interactions among social and ecological systems. Understanding this complexity requires the accommodation of multiple, often conflicting, perspectives and narratives, rooted in different value systems and temporal-spatial scales. Therefore, to be adaptive, successful and sustainable, One Health approaches necessarily entail conflicts among observers, practitioners and scholars. Nevertheless, these integrative approaches have, both implicitly and explicitly, tended to marginalize some perspectives and prioritize others, resulting in a kind of technocratic tyranny. An important function of One Health approaches should be to facilitate and manage those conflicts, rather than to impose solutions.This article is part of the themed issue 'One Health for a changing world: zoonoses, ecosystems and human well-being'. © 2017 The Authors.

  7. Solution of problems of automation of elevator complex

    Directory of Open Access Journals (Sweden)

    V. S. Kudryashov

    2018-01-01

    Full Text Available The article is devoted to the solution of automation tasks in the development of the operator's workstation (AWP for controlling the elevator with a capacity of 280 tons per hour as part of the work of LLC "Intelligent Automation Complexes". In the existing elevator complexes, only grain transportation is provided (there are no control systems with automatic grain drying with high accuracy of humidity measurement, automatic generation of grain transportation routes is not provided (for each route, a technical task is required and changes to the control program of the system are required. At the same time, more precise regulation of the flow of grain flows is required (the automatic latches used have only the "open / close" positions. The goal of elevator automation is: to reduce the time of equipment downturn by tracking the operating time of the equipment, the number of accidents and informing the operator about equipment that is susceptible to failure; reduction of the time for setting up and servicing the elevator; improvement of product quality; a decrease in the percentage of rejects, as well as a decrease in the influence of the human factor on the process. The paper provides a brief description of the proposed gate valve control algorithms, the auto-building of the grain drying route, the filtering of the grain moisture readings and the fragments of the operator's workstation program (in indusoft web studio for controlling the elevator complex. The proposed solutions allow: to reduce the time of equipment downtime by 20% and the total service time of the complex; weed out the undried grain for ridding in automatic mode for repeated drying; to improve the quality of products through automatic control of grain overheating; to reduce the production waste by 3%, and also to reduce the influence of the human factor on the process of grain transportation and drying.

  8. A complex approach to the blue-loop problem

    Science.gov (United States)

    Ostrowski, Jakub; Daszynska-Daszkiewicz, Jadwiga

    2015-08-01

    The problem of the blue loops during the core helium burning, outstanding for almost fifty years, is one of the most difficult and poorly understood problems in stellar astrophysics. Most of the work focused on the blue loops done so far has been performed with old stellar evolution codes and with limited computational resources. In the end the obtained conclusions were based on a small sample of models and could not have taken into account more advanced effects and interactions between them.The emergence of the blue loops depends on many details of the evolution calculations, in particular on chemical composition, opacity, mixing processes etc. The non-linear interactions between these factors contribute to the statement that in most cases it is hard to predict without a precise stellar modeling whether a loop will emerge or not. The high sensitivity of the blue loops to even small changes of the internal structure of a star yields one more issue: a sensitivity to numerical problems, which are common in calculations of stellar models on advanced stages of the evolution.To tackle this problem we used a modern stellar evolution code MESA. We calculated a large grid of evolutionary tracks (about 8000 models) with masses in the range of 3.0 - 25.0 solar masses from the zero age main sequence to the depletion of helium in the core. In order to make a comparative analysis, we varied metallicity, helium abundance and different mixing parameters resulting from convective overshooting, rotation etc.The better understanding of the properties of the blue loops is crucial for our knowledge of the population of blue supergiants or pulsating variables such as Cepheids, α-Cygni or Slowly Pulsating B-type supergiants. In case of more massive models it is also of great importance for studies of the progenitors of supernovae.

  9. Analyzing a problem-solution pattern in the transcription of a conversation: suggestions for the ELF classroom

    Directory of Open Access Journals (Sweden)

    Navas Brenes, César A.

    2005-12-01

    Full Text Available This paper analyzes a problem-solution pattern shown in the transcription of a conversation. This analysis is based on a conversation that has been elicited from three native speakers of English. These speakers were given a topic that dealt with the problem of children being constantly exposed to violent video games. As a result, the writer recorded an oral text that contains several elements related to a problem-solution pattern such as the main issue, opinions, personal examples, possible solutions, and the evaluation of these solutions. The writer analyzed this pattern in terms of discourse analysis using idea units from the transcription. Moreover, the writer will point out how appropriate this transcription is for preparing teaching tasks for the EFL classroom. Este artículo analiza el patrón estructural de la solución de un problema evidente en la transcripción de una conversación. El análisis esta basado en un diálogo obtenido de tres hablantes nativos de la lengua inglesa. El autor suministró a dichas personas un tema relacionado con el problema de la exposición de niños al contenido violento de los juegos de video. Como resultado, el escritor obtuvo una grabación del texto oral que contiene varios elementos relacionados con el patrón estructural de la solución de un problema. Algunos de estos elementos son el problema principal, opiniones, ejemplos personales, soluciones viables al problema, y la evaluación de dichas soluciones. El autor analizó algunas líneas de la transcripción con la ayuda de algunos conceptos relacionados con el análisis del discurso. Cabe destacar que el uso de tales trascripciones es muy apropiado en la preparación de actividades para una clase de inglés como lengua extranjera.

  10. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  11. Problem-solving with multiple interdependent criteria: better solution to complex problems

    International Nuclear Information System (INIS)

    Carlsson, C.; Fuller, R.

    1996-01-01

    We consider multiple objective programming (MOP) problems with additive interdependencies, this is when the states of some chosen objective are attained through supportive or inhibitory feed-backs from several other objectives. MOP problems with independent objectives (when the cause-effect relations between the decision variables and the objectives are completely known) will be treated as special cases of the MOP in which we have interdependent objectives. We illustrate our ideas by a simple three-objective real-life problem

  12. Using iMCFA to Perform the CFA, Multilevel CFA, and Maximum Model for Analyzing Complex Survey Data.

    Science.gov (United States)

    Wu, Jiun-Yu; Lee, Yuan-Hsuan; Lin, John J H

    2018-01-01

    To construct CFA, MCFA, and maximum MCFA with LISREL v.8 and below, we provide iMCFA (integrated Multilevel Confirmatory Analysis) to examine the potential multilevel factorial structure in the complex survey data. Modeling multilevel structure for complex survey data is complicated because building a multilevel model is not an infallible statistical strategy unless the hypothesized model is close to the real data structure. Methodologists have suggested using different modeling techniques to investigate potential multilevel structure of survey data. Using iMCFA, researchers can visually set the between- and within-level factorial structure to fit MCFA, CFA and/or MAX MCFA models for complex survey data. iMCFA can then yield between- and within-level variance-covariance matrices, calculate intraclass correlations, perform the analyses and generate the outputs for respective models. The summary of the analytical outputs from LISREL is gathered and tabulated for further model comparison and interpretation. iMCFA also provides LISREL syntax of different models for researchers' future use. An empirical and a simulated multilevel dataset with complex and simple structures in the within or between level was used to illustrate the usability and the effectiveness of the iMCFA procedure on analyzing complex survey data. The analytic results of iMCFA using Muthen's limited information estimator were compared with those of Mplus using Full Information Maximum Likelihood regarding the effectiveness of different estimation methods.

  13. Addressing Complex Challenges through Adaptive Leadership: A Promising Approach to Collaborative Problem Solving

    Science.gov (United States)

    Nelson, Tenneisha; Squires, Vicki

    2017-01-01

    Organizations are faced with solving increasingly complex problems. Addressing these issues requires effective leadership that can facilitate a collaborative problem solving approach where multiple perspectives are leveraged. In this conceptual paper, we critique the effectiveness of earlier leadership models in tackling complex organizational…

  14. Tracing the development of complex problems and the methods of its information support

    International Nuclear Information System (INIS)

    Belenki, A.; Ryjov, A.

    1999-01-01

    This article is dedicated to the development of a technology for information monitoring of complex problems such as IAEA safeguards tasks. The main purpose of this technology is to create human-machine systems for monitoring problems with complex subject areas such as political science, social science, business, ecology and etc. (author)

  15. Analyzing the problems with the current adoption of IFRS in the companies among India, China, Germany, Russia and Kenya

    Directory of Open Access Journals (Sweden)

    Robert Mosomi Ombati

    2018-01-01

    Full Text Available Accounting information provides past and current financial information of an economic unit for business managers, potential investors, and other interested parties. Internally generated accounting information helps business managers with planning, controlling, and making decisions referred to as managerial accounting information. However, if the companies, which have adopted International Financial Reporting Standards (IFRS globally, cannot generate the same information then the accounting practices need to be improved. For this purpose, the current study was performed with the objectives of measuring relationship between profitability and market capitalization and to analyze the challenges faced by listed firms of various countries in association with the implementation of IFRS. For this purpose, 15 companies were selected from 5 countries including India, China, Germany, Russia and Kenya. The secondary data regarding the correlation between profitability and market capitalization were analyzed to calculate the correlations. The primary data regarding the managers perception were analyzed with multiple regression method using SPSS-19 software to find out the company related variables, investors’ related variables and government agency related variables responsible for problems in the current adoption of IFRS.

  16. Composing complex EXAFS problems with severe information constraints

    International Nuclear Information System (INIS)

    Ravel, B

    2009-01-01

    In recent work, a model for the structural environment of Hg bound to a catalytic DNA sensor was proposed on the basis of EXAFS data analysis. Although severely constrained by limited data quality and scant supporting structural data, a compelling structural model was found which agreed with a similar but less detailed model proposed on the basis on NMR data. I discuss in detail the successes and limitations of the analytical strategy that were implemented in the earlier work. I then speculate on future software requirements needed to make this and similarly complex analytical strategies more available to the wider audience of EXAFS practitioners.

  17. On the problem of constructing a modern, economic radiotelescope complex

    Science.gov (United States)

    Bogomolov, A. F.; Sokolov, A. G.; Poperechenko, B. A.; Polyak, V. S.

    1977-01-01

    Criteria for comparing and planning the technical and economic characteristics of large parabolic reflector antenna systems and other types used in radioastronomy and deep space communications are discussed. The experience gained in making and optimizing a series of highly efficient parabolic antennas in the USSR is reviewed. Several ways are indicated for further improving the complex characteristics of antennas similar to the original TNA-1500 64m radio telescope. The suggestions can be applied in planning the characteristics of radiotelescopes which are now being built, in particular, the TNA-8000 with a diameter of 128 m.

  18. Simulation Gaming as a Social Development Instrument : Dealing with Complex Problems

    NARCIS (Netherlands)

    Klievink, B.; Janssen, M.

    Improving public service delivery is a very complex domain and the complexity is difficult to grasp by stakeholders having various degree of knowledge and involvement. An emergent and promising method for dealing with complex problems is simulation gaming, which can be used to capitalize the

  19. Bactrocera dorsalis complex and its problem in control

    International Nuclear Information System (INIS)

    Tan, Keng-Hong

    2003-01-01

    Eight species of fifty-two in the Bactrocera dorsalis complex are serious pests in the Asia-Pacific region. Of these, all except one are attracted to methyl eugenol. Four of these pests B. carambolae, B. dorsalis, B. papayae and B. philippinesis are polyphagous species and infest 75, 117, 195 and 18 fruit host species respectively. Common names for B. carambalae and B. papayae (sympatric species) have caused confusion. Both species can interbreed and produce viable offspring; and their natural hybrids have been collected. Bactrocera dorsalis and B. papayae can interbreed readily and produce viable offspring in the laboratory as males produce identical booster sex and aggregation pheromonal components after consuming methyl eugenol. The DNA sequences of one of their respective allelic introns of the actin gene are also identical which suggests that they are not distinct genetic species. Protein bait application and male annihilation techniques have been successful in the management of fruit flies in many cases but they have to compete with natural sources of lures. SIT is amenable for non-methyl engenol species; but for methyl eugenol sensitive species, sterile makes should be allowed to consume methyl eugenol before release to have an equal mating competitiveness with wild males. (author)

  20. Algorithms and Complexity Results for Genome Mapping Problems.

    Science.gov (United States)

    Rajaraman, Ashok; Zanetti, Joao Paulo Pereira; Manuch, Jan; Chauve, Cedric

    2017-01-01

    Genome mapping algorithms aim at computing an ordering of a set of genomic markers based on local ordering information such as adjacencies and intervals of markers. In most genome mapping models, markers are assumed to occur uniquely in the resulting map. We introduce algorithmic questions that consider repeats, i.e., markers that can have several occurrences in the resulting map. We show that, provided with an upper bound on the copy number of repeated markers and with intervals that span full repeat copies, called repeat spanning intervals, the problem of deciding if a set of adjacencies and repeat spanning intervals admits a genome representation is tractable if the target genome can contain linear and/or circular chromosomal fragments. We also show that extracting a maximum cardinality or weight subset of repeat spanning intervals given a set of adjacencies that admits a genome realization is NP-hard but fixed-parameter tractable in the maximum copy number and the number of adjacent repeats, and tractable if intervals contain a single repeated marker.

  1. Granulomatous lobular mastitis: a complex diagnostic and therapeutic problem.

    Science.gov (United States)

    Akcan, Alper; Akyildiz, Hizir; Deneme, Mehmet Ali; Akgun, Hulya; Aritas, Yucel

    2006-08-01

    Granulomatous lobular mastitis is a rare chronic inflammatory disease of the breast. Clinical and radiological features may mimic breast carcinoma. Since this entity was first described, several clinical and pathologic features of the disease have been reported, but diagnostic features and treatment alternatives are still unclear. The purpose of this study is to evaluate diagnostic difficulties and discuss the outcome of surgical treatment in a series of 21 patients with granulomatous lobular mastitis. A retrospective review of 21 patients with histologically confirmed granulomatous lobular mastitis treated in our center between January 1995 and May 2005 was analyzed to identify issues in the diagnosis and treatment of this rare condition. The most common presenting symptoms were a mass in the breast and pain. Four patients had no significant mammographic findings (MMG), but on ultrasound (US), 2 had irregular hypoechoic mass, and 2 hypoechoic nodular structures had abnormalities-one parenchymal distortion and 1 mass formation in 2 of these 4 patients' magnetic resonance imaging (MRI). In recurrent cases, limited excision under local anesthesia was performed, as the clinical examination suggested carcinoma. Although some findings on MMG and US are suggestive of benign breast disease, these modalities do not rule out malignancy. MRI may be helpful in patients who do not have significant pathology at MMG or US. Fine-needle aspiration cytology may be useful in some cases but diagnosis is potentially difficult because of its cytologic characteristics. Wide excision, particularly under general anesthesia, can be therapeutic as well as useful in providing an exact diagnosis.

  2. How Students Circumvent Problem-Solving Strategies that Require Greater Cognitive Complexity.

    Science.gov (United States)

    Niaz, Mansoor

    1996-01-01

    Analyzes the great diversity in problem-solving strategies used by students in solving a chemistry problem and discusses the relationship between these variables and different cognitive variables. Concludes that students try to circumvent certain problem-solving strategies by adapting flexible and stylistic innovations that render the cognitive…

  3. Analogy as a strategy for supporting complex problem solving under uncertainty.

    Science.gov (United States)

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  4. Amodified probabilistic genetic algorithm for the solution of complex constrained optimization problems

    OpenAIRE

    Vorozheikin, A.; Gonchar, T.; Panfilov, I.; Sopov, E.; Sopov, S.

    2009-01-01

    A new algorithm for the solution of complex constrained optimization problems based on the probabilistic genetic algorithm with optimal solution prediction is proposed. The efficiency investigation results in comparison with standard genetic algorithm are presented.

  5. Evaluation of fine ceramics raw powders with particle size analyzers having different measuring principle and its problem

    International Nuclear Information System (INIS)

    Hayakawa, Osamu; Nakahira, Kenji; Tsubaki, Junichiro.

    1995-01-01

    Many kinds of analyzers based on various principles have been developed for measuring particle size distribution of fine ceramics powders. But the reproducibility of the results, interchangeability of the models, reliability of the ends of the measured distribution have not been investigated for each principle. In this paper, these important points for particle size analysis were clarified by measuring raw material powders of fine ceramics. (1) in the case of laser diffraction and scattering method, the reproducibility in the same model is good, however, interchangeability of the different models is not so good, especially at the ends of the distribution. Submicron powders having high refractive index show such a tendency remarkably. (2) the photo sedimentation method has some problems to be conquered, especially in measuring submicron powders having high refractive index or flaky shape particles. The reproducibility of X-ray sedimentation method is much better than that of photo sedimentation. (3) the light obscuration and electrical sensing zone methods, show good reproducibility, however, sometime bad interchangeability is affected by calibration and so on. (author)

  6. Analyzing discourse and text complexity for learning and collaborating a cognitive approach based on natural language processing

    CERN Document Server

    Dascălu, Mihai

    2014-01-01

    With the advent and increasing popularity of Computer Supported Collaborative Learning (CSCL) and e-learning technologies, the need of automatic assessment and of teacher/tutor support for the two tightly intertwined activities of comprehension of reading materials and of collaboration among peers has grown significantly. In this context, a polyphonic model of discourse derived from Bakhtin’s work as a paradigm is used for analyzing both general texts and CSCL conversations in a unique framework focused on different facets of textual cohesion. As specificity of our analysis, the individual learning perspective is focused on the identification of reading strategies and on providing a multi-dimensional textual complexity model, whereas the collaborative learning dimension is centered on the evaluation of participants’ involvement, as well as on collaboration assessment. Our approach based on advanced Natural Language Processing techniques provides a qualitative estimation of the learning process and enhance...

  7. Individual Differences in Students' Complex Problem Solving Skills: How They Evolve and What They Imply

    Science.gov (United States)

    Wüstenberg, Sascha; Greiff, Samuel; Vainikainen, Mari-Pauliina; Murphy, Kevin

    2016-01-01

    Changes in the demands posed by increasingly complex workplaces in the 21st century have raised the importance of nonroutine skills such as complex problem solving (CPS). However, little is known about the antecedents and outcomes of CPS, especially with regard to malleable external factors such as classroom climate. To investigate the relations…

  8. The Streaming Complexity of Cycle Counting, Sorting by Reversals, and Other Problems

    DEFF Research Database (Denmark)

    Verbin, Elad; Yu, Wei

    2011-01-01

    -way. By designing reductions from BHH, we prove lower bounds for the streaming complexity of approximating the sorting by reversal distance, of approximately counting the number of cycles in a 2-regular graph, and of other problems. For example, here is one lower bound that we prove, for a cycle-counting problem...

  9. Upper estimates of complexity of algorithms for multi-peg Tower of Hanoi problem

    Directory of Open Access Journals (Sweden)

    Sergey Novikov

    2007-06-01

    Full Text Available There are proved upper explicit estimates of complexity of lgorithms: for multi-peg Tower of Hanoi problem with the limited number of disks, for Reve's puzzle and for $5$-peg Tower of Hanoi problem with the free number of disks.

  10. Identification of effective visual problem solving strategies in a complex visual domain

    NARCIS (Netherlands)

    Van Meeuwen, Ludo; Jarodzka, Halszka; Brand-Gruwel, Saskia; Kirschner, Paul A.; De Bock, Jeano; Van Merriënboer, Jeroen

    2018-01-01

    Students in complex visual domains must acquire visual problem solving strategies that allow them to make fast decisions and come up with good solutions to real-time problems. In this study, 31 air traffic controllers at different levels of expertise (novice, intermediate, expert) were confronted

  11. A Real-Life Case Study of Audit Interactions--Resolving Messy, Complex Problems

    Science.gov (United States)

    Beattie, Vivien; Fearnley, Stella; Hines, Tony

    2012-01-01

    Real-life accounting and auditing problems are often complex and messy, requiring the synthesis of technical knowledge in addition to the application of generic skills. To help students acquire the necessary skills to deal with these problems effectively, educators have called for the use of case-based methods. Cases based on real situations (such…

  12. Child outcomes of home-visiting for families with complex and multiple problems

    NARCIS (Netherlands)

    van Assen, Arend; Dickscheit, Jana; Post, Wendy; Grietens, Hans

    2016-01-01

    Introduction Families with complex and multiple problems are faced with an accumulation of problems across multiple areas of life. Furthermore, these families are often considered to be ‘difficult to treat’. Children and teenagers growing up in these families are exposed to an accumulation of risks

  13. Development and operation of an integrated sampling probe and gas analyzer for turbulent mixing studies in complex supersonic flows

    Science.gov (United States)

    Wiswall, John D.

    -temporal characteristic scales of the flow on the resulting time-area-averaged concentration measurements. Two series of experiments were performed to verify the probe's design; the first used Schlieren photography and verified that the probe sampled from the supersonic flowfield isokinetically. The second series involved traversing the probe across a free mixing layer of air and helium, to obtain both mean concentration and high frequency measurements. High-frequency data was statistically analyzed and inspection of the Probability Density Function (PDF) of the hot-film response was instrumental to interpret how well the resulting average mixing measurements represent these types of complex flows. The probe is minimally intrusive, has accuracy comparable to its predecessors, has an improved frequency response for mean concentration measurements, and samples from a very small area in the flowfield.

  14. Non-commutative cryptography and complexity of group-theoretic problems

    CERN Document Server

    Myasnikov, Alexei; Ushakov, Alexander

    2011-01-01

    This book is about relations between three different areas of mathematics and theoretical computer science: combinatorial group theory, cryptography, and complexity theory. It explores how non-commutative (infinite) groups, which are typically studied in combinatorial group theory, can be used in public-key cryptography. It also shows that there is remarkable feedback from cryptography to combinatorial group theory because some of the problems motivated by cryptography appear to be new to group theory, and they open many interesting research avenues within group theory. In particular, a lot of emphasis in the book is put on studying search problems, as compared to decision problems traditionally studied in combinatorial group theory. Then, complexity theory, notably generic-case complexity of algorithms, is employed for cryptanalysis of various cryptographic protocols based on infinite groups, and the ideas and machinery from the theory of generic-case complexity are used to study asymptotically dominant prop...

  15. Implementation of exterior complex scaling in B-splines to solve atomic and molecular collision problems

    International Nuclear Information System (INIS)

    McCurdy, C William; MartIn, Fernando

    2004-01-01

    B-spline methods are now well established as widely applicable tools for the evaluation of atomic and molecular continuum states. The mathematical technique of exterior complex scaling has been shown, in a variety of other implementations, to be a powerful method with which to solve atomic and molecular scattering problems, because it allows the correct imposition of continuum boundary conditions without their explicit analytic application. In this paper, an implementation of exterior complex scaling in B-splines is described that can bring the well-developed technology of B-splines to bear on new problems, including multiple ionization and breakup problems, in a straightforward way. The approach is demonstrated for examples involving the continuum motion of nuclei in diatomic molecules as well as electronic continua. For problems involving electrons, a method based on Poisson's equation is presented for computing two-electron integrals over B-splines under exterior complex scaling

  16. Harm reduction as a complex adaptive system: A dynamic framework for analyzing Tanzanian policies concerning heroin use.

    Science.gov (United States)

    Ratliff, Eric A; Kaduri, Pamela; Masao, Frank; Mbwambo, Jessie K K; McCurdy, Sheryl A

    2016-04-01

    Contrary to popular belief, policies on drug use are not always based on scientific evidence or composed in a rational manner. Rather, decisions concerning drug policies reflect the negotiation of actors' ambitions, values, and facts as they organize in different ways around the perceived problems associated with illicit drug use. Drug policy is thus best represented as a complex adaptive system (CAS) that is dynamic, self-organizing, and coevolving. In this analysis, we use a CAS framework to examine how harm reduction emerged around heroin trafficking and use in Tanzania over the past thirty years (1985-present). This account is an organizational ethnography based on of the observant participation of the authors as actors within this system. We review the dynamic history and self-organizing nature of harm reduction, noting how interactions among system actors and components have coevolved with patterns of heroin us, policing, and treatment activities over time. Using a CAS framework, we describe harm reduction as a complex process where ambitions, values, facts, and technologies interact in the Tanzanian sociopolitical environment. We review the dynamic history and self-organizing nature of heroin policies, noting how the interactions within and between competing prohibitionist and harm reduction policies have changed with patterns of heroin use, policing, and treatment activities over time. Actors learn from their experiences to organize with other actors, align their values and facts, and implement new policies. Using a CAS approach provides researchers and policy actors a better understanding of patterns and intricacies in drug policy. This knowledge of how the system works can help improve the policy process through adaptive action to introduce new actors, different ideas, and avenues for communication into the system. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. DOE's efforts to correct environmental problems of the nuclear weapons complex

    International Nuclear Information System (INIS)

    Rezendes, V.S.

    1990-03-01

    This report focuses on four main issues: the environmental problems at DOE's nuclear weapons complex, recent changes in DOE's organizational structure, DOE's 1991 budget request, and the need for effective management systems. This report concludes that the environmental problems are enormous and will take decades to resolve. Widespread contamination can be found at many DOE sites, and the full extent of the environmental problems is unknown. DOE has taken several steps during the past year to better deal with these problems, including making organizational improvements and requesting additional funds for environmental restoration and waste management activities

  18. Sleep, Cognition, and Behavioral Problems in School-Age Children: A Century of Research Meta-Analyzed

    NARCIS (Netherlands)

    Astill, R.G.; van der Heijden, K.B.; van IJzendoorn, M.H.; van Someren, E.J.W.

    2012-01-01

    Clear associations of sleep, cognitive performance, and behavioral problems have been demonstrated in meta-analyses of studies in adults. This meta-analysis is the first to systematically summarize all relevant studies reporting on sleep, cognition, and behavioral problems in healthy school-age

  19. Conceptual and procedural knowledge community college students use when solving a complex science problem

    Science.gov (United States)

    Steen-Eibensteiner, Janice Lee

    2006-07-01

    A strong science knowledge base and problem solving skills have always been highly valued for employment in the science industry. Skills currently needed for employment include being able to problem solve (Overtoom, 2000). Academia also recognizes the need for effectively teaching students to apply problem solving skills in clinical settings. This thesis investigates how students solve complex science problems in an academic setting in order to inform the development of problem solving skills for the workplace. Students' use of problem solving skills in the form of learned concepts and procedural knowledge was studied as students completed a problem that might come up in real life. Students were taking a community college sophomore biology course, Human Anatomy & Physiology II. The problem topic was negative feedback inhibition of the thyroid and parathyroid glands. The research questions answered were (1) How well do community college students use a complex of conceptual knowledge when solving a complex science problem? (2) What conceptual knowledge are community college students using correctly, incorrectly, or not using when solving a complex science problem? (3) What problem solving procedural knowledge are community college students using successfully, unsuccessfully, or not using when solving a complex science problem? From the whole class the high academic level participants performed at a mean of 72% correct on chapter test questions which was a low average to fair grade of C-. The middle and low academic participants both failed (F) the test questions (37% and 30% respectively); 29% (9/31) of the students show only a fair performance while 71% (22/31) fail. From the subset sample population of 2 students each from the high, middle, and low academic levels selected from the whole class 35% (8/23) of the concepts were used effectively, 22% (5/23) marginally, and 43% (10/23) poorly. Only 1 concept was used incorrectly by 3/6 of the students and identified as

  20. Developing an agent-based model on how different individuals solve complex problems

    Directory of Open Access Journals (Sweden)

    Ipek Bozkurt

    2015-01-01

    Full Text Available Purpose: Research that focuses on the emotional, mental, behavioral and cognitive capabilities of individuals has been abundant within disciplines such as psychology, sociology, and anthropology, among others. However, when facing complex problems, a new perspective to understand individuals is necessary. The main purpose of this paper is to develop an agent-based model and simulation to gain understanding on the decision-making and problem-solving abilities of individuals. Design/Methodology/approach: The micro-level analysis modeling and simulation paradigm Agent-Based Modeling Through the use of Agent-Based Modeling, insight is gained on how different individuals with different profiles deal with complex problems. Using previous literature from different bodies of knowledge, established theories and certain assumptions as input parameters, a model is built and executed through a computer simulation. Findings: The results indicate that individuals with certain profiles have better capabilities to deal with complex problems. Moderate profiles could solve the entire complex problem, whereas profiles within extreme conditions could not. This indicates that having a strong predisposition is not the ideal way when approaching complex problems, and there should always be a component from the other perspective. The probability that an individual may use these capabilities provided by the opposite predisposition provides to be a useful option. Originality/value: The originality of the present research stems from how individuals are profiled, and the model and simulation that is built to understand how they solve complex problems. The development of the agent-based model adds value to the existing body of knowledge within both social sciences, and modeling and simulation.

  1. Preparing new nurses with complexity science and problem-based learning.

    Science.gov (United States)

    Hodges, Helen F

    2011-01-01

    Successful nurses function effectively with adaptability, improvability, and interconnectedness, and can see emerging and unpredictable complex problems. Preparing new nurses for complexity requires a significant change in prevalent but dated nursing education models for rising graduates. The science of complexity coupled with problem-based learning and peer review contributes a feasible framework for a constructivist learning environment to examine real-time systems data; explore uncertainty, inherent patterns, and ambiguity; and develop skills for unstructured problem solving. This article describes a pilot study of a problem-based learning strategy guided by principles of complexity science in a community clinical nursing course. Thirty-five senior nursing students participated during a 3-year period. Assessments included peer review, a final project paper, reflection, and a satisfaction survey. Results were higher than expected levels of student satisfaction, increased breadth and analysis of complex data, acknowledgment of community as complex adaptive systems, and overall higher level thinking skills than in previous years. 2011, SLACK Incorporated.

  2. Predictability problems of global change as seen through natural systems complexity description. 2. Approach

    Directory of Open Access Journals (Sweden)

    Vladimir V. Kozoderov

    1998-01-01

    Full Text Available Developing the general statements of the proposed global change theory, outlined in Part 1 of the publication, Kolmogorov's probability space is used to study properties of information measures (unconditional, joint and conditional entropies, information divergence, mutual information, etc.. Sets of elementary events, the specified algebra of their sub-sets and probability measures for the algebra are composite parts of the space. The information measures are analyzed using the mathematical expectance operator and the adequacy between an additive function of sets and their equivalents in the form of the measures. As a result, explanations are given to multispectral satellite imagery visualization procedures using Markov's chains of random variables represented by pixels of the imagery. The proposed formalism of the information measures application enables to describe the natural targets complexity by syntactically governing probabilities. Asserted as that of signal/noise ratios finding for anomalies of natural processes, the predictability problem is solved by analyses of temporal data sets of related measurements for key regions and their background within contextually coherent structures of natural targets and between particular boundaries of the structures.

  3. arXiv Spin models in complex magnetic fields: a hard sign problem

    CERN Document Server

    de Forcrand, Philippe

    2018-01-01

    Coupling spin models to complex external fields can give rise to interesting phenomena like zeroes of the partition function (Lee-Yang zeroes, edge singularities) or oscillating propagators. Unfortunately, it usually also leads to a severe sign problem that can be overcome only in special cases; if the partition function has zeroes, the sign problem is even representation-independent at these points. In this study, we couple the N-state Potts model in different ways to a complex external magnetic field and discuss the above mentioned phenomena and their relations based on analytic calculations (1D) and results obtained using a modified cluster algorithm (general D) that in many cases either cures or at least drastically reduces the sign-problem induced by the complex external field.

  4. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases.

    Science.gov (United States)

    Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M

    2006-04-21

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association

  5. Medicines counterfeiting is a complex problem: a review of key challenges across the supply chain.

    Science.gov (United States)

    Tremblay, Michael

    2013-02-01

    The paper begins by asking why there is a market for counterfeit medicines, which in effect creates the problem of counterfeiting itself. Contributing factors include supply chain complexity and the lack of whole-systems thinking. These two underpin the author's view that counterfeiting is a complex (i.e. wicked) problem, and that corporate, public policy and regulatory actions need to be mindful of how their actions may be causal. The paper offers a problem-based review of key components of this complexity, viz., the knowledge end-users/consumers have of medicines; whether restrictive information policies may hamper information provision to patients; the internet's direct access to consumers; internet-enabled distribution of unsafe and counterfeit medicines; whether the internet is a parallel and competitive supply chain to legitimate routes; organised crime as an emerging medicines manufacturer and supplier and whether substandard medicines is really the bigger problem. Solutions respect the perceived complexity of the supply chain challenges. The paper identifies the need to avoid technologically-driven solutions, calling for 'technological agnosticism'. Both regulation and public policy need to reflect the dynamic nature of the problem and avoid creating perverse incentives; it may be, for instance, that medicines pricing and reimbursement policies, which affect consumer/patient access may act as market signals to counterfeiters, since this creates a cash market in cheaper drugs.

  6. PREDOMINANCE AND SOCIAL DETERMINANTS IN OCCURRENCE OF PARASITOSIS IN CENTEREASTERN REGION OF PARANÁ: A SOCIOECONOMIC ANALYZES OF THE PROBLEM.

    Directory of Open Access Journals (Sweden)

    Patricia Regina Cenci Queiroz, Angela Patricia Motin, Cristiane Aparecida Verbaneck, Franciely Damaris de Cristo, Marcia de Souza Oliveira, Marcia Maria Veronese e Shirley Rak Mantovani

    2006-12-01

    Full Text Available Considering the high incidence of parasitic diseases in centereastern region of Paraná, an inquiry about its probable determinants was carried out. Particularly, the occurrence of Ascaris Lumbricoides was analyzed. Bibliographical research referring to this theme and data-collecting from centereastern region have been carried out, attempting to contextualize the epidemiological condition of this region. We can conclude that the high occurrence of ascariasis in analyzed region is linked to life conditions of general population, the deficiency in urban and sanitary planning, socioeconomic conditions and, essentially, the absence of investments in basic infrastructure. Thus, this study demonstrates the need of an urgent prophylactic action.

  7. The Development of Complex Problem Solving in Adolescence: A Latent Growth Curve Analysis

    Science.gov (United States)

    Frischkorn, Gidon T.; Greiff, Samuel; Wüstenberg, Sascha

    2014-01-01

    Complex problem solving (CPS) as a cross-curricular competence has recently attracted more attention in educational psychology as indicated by its implementation in international educational large-scale assessments such as the Programme for International Student Assessment. However, research on the development of CPS is scarce, and the few…

  8. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    Science.gov (United States)

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  9. Learning about Complex Multi-Stakeholder Issues: Assessing the Visual Problem Appraisal

    NARCIS (Netherlands)

    Witteveen, L.M.; Put, M.; Leeuwis, C.

    2010-01-01

    This paper presents an evaluation of the visual problem appraisal (VPA) learning environment in higher education. The VPA has been designed for the training of competences that are required in complex stakeholder settings in relation to sustainability issues. The design of VPA incorporates a

  10. A method for evaluating the problem complex of choosing the ventilation system for a new building

    DEFF Research Database (Denmark)

    Hviid, Christian Anker; Svendsen, Svend

    2007-01-01

    The application of a ventilation system in a new building is a multidimensional complex problem that involves quantifiable and non-quantifiable data like energy consump¬tion, indoor environment, building integration and architectural expression. This paper presents a structured method for evaluat...

  11. MDcons: Intermolecular contact maps as a tool to analyze the interface of protein complexes from molecular dynamics trajectories

    KAUST Repository

    Abdel-Azeim, Safwat

    2014-05-06

    Background: Molecular Dynamics ( MD) simulations of protein complexes suffer from the lack of specific tools in the analysis step. Analyses of MD trajectories of protein complexes indeed generally rely on classical measures, such as the RMSD, RMSF and gyration radius, conceived and developed for single macromolecules. As a matter of fact, instead, researchers engaged in simulating the dynamics of a protein complex are mainly interested in characterizing the conservation/variation of its biological interface. Results: On these bases, herein we propose a novel approach to the analysis of MD trajectories or other conformational ensembles of protein complexes, MDcons, which uses the conservation of inter-residue contacts at the interface as a measure of the similarity between different snapshots. A "consensus contact map" is also provided, where the conservation of the different contacts is drawn in a grey scale. Finally, the interface area of the complex is monitored during the simulations. To show its utility, we used this novel approach to study two protein-protein complexes with interfaces of comparable size and both dominated by hydrophilic interactions, but having binding affinities at the extremes of the experimental range. MDcons is demonstrated to be extremely useful to analyse the MD trajectories of the investigated complexes, adding important insight into the dynamic behavior of their biological interface. Conclusions: MDcons specifically allows the user to highlight and characterize the dynamics of the interface in protein complexes and can thus be used as a complementary tool for the analysis of MD simulations of both experimental and predicted structures of protein complexes.

  12. Protein complexes in the archaeon Methanothermobacter thermautotrophicus analyzed by blue native/SDS-PAGE and mass spectrometry.

    NARCIS (Netherlands)

    Farhoud, M.H.; Wessels, H.C.T.; Steenbakkers, P.J.M.; Mattijssen, S.; Wevers, R.A.; Engelen, B.G.M. van; Jetten, M.S.M.; Smeitink, J.A.M.; Heuvel, L.P.W.J. van den; Keltjens, J.T.M.

    2005-01-01

    Methanothermobacter thermautotrophicus is a thermophilic archaeon that produces methane as the end product of its primary metabolism. The biochemistry of methane formation has been extensively studied and is catalyzed by individual enzymes and proteins that are organized in protein complexes.

  13. The Consensus String Problem and the Complexity of Comparing Hidden Markov Models

    DEFF Research Database (Denmark)

    Lyngsø, Rune Bang; Pedersen, Christian Nørgaard Storm

    2002-01-01

    The basic theory of hidden Markov models was developed and applied to problems in speech recognition in the late 1960s, and has since then been applied to numerous problems, e.g. biological sequence analysis. Most applications of hidden Markov models are based on efficient algorithms for computing...... the probability of generating a given string, or computing the most likely path generating a given string. In this paper we consider the problem of computing the most likely string, or consensus string, generated by a given model, and its implications on the complexity of comparing hidden Markov models. We show...... that computing the consensus string, and approximating its probability within any constant factor, is NP-hard, and that the same holds for the closely related labeling problem for class hidden Markov models. Furthermore, we establish the NP-hardness of comparing two hidden Markov models under the L∞- and L1...

  14. Determining the Effects of Cognitive Style, Problem Complexity, and Hypothesis Generation on the Problem Solving Ability of School-Based Agricultural Education Students

    Science.gov (United States)

    Blackburn, J. Joey; Robinson, J. Shane

    2016-01-01

    The purpose of this experimental study was to assess the effects of cognitive style, problem complexity, and hypothesis generation on the problem solving ability of school-based agricultural education students. Problem solving ability was defined as time to solution. Kirton's Adaption-Innovation Inventory was employed to assess students' cognitive…

  15. Numerical nonlinear complex geometrical optics algorithm for the 3D Calderón problem

    DEFF Research Database (Denmark)

    Delbary, Fabrice; Knudsen, Kim

    2014-01-01

    to the generalized Laplace equation. The 3D problem was solved in theory in late 1980s using complex geometrical optics solutions and a scattering transform. Several approximations to the reconstruction method have been suggested and implemented numerically in the literature, but here, for the first time, a complete...... computer implementation of the full nonlinear algorithm is given. First a boundary integral equation is solved by a Nystrom method for the traces of the complex geometrical optics solutions, second the scattering transform is computed and inverted using fast Fourier transform, and finally a boundary value...

  16. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem.

    Science.gov (United States)

    Williams, Patricia Ah; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat.

  17. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem

    Science.gov (United States)

    Williams, Patricia AH; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. PMID:26229513

  18. MDcons: Intermolecular contact maps as a tool to analyze the interface of protein complexes from molecular dynamics trajectories

    KAUST Repository

    Abdel-Azeim, Safwat; Chermak, Edrisse; Vangone, Anna; Oliva, Romina; Cavallo, Luigi

    2014-01-01

    of the similarity between different snapshots. A "consensus contact map" is also provided, where the conservation of the different contacts is drawn in a grey scale. Finally, the interface area of the complex is monitored during the simulations. To show its utility

  19. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases

    NARCIS (Netherlands)

    Heidema, A.G.; Boer, J.M.A.; Nagelkerke, N.; Mariman, E.C.M.; A, van der D.L.; Feskens, E.J.M.

    2006-01-01

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods

  20. Numerical sensitivity computation for discontinuous gradient-only optimization problems using the complex-step method

    CSIR Research Space (South Africa)

    Wilke, DN

    2012-07-01

    Full Text Available problems that utilise remeshing (i.e. the mesh topology is allowed to change) between design updates. Here, changes in mesh topology result in abrupt changes in the discretization error of the computed response. These abrupt changes in turn manifests... in shape optimization but may be present whenever (partial) differential equations are ap- proximated numerically with non-constant discretization methods e.g. remeshing of spatial domains or automatic time stepping in temporal domains. Keywords: Complex...

  1. Solving the three-body Coulomb breakup problem using exterior complex scaling

    Energy Technology Data Exchange (ETDEWEB)

    McCurdy, C.W.; Baertschy, M.; Rescigno, T.N.

    2004-05-17

    Electron-impact ionization of the hydrogen atom is the prototypical three-body Coulomb breakup problem in quantum mechanics. The combination of subtle correlation effects and the difficult boundary conditions required to describe two electrons in the continuum have made this one of the outstanding challenges of atomic physics. A complete solution of this problem in the form of a ''reduction to computation'' of all aspects of the physics is given by the application of exterior complex scaling, a modern variant of the mathematical tool of analytic continuation of the electronic coordinates into the complex plane that was used historically to establish the formal analytic properties of the scattering matrix. This review first discusses the essential difficulties of the three-body Coulomb breakup problem in quantum mechanics. It then describes the formal basis of exterior complex scaling of electronic coordinates as well as the details of its numerical implementation using a variety of methods including finite difference, finite elements, discrete variable representations, and B-splines. Given these numerical implementations of exterior complex scaling, the scattering wave function can be generated with arbitrary accuracy on any finite volume in the space of electronic coordinates, but there remains the fundamental problem of extracting the breakup amplitudes from it. Methods are described for evaluating these amplitudes. The question of the volume-dependent overall phase that appears in the formal theory of ionization is resolved. A summary is presented of accurate results that have been obtained for the case of electron-impact ionization of hydrogen as well as a discussion of applications to the double photoionization of helium.

  2. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving.

    Science.gov (United States)

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-03-06

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  3. Putting the puzzle together: the role of 'problem definition' in complex clinical judgement.

    Science.gov (United States)

    Cristancho, Sayra; Lingard, Lorelei; Forbes, Thomas; Ott, Michael; Novick, Richard

    2017-02-01

    We teach judgement in pieces; that is, we talk about each aspect separately (patient, plan, resources, technique, etc.). We also let trainees figure out how to put the pieces together. In complex situations, this might be problematic. Using data from a drawing-based study on surgeons' experiences with complex situations, we explore the notion of 'problem definition' in real-world clinical judgement using the theoretical lens of systems engineering. 'Emergence', the sensitising concept for analysis, is rooted in two key systems premises: that person and context are inseparable and that what emerges is an act of choice. Via a 'gallery walk' we used these premises to perform analysis on individual drawings as well as cross-comparisons of multiple drawings. Our focus was to understand similarities and differences among the vantage points used by multiple surgeons. In this paper we challenge two assumptions from current models of clinical judgement: that experts hold a fixed and static definition of the problem and that consequently the focus of the expert's work is on solving the problem. Each situation described by our participants revealed different but complementary perspectives of what a surgical problem might come to be: from concerns about ensuring standard of care, to balancing personal emotions versus care choices, to coordinating resources, and to maintaining control while in the midst of personality clashes. We suggest that it is only at the situation and system level, not at the individual level, that we are able to appreciate the nuances of defining the problem when experts make judgements during real-world complex situations. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  4. Molecular computing towards a novel computing architecture for complex problem solving

    CERN Document Server

    Chang, Weng-Long

    2014-01-01

    This textbook introduces a concise approach to the design of molecular algorithms for students or researchers who are interested in dealing with complex problems. Through numerous examples and exercises, you will understand the main difference of molecular circuits and traditional digital circuits to manipulate the same problem and you will also learn how to design a molecular algorithm of solving any a problem from start to finish. The book starts with an introduction to computational aspects of digital computers and molecular computing, data representation of molecular computing, molecular operations of molecular computing and number representation of molecular computing, and provides many molecular algorithm to construct the parity generator and the parity checker of error-detection codes on digital communication, to encode integers of different formats, single precision and double precision of floating-point numbers, to implement addition and subtraction of unsigned integers, to construct logic operations...

  5. Analyzing the Risk of Fire in a Hospital Complex by “Fire Risk Assessment Method for Engineering”(FRAME

    Directory of Open Access Journals (Sweden)

    Sarsangi V.* MSc,

    2016-08-01

    Full Text Available Aims The occurrence of fire in residential buildings, commercial complexes and large and small industries cause physical, environmental and financial damages to many different communities. Fire safety in hospitals is sensitive and it is believed that the society takes the responsibility to care sick people. The goal of this study was to use Fire Risk Assessment Method for Engineering (FRAME in a hospital complex environment and assess the level of fire risks. Materials & Methods This descriptive study was conducted in Kashan Shahid Beheshti hospital in 2013. The FRAME is designed based on the empirical and scientific knowledge and experiment and have acceptable reliability for assessing the building fire risk. Excel software was used to calculate the risk level and finally fire risk (R was calculated separately for different units. Findings Calculated Rs were less than 1for health, autoclave, office of nursing and infection control units. R1s were greater than 1 for all units. R2s were less than 1 for office of nursing and infection control units. Conclusion FRAME is an acceptable tool for assessing the risk of fire in buildings and the fire risk is high in Shahid Beheshti Hospital Complex of Kashan and damages can be intolerable in the case of fire.

  6. Use of a field model to analyze probable fire environments encountered within the complex geometries of nuclear power plants

    International Nuclear Information System (INIS)

    Boccio, J.L.; Usher, J.L.; Singhal, A.K.; Tam, L.T.

    1985-08-01

    A fire in a nuclear power plant (NPP) can damage equipment needed to safely operate the plant and thereby either directly cause an accident or else reduce the plant's margin of safety. The development of a field-model fire code to analyze the probable fire environments encountered within NPP is discussed. A set of fire tests carried out under the aegis of the US Nuclear Regulatory Commission (NRC) is described. The results of these tests are then utilized to validate the field model

  7. Understanding and quantifying cognitive complexity level in mathematical problem solving items

    Directory of Open Access Journals (Sweden)

    SUSAN E. EMBRETSON

    2008-09-01

    Full Text Available The linear logistic test model (LLTM; Fischer, 1973 has been applied to a wide variety of new tests. When the LLTM application involves item complexity variables that are both theoretically interesting and empirically supported, several advantages can result. These advantages include elaborating construct validity at the item level, defining variables for test design, predicting parameters of new items, item banking by sources of complexity and providing a basis for item design and item generation. However, despite the many advantages of applying LLTM to test items, it has been applied less often to understand the sources of complexity for large-scale operational test items. Instead, previously calibrated item parameters are modeled using regression techniques because raw item response data often cannot be made available. In the current study, both LLTM and regression modeling are applied to mathematical problem solving items from a widely used test. The findings from the two methods are compared and contrasted for their implications for continued development of ability and achievement tests based on mathematical problem solving items.

  8. Inverse problems in complex material design: Applications to non-crystalline solids

    Science.gov (United States)

    Biswas, Parthapratim; Drabold, David; Elliott, Stephen

    The design of complex amorphous materials is one of the fundamental problems in disordered condensed-matter science. While impressive developments of ab-initio simulation methods during the past several decades have brought tremendous success in understanding materials property from micro- to mesoscopic length scales, a major drawback is that they fail to incorporate existing knowledge of the materials in simulation methodologies. Since an essential feature of materials design is the synergy between experiment and theory, a properly developed approach to design materials should be able to exploit all available knowledge of the materials from measured experimental data. In this talk, we will address the design of complex disordered materials as an inverse problem involving experimental data and available empirical information. We show that the problem can be posed as a multi-objective non-convex optimization program, which can be addressed using a number of recently-developed bio-inspired global optimization techniques. In particular, we will discuss how a population-based stochastic search procedure can be used to determine the structure of non-crystalline solids (e.g. a-SiH, a-SiO2, amorphous graphene, and Fe and Ni clusters). The work is partially supported by NSF under Grant Nos. DMR 1507166 and 1507670.

  9. FOCUS, Neutron Transport System for Complex Geometry Reactor Core and Shielding Problems by Monte-Carlo

    International Nuclear Information System (INIS)

    Hoogenboom, J.E.

    1980-01-01

    1 - Description of problem or function: FOCUS enables the calculation of any quantity related to neutron transport in reactor or shielding problems, but was especially designed to calculate differential quantities, such as point values at one or more of the space, energy, direction and time variables of quantities like neutron flux, detector response, reaction rate, etc. or averages of such quantities over a small volume of the phase space. Different types of problems can be treated: systems with a fixed neutron source which may be a mono-directional source located out- side the system, and Eigen function problems in which the neutron source distribution is given by the (unknown) fundamental mode Eigen function distribution. Using Monte Carlo methods complex 3- dimensional geometries and detailed cross section information can be treated. Cross section data are derived from ENDF/B, with anisotropic scattering and discrete or continuous inelastic scattering taken into account. Energy is treated as a continuous variable and time dependence may also be included. 2 - Method of solution: A transformed form of the adjoint Boltzmann equation in integral representation is solved for the space, energy, direction and time variables by Monte Carlo methods. Adjoint particles are defined with properties in some respects contrary to those of neutrons. Adjoint particle histories are constructed from which estimates are obtained of the desired quantity. Adjoint cross sections are defined with which the nuclide and reaction type are selected in a collision. The energy after a collision is selected from adjoint energy distributions calculated together with the adjoint cross sections in advance of the actual Monte Carlo calculation. For multiplying systems successive generations of adjoint particles are obtained which will die out for subcritical systems with a fixed neutron source and will be kept approximately stationary for Eigen function problems. Completely arbitrary problems can

  10. Building University Capacity to Visualize Solutions to Complex Problems in the Arctic

    Science.gov (United States)

    Broderson, D.; Veazey, P.; Raymond, V. L.; Kowalski, K.; Prakash, A.; Signor, B.

    2016-12-01

    Rapidly changing environments are creating complex problems across the globe, which are particular magnified in the Arctic. These worldwide challenges can best be addressed through diverse and interdisciplinary research teams. It is incumbent on such teams to promote co-production of knowledge and data-driven decision-making by identifying effective methods to communicate their findings and to engage with the public. Decision Theater North (DTN) is a new semi-immersive visualization system that provides a space for teams to collaborate and develop solutions to complex problems, relying on diverse sets of skills and knowledge. It provides a venue to synthesize the talents of scientists, who gather information (data); modelers, who create models of complex systems; artists, who develop visualizations; communicators, who connect and bridge populations; and policymakers, who can use the visualizations to develop sustainable solutions to pressing problems. The mission of Decision Theater North is to provide a cutting-edge visual environment to facilitate dialogue and decision-making by stakeholders including government, industry, communities and academia. We achieve this mission by adopting a multi-faceted approach reflected in the theater's design, technology, networking capabilities, user support, community relationship building, and strategic partnerships. DTN is a joint project of Alaska's National Science Foundation Experimental Program to Stimulate Competitive Research (NSF EPSCoR) and the University of Alaska Fairbanks (UAF), who have brought the facility up to full operational status and are now expanding its development space to support larger team science efforts. Based in Fairbanks, Alaska, DTN is uniquely poised to address changes taking place in the Arctic and subarctic, and is connected with a larger network of decision theaters that include the Arizona State University Decision Theater Network and the McCain Institute in Washington, DC.

  11. Two-Level Solutions to Exponentially Complex Problems in Glass Science

    DEFF Research Database (Denmark)

    Mauro, John C.; Smedskjær, Morten Mattrup

    Glass poses an especially challenging problem for physicists. The key to making progress in theoretical glass science is to extract the key physics governing properties of practical interest. In this spirit, we discuss several two-level solutions to exponentially complex problems in glass science....... Topological constraint theory, originally developed by J.C. Phillips, is based on a two-level description of rigid and floppy modes in a glass network and can be used to derive quantitatively accurate and analytically solvable models for a variety of macroscopic properties. The temperature dependence...... that captures both primary and secondary relaxation modes. Such a model also offers the ability to calculate the distinguishability of particles during glass transition and relaxation processes. Two-level models can also be used to capture the distribution of various network-forming species in mixed...

  12. An Experimental Analysis on Dispatching Rules for the Train Platforming Problem in Busy Complex Passenger Stations

    Directory of Open Access Journals (Sweden)

    Qiongfang Zeng

    2017-09-01

    platforming problem (TPP by using mixed integer linear programming and job shop scheduling theory. First, the operation procedures and scheduled time adjustment costs of different train types specific to busy complex passenger stations are explicitly represented. Second, a multi-criteria scheduling model (MCS for TPP without earliness and tardiness time window (ETTW and a time window scheduling model (TWS with ETTW for TPP are proposed. Third, various dispatching rules were designed by incorporating the dispatcher experiences with modern scheduling theory and a rule-based metaheuristic to solve the above model is presented. With solution improvement strategies analogous to those used in practice by dispatchers, the realistic size problems in acceptable time can be solved.

  13. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    International Nuclear Information System (INIS)

    Tahvili, Sahar; Österberg, Jonas; Silvestrov, Sergei; Biteus, Jonas

    2014-01-01

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation

  14. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    Energy Technology Data Exchange (ETDEWEB)

    Tahvili, Sahar [Mälardalen University (Sweden); Österberg, Jonas; Silvestrov, Sergei [Division of Applied Mathematics, Mälardalen University (Sweden); Biteus, Jonas [Scania CV (Sweden)

    2014-12-10

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.

  15. Nanotechnology for sustainability: what does nanotechnology offer to address complex sustainability problems?

    Energy Technology Data Exchange (ETDEWEB)

    Wiek, Arnim, E-mail: arnim.wiek@asu.edu; Foley, Rider W. [Arizona State University, School of Sustainability (United States); Guston, David H. [Arizona State University, Center for Nanotechnology in Society, Consortium for Science, Policy and Outcomes (United States)

    2012-09-15

    Nanotechnology is widely associated with the promise of positively contributing to sustainability. However, this view often focuses on end-of-pipe applications, for instance, for water purification or energy efficiency, and relies on a narrow concept of sustainability. Approaching sustainability problems and solution options from a comprehensive and systemic perspective instead may yield quite different conclusions about the contribution of nanotechnology to sustainability. This study conceptualizes sustainability problems as complex constellations with several potential intervention points and amenable to different solution options. The study presents results from interdisciplinary workshops and literature reviews that appraise the contribution of the selected nanotechnologies to mitigate such problems. The study focuses exemplarily on the urban context to make the appraisals tangible and relevant. The solution potential of nanotechnology is explored not only for well-known urban sustainability problems such as water contamination and energy use but also for less obvious ones such as childhood obesity. Results indicate not only potentials but also limitations of nanotechnology's contribution to sustainability and can inform anticipatory governance of nanotechnology in general, and in the urban context in particular.

  16. Nanotechnology for sustainability: what does nanotechnology offer to address complex sustainability problems?

    International Nuclear Information System (INIS)

    Wiek, Arnim; Foley, Rider W.; Guston, David H.

    2012-01-01

    Nanotechnology is widely associated with the promise of positively contributing to sustainability. However, this view often focuses on end-of-pipe applications, for instance, for water purification or energy efficiency, and relies on a narrow concept of sustainability. Approaching sustainability problems and solution options from a comprehensive and systemic perspective instead may yield quite different conclusions about the contribution of nanotechnology to sustainability. This study conceptualizes sustainability problems as complex constellations with several potential intervention points and amenable to different solution options. The study presents results from interdisciplinary workshops and literature reviews that appraise the contribution of the selected nanotechnologies to mitigate such problems. The study focuses exemplarily on the urban context to make the appraisals tangible and relevant. The solution potential of nanotechnology is explored not only for well-known urban sustainability problems such as water contamination and energy use but also for less obvious ones such as childhood obesity. Results indicate not only potentials but also limitations of nanotechnology’s contribution to sustainability and can inform anticipatory governance of nanotechnology in general, and in the urban context in particular.

  17. Analyzing Katana referral hospital as a complex adaptive system: agents, interactions and adaptation to a changing environment.

    Science.gov (United States)

    Karemere, Hermès; Ribesse, Nathalie; Marchal, Bruno; Macq, Jean

    2015-01-01

    This study deals with the adaptation of Katana referral hospital in Eastern Democratic Republic of Congo in a changing environment that is affected for more than a decade by intermittent armed conflicts. His objective is to generate theoretical proposals for addressing differently the analysis of hospitals governance in the aims to assess their performance and how to improve that performance. The methodology applied approach uses a case study using mixed methods ( qualitative and quantitative) for data collection. It uses (1) hospital data to measure the output of hospitals, (2) literature review to identify among others, events and interventions recorded in the history of hospital during the study period and (3) information from individual interviews to validate the interpretation of the results of the previous two sources of data and understand the responsiveness of management team referral hospital during times of change. The study brings four theoretical propositions: (1) Interaction between key agents is a positive force driving adaptation if the actors share a same vision, (2) The strength of the interaction between agents is largely based on the nature of institutional arrangements, which in turn are shaped by the actors themselves, (3) The owner and the management team play a decisive role in the implementation of effective institutional arrangements and establishment of positive interactions between agents, (4) The analysis of recipient population's perception of health services provided allow to better tailor and adapt the health services offer to the population's needs and expectations. Research shows that it isn't enough just to provide support (financial and technical), to manage a hospital for operate and adapt to a changing environment but must still animate, considering that it is a complex adaptive system and that this animation is nothing other than the induction of a positive interaction between agents.

  18. Knowledge to action for solving complex problems: insights from a review of nine international cases.

    Science.gov (United States)

    Riley, B L; Robinson, K L; Gamble, J; Finegood, D T; Sheppard, D; Penney, T L; Best, A

    2015-05-01

    Solving complex problems such as preventing chronic diseases introduces unique challenges for the creation and application of knowledge, or knowledge to action (KTA). KTA approaches that apply principles of systems thinking are thought to hold promise, but practical strategies for their application are not well understood. In this paper we report the results of a scan of systems approaches to KTA with a goal to identify how to optimize their implementation and impact. A 5-person advisory group purposefully selected 9 initiatives to achieve diversity on issues addressed and organizational forms. Information on each case was gathered from documents and through telephone interviews with primary contacts within each organization. Following verification of case descriptions, an inductive analysis was conducted within and across cases. The cases revealed 5 guidelines for moving from conceiving KTA systems to implementing them: (1) establish and nurture relationships, (2) co-produce and curate knowledge, (3) create feedback loops, (4) frame as systems interventions rather than projects, and (5) consider variations across time and place. Results from the environmental scan are a modest start to translating systems concepts for KTA into practice. Use of the strategies revealed in the scan may improve KTA for solving complex public health problems. The strategies themselves will benefit from the development of a science that aims to understand adaptation and ongoing learning from policy and practice interventions, strengthens enduring relationships, and fills system gaps in addition to evidence gaps. Systems approaches to KTA will also benefit from robust evaluations.

  19. The problem of sustainability within the complexity of agricultural production systems

    International Nuclear Information System (INIS)

    Cotes Torres, Alejandro; Cotes Torres, Jose Miguel

    2005-01-01

    The problem of sustainability is a topic that since the end of the XX century has been worrying more the different sectors of society; becoming one of the topics of greatest interest for managers, consumers, academics and investigators that conform the different agricultural food chains of the world. This paper presents from the general systems theory point of view some elements of critical reflection, approaching the problem of sustainability from the complexity of agricultural production systems, beginning with the original philosophical conception of agricultural and ending by outlining some considerations that should be kept in mind for the development of scientific and technological advances concordant with the agricultural food chain needs of the XX century; which permit an orientation of not only work by profession is who lead the processes of animal and vegetable production, but also creates a sense of pertinence in all of the participants in the chain, highlighting the importance of studying by means of systemic thought, agronomy and animal science, as disciplines that approach to complexities of agriculture which is the angular stone of civilization, such as we know it at the moment

  20. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem

    Directory of Open Access Journals (Sweden)

    Williams PAH

    2015-07-01

    Full Text Available Patricia AH Williams, Andrew J Woodward eHealth Research Group and Security Research Institute, Edith Cowan University, Perth, WA, Australia Abstract: The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. Keywords: cybersecurity, security, safety, wireless, risk, medical devices

  1. Inductive dielectric analyzer

    International Nuclear Information System (INIS)

    Agranovich, Daniel; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri; Polygalov, Eugene

    2017-01-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions. (paper)

  2. Addressing Complex Societal Problems: Enabling Multiple Dimensions of Proximity to Sustain Partnerships for Collective Impact in Quebec

    Directory of Open Access Journals (Sweden)

    Nii A. Addy

    2018-03-01

    Full Text Available Sustainable solutions for complex societal problems, like poverty, require informing stakeholders about progress and changes needed as they collaborate. Yet, inter-organizational collaboration researchers highlight monumental challenges in measuring seemingly intangible factors during collective impact processes. We grapple with the question: How can decision-makers coherently conceptualize and measure seemingly intangible factors to sustain partnerships for the emergence of collective impact? We conducted an inductive process case study to address this question, analyzing data from documents, observations, and interviews of 24 philanthropy leaders and multiple stakeholders in a decades-long partnership involving Canada’s largest private family foundation, government and community networks, and during which a “collective impact project” emerged in Quebec Province, Canada. The multidimensional proximity framework provided an analytical lens. During the first phase of the partnership studied, there was a lack of baseline measurement of largely qualitative factors—conceptualized as cognitive, social, and institutional proximity between stakeholders—which evaluations suggested were important for explaining which community networks successfully brought about desired outcomes. Non-measurement of these factors was a problem in providing evidence for sustained engagement of stakeholders, such as government and local businesses. We develop a multidimensional proximity model that coherently conceptualizes qualitative proximity factors, for measuring their change over time.

  3. Leadership and leadership development in healthcare settings - a simplistic solution to complex problems?

    Science.gov (United States)

    McDonald, Ruth

    2014-10-01

    There is a trend in health systems around the world to place great emphasis on and faith in improving 'leadership'. Leadership has been defined in many ways and the elitist implications of traditional notions of leadership sit uncomfortably with modern healthcare organisations. The concept of distributed leadership incorporates inclusivity, collectiveness and collaboration, with the result that, to some extent, all staff, not just those in senior management roles, are viewed as leaders. Leadership development programmes are intended to equip individuals to improve leadership skills, but we know little about their effectiveness. Furthermore, the content of these programmes varies widely and the fact that many lack a sense of how they fit with individual or organisational goals raises questions about how they are intended to achieve their aims. It is important to avoid simplistic assumptions about the ability of improved leadership to solve complex problems. It is also important to evaluate leadership development programmes in ways that go beyond descriptive accounts.

  4. Multi Criteria Decision Making (MCDM). Complex problems made easy; Multi Criteria Decision Making (MCDM). Complexe vraagstukken behapbaar maken

    Energy Technology Data Exchange (ETDEWEB)

    Van Oeffelen, E.C.M.; Van Zundert, K.; Westerlaekn, A.C. [TNO, Delft (Netherlands)

    2011-12-15

    The existing housing stock needs to become smarter and more sustainable in its energy use. From a technical viewpoint, renovations can usually be realized successfully, but the multitude of preconditions such as phasing and the degree of inconvenience for residents often turn renovation into a complex matter. The MCDM method can be a suitable instrument in handling complex renovation issues. [Dutch] In de bestaande woningvoorraad moet slimmer en vooral duurzamer met energie worden omgegaan. Technisch gezien is een renovatie vaak goed realiseerbaar, maar vele randvoorwaarden, zoals fasering en mate van overlast voor bewoners, maken renovatievraagstukken vaak complex. De MCDM-methodiek kan een geschikt hulpmiddel zijn bij het aanpakken van complexe renovatievraagstukken.

  5. Using Educational Data Mining Methods to Assess Field-Dependent and Field-Independent Learners' Complex Problem Solving

    Science.gov (United States)

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…

  6. Student Learning of Complex Earth Systems: A Model to Guide Development of Student Expertise in Problem-Solving

    Science.gov (United States)

    Holder, Lauren N.; Scherer, Hannah H.; Herbert, Bruce E.

    2017-01-01

    Engaging students in problem-solving concerning environmental issues in near-surface complex Earth systems involves developing student conceptualization of the Earth as a system and applying that scientific knowledge to the problems using practices that model those used by professionals. In this article, we review geoscience education research…

  7. Development of a Novel Cu(II Complex Modified Electrode and a Portable Electrochemical Analyzer for the Determination of Dissolved Oxygen (DO in Water

    Directory of Open Access Journals (Sweden)

    Salvatore Gianluca Leonardi

    2016-04-01

    Full Text Available The development of an electrochemical dissolved oxygen (DO sensor based on a novel Cu(II complex-modified screen printed carbon electrode is reported. The voltammetric behavior of the modified electrode was investigated at different scan rates and oxygen concentrations in PBS (pH = 7. An increase of cathodic current (at about −0.4 vs. Ag/AgCl with the addition of oxygen was observed. The modified Cu(II complex electrode was demonstrated for the determination of DO in water using chronoamperometry. A small size and low power consumption home-made portable electrochemical analyzer based on custom electronics for sensor interfacing and operating in voltammetry and amperometry modes has been also designed and fabricated. Its performances in the monitoring of DO in water were compared with a commercial one.

  8. Level of satisfaction of older persons with their general practitioner and practice: role of complexity of health problems.

    Directory of Open Access Journals (Sweden)

    Antonius J Poot

    Full Text Available BACKGROUND: Satisfaction is widely used to evaluate and direct delivery of medical care; a complicated relationship exists between patient satisfaction, morbidity and age. This study investigates the relationships between complexity of health problems and level of patient satisfaction of older persons with their general practitioner (GP and practice. METHODS AND FINDINGS: This study is embedded in the ISCOPE (Integrated Systematic Care for Older Persons study. Enlisted patients aged ≥75 years from 59 practices received a written questionnaire to screen for complex health problems (somatic, functional, psychological and social. For 2664 randomly chosen respondents (median age 82 years; 68% female information was collected on level of satisfaction (satisfied, neutral, dissatisfied with their GP and general practice, and demographic and clinical characteristics including complexity of health problems. Of all participants, 4% was dissatisfied with their GP care, 59% neutral and 37% satisfied. Between these three categories no differences were observed in age, gender, country of birth or education level. The percentage of participants dissatisfied with their GP care increased from 0.4% in those with 0 problem domains to 8% in those with 4 domains, i.e. having complex health problems (p<0.001. Per additional health domain with problems, the risk of being dissatisfied increased 1.7 times (95% CI 1.4-2.14; p<0.001. This was independent of age, gender, and demographic and clinical parameters (adjusted OR 1.4, 95% CI 1.1-1.8; p = 0.021. CONCLUSION: In older persons, dissatisfaction with general practice is strongly correlated with rising complexity of health problems, independent of age, demographic and clinical parameters. It remains unclear whether complexity of health problems is a patient characteristic influencing the perception of care, or whether the care is unable to handle the demands of these patients. Prospective studies are needed to

  9. Exploring Corn-Ethanol As A Complex Problem To Teach Sustainability Concepts Across The Science-Business-Liberal Arts Curriculum

    Science.gov (United States)

    Oches, E. A.; Szymanski, D. W.; Snyder, B.; Gulati, G. J.; Davis, P. T.

    2012-12-01

    The highly interdisciplinary nature of sustainability presents pedagogic challenges when sustainability concepts are incorporated into traditional disciplinary courses. At Bentley University, where over 90 percent of students major in business disciplines, we have created a multidisciplinary course module centered on corn ethanol that explores a complex social, environmental, and economic problem and develops basic data analysis and analytical thinking skills in several courses spanning the natural, physical, and social sciences within the business curriculum. Through an NSF-CCLI grant, Bentley faculty from several disciplines participated in a summer workshop to define learning objectives, create course modules, and develop an assessment plan to enhance interdisciplinary sustainability teaching. The core instructional outcome was a data-rich exercise for all participating courses in which students plot and analyze multiple parameters of corn planted and harvested for various purposes including food (human), feed (animal), ethanol production, and commodities exchanged for the years 1960 to present. Students then evaluate patterns and trends in the data and hypothesize relationships among the plotted data and environmental, social, and economic drivers, responses, and unintended consequences. After the central data analysis activity, students explore corn ethanol production as it relates to core disciplinary concepts in their individual classes. For example, students in Environmental Chemistry produce ethanol using corn and sugar as feedstocks and compare the efficiency of each process, while learning about enzymes, fermentation, distillation, and other chemical principles. Principles of Geology students examine the effects of agricultural runoff on surface water quality associated with extracting greater agricultural yield from mid-continent croplands. The American Government course examines the role of political institutions, the political process, and various

  10. Accurate gradient approximation for complex interface problems in 3D by an improved coupling interface method

    Energy Technology Data Exchange (ETDEWEB)

    Shu, Yu-Chen, E-mail: ycshu@mail.ncku.edu.tw [Department of Mathematics, National Cheng Kung University, Tainan 701, Taiwan (China); Mathematics Division, National Center for Theoretical Sciences (South), Tainan 701, Taiwan (China); Chern, I-Liang, E-mail: chern@math.ntu.edu.tw [Department of Applied Mathematics, National Chiao Tung University, Hsin Chu 300, Taiwan (China); Department of Mathematics, National Taiwan University, Taipei 106, Taiwan (China); Mathematics Division, National Center for Theoretical Sciences (Taipei Office), Taipei 106, Taiwan (China); Chang, Chien C., E-mail: mechang@iam.ntu.edu.tw [Institute of Applied Mechanics, National Taiwan University, Taipei 106, Taiwan (China); Department of Mathematics, National Taiwan University, Taipei 106, Taiwan (China)

    2014-10-15

    Most elliptic interface solvers become complicated for complex interface problems at those “exceptional points” where there are not enough neighboring interior points for high order interpolation. Such complication increases especially in three dimensions. Usually, the solvers are thus reduced to low order accuracy. In this paper, we classify these exceptional points and propose two recipes to maintain order of accuracy there, aiming at improving the previous coupling interface method [26]. Yet the idea is also applicable to other interface solvers. The main idea is to have at least first order approximations for second order derivatives at those exceptional points. Recipe 1 is to use the finite difference approximation for the second order derivatives at a nearby interior grid point, whenever this is possible. Recipe 2 is to flip domain signatures and introduce a ghost state so that a second-order method can be applied. This ghost state is a smooth extension of the solution at the exceptional point from the other side of the interface. The original state is recovered by a post-processing using nearby states and jump conditions. The choice of recipes is determined by a classification scheme of the exceptional points. The method renders the solution and its gradient uniformly second-order accurate in the entire computed domain. Numerical examples are provided to illustrate the second order accuracy of the presently proposed method in approximating the gradients of the original states for some complex interfaces which we had tested previous in two and three dimensions, and a real molecule ( (1D63)) which is double-helix shape and composed of hundreds of atoms.

  11. Application of the random phase approximation to complex problems in materials science

    International Nuclear Information System (INIS)

    Schimka, L.

    2012-01-01

    This thesis is devoted to the assessment and application of the random phase approximation (RPA) in the adiabatic-connection fluctuation-dissipation (ACFD) framework in solid state physics. The first part presents a review of density functional theory (DFT) and the ACFD theorem in the RPA. This includes an introduction to the many-body problem as well as a description of the implementation of the RPA in the Vienna Ab-initio Simulation Package (VASP). In the results part, the quality of the RPA is assessed and its performance compared to three (beyond) DFT functionals. The experimental values are corrected for the effect of phonon zero-point vibrational energies which were calculated at the DFT level from ab-initio. We find that the RPA describes all bonding situations very accurately, making it a promising candidate for more complex problems in solid state physics. In light of these findings, we investigate the carbon-water interaction in two specific cases: the adsorption of water on benzene and the adsorption of water on a graphene layer. We compare our results to a different correlated method: diffusion Monte Carlo (DMC). We find very good agreement and thus believe that our values can serve as a benchmark for the development of other DFT functionals to treat water-carbon interfaces. The highlight of this thesis is the successful application of the RPA to the long-standing and (at DFT level) unsolved CO adsorption puzzle. We show results for CO adsorption on Cu, late 4d metals and Pt. RPA is at present the only ab-initio method that describes adsorption and surface energies accurately at the same time and predicts the correct adsorption site in every single case. (author) [de

  12. Computational issues in complex water-energy optimization problems: Time scales, parameterizations, objectives and algorithms

    Science.gov (United States)

    Efstratiadis, Andreas; Tsoukalas, Ioannis; Kossieris, Panayiotis; Karavokiros, George; Christofides, Antonis; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Modelling of large-scale hybrid renewable energy systems (HRES) is a challenging task, for which several open computational issues exist. HRES comprise typical components of hydrosystems (reservoirs, boreholes, conveyance networks, hydropower stations, pumps, water demand nodes, etc.), which are dynamically linked with renewables (e.g., wind turbines, solar parks) and energy demand nodes. In such systems, apart from the well-known shortcomings of water resources modelling (nonlinear dynamics, unknown future inflows, large number of variables and constraints, conflicting criteria, etc.), additional complexities and uncertainties arise due to the introduction of energy components and associated fluxes. A major difficulty is the need for coupling two different temporal scales, given that in hydrosystem modeling, monthly simulation steps are typically adopted, yet for a faithful representation of the energy balance (i.e. energy production vs. demand) a much finer resolution (e.g. hourly) is required. Another drawback is the increase of control variables, constraints and objectives, due to the simultaneous modelling of the two parallel fluxes (i.e. water and energy) and their interactions. Finally, since the driving hydrometeorological processes of the integrated system are inherently uncertain, it is often essential to use synthetically generated input time series of large length, in order to assess the system performance in terms of reliability and risk, with satisfactory accuracy. To address these issues, we propose an effective and efficient modeling framework, key objectives of which are: (a) the substantial reduction of control variables, through parsimonious yet consistent parameterizations; (b) the substantial decrease of computational burden of simulation, by linearizing the combined water and energy allocation problem of each individual time step, and solve each local sub-problem through very fast linear network programming algorithms, and (c) the substantial

  13. A numerical approach of thermal problems coupling fluid solid and radiation in complex geometries; Approche numerique de problemes thermiques couplant fluides, solides et rayonnement en geometries complexes

    Energy Technology Data Exchange (ETDEWEB)

    Peniguel, C; Rupp, I

    1995-11-01

    In many industrial problems, heat transfer does play an important part. Quite often, radiation, convection and radiation are present simultaneously. This paper presents the numerical tool handling simultaneously these phenomena. Fluid is tackled by the finite element code N3S, radiation (restricted to a non participating medium) and conduction are handled with SYRTHES respectively by a radiosity method and a finite element method. The main originality of the product is that meshes used to solve each phenomenon are completely independent. This allows users to choose the most appropriate spatial discretization for each part or phenomenon. This flexibility requires of course robust and fast data exchange procedures (temperature, convective flux, radiative flux) between the independent grids. This operation is done automatically by the code SYRTHES. One simple problem illustrating the interest of this development is presented at the end of the paper. (author). 6 refs., 8 figs.

  14. Using threshold regression to analyze survival data from complex surveys: With application to mortality linked NHANES III Phase II genetic data.

    Science.gov (United States)

    Li, Yan; Xiao, Tao; Liao, Dandan; Lee, Mei-Ling Ting

    2018-03-30

    The Cox proportional hazards (PH) model is a common statistical technique used for analyzing time-to-event data. The assumption of PH, however, is not always appropriate in real applications. In cases where the assumption is not tenable, threshold regression (TR) and other survival methods, which do not require the PH assumption, are available and widely used. These alternative methods generally assume that the study data constitute simple random samples. In particular, TR has not been studied in the setting of complex surveys that involve (1) differential selection probabilities of study subjects and (2) intracluster correlations induced by multistage cluster sampling. In this paper, we extend TR procedures to account for complex sampling designs. The pseudo-maximum likelihood estimation technique is applied to estimate the TR model parameters. Computationally efficient Taylor linearization variance estimators that consider both the intracluster correlation and the differential selection probabilities are developed. The proposed methods are evaluated by using simulation experiments with various complex designs and illustrated empirically by using mortality-linked Third National Health and Nutrition Examination Survey Phase II genetic data. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Managing the Complexity of Design Problems through Studio-Based Learning

    Science.gov (United States)

    Cennamo, Katherine; Brandt, Carol; Scott, Brigitte; Douglas, Sarah; McGrath, Margarita; Reimer, Yolanda; Vernon, Mitzi

    2011-01-01

    The ill-structured nature of design problems makes them particularly challenging for problem-based learning. Studio-based learning (SBL), however, has much in common with problem-based learning and indeed has a long history of use in teaching students to solve design problems. The purpose of this ethnographic study of an industrial design class,…

  16. The complexity of the matching-cut problem for planar graphs and other graph classes

    NARCIS (Netherlands)

    Bonsma, P.S.

    2009-01-01

    The Matching-Cut problem is the problem to decide whether a graph has an edge cut that is also a matching. Previously this problem was studied under the name of the Decomposable Graph Recognition problem, and proved to be -complete when restricted to graphs with maximum degree four. In this paper it

  17. Case study method and problem-based learning: utilizing the pedagogical model of progressive complexity in nursing education.

    Science.gov (United States)

    McMahon, Michelle A; Christopher, Kimberly A

    2011-08-19

    As the complexity of health care delivery continues to increase, educators are challenged to determine educational best practices to prepare BSN students for the ambiguous clinical practice setting. Integrative, active, and student-centered curricular methods are encouraged to foster student ability to use clinical judgment for problem solving and informed clinical decision making. The proposed pedagogical model of progressive complexity in nursing education suggests gradually introducing students to complex and multi-contextual clinical scenarios through the utilization of case studies and problem-based learning activities, with the intention to transition nursing students into autonomous learners and well-prepared practitioners at the culmination of a nursing program. Exemplar curricular activities are suggested to potentiate student development of a transferable problem solving skill set and a flexible knowledge base to better prepare students for practice in future novel clinical experiences, which is a mutual goal for both educators and students.

  18. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    Science.gov (United States)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  19. How students process equations in solving quantitative synthesis problems? Role of mathematical complexity in students’ mathematical performance

    Directory of Open Access Journals (Sweden)

    Bashirah Ibrahim

    2017-10-01

    Full Text Available We examine students’ mathematical performance on quantitative “synthesis problems” with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking, formulation and combination of equations require conceptual reasoning; simplification of equations requires manipulation of equations as computational tools. Mathematical complexity is operationally defined by the number and the type of equations to be manipulated concurrently due to the number of unknowns in each equation. We use two types of synthesis problems, namely, sequential and simultaneous tasks. Sequential synthesis tasks require a chronological application of pertinent concepts, and simultaneous synthesis tasks require a concurrent application of the pertinent concepts. A total of 179 physics major students from a second year mechanics course participated in the study. Data were collected from written tasks and individual interviews. Results show that mathematical complexity negatively influences the students’ mathematical performance on both types of synthesis problems. However, for the sequential synthesis tasks, it interferes only with the students’ simplification of equations. For the simultaneous synthesis tasks, mathematical complexity additionally impedes the students’ formulation and combination of equations. Several reasons may explain this difference, including the students’ different approaches to the two types of synthesis problems, cognitive load, and the variation of mathematical complexity within each synthesis type.

  20. Level of satisfaction of older persons with their general practitioner and practice: role of complexity of health problems.

    Science.gov (United States)

    Poot, Antonius J; den Elzen, Wendy P J; Blom, Jeanet W; Gussekloo, Jacobijn

    2014-01-01

    Satisfaction is widely used to evaluate and direct delivery of medical care; a complicated relationship exists between patient satisfaction, morbidity and age. This study investigates the relationships between complexity of health problems and level of patient satisfaction of older persons with their general practitioner (GP) and practice. This study is embedded in the ISCOPE (Integrated Systematic Care for Older Persons) study. Enlisted patients aged ≥75 years from 59 practices received a written questionnaire to screen for complex health problems (somatic, functional, psychological and social). For 2664 randomly chosen respondents (median age 82 years; 68% female) information was collected on level of satisfaction (satisfied, neutral, dissatisfied) with their GP and general practice, and demographic and clinical characteristics including complexity of health problems. Of all participants, 4% was dissatisfied with their GP care, 59% neutral and 37% satisfied. Between these three categories no differences were observed in age, gender, country of birth or education level. The percentage of participants dissatisfied with their GP care increased from 0.4% in those with 0 problem domains to 8% in those with 4 domains, i.e. having complex health problems (ppatient characteristic influencing the perception of care, or whether the care is unable to handle the demands of these patients. Prospective studies are needed to investigate the causal associations between care organization, patient characteristics, indicators of quality, and patient perceptions.

  1. The Assessment of 21st Century Skills in Industrial and Organizational Psychology: Complex and Collaborative Problem Solving

    OpenAIRE

    Neubert, Jonas; Mainert, Jakob; Kretzschmar, André; Greiff, Samuel

    2015-01-01

    In the current paper, we highlight why and how industrial and organizational psychology can take advantage of research on 21st century skills and their assessment. We present vital theoretical perspectives, a suitable framework for assessment, and exemplary instruments with a focus on advances in the assessment of Human Capital. Specifically, Complex Problem Solving (CPS) and Collaborative Problem Solving (ColPS) are two transversal skills (i.e., skills that span multiple domains) that are...

  2. Atrial fibrillation management in older heart failure patients: a complex clinical problem

    Directory of Open Access Journals (Sweden)

    Giovanni Pulignano

    2016-09-01

    Full Text Available BackgroundAtrial fibrillation (AF and heart failure (HF, two problems of growing prevalence as a consequence of the ageing population, are associated with high morbidity, mortality, and healthcare costs. AF and HF also share common risk factors and pathophysiologic processes such as hypertension, diabetes mellitus, ischemic heart disease, and valvular heart disease often occur together. Although elderly patients with both HF and AF are affected by worse symptoms and poorer prognosis, there is a paucity of data on appropriate management of these patients.MethodsPubMed was searched for studies on AF and older patients using the terms atrial fibrillation, elderly, heart failure, cognitive impairment, frailty, stroke, and anticoagulants.ResultsThe clinical picture of HF patients with AF is complex and heterogeneous with a higher prevalence of frailty, cognitive impairment, and disability. Because of the association of mental and physical impairment to non-administration of oral anticoagulants (OACs, screening for these simple variables in clinical practice may allow better strategies for intervention in this high-risk population. Since novel direct OACs (NOACs have a more favorable risk-benefit profile, they may be preferable to vitamin K antagonists (VKAs in many frail elderly patients, especially those at higher risk of falls. Moreover, NOACs are simple to administer and monitor and may be associated with better adherence and safety in patients with cognitive deficits and mobility impairments.ConclusionsLarge multicenter longitudinal studies are needed to examine the effects of VKAs and NOACs on long-term cognitive function and frailty; future studies should include geriatric conditions.

  3. Evolving hard problems: Generating human genetics datasets with a complex etiology

    Directory of Open Access Journals (Sweden)

    Himmelstein Daniel S

    2011-07-01

    Full Text Available Abstract Background A goal of human genetics is to discover genetic factors that influence individuals' susceptibility to common diseases. Most common diseases are thought to result from the joint failure of two or more interacting components instead of single component failures. This greatly complicates both the task of selecting informative genetic variants and the task of modeling interactions between them. We and others have previously developed algorithms to detect and model the relationships between these genetic factors and disease. Previously these methods have been evaluated with datasets simulated according to pre-defined genetic models. Results Here we develop and evaluate a model free evolution strategy to generate datasets which display a complex relationship between individual genotype and disease susceptibility. We show that this model free approach is capable of generating a diverse array of datasets with distinct gene-disease relationships for an arbitrary interaction order and sample size. We specifically generate eight-hundred Pareto fronts; one for each independent run of our algorithm. In each run the predictiveness of single genetic variation and pairs of genetic variants have been minimized, while the predictiveness of third, fourth, or fifth-order combinations is maximized. Two hundred runs of the algorithm are further dedicated to creating datasets with predictive four or five order interactions and minimized lower-level effects. Conclusions This method and the resulting datasets will allow the capabilities of novel methods to be tested without pre-specified genetic models. This allows researchers to evaluate which methods will succeed on human genetics problems where the model is not known in advance. We further make freely available to the community the entire Pareto-optimal front of datasets from each run so that novel methods may be rigorously evaluated. These 76,600 datasets are available from http://discovery.dartmouth.edu/model_free_data/.

  4. A framework to approach problems of forensic anthropology using complex networks

    Science.gov (United States)

    Caridi, Inés; Dorso, Claudio O.; Gallo, Pablo; Somigliana, Carlos

    2011-05-01

    We have developed a method to analyze and interpret emerging structures in a set of data which lacks some information. It has been conceived to be applied to the problem of getting information about people who disappeared in the Argentine state of Tucumán from 1974 to 1981. Even if the military dictatorship formally started in Argentina had begun in 1976 and lasted until 1983, the disappearance and assassination of people began some months earlier. During this period several circuits of Illegal Detention Centres (IDC) were set up in different locations all over the country. In these secret centres, disappeared people were illegally kept without any sort of constitutional guarantees, and later assassinated. Even today, the final destination of most of the disappeared people’s remains is still unknown. The fundamental hypothesis in this work is that a group of people with the same political affiliation whose disappearances were closely related in time and space shared the same place of captivity (the same IDC or circuit of IDCs). This hypothesis makes sense when applied to the systematic method of repression and disappearances which was actually launched in Tucumán, Argentina (2007) [11]. In this work, the missing individuals are identified as nodes on a network and connections are established among them based on the individuals’ attributes while they were alive, by using rules to link them. In order to determine which rules are the most effective in defining the network, we use other kind of knowledge available in this problem: previous results from the anthropological point of view (based on other sources of information, both oral and written, historical and anthropological data, etc.); and information about the place (one or more IDCs) where some people were kept during their captivity. For these best rules, a prediction about these people’s possible destination is assigned (one or more IDCs where they could have been kept), and the success of the

  5. Problem of quality assurance during metal constructions welding via robotic technological complexes

    Science.gov (United States)

    Fominykh, D. S.; Rezchikov, A. F.; Kushnikov, V. A.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.

    2018-05-01

    The problem of minimizing the probability for critical combinations of events that lead to a loss in welding quality via robotic process automation is examined. The problem is formulated, models and algorithms for its solution are developed. The problem is solved by minimizing the criterion characterizing the losses caused by defective products. Solving the problem may enhance the quality and accuracy of operations performed and reduce the losses caused by defective product

  6. Untangling the Complex Needs of People Experiencing Gambling Problems and Homelessness

    Science.gov (United States)

    Holdsworth, Louise; Tiyce, Margaret

    2013-01-01

    People with gambling problems are now recognised among those at increased risk of homelessness, and the link between housing and gambling problems has been identified as an area requiring further research. This paper discusses the findings of a qualitative study that explored the relationship between gambling problems and homelessness. Interviews…

  7. Influence of metal loading and humic acid functional groups on the complexation behavior of trivalent lanthanides analyzed by CE-ICP-MS

    Energy Technology Data Exchange (ETDEWEB)

    Kautenburger, Ralf, E-mail: r.kautenburger@mx.uni-saarland.de [Institute of Inorganic Solid State Chemistry, Saarland University, Campus Dudweiler, Am Markt Zeile 3-5, D-66125 Saarbrücken (Germany); Hein, Christina; Sander, Jonas M. [Institute of Inorganic Solid State Chemistry, Saarland University, Campus Dudweiler, Am Markt Zeile 3-5, D-66125 Saarbrücken (Germany); Beck, Horst P. [Institute of Inorganic and Analytical Chemistry and Radiochemistry, Saarland University, Campus Dudweiler, Am Markt Zeile 5, D-66125 Saarbrücken (Germany)

    2014-03-01

    Highlights: • Free and complexed HA-Ln species are separated by CE-ICP-MS. • Weaker and stronger HA-binding sites for Ln-complexation can be detected. • Complexation by original and modified humic acid (HA) with blocked phenolic hydroxyl- and carboxyl-groups is compared. • Stronger HA-binding sites for Ln³⁺ can be assumed as chelating complexes. • Chelates consist of trivalent Ln and a combination of both OH- and COOH-groups. Abstract: The complexation behavior of Aldrich humic acid (AHA) and a modified humic acid (AHA-PB) with blocked phenolic hydroxyl groups for trivalent lanthanides (Ln) is compared, and their influence on the mobility of Ln(III) in an aquifer is analyzed. As speciation technique, capillary electrophoresis (CE) was hyphenated with inductively coupled plasma mass spectrometry (ICP-MS). For metal loading experiments 25 mg L⁻¹ of AHA and different concentrations (c Ln(Eu+Gd)} = 100–6000 μg L⁻¹) of Eu(III) and Gd(III) in 10 mM NaClO₄ at pH 5 were applied. By CE-ICP-MS, three Ln-fractions, assumed to be uncomplexed, weakly and strongly AHA-complexed metal can be detected. For the used Ln/AHA-ratios conservative complex stability constants log βLnAHA decrease from 6.33 (100 μg L⁻¹ Ln³⁺) to 4.31 (6000 μg L⁻¹ Ln³⁺) with growing Ln-content. In order to verify the postulated weaker and stronger humic acid binding sites for trivalent Eu and Gd, a modified AHA with blocked functional groups was used. For these experiments 500 μg L⁻¹ Eu and 25 mg L⁻¹ AHA and AHA-PB in 10 mM NaClO₄ at pH-values ranging from 3 to 10 have been applied. With AHA-PB, where 84% of the phenolic OH-groups and 40% of the COOH-groups were blocked, Eu complexation was significantly lower, especially at the strong binding sites. The log β-values decrease from 6.11 (pH 10) to 5.61 at pH 3 (AHA) and for AHA-PB from 6.01 (pH 7) to 3.94 at pH 3. As a potential consequence, particularly humic acids with a high amount of

  8. Generalist solutions to complex problems: generating practice-based evidence--the example of managing multi-morbidity.

    Science.gov (United States)

    Reeve, Joanne; Blakeman, Tom; Freeman, George K; Green, Larry A; James, Paul A; Lucassen, Peter; Martin, Carmel M; Sturmberg, Joachim P; van Weel, Chris

    2013-08-07

    A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a 'complex intervention' (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Answers to the complex problem of multi-morbidity won't come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity.

  9. Linking Complex Problem Solving and General Mental Ability to Career Advancement: Does a Transversal Skill Reveal Incremental Predictive Validity?

    Science.gov (United States)

    Mainert, Jakob; Kretzschmar, André; Neubert, Jonas C.; Greiff, Samuel

    2015-01-01

    Transversal skills, such as complex problem solving (CPS) are viewed as central twenty-first-century skills. Recent empirical findings have already supported the importance of CPS for early academic advancement. We wanted to determine whether CPS could also contribute to the understanding of career advancement later in life. Towards this end, we…

  10. Fifth Anniversary youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects. Theses of reports

    International Nuclear Information System (INIS)

    2009-01-01

    Theses of reports of the Fifth Anniversary youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects (21-23 April 2009, Ozersk) are presented. The book contains abstracts of papers of fourth thematic sections: SNF reprocessing: science and industry; Radioecology and radiobiology; Advanced science-intensive technologies and materials; Education and training for NFC plants

  11. Validity of the MicroDYN Approach: Complex Problem Solving Predicts School Grades beyond Working Memory Capacity

    Science.gov (United States)

    Schweizer, Fabian; Wustenberg, Sascha; Greiff, Samuel

    2013-01-01

    This study examines the validity of the complex problem solving (CPS) test MicroDYN by investigating a) the relation between its dimensions--rule identification (exploration strategy), rule knowledge (acquired knowledge), rule application (control performance)--and working memory capacity (WMC), and b) whether CPS predicts school grades in…

  12. Generalist solutions to complex problems: generating practice-based evidence - the example of managing multi-morbidity

    NARCIS (Netherlands)

    Reeve, J.; Blakeman, T.; Freeman, G.K.; Green, L.A.; James, P.A.; Lucassen, P.L.; Martin, C.M.; Sturmberg, J.P.; Weel, C. van

    2013-01-01

    BACKGROUND: A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet

  13. Learning by Preparing to Teach: Fostering Self-Regulatory Processes and Achievement during Complex Mathematics Problem Solving

    Science.gov (United States)

    Muis, Krista R.; Psaradellis, Cynthia; Chevrier, Marianne; Di Leo, Ivana; Lajoie, Susanne P.

    2016-01-01

    We developed an intervention based on the learning by teaching paradigm to foster self-regulatory processes and better learning outcomes during complex mathematics problem solving in a technology-rich learning environment. Seventy-eight elementary students were randomly assigned to 1 of 2 conditions: learning by preparing to teach, or learning for…

  14. Modelling of Octahedral Manganese II Complexes with Inorganic Ligands: A Problem with Spin-States

    Directory of Open Access Journals (Sweden)

    Ludwik Adamowicz

    2003-08-01

    Full Text Available Abstract: Quantum mechanical ab initio UHF, MP2, MC-SCF and DFT calculations with moderate Gaussian basis sets were performed for MnX6, X = H2O, F-, CN-, manganese octahedral complexes. The correct spin-state of the complexes was obtained only when the counter ions neutralizing the entire complexes were used in the modelling at the B3LYP level of theory.

  15. Congruences of null strings in complex space-times and some Cauchy--Kovalevski-like problems

    International Nuclear Information System (INIS)

    Robinson, I.; Rozga, K.

    1984-01-01

    It is shown that a problem of construction of a local congruence of null strings is equivalent to a natural Cauchy--Kovalevski-like problem, related to an equation for a spinor field k/sub A/ defining the congruence. Initial data are specified on two-dimensional submanifolds. In left-conformally-flat spaces, the solution of that problem exists for arbitrary initial data

  16. “Robots in Space” Multiagent Problem: Complexity, Information and Cryptographic Aspects

    Directory of Open Access Journals (Sweden)

    A. Yu. Bernstein

    2013-01-01

    Full Text Available We study a multiagent algorithmic problem that we call Robot in Space (RinS: There are n ≥ 2 autonomous robots, that need to agree without outside interference on distribution of shelters, so that straight pathes to the shelters will not intersect. The problem is closely related to the assignment problem in Graph Theory, to the convex hull problem in Combinatorial Geometry, or to the path-planning problem in Artificial Intelligence. Our algorithm grew up from a local search solution of the problem suggested by E.W. Dijkstra. We present a multiagent anonymous and scalable algorithm (protocol solving the problem, give an upper bound for the algorithm, prove (manually its correctness, and examine two communication aspects of the RinS problem — the informational and cryptographic. We proved that (1 there is no protocol that solves the RinS, which transfers a bounded number of bits, and (2 suggested the protocol that allows robots to check whether their paths intersect, without revealing additional information about their relative positions (with respect to shelters. The present paper continues the research presented in Mars Robot Puzzle (a Multiagent Approach to the Dijkstra Problem (by E.V. Bodin, N.O. Garanina, and N.V. Shilov, published in Modeling and analysis of information systems, 18(2, 2011.

  17. A complex of optimization problems in planning for the development of mining operations in coal mines

    Energy Technology Data Exchange (ETDEWEB)

    Todorov, A K; Arnaudov, B K; Brankova, B A; Gyuleva, B I; Zakhariyev, G K

    1977-01-01

    The system for planning for the development of coal mines is a complex of interrelated plan optimization, plan calculation and supporting (accounting-analytical and standards) tasks. An important point in this complex is held by the plan optimization tasks. The questions about the synthesis and the structural peculiarities of the system, the essence and machine realization of the tasks are examined.

  18. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  19. Complex problems require complex solutions: the utility of social quality theory for addressing the Social Determinants of Health

    Directory of Open Access Journals (Sweden)

    Ward Paul R

    2011-08-01

    Full Text Available Abstract Background In order to improve the health of the most vulnerable groups in society, the WHO Commission on Social Determinants of Health (CSDH called for multi-sectoral action, which requires research and policy on the multiple and inter-linking factors shaping health outcomes. Most conceptual tools available to researchers tend to focus on singular and specific social determinants of health (SDH (e.g. social capital, empowerment, social inclusion. However, a new and innovative conceptual framework, known as social quality theory, facilitates a more complex and complete understanding of the SDH, with its focus on four domains: social cohesion, social inclusion, social empowerment and socioeconomic security, all within the same conceptual framework. This paper provides both an overview of social quality theory in addition to findings from a national survey of social quality in Australia, as a means of demonstrating the operationalisation of the theory. Methods Data were collected using a national random postal survey of 1044 respondents in September, 2009. Multivariate logistic regression analysis was conducted. Results Statistical analysis revealed that people on lower incomes (less than $45000 experience worse social quality across all of the four domains: lower socio-economic security, lower levels of membership of organisations (lower social cohesion, higher levels of discrimination and less political action (lower social inclusion and lower social empowerment. The findings were mixed in terms of age, with people over 65 years experiencing lower socio-economic security, but having higher levels of social cohesion, experiencing lower levels of discrimination (higher social inclusion and engaging in more political action (higher social empowerment. In terms of gender, women had higher social cohesion than men, although also experienced more discrimination (lower social inclusion. Conclusions Applying social quality theory allows

  20. Análise de erros ortográficos em diferentes problemas de aprendizagem Analyzing typical orthographic mistakes related to different learning problems

    Directory of Open Access Journals (Sweden)

    Jaime Luiz Zorzi

    2009-09-01

    learning problems, check if the types of produced mistakes are those found in the learning that is considered normal and analyze if orthographic or phonological nature problems prevail in each disorder. METHODS: the writing of 64 subjects was evaluated by the Laboratory of Learning Disabilities of the Neurology Department of UNICAMP and diagnosed as showing some type of learning problem. Deficit of Attention / Hyperactivity disorder (28; School Difficulties (13; Learning Disabilities (7; Dyslexia (3; Associated Disorders (5 and Inconclusive Diagnosis (9. The ages varied between 8;2 and 13;4 years, with a 10;6 year-old average. Only subjects in alphabetical writing level without any type of intellectual deficit were included. The found mistakes were classified in eleven categories and quantified for ends of statistical analysis. RESULTS: the spelling mistakes found in each problem type correspond to those observed in children without learning complaint. The spelling mistakes through Multiple Representations, Omission of letters and Orality, are respectively, the three most frequent types in the cases Deficit of Attention and Hyperactivity disorder, School Difficulties, Associated Disorders and Unknown Diagnosis. In the Disturbance of Learning the sequence is of Multiple Representations, Omission, Other Mistakes and Voiced/Unvoiced mistakes. In the dyslexia we note the sequence of Multiple Representations, Orality, Omission and Other Mistakes. There is a trend, in each problem type, to the prevalence of orthographic nature mistakes, although with no statistically significant difference in relation to the phonological nature mistakes. CONCLUSION: the orthographic nature mistakes are the most frequent, although, there is no significant difference, in each group, in relation to the phonological nature mistakes. With contrary trend, the visual-spatial mistakes have low occurrence in general, which shows that the difficulty concerning all groups has fundamentally a linguistic

  1. Complexity of the positive semidefinite matrix completion problem with a rank constraint

    NARCIS (Netherlands)

    Nagy, M.; Laurent, M.; Varvitsiotis, A.; Bezdek, K.; Deza, A.; Ye, Y.

    2013-01-01

    We consider the decision problem asking whether a partial rational symmetric matrix with an all-ones diagonal can be completed to a full positive semidefinite matrix of rank at most k. We show that this problem is NP-hard for any fixed integer k ≥ 2. In other words, for k ≥ 2, it is NP-hard to test

  2. Complexity of the positive semidefinite matrix completion problem with a rank constraint.

    NARCIS (Netherlands)

    M. Eisenberg-Nagy (Marianna); M. Laurent (Monique); A. Varvitsiotis (Antonios); K. Bezdek; A. Deza; Y. Ye

    2013-01-01

    htmlabstractWe consider the decision problem asking whether a partial rational symmetric matrix with an all-ones diagonal can be completed to a full positive semidefinite matrix of rank at most k. We show that this problem is NP-hard for any fixed integer k ≥ 2. Equivalently, for k ≥ 2, it is

  3. Complexity of the positive semidefinite matrix completion problem with a rank constraint.

    NARCIS (Netherlands)

    M. Eisenberg-Nagy (Marianna); M. Laurent (Monique); A. Varvitsiotis (Antonios)

    2012-01-01

    htmlabstractWe consider the decision problem asking whether a partial rational symmetric matrix with an all-ones diagonal can be completed to a full positive semidefinite matrix of rank at most k. We show that this problem is NP-hard for any fixed integer k ≥ 2. Equivalently, for k ≥ 2, it is

  4. On the Combinatorics of SAT and the Complexity of Planar Problems

    DEFF Research Database (Denmark)

    Talebanfard, Navid

    In this thesis we study several problems arising in Boolean satisfiability ranging from lower bounds for SAT algorithms and proof systems to extremal properties of formulas. The first problem is about construction of hard instances for k-SAT algorithms. For PPSZ algorithm [40] we give the first...

  5. Complex Problem Solving in Radiologic Technology: Understanding the Roles of Experience, Reflective Judgment, and Workplace Culture

    Science.gov (United States)

    Yates, Jennifer L.

    2011-01-01

    The purpose of this research study was to explore the process of learning and development of problem solving skills in radiologic technologists. The researcher sought to understand the nature of difficult problems encountered in clinical practice, to identify specific learning practices leading to the development of professional expertise, and to…

  6. Complexity classifications for different equivalence and audit problems for Boolean circuits

    OpenAIRE

    Böhler, Elmar; Creignou, Nadia; Galota, Matthias; Reith, Steffen; Schnoor, Henning; Vollmer, Heribert

    2010-01-01

    We study Boolean circuits as a representation of Boolean functions and conskier different equivalence, audit, and enumeration problems. For a number of restricted sets of gate types (bases) we obtain efficient algorithms, while for all other gate types we show these problems are at least NP-hard.

  7. RESEARCH OF PROBLEMS OF DESIGN OF COMPLEX TECHNICAL PROVIDING AND THE GENERALIZED MODEL OF THEIR DECISION

    Directory of Open Access Journals (Sweden)

    A. V. Skrypnikov

    2015-01-01

    Full Text Available Summary. In this work the general ideas of a method of V. I. Skurikhin taking into account the specified features develop and questions of the analysis and synthesis of a complex of technical means, with finishing them to the level suitable for use in engineering practice of design of information management systems are in more detail considered. In work the general system approach to the solution of questions of a choice of technical means of the information management system is created, the general technique of the sys tem analysis and synthesis of a complex of the technical means and its subsystems providing achievement of extreme value of criterion of efficiency of functioning of a technical complex of the information management system is developed. The main attention is paid to the applied party of system researches of complex technical providing, in particular, to definition of criteria of quality of functioning of a technical complex, development of methods of the analysis of information base of the information management system and definition of requirements to technical means, and also methods of structural synthesis of the main subsystems of complex technical providing. Thus, the purpose is research on the basis of system approach of complex technical providing the information management system and development of a number of methods of the analysis and the synthesis of complex technical providing suitable for use in engineering practice of design of systems. The well-known paradox of development of management information consists of that parameters of the system, and consequently, and requirements to the complex hardware, can not be strictly reasonable to development of algorithms and programs, and vice versa. The possible method of overcoming of these difficulties is prognostication of structure and parameters of complex hardware for certain management informations on the early stages of development, with subsequent clarification and

  8. Problems of Ensuring Complex Business Security in the Conditions of Modern Globalization

    OpenAIRE

    Anatoliy Petrovich Sterkhov

    2015-01-01

    From the viewpoint of ensuring complex business security, the relevance of the present work is associated with the rationale of multilevel hierarchical approach to the classification of security threats in the age of globalization. The specificity of the threats specific to one or another level of the economy, helps to better understand and consequently to build an effective system of ensuring complex business security. For each of the nine hierarchical levels of the economy the author identi...

  9. Robust control problems of vortex dynamics in superconducting films with Ginzburg-Landau complex systems

    OpenAIRE

    Belmiloudi, Aziz

    2006-01-01

    We formulate and study robust control problems for a two-dimensional time-dependent Ginzburg-Landau model with Robin boundary conditions on phase-field parameter, which describes the phase transitions taking place in superconductor films with variable thickness. The objective of such study is to control the motion of vortices in the superconductor films by taking into account the influence of noises in data. Firstly, we introduce the perturbation problem of the nonlinear ...

  10. Fourth youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects. Theses of reports

    International Nuclear Information System (INIS)

    2007-01-01

    Theses of reports of the Fourth youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects (18-20 April 2007, Ozersk) are presented. The book contains theses of reports of the seventh subject sections: NFC: science and industry; Ecological problems in NFC development: radiation safety, radioecology and radiobiology; Nuclear power engineering: economics, safety, field experience; Atomic branch: history, today and future; New technologies in education. Education and training for NFC plants, public opinion; Information technologies and telecommunications; Long-term science intensive technologies and new materials [ru

  11. MULTIENZYME COMPLEX APPLICATION WHEN RECEIVING OF ETHYL ALCOHOL FROM THE PROBLEM OF RAW MATERIAL

    Directory of Open Access Journals (Sweden)

    A. N. Yakovlev

    2012-01-01

    Full Text Available Influence of the complex enzyme preparation of Bruzaime BGX on viscosity of a rye batter was probed. Dynamics of accumulation of a weight fraction of dry and reducing materials in a batter, and also possibility of decrease in a dosage glucoamilase is shown at using of multienzymic complex at a stage of water-thermal processing. It was set that multienzymic complex application on stages of water-thermal processing allows to increase the glucose maintenance in a wort by 34,7 % in comparison with the control that raises an alcohol yield on 1,4 dal/tonn conditional starch, reduces duration of fermentation till 50-52 h and reduces the general screenings content in a fermented wash by 10 %.

  12. Exponential-Time Algorithms and Complexity of NP-Hard Graph Problems

    DEFF Research Database (Denmark)

    Taslaman, Nina Sofia

    of algorithms, as well as investigations into how far such improvements can get under reasonable assumptions.      The first part is concerned with detection of cycles in graphs, especially parameterized generalizations of Hamiltonian cycles. A remarkably simple Monte Carlo algorithm is presented......NP-hard problems are deemed highly unlikely to be solvable in polynomial time. Still, one can often find algorithms that are substantially faster than brute force solutions. This thesis concerns such algorithms for problems from graph theory; techniques for constructing and improving this type......, and with high probability any found solution is shortest possible. Moreover, the algorithm can be used to find a cycle of given parity through the specified elements.      The second part concerns the hardness of problems encoded as evaluations of the Tutte polynomial at some fixed point in the rational plane...

  13. The Consensus String Problem and the Complexity of Comparing Hidden Markov Models

    DEFF Research Database (Denmark)

    Lyngsø, Rune Bang; Pedersen, Christian Nørgaard Storm

    2002-01-01

    The basic theory of hidden Markov models was developed and applied to problems in speech recognition in the late 1960s, and has since then been applied to numerous problems, e.g. biological sequence analysis. Most applications of hidden Markov models are based on efficient algorithms for computing......-norms. We discuss the applicability of the technique used for proving the hardness of comparing two hidden Markov models under the L1-norm to other measures of distance between probability distributions. In particular, we show that it cannot be used for proving NP-hardness of determining the Kullback...

  14. Forecasting of Processes in Complex Systems for Real-World Problems

    Czech Academy of Sciences Publication Activity Database

    Pelikán, Emil

    2014-01-01

    Roč. 24, č. 6 (2014), s. 567-589 ISSN 1210-0552 Institutional support: RVO:67985807 Keywords : complex systems * data assimilation * ensemble forecasting * forecasting * global solar radiation * judgmental forecasting * multimodel forecasting * pollution Subject RIV: IN - Informatics, Computer Science Impact factor: 0.479, year: 2014

  15. Foucault as Complexity Theorist: Overcoming the Problems of Classical Philosophical Analysis

    Science.gov (United States)

    Olssen, Mark

    2008-01-01

    This article explores the affinities and parallels between Foucault's Nietzschean view of history and models of complexity developed in the physical sciences in the twentieth century. It claims that Foucault's rejection of structuralism and Marxism can be explained as a consequence of his own approach which posits a radical ontology whereby the…

  16. Analogize This! The Politics of Scale and the Problem of Substance in Complexity-Based Composition

    Science.gov (United States)

    Roderick, Noah R.

    2012-01-01

    In light of recent enthusiasm in composition studies (and in the social sciences more broadly) for complexity theory and ecology, this article revisits the debate over how much composition studies can or should align itself with the natural sciences. For many in the discipline, the science debate--which was ignited in the 1970s, both by the…

  17. Problems of Ensuring Complex Business Security in the Conditions of Modern Globalization

    Directory of Open Access Journals (Sweden)

    Anatoliy Petrovich Sterkhov

    2015-12-01

    Full Text Available From the viewpoint of ensuring complex business security, the relevance of the present work is associated with the rationale of multilevel hierarchical approach to the classification of security threats in the age of globalization. The specificity of the threats specific to one or another level of the economy, helps to better understand and consequently to build an effective system of ensuring complex business security. For each of the nine hierarchical levels of the economy the author identifies the main threats to the business, as well as the objects and subjects of this study. It is noted that the performance of the business in the form of a complex hierarchical system depends on the principle of specification. The author gives examples of the use of the basic principles of specification. It is noted that the decomposition of the economic system from the viewpoint of its hierarchical nature is of great importance not only to the distribution of the goals and objectives of security of business levels of the system, but their subordination corresponding to each level. The result is the development of specific recommendations and elaboration of the main directions to ensure complex business security for mega-, macro-, micro-, mini-, nano - and mesoeconomic levels. Although the priority of action in multi-level hierarchical system is directed from the upper to the lower levels, the success of the system as a whole depends on the behavior of all system components. It is stated that the interaction with the environment in business occurs mainly in the lower levels of the hierarchy. The quality system of ensuring complex business security which deals with hierarchical positions, will depend not so much on top-level elements, but on response to intervention on the part of lower level, more precisely from their total effect. In other words, the quality of the system of integrated safety management business provides organized feedbacks in the system.

  18. Aksoy Nigar Yildirim Variational problem with complex co-efficient of ...

    Indian Academy of Sciences (India)

    user1

    Abbaspour Mohammad Hassan see Ghaffarzadeh Ghodrat. 329. Abhyankar Shreeram S. Rees valuations. 525. Agarwal A K. seeAnand S. 23. Aithal A R. On the extrema of Dirichlet's first eigen- value of a family of punctured regular polygons in two dimensional space forms. 257. Aksoy Nigar Yildirim. Variational problem ...

  19. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Caroline Looms, Majken

    2013-01-01

    on the solution. The combined states of information (i.e. the solution to the inverse problem) is a probability density function typically referred to as the a posteriori probability density function. We present a generic toolbox for Matlab and Gnu Octave called SIPPI that implements a number of methods...

  20. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Looms, Majken Caroline

    2013-01-01

    We present an application of the SIPPI Matlab toolbox, to obtain a sample from the a posteriori probability density function for the classical tomographic inversion problem. We consider a number of different forward models, linear and non-linear, such as ray based forward models that rely...

  1. Device for analyzing a solution

    International Nuclear Information System (INIS)

    Marchand, Joseph.

    1978-01-01

    The device enables a solution containing an antigen to be analyzed by the radio-immunology technique without coming up against the problems of antigen-antibody complex and free antigen separation. This device, for analyzing a solution containing a biological compound capable of reacting with an antagonistic compound specific of the biological compound, features a tube closed at its bottom end and a component set and immobilized in the bottom of the tube so as to leave a capacity between the bottom of the tube and its lower end. The component has a large developed surface and is so shaped that it allows the solution to be analyzed to have access to the bottom of the tube; it is made of a material having some elastic deformation and able to take up a given quantity of the biological compound or of the antagonistic compound specific of the biological compound [fr

  2. Radiometric analyzer

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring the characteristic values of a sample by radiation includes a humer of radiation measuring subsystems having different ratios of sensitivities to the elements of the sample and linearizing circuits having inverse function characteristics of calibration functions which correspond to the radiation measuring subsystems. A weighing adder operates a desirable linear combination of the outputs of the linearizing circuits. Operators for operating between two or more different linear combinations are included

  3. Complexity of matrix organization and problems caused by its inadequate implementation

    Directory of Open Access Journals (Sweden)

    Janićijević Nebojša

    2007-01-01

    Full Text Available Matrix organization model is a sophisticated structure intended to combine both the efficiency and effectiveness of the functional and the product/service/customer/area dimensions. From the moment it was introduced in practice, this organizational architecture was accepted with enthusiasm, because it represented a complex organizational response adequate to the conditions which most of the companies in the world have been facing since 1970s. Although matrix organization is not a novelty, it is still a controversial model of organization design. The aim of this paper is to provide a deeper insight into the causes and effects of organizational misfits which appear in the implementation phase of three-dimensional matrix organization, as well as to offer some practical recommendations for managers on how to improve their capacities for successful management of complex matrix organization architecture in their organizations.

  4. Interdisciplinary approach for bilateral maxillary canine: First premolar transposition with complex problems in an adult patient

    Directory of Open Access Journals (Sweden)

    Dhivakar Selvaraj

    2013-01-01

    Full Text Available Adult patients seeking orthodontic care were increased nowadays not only on esthetic need but also on functional demand. But problems with adult patients were not only malocclusions but also dental caries, pulpal pathology, missing teeth, muco-gingival problems and loss of supporting structures. We report here a case of 35-year-old female with complete transposition referred as a positional interchange of two permanent teeth within the same quadrant of the dental arch along with gingival recession of the lower anteriors and missing molars. Gingival health was improved by free gingival graft in lower anteriors followed by fixed orthodontic procedure to correct transposition. Based on transposition crown recontouring and restoration was done along with replacement of missing molars with fixed prosthesis. Thus, proper treatment planning with interdisciplinary management improves not only the esthetics and occlusal relationship but also with stable results.

  5. A numerical approach of thermal problems coupling fluid solid and radiation in complex geometries

    International Nuclear Information System (INIS)

    Peniguel, C.; Rupp, I.

    1995-11-01

    In many industrial problems, heat transfer does play an important part. Quite often, radiation, convection and radiation are present simultaneously. This paper presents the numerical tool handling simultaneously these phenomena. Fluid is tackled by the finite element code N3S, radiation (restricted to a non participating medium) and conduction are handled with SYRTHES respectively by a radiosity method and a finite element method. The main originality of the product is that meshes used to solve each phenomenon are completely independent. This allows users to choose the most appropriate spatial discretization for each part or phenomenon. This flexibility requires of course robust and fast data exchange procedures (temperature, convective flux, radiative flux) between the independent grids. This operation is done automatically by the code SYRTHES. One simple problem illustrating the interest of this development is presented at the end of the paper. (author). 6 refs., 8 figs

  6. An elitist teaching-learning-based optimization algorithm for solving complex constrained optimization problems

    Directory of Open Access Journals (Sweden)

    Vivek Patel

    2012-08-01

    Full Text Available Nature inspired population based algorithms is a research field which simulates different natural phenomena to solve a wide range of problems. Researchers have proposed several algorithms considering different natural phenomena. Teaching-Learning-based optimization (TLBO is one of the recently proposed population based algorithm which simulates the teaching-learning process of the class room. This algorithm does not require any algorithm-specific control parameters. In this paper, elitism concept is introduced in the TLBO algorithm and its effect on the performance of the algorithm is investigated. The effects of common controlling parameters such as the population size and the number of generations on the performance of the algorithm are also investigated. The proposed algorithm is tested on 35 constrained benchmark functions with different characteristics and the performance of the algorithm is compared with that of other well known optimization algorithms. The proposed algorithm can be applied to various optimization problems of the industrial environment.

  7. The Average Network Flow Problem: Shortest Path and Minimum Cost Flow Formulations, Algorithms, Heuristics, and Complexity

    Science.gov (United States)

    2012-09-13

    46, 1989. [75] S. Melkote and M.S. Daskin . An integrated model of facility location and transportation network design. Transportation Research Part A ... a work of the U.S. Government and is not subject to copyright protection in the United States. AFIT/DS/ENS/12-09 THE AVERAGE NETWORK FLOW PROBLEM...focused thinking (VFT) are used sparingly, as is the case across the entirety of the supply chain literature. We provide a VFT tutorial for supply chain

  8. Methods of Optimization and Systems Analysis for Problems of Transcomputational Complexity

    CERN Document Server

    Sergienko, Ivan V

    2012-01-01

    This work presents lines of investigation and scientific achievements of the Ukrainian school of optimization theory and adjacent disciplines. These include the development of approaches to mathematical theories, methodologies, methods, and application systems for the solution of applied problems in economy, finances, energy saving, agriculture, biology, genetics, environmental protection, hardware and software engineering, information protection, decision making, pattern recognition, self-adapting control of complicated objects, personnel training, etc. The methods developed include sequentia

  9. Recent advances in hopanoids analysis: Quantification protocols overview, main research targets and selected problems of complex data exploration.

    Science.gov (United States)

    Zarzycki, Paweł K; Portka, Joanna K

    2015-09-01

    Pentacyclic triterpenoids, particularly hopanoids, are organism-specific compounds and are generally considered as useful biomarkers that allow fingerprinting and classification of biological, environmental and geological samples. Simultaneous quantification of various hopanoids together with battery of related non-polar and low-molecular mass compounds may provide principal information for geochemical and environmental research focusing on both modern and ancient investigations. Target compounds can be derived from microbial biomass, water columns, sediments, coals, crude fossils or rocks. This create number of analytical problems due to different composition of the analytical matrix and interfering compounds and therefore, proper optimization of quantification protocols for such biomarkers is still the challenge. In this work we summarizing typical analytical protocols that were recently applied for quantification of hopanoids like compounds from different samples. Main steps including components of interest extraction, pre-purification, fractionation, derivatization and quantification involving gas (1D and 2D) as well as liquid separation techniques (liquid-liquid extraction, solid-phase extraction, planar and low resolution column chromatography, high-performance liquid chromatography) are described and discussed from practical point of view, mainly based on the experimental papers that were published within last two years, where significant increase in hopanoids research was noticed. The second aim of this review is to describe the latest research trends concerning determination of hopanoids and related low-molecular mass lipids analyzed in various samples including sediments, rocks, coals, crude oils and plant fossils as well as stromatolites and microbial biomass cultivated under different conditions. It has been found that majority of the most recent papers are based on uni- or bivariate approach for complex data analysis. Data interpretation involves

  10. Contamination Analyzer

    Science.gov (United States)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  11. Dynamic Modeling as a Cognitive Regulation Scaffold for Developing Complex Problem-Solving Skills in an Educational Massively Multiplayer Online Game Environment

    Science.gov (United States)

    Eseryel, Deniz; Ge, Xun; Ifenthaler, Dirk; Law, Victor

    2011-01-01

    Following a design-based research framework, this article reports two empirical studies with an educational MMOG, called "McLarin's Adventures," on facilitating 9th-grade students' complex problem-solving skill acquisition in interdisciplinary STEM education. The article discusses the nature of complex and ill-structured problem solving…

  12. Model of geophysical fields representation in problems of complex correlation-extreme navigation

    Directory of Open Access Journals (Sweden)

    Volodymyr KHARCHENKO

    2015-09-01

    Full Text Available A model of the optimal representation of spatial data for the task of complex correlation-extreme navigation is developed based on the criterion of minimum deviation of the correlation functions of the original and the resulting fields. Calculations are presented for one-dimensional case using the approximation of the correlation function by Fourier series. It is shown that in the presence of different geophysical map data fields their representation is possible by single template with optimal sampling without distorting the form of the correlation functions.

  13. Problems concerning the parenteral nutrition within the complex therapy of radiation injuries of the intestine

    International Nuclear Information System (INIS)

    Sloventantor, V.Yu.; Kurpesheva, A.K.; Kaplan, M.A.; Bardychev, M.S.; Khmelevskij, Ya.M.

    1982-01-01

    The treatment results of 52 patients with radiation enterocolitis and rectosygmoiditis are reported. The complex therapy included a partial or a complete parenteral nutrition according to the indication. The treatment caused an improvement in 86.7% of the cases, no changes in 5.7% and a deterioration of the condition in 7.6%. The additional nutritive therapy rendered it possible to hold the cell mass of the body constant and to decrease the protein losses of the gastrointestinal tract significantly. (author)

  14. Absenteeism- a complex problem: A study on absenteeism in Trondheim’s nursing homes

    OpenAIRE

    Evans, Josiane

    2011-01-01

    Absenteeism is a community problem when one looks at the amount of money spent because of it, an amount that could have been used on other more important matters. For this thesis I set out to study absenteeism in nursing homes here in my town, Trondheim. I wanted to study the nursing homes with relatively low absenteeism rate and the nursing homes with relatively high absenteeism rate to see if I could find differences that could explain the differences in their absenteeism rates. Interviews ...

  15. Class II malocclusion with complex problems treated with a novel combination of lingual orthodontic appliances and lingual arches.

    Science.gov (United States)

    Yanagita, Takeshi; Nakamura, Masahiro; Kawanabe, Noriaki; Yamashiro, Takashi

    2014-07-01

    This case report describes a novel method of combining lingual appliances and lingual arches to control horizontal problems. The patient, who was 25 years of age at her first visit to our hospital with a chief complaint of crooked anterior teeth, was diagnosed with skeletal Class II and Angle Class II malocclusion with anterior deep bite, lateral open bite, premolar crossbite, and severe crowding in both arches. She was treated with premolar extractions and temporary anchorage devices. Conventionally, it is ideal to use labial brackets simultaneously with appliances, such as a lingual arch, a quad-helix, or a rapid expansion appliance, in patients with complex problems requiring horizontal, anteroposterior, and vertical control; however, this patient strongly requested orthodontic treatment with lingual appliances. A limitation of lingual appliances is that they cannot be used with other conventional appliances. In this report, we present the successful orthodontic treatment of a complex problem using modified lingual appliances that enabled combined use of a conventional lingual arch. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  16. Traveling salesman problems with PageRank Distance on complex networks reveal community structure

    Science.gov (United States)

    Jiang, Zhongzhou; Liu, Jing; Wang, Shuai

    2016-12-01

    In this paper, we propose a new algorithm for community detection problems (CDPs) based on traveling salesman problems (TSPs), labeled as TSP-CDA. Since TSPs need to find a tour with minimum cost, cities close to each other are usually clustered in the tour. This inspired us to model CDPs as TSPs by taking each vertex as a city. Then, in the final tour, the vertices in the same community tend to cluster together, and the community structure can be obtained by cutting the tour into a couple of paths. There are two challenges. The first is to define a suitable distance between each pair of vertices which can reflect the probability that they belong to the same community. The second is to design a suitable strategy to cut the final tour into paths which can form communities. In TSP-CDA, we deal with these two challenges by defining a PageRank Distance and an automatic threshold-based cutting strategy. The PageRank Distance is designed with the intrinsic properties of CDPs in mind, and can be calculated efficiently. In the experiments, benchmark networks with 1000-10,000 nodes and varying structures are used to test the performance of TSP-CDA. A comparison is also made between TSP-CDA and two well-established community detection algorithms. The results show that TSP-CDA can find accurate community structure efficiently and outperforms the two existing algorithms.

  17. A low complexity based spectrum management algorithm for ‘Near–Far’ problem in VDSL environment

    Directory of Open Access Journals (Sweden)

    Sunil Sharma

    2015-10-01

    Full Text Available In digital subscriber line (DSL system, crosstalk created by electromagnetic interference among twisted pairs degrades the system performance. Very high bit rate DSL (VDSL, utilizes higher bandwidth of copper cable for data transmission. During upstream transmission, a ‘Near–Far’ problem occurs in VDSL system. In this problem the far end crosstalk (FEXT is produced from near end user degrades the data rate achieved at the far end user. The effect of FEXT can be reduced by properly managing power spectral densities (PSD of transmitters of near and far users. This kind of power allocation is called dynamic spectrum management (DSM. In this paper, a new distributed DSM algorithm is proposed in which power from only those sub channels of near end user are reduced which create interference to far end user. This power back off strategy takes place with the help of power spectral density (PSD masks at interference creating sub channels of near end user. The simulation results of the proposed algorithm show an improvement in terms of data rate and approaches near to that of optimal spectrum balancing (OSB algorithm.

  18. Application of the decoupling scheme on complex neutron-gamma shielding problems

    Energy Technology Data Exchange (ETDEWEB)

    Feher, S. [Institute of Nuclear Technology, Technical University of Budapest, Budapest (Hungary); Leege, P.F.A. de; Hoogenboom, J.E.; Kloosterman, J.L. [Interfaculty Reactor Institute, Delft University of Technology, Delft (Netherlands)

    2000-03-01

    Coupled neutron-gamma shielding calculations using S{sub n} transport theory can be time consuming, especially for two- and three-dimensional geometries. In general, the CPU time of these calculations increases stronger than linear with increasing number of neutron and gamma energy groups, and depends on the order of Legendre expansion and number of S{sub n} directions used. This fact induced the idea of the decoupling method, which seems applicable to accelerate coupled neutron-gamma shielding calculations. The data included in a combined neutron-gamma library can be readily separated into a library containing neutron data only and another library containing gamma data only. Separate calculations for neutrons and gammas are performed on complex geometries using a different Legendre order expansion for neutrons and gammas. CPU savings of 60 to 85% can be achieved for the two-dimensional DORT and three-dimensional TORT calculations respectively. (author)

  19. Design problems of social-administrative complexes on the example of the Morcinek mine

    Energy Technology Data Exchange (ETDEWEB)

    Bielski, M.; Trojanowski, S.

    1987-01-01

    Buildings at the Morcinek mine head are designed as four complexes: the lamp room of 28,850 m/sup 3/ for 6,440 lamps; the washing room block of 92,491 m/sup 3/ for 7,623 miners (including mine operation offices, control room, mine rescue station, laundry and canteen); administration and social services block of 47,645 m/sup 3/ (mine management, telephone exchange, dispatcher room, health services, rooms for training and social organizations, snack bar); shaft landing and waiting room block of 12,080 m/sup 3/ (transportation, bus depot, parking). The buildings are built as frame type structures. Reinforced concrete is used for frames up to 6 m and steel for the larger ones. Prefabricated reinforced concrete floors and skirt type walls of glass panels, bricks or prefabricated materials are incorporated. The multi-storey buildings are built on 'Franki' type piles.

  20. Using the complex Langevin equation to solve the sign problem of QCD

    Energy Technology Data Exchange (ETDEWEB)

    Sexty, Denes [Bergische Univ. Wuppertal (Germany)

    2016-11-01

    Using the resources of SuperMUC we have been able to calculate the reweighting results and compare them to the CLE for lattice sizes up to Nt=8. This did not allow the exploration of the phase transition line. It's an open question whether increasing the lattice size will allow us to go to smaller temperatures. The cost of larger lattices is of course increasing, especially the reweighting becomes much more expensive at larger volumes, as it's cost is proportional to the spatial volume cubed. An other important open question is the question of the poles: the fermionic drift term has singularities on the complex manifold, which in some cases can lead to the breakdown of the method, but it is unknown what its effect is on QCD, especially at low temperatures.

  1. Integrating water and agricultural management: collaborative governance for a complex policy problem.

    Science.gov (United States)

    Fish, Rob D; Ioris, Antonio A R; Watson, Nigel M

    2010-11-01

    This paper examines governance requirements for integrating water and agricultural management (IWAM). The institutional arrangements for the agriculture and water sectors are complex and multi-dimensional, and integration cannot therefore be achieved through a simplistic 'additive' policy process. Effective integration requires the development of a new collaborative approach to governance that is designed to cope with scale dependencies and interactions, uncertainty and contested knowledge, and interdependency among diverse and unequal interests. When combined with interdisciplinary research, collaborative governance provides a viable normative model because of its emphasis on reciprocity, relationships, learning and creativity. Ultimately, such an approach could lead to the sorts of system adaptations and transformations that are required for IWAM. Copyright © 2009 Elsevier B.V. All rights reserved.

  2. QDPSO applied to the complex problem optimization of the nuclear engineering

    International Nuclear Information System (INIS)

    Nicolau, Andressa dos Santos; Schirru, Roberto

    2013-01-01

    The purpose of this article is to show the performance of different approaches of quantum-inspired algorithms as optimization tool of diagnosis system of Brazilian nuclear power plant operating at 100% of full power. The algorithms implemented in this study were Quantum Delta-Potential-Well-based Particle Swarm Optimization (QDPSO), Quantum Swarm Evolutionary (QSE) and Quantum Evolutionary Algorithm (QEA). Both QDPSO and QSE are inspired on the philosophy of 'collective learning' of Particle Swarm Optimization (PSO) but use different theories of quantum mechanics to govern the motion of the particles. On the other hand QEA is inspired on the philosophy of 'population evolution' of Genetic Algorithm and uses the main concepts of Quantum Computation. The results found shown that only QDPSO and QEA achieve the best result of the problem. Besides QDPSO in terms of convergence speed is faster than QEA. (author)

  3. QDPSO applied to the complex problem optimization of the nuclear engineering

    Energy Technology Data Exchange (ETDEWEB)

    Nicolau, Andressa dos Santos; Schirru, Roberto, E-mail: andressa@lmp.ufrj.br, E-mail: schirru@lmp.ufrj.br [Coordenacao dos Programas de Pos-Graduacao em Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2013-07-01

    The purpose of this article is to show the performance of different approaches of quantum-inspired algorithms as optimization tool of diagnosis system of Brazilian nuclear power plant operating at 100% of full power. The algorithms implemented in this study were Quantum Delta-Potential-Well-based Particle Swarm Optimization (QDPSO), Quantum Swarm Evolutionary (QSE) and Quantum Evolutionary Algorithm (QEA). Both QDPSO and QSE are inspired on the philosophy of 'collective learning' of Particle Swarm Optimization (PSO) but use different theories of quantum mechanics to govern the motion of the particles. On the other hand QEA is inspired on the philosophy of 'population evolution' of Genetic Algorithm and uses the main concepts of Quantum Computation. The results found shown that only QDPSO and QEA achieve the best result of the problem. Besides QDPSO in terms of convergence speed is faster than QEA. (author)

  4. Effective algorithm for solving complex problems of production control and of material flows control of industrial enterprise

    Science.gov (United States)

    Mezentsev, Yu A.; Baranova, N. V.

    2018-05-01

    A universal economical and mathematical model designed for determination of optimal strategies for managing subsystems (components of subsystems) of production and logistics of enterprises is considered. Declared universality allows taking into account on the system level both production components, including limitations on the ways of converting raw materials and components into sold goods, as well as resource and logical restrictions on input and output material flows. The presented model and generated control problems are developed within the framework of the unified approach that allows one to implement logical conditions of any complexity and to define corresponding formal optimization tasks. Conceptual meaning of used criteria and limitations are explained. The belonging of the generated tasks of the mixed programming with the class of NP is shown. An approximate polynomial algorithm for solving the posed optimization tasks for mixed programming of real dimension with high computational complexity is proposed. Results of testing the algorithm on the tasks in a wide range of dimensions are presented.

  5. A Different Trolley Problem: The Limits of Environmental Justice and the Promise of Complex Moral Assessments for Transportation Infrastructure.

    Science.gov (United States)

    Epting, Shane

    2016-12-01

    Transportation infrastructure tremendously affects the quality of life for urban residents, influences public and mental health, and shapes social relations. Historically, the topic is rich with social and political controversy and the resultant transit systems in the United States cause problems for minority residents and issues for the public. Environmental justice frameworks provide a means to identify and address harms that affect marginalized groups, but environmental justice has limits that cannot account for the mainstream population. To account for this condition, I employ a complex moral assessment measure that provides a way to talk about harms that affect the public.

  6. Approximate solutions for the two-dimensional integral transport equation. Solution of complex two-dimensional transport problems

    International Nuclear Information System (INIS)

    Sanchez, Richard.

    1980-11-01

    This work is divided into two parts: the first part deals with the solution of complex two-dimensional transport problems, the second one (note CEA-N-2166) treats the critically mixed methods of resolution. A set of approximate solutions for the isotropic two-dimensional neutron transport problem has been developed using the interface current formalism. The method has been applied to regular lattices of rectangular cells containing a fuel pin, cladding, and water, or homogenized structural material. The cells are divided into zones that are homogeneous. A zone-wise flux expansion is used to formulate a direct collision probability problem within a cell. The coupling of the cells is effected by making extra assumptions on the currents entering and leaving the interfaces. Two codes have been written: CALLIOPE uses a cylindrical cell model and one or three terms for the flux expansion, and NAUSICAA uses a two-dimensional flux representation and does a truly two-dimensional calculation inside each cell. In both codes, one or three terms can be used to make a space-independent expansion of the angular fluxes entering and leaving each side of the cell. The accuracies and computing times achieved with the different approximations are illustrated by numerical studies on two benchmark problems and by calculations performed in the APOLLO multigroup code [fr

  7. Age and sex effects on human mutation rates. An old problem with new complexities

    International Nuclear Information System (INIS)

    Crow, James F.

    2006-01-01

    Base substitution mutations are far more common in human males than in females, and the frequency increases with paternal age. Both can be accounted for by the greater number of pre-meiotic cell divisions in males, especially old ones. In contrast, small deletions do not show any important age effect and occur with approximately equal frequency in the two sexes. Mutations in most genes include both types, and the sex and paternal age effect depends on the proportion of the two types. A few traits, of which Apert Syndrome is best understood, are mutation hot spots with all the mutations occurring in one or two codons, usually at one nucleotide. They occur with very high frequency almost exclusively in males and the frequency increases rapidly with paternal age. It has been suggested that the mutant cells have a selective advantage in the male germ-line prior to meiosis. Evidence for this surprising, but important, hypothesis is discussed. A possible mechanism is the conversion of asymmetrical stem-cell divisions into symmetric ones. Some traits with complex etiology show a slight paternal age effect. There is also a short discussion of the high deleterious mutation rate and the role of sexual reproduction in reducing the consequent mutation load. (author)

  8. Detailed Simulation of Complex Hydraulic Problems with Macroscopic and Mesoscopic Mathematical Methods

    Directory of Open Access Journals (Sweden)

    Chiara Biscarini

    2013-01-01

    Full Text Available The numerical simulation of fast-moving fronts originating from dam or levee breaches is a challenging task for small scale engineering projects. In this work, the use of fully three-dimensional Navier-Stokes (NS equations and lattice Boltzmann method (LBM is proposed for testing the validity of, respectively, macroscopic and mesoscopic mathematical models. Macroscopic simulations are performed employing an open-source computational fluid dynamics (CFD code that solves the NS combined with the volume of fluid (VOF multiphase method to represent free-surface flows. The mesoscopic model is a front-tracking experimental variant of the LBM. In the proposed LBM the air-gas interface is represented as a surface with zero thickness that handles the passage of the density field from the light to the dense phase and vice versa. A single set of LBM equations represents the liquid phase, while the free surface is characterized by an additional variable, the liquid volume fraction. Case studies show advantages and disadvantages of the proposed LBM and NS with specific regard to the computational efficiency and accuracy in dealing with the simulation of flows through complex geometries. In particular, the validation of the model application is developed by simulating the flow propagating through a synthetic urban setting and comparing results with analytical and experimental laboratory measurements.

  9. The manual of strategic economic decision making using Bayesian belief networks to solve complex problems

    CERN Document Server

    Grover, Jeff

    2016-01-01

    This book is an extension of the author’s first book and serves as a guide and manual on how to specify and compute 2-, 3-, & 4-Event Bayesian Belief Networks (BBN). It walks the learner through the steps of fitting and solving fifty BBN numerically, using mathematical proof. The author wrote this book primarily for naïve learners and professionals, with a proof-based academic rigor. The author's first book on this topic, a primer introducing learners to the basic complexities and nuances associated with learning Bayes’ theory and inverse probability for the first time, was meant for non-statisticians unfamiliar with the theorem - as is this book. This new book expands upon that approach and is meant to be a prescriptive guide for building BBN and executive decision-making for students and professionals; intended so that decision-makers can invest their time and start using this inductive reasoning principle in their decision-making processes. It highlights the utility of an algorithm that served as ...

  10. Reducing the Complexity of Genetic Fuzzy Classifiers in Highly-Dimensional Classification Problems

    Directory of Open Access Journals (Sweden)

    DimitrisG. Stavrakoudis

    2012-04-01

    Full Text Available This paper introduces the Fast Iterative Rule-based Linguistic Classifier (FaIRLiC, a Genetic Fuzzy Rule-Based Classification System (GFRBCS which targets at reducing the structural complexity of the resulting rule base, as well as its learning algorithm's computational requirements, especially when dealing with high-dimensional feature spaces. The proposed methodology follows the principles of the iterative rule learning (IRL approach, whereby a rule extraction algorithm (REA is invoked in an iterative fashion, producing one fuzzy rule at a time. The REA is performed in two successive steps: the first one selects the relevant features of the currently extracted rule, whereas the second one decides the antecedent part of the fuzzy rule, using the previously selected subset of features. The performance of the classifier is finally optimized through a genetic tuning post-processing stage. Comparative results in a hyperspectral remote sensing classification as well as in 12 real-world classification datasets indicate the effectiveness of the proposed methodology in generating high-performing and compact fuzzy rule-based classifiers, even for very high-dimensional feature spaces.

  11. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  12. Close contacts at the interface: Experimental-computational synergies for solving complexity problems

    Science.gov (United States)

    Torras, Juan; Zanuy, David; Bertran, Oscar; Alemán, Carlos; Puiggalí, Jordi; Turón, Pau; Revilla-López, Guillem

    2018-02-01

    The study of material science has been long devoted to the disentanglement of bulk structures which mainly entails finding the inner structure of materials. That structure is accountable for a major portion of materials' properties. Yet, as our knowledge of these "backbones" enlarged so did the interest for the materials' boundaries properties which means the properties at the frontier with the surrounding environment that is called interface. The interface is thus to be understood as the sum of the material's surface plus the surrounding environment be it in solid, liquid or gas phase. The study of phenomena at this interface requires both the use of experimental and theoretical techniques and, above all, a wise combination of them in order to shed light over the most intimate details at atomic, molecular and mesostructure levels. Here, we report several cases to be used as proof of concept of the results achieved when studying interface phenomena by combining a myriad of experimental and theoretical tools to overcome the usual limitation regardind atomic detail, size and time scales and systems of complex composition. Real world examples of the combined experimental-theoretical work and new tools, software, is offered to the readers.

  13. On Advice Complexity of the k-server Problem under Sparse Metrics

    DEFF Research Database (Denmark)

    Gupta, S.; Kamali, S.; López-Ortiz, A.

    2013-01-01

    O (n(log μ +log logN)) bits of advice. Among other results, this gives a 3-competitive algorithm for planar graphs, provided with O (n log log N) bits of advice. On the other side, we show that an advice of size Ω (n) is required to obtain a 1-competitive algorithm for sequences of size n even......We consider the k-Server problem under the advice model of computation when the underlying metric space is sparse. On one side, we introduce Θ (1)-competitive algorithms for a wide range of sparse graphs, which require advice of (almost) linear size. Namely, we show that for graphs of size N...... and treewidth α, there is an online algorithm which receives O (n(log α +log log N))1 bits of advice and optimally serves a sequence of length n. With a different argument, we show that if a graph admits a system of μ collective tree (q, r)- spanners, then there is a (q + r)-competitive algorithm which receives...

  14. Stereoselectivity of Mucorales lipases toward triradylglycerols--a simple solution to a complex problem.

    Science.gov (United States)

    Scheib, H.; Pleiss, J.; Kovac, A.; Paltauf, F.; Schmid, R. D.

    1999-01-01

    The lipases from Rhizopus and Rhizomucor are members of the family of Mucorales lipases. Although they display high sequence homology, their stereoselectivity toward triradylglycerols (sn-2 substituted triacylglycerols) varies. Four different triradylglycerols were investigated, which were classified into two groups: flexible substrates with rotatable O'-C1' ether or ester bonds adjacent to C2 of glycerol and rigid substrates with a rigid N'-C1' amide bond or a phenyl ring in sn-2. Although Rhizopus lipase shows opposite stereopreference for flexible and rigid substrates (hydrolysis in sn-1 and sn-3, respectively), Rhizomucor lipase hydrolyzes both groups of triradylglycerols preferably in sn-1. To explain these experimental observations, computer-aided molecular modeling was applied to study the molecular basis of stereoselectivity. A generalized model for both lipases of the Mucorales family highlights the residues mediating stereoselectivity: (1) L258, the C-terminal neighbor of the catalytic histidine, and (2) G266, which is located in a loop contacting the glycerol backbone of a bound substrate. Interactions with triradylglycerol substrates are dominated by van der Waals contacts. Stereoselectivity can be predicted by analyzing the value of a single substrate torsion angle that discriminates between sn-1 and sn-3 stereopreference for all substrates and lipases investigated here. This simple model can be easily applied in enzyme and substrate engineering to predict Mucorales lipase variants and synthetic substrates with desired stereoselectivity. PMID:10210199

  15. Metallic materials for the hydrogen energy industry and main gas pipelines: complex physical problems of aging, embrittlement, and failure

    International Nuclear Information System (INIS)

    Nechaev, Yu S

    2008-01-01

    The possibilities of effective solutions of relevant technological problems are considered based on the analysis of fundamental physical aspects, elucidation of the micromechanisms and interrelations of aging and hydrogen embrittlement of materials in the hydrogen industry and gas-main industries. The adverse effects these mechanisms and processes have on the service properties and technological lifetime of materials are analyzed. The concomitant fundamental process of formation of carbohydride-like and other nanosegregation structures at dislocations (with the segregation capacity 1 to 1.5 orders of magnitude greater than in the widely used Cottrell 'atmosphere' model) and grain boundaries is discussed, as is the way in which these structures affect technological processes (aging, hydrogen embrittlement, stress corrosion damage, and failure) and the physicomechanical properties of the metallic materials (including the technological lifetimes of pipeline steels). (reviews of topical problems)

  16. Technological problems concerning the complex recovery of uranium and accompanying elements from sedimentary ores

    International Nuclear Information System (INIS)

    Pinkas, K.

    1977-01-01

    In Poland a deposit of carbonaceous clay shales has been discovered, it contains 1600ppmV, 100ppmu and 180ppm Mo. On the basis of the experiments carried out on the laboratory scale, it has been shown, that the leaching of the shales by means of the diluted solutions of sulphuric acid or sodium carbonates does not assure the high recovery of vanadium and uranium because of their occurrence in shales in refractory forms. The treatment of the shales by using of the concentrated sulphuric acid /250g/1kg shales/, according to the ''acid cure'' method and baking them in the temperature of 250 0 C, has permitted the recovery of 70% vanadium and 65% uranium. From the acid leaching residue, or from the shales directly, 70% of molybdenum can be gained, employing an alkaline pretreatment. The solutions after acid leaching contain great quantities of Al and Fe, which before the separation of U and V by solvent extraction must be to some extent removed. The performed tests have confirmed this, and by using a crystallization process, as by-products the aluminum- and iron sulphates have been obtained. From the solutions, after crystallization by amine solvent extraction, the uranium and vanadium concentrates have been recovered. The currently recognized technological method has been estimated as difficult and expensive. In order to utilize, more economically, this low grade and very refractory for pretreatment shales it is necessary to continue intensive technological research on the improvement of the recognized method and explore new ways, which could contribute to successful solution of this complicated technological problem

  17. Next generation of novel psychoactive substances on the horizon - A complex problem to face.

    Science.gov (United States)

    Zawilska, Jolanta B; Andrzejczak, Dariusz

    2015-12-01

    The last decade has seen a rapid and continuous growth in the availability and use of novel psychoactive substances (NPS) across the world. Although various products are labeled with warnings "not for human consumption", they are intended to mimic psychoactive effects of illicit drugs of abuse. Once some compounds become regulated, new analogues appear in order to satisfy consumers' demands and at the same time to avoid criminalization. This review presents updated information on the second generation of NPS, introduced as replacements of the already banned substances from this class, focusing on their pharmacological properties and metabolism, routes of administration, and effects in humans. Literature search, covering years 2013-2015, was performed using the following keywords alone or in combination: "novel psychoactive substances", "cathinones", "synthetic cannabinoids", "benzofurans", "phenethylamines", "2C-drugs", "NBOMe", "methoxetamine", "opioids", "toxicity", and "metabolism". More than 400 NPS have been reported in Europe, with 255 detected in 2012-2014. The most popular are synthetic cannabimimetics and psychostimulant cathinones; use of psychedelics and opioids is less common. Accumulating experimental and clinical data indicate that potential harms associated with the use of second generation NPS could be even more serious than those described for the already banned drugs. NPS are constantly emerging on the illicit drug market and represent an important health problem. A significant amount of research is needed in order to fully quantify both the short and long term effects of the second generation NPS, and their interaction with other drugs of abuse. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. A new differential calculus on a complex banach space with application to variational problems of quantum theory

    International Nuclear Information System (INIS)

    Sharma, C.S.; Rebelo, I.

    1975-01-01

    It is proved that a semilinear function on a complex banach space is not differentiable according to the usual definition of differentiability in the calculus on banch spaces. It is shown that this result makes the calculus largely inapplicable to the solution od variational problems of quantum mechanics. A new concept of differentiability called semidifferentiability is defined. This generalizes the standard concept of differentiability in a banach space and the resulting calculus is particularly suitable for optimizing real-value functions on a complex banach space and is directly applicable to the solution of quantum mechanical variational problems. As an example of such application a rigorous proof of a generalized version of a result due to Sharma (J. Phys. A; 2:413 (1969)) is given. In the course of this work a new concept of prelinearity is defined and some standard results in the calculus in banach spaces are extended and generalized into more powerful ones applicable directly to prelinear functions and hence yielding the standard results for linear function as particular cases. (author)

  19. System sight at a problem of efficiency of enterprises’s operaton of the Russian chemical complex

    Directory of Open Access Journals (Sweden)

    Svyatoslav Arkadyevich Nikitin

    2011-06-01

    Full Text Available Chemical industry plays an important role in the development of the domestic economy as one of the basic facilities of Russia's economy, laying the foundation for its long-term and stable development. As a major supplier of raw materials, intermediates, and products of various materials (plastics, chemical fibers, tires, paints and varnishes, dyes, fertilizers, feed additives, pharmaceuticals, medical equipment etc. in almost all sectors of industry, agriculture, health care, human services, commerce, science, culture and education, defense industry, chemical complex has direct impact on the efficiency of their operation and development in these new directions. Therefore, the condition and development of domestic chemistry determines the level of national competitiveness, economic growth and Russia's wealth. However, like most industries in Russia today, chemical industry is going through a difficult period. The presence of a set of common economic problems (identified by technological backwardness and high depreciation, low innovation activity of domestic enterprises of the chemical complex, a lack of effectiveness of the investment process, infrastructure and resource constraints etc., as well as internal management problems causes the rapid growth of interest of uncompetitive Russian chemical products on the world market. Under these conditions, not only a radical adjustment of the internal control systems and chemical plants, but also a significant organizational and economic change is required. Thus, unless we take measures to improve the domestic chemical industry in the coming years, almost all of it grow back and may get into the situation of struggle for survival.

  20. Developing Seventh Grade Students' Understanding of Complex Environmental Problems with Systems Tools and Representations: a Quasi-experimental Study

    Science.gov (United States)

    Doganca Kucuk, Zerrin; Saysel, Ali Kerem

    2017-03-01

    A systems-based classroom intervention on environmental education was designed for seventh grade students; the results were evaluated to see its impact on the development of systems thinking skills and standard science achievement and whether the systems approach is a more effective way to teach environmental issues that are dynamic and complex. A quasi-experimental methodology was used to compare performances of the participants in various dimensions, including systems thinking skills, competence in dynamic environmental problem solving and success in science achievement tests. The same pre-, post- and delayed tests were used with both the comparison and experimental groups in the same public middle school in Istanbul. Classroom activities designed for the comparison group (N = 20) followed the directives of the Science and Technology Curriculum, while the experimental group (N = 22) covered the same subject matter through activities benefiting from systems tools and representations such as behaviour over time graphs, causal loop diagrams, stock-flow structures and hands-on dynamic modelling. After a one-month systems-based instruction, the experimental group demonstrated significantly better systems thinking and dynamic environmental problem solving skills. Achievement in dynamic problem solving was found to be relatively stable over time. However, standard science achievement did not improve at all. This paper focuses on the quantitative analysis of the results, the weaknesses of the curriculum and educational implications.

  1. Ultraviolet radiation-induced mutability of uvrD3 strains of Escherichia coli B/r and K-12: a problem in analyzing mutagenesis data

    International Nuclear Information System (INIS)

    Smith, K.C.

    1976-01-01

    The involvement of the uvrD gene product in UV-induced mutagenesis in Escherichia coli was studied by comparing wild-type and uvrA or uvrB strains with their uvrD derivatives in B/r and K-12(W3110) backgrounds. Mutations per survivor (reversions to prototrophy) were compared as a function of surviving fraction and of UV fluence. While recognizing that both methods are not without problems, arguments are presented for favoring the former rather than the latter method of presenting the data when survival is less than 100%. When UV-induced mutation frequencies were plotted as a function of surviving fraction, the uvrD derivatives were less mutable than the corresponding parent strains. The B/r strains exhibited higher mutation frequencies than did the K-12(W3110) strains. A uvrB mutation increased the mutation frequency of its parental K-12 strain, but a uvrA mutation only increased the mutation frequency of its parental B/r strain at UV survivals greater than approximately 80%. Both the uvrA and uvrB mutations increased the mutation frequencies of the uvrD strains in the B/r and K-12 backgrounds, respectively. Rather different conclusions would be drawn if mutagenesis were considered as a function of UV fluence rather than of survival, a situation that calls for further work and discussion. Ideally mutation efficiencies should be compared as a function of the number of repair events per survivor, a number that is currently unobtainable. (author)

  2. A new million-channel analyzer for complex nuclear spectroscopy studies and its application in measurements of the β decay of 149Pr

    International Nuclear Information System (INIS)

    Tenten, W.

    1978-11-01

    A million-channel analyzer with CAMAC instrumentation and PDP-11 control computer was developed and tested using the case of the β decay of 149 Pr and the γ decays of 149 Nd. A level scheme for 149 Nd was developed. (WL) [de

  3. What is science? Thinking about doctoral Business Administration students’ perceptions analyzed from the perspective of Edgar Morin and the paradigm of Complexity

    Directory of Open Access Journals (Sweden)

    Giancarlo Dal Bo

    2015-09-01

    Full Text Available Discussions about the paradigms that shape science are important in order to promote reflection among researchers as to their role in society. The Cartesian-Newtonian paradigm lies both in the natural and social sciences, both of which initially adopted it, before gradually bringing it into question due to the depletion of its explanatory power for current phenomena. Some authors propose the paradigm of complexity, which would, based on the features exposed in this paper, be better suited to providing a broad understanding of the process of knowledge construction. Through literature review and quantitative research conducted with students of the Doctoral Program in Business Administration at two Higher Education Institutions in Rio Grande do Sul, this paper attempts to identify the prevailing perceptions regarding the epistemological and paradigmatic positions adopted in the sciences and challenge them with the complexity paradigm proposed by Edgar Morin.

  4. Can the complex networks help us in the resolution of the problem of power outages (blackouts) in Brazil?

    Energy Technology Data Exchange (ETDEWEB)

    Castro, Paulo Alexandre de; Souza, Thaianne Lopes de [Universidade Federal de Goias (UFG), Catalao, GO (Brazil)

    2011-07-01

    Full text. What the Brazilian soccer championship, Hollywood actors, the network of the Internet, the spread of viruses and electric distribution network have in common? Until less than two decade ago, the answer would be 'nothing' or 'almost nothing'. However, the answer today to this same question is 'all' or 'almost all'. The answer to these questions and more can be found through a sub-area of statistical physics | called science of complex networks that has been used to approach and study the most diverse natural and non-natural systems, such as systems/social networks, information, technological or biological. In this work we study the distribution network of electric power in Brazil (DEEB), from a perspective of complex networks, where we associate stations and/or substations with a network of vertices and the links between the vertices we associate with the transmission lines. We are doing too a comparative study with the best-known models of complex networks, such as Erdoes-Renyi, Configuration Model and Barabasi-Albert, and then we compare with results obtained in real electrical distribution networks. Based on this information, we do a comparative analysis using the following variables: connectivity distribution, diameter, clustering coefficient, which are frequently used in studies of complex networks. We emphasize that the main objective of this study is to analyze the robustness of the network DEEB, and then propose alternatives for network connectivity, which may contribute to the increase of robustness in maintenance projects and/or expansion of the network, in other words our goal is to make the network to proof the blackouts or improve the endurance the network against the blackouts. For this purpose, we use information from the structural properties of networks, computer modeling and simulation. (author)

  5. Analyzing coastal turbidity under complex terrestrial loads characterized by a 'stress connectivity matrix' with an atmosphere-watershed-coastal ocean coupled model

    Science.gov (United States)

    Yamamoto, Takahiro; Nadaoka, Kazuo

    2018-04-01

    Atmospheric, watershed and coastal ocean models were integrated to provide a holistic analysis approach for coastal ocean simulation. The coupled model was applied to coastal ocean in the Philippines where terrestrial sediment loads provided from several adjacent watersheds play a major role in influencing coastal turbidity and are partly responsible for the coastal ecosystem degradation. The coupled model was validated using weather and hydrologic measurement to examine its potential applicability. The results revealed that the coastal water quality may be governed by the loads not only from the adjacent watershed but also from the distant watershed via coastal currents. This important feature of the multiple linkages can be quantitatively characterized by a "stress connectivity matrix", which indicates the complex underlying structure of environmental stresses in coastal ocean. The multiple stress connectivity concept shows the potential advantage of the integrated modelling approach for coastal ocean assessment, which may also serve for compensating the lack of measured data especially in tropical basins.

  6. Study on the generalized WKB approximation for the inverse scattering problem at fixed energy for complex potentials

    International Nuclear Information System (INIS)

    Pozdnyakov, Yu.A.; Terenetskij, K.O.

    1981-01-01

    The approximate method for solution of the inverse scattering problem (ISP) at fixed energy for complex spherically symmetric potentials decreasing faster 1/r is considered. The method is based on using a generalized WKB approximation. For the designed potential V(r) a sufficiently ''close'' reference potential V(r) has been chosen. For both potentials S-matrix elements (ME) have been calculated and inversion procedure has been carried out. S-ME have been calculated for integral-valued and intermediate angular moment values. S-ME are presented in a graphical form for being restored reference, and restored potentials for proton scattering with Esub(p)=49.48 MeV energy on 12 C nuclei. The restoration is the better the ''closer'' the sought-for potential to the reference one. This allows to specify the potential by means of iterations: the restored potential can be used as a reference one, etc. The operation of a restored potential smoothing before the following iteration is introduced. Drawbacks and advantages of the ISP solution method under consideration are pointed out. The method application is strongly limited by the requirement that the energy should be higher than a certain ''critical'' one. The method is applicable in a wider region of particle energies (in the low-energies direction) than the ordinary WKB method. The method is more simple in realization conformably to complex potentials. The investigations carried out of the proposed ISP solution method at fixed energy for complex spherically-symmetric potentials allow to conclude that the method can be successFully applied to specify the central part of interaction of nucleons, α-particles and heavy ions of average and high energies with atomic nuclei [ru

  7. The "adjuvant effect" of the polymorphic B-G antigens of the chicken major histocompatibility complex analyzed using purified molecules incorporated in liposomes

    DEFF Research Database (Denmark)

    Salomonsen, J; Eriksson, H; Skjødt, K

    1991-01-01

    The polymorphic B-G region of the chicken major histocompatibility complex has previously been shown to mediate an "adjuvant effect" on the humoral response to other erythrocyte alloantigens. We demonstrate here that B-G molecules purified with monoclonal antibodies exert this adjuvant effect...... on the production of alloantibodies to chicken class I (B-F) molecules, when the two are in the same liposome. The adjuvant effect may in part be mediated by antibodies, since the antibody response to B-G molecules occurs much faster than the response to B-F molecules, and conditions in which antibodies to B......-G are present increase the speed of the response to B-F molecules. We also found that the presence of B-G molecules in separate liposomes results in a lack of response to B-F molecules. In the light of this and other data, we consider the possible roles for the polymorphic B-G molecules, particularly...

  8. Comparison of net CO2 fluxes measured with open- and closed-path infrared gas analyzers in an urban complex environment

    DEFF Research Database (Denmark)

    Järvi, L.; Mammarella, I.; Eugster, W.

    2009-01-01

    and their suitability to accurately measure CO2 exchange in such non-ideal landscape. In addition, this study examined the effect of open-path sensor heating on measured fluxes in urban terrain, and these results were compared with similar measurements made above a temperate beech forest in Denmark. The correlation...... between the two fluxes was good (R2 = 0.93) at the urban site, but during the measurement period the open-path net surface exchange (NSE) was 17% smaller than the closed-path NSE, indicating apparent additional uptake of CO2 by open-path measurements. At both sites, sensor heating corrections evidently...... improved the performance of the open-path analyzer by reducing discrepancies in NSE at the urban site to 2% and decreasing the difference in NSE from 67% to 7% at the forest site. Overall, the site-specific approach gave the best results at both sites and, if possible, it should be preferred in the sensor...

  9. Analyzing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2014-01-01

    New types of disclosure and reporting are argued to be vital in order to convey a transparent picture of the true state of the company. However, they are unfortunately not without problems as these types of information are somewhat more complex than the information provided in the traditional...... stakeholders in a form that corresponds to the stakeholders understanding, then disclosure and interpretation of key performance indicators will also be facilitated....

  10. On the complexity of determining tolerances for ->e--optimal solutions to min-max combinatorial optimization problems

    NARCIS (Netherlands)

    Ghosh, D.; Sierksma, G.

    2000-01-01

    Sensitivity analysis of e-optimal solutions is the problem of calculating the range within which a problem parameter may lie so that the given solution re-mains e-optimal. In this paper we study the sensitivity analysis problem for e-optimal solutions tocombinatorial optimization problems with

  11. Computer-aided design system for a complex of problems on calculation and analysis of engineering and economical indexes of NPP power units

    International Nuclear Information System (INIS)

    Stepanov, V.I.; Koryagin, A.V.; Ruzankov, V.N.

    1988-01-01

    Computer-aided design system for a complex of problems concerning calculation and analysis of engineering and economical indices of NPP power units is described. In the system there are means for automated preparation and debugging of data base software complex, which realizes th plotted algorithm in the power unit control system. Besides, in the system there are devices for automated preparation and registration of technical documentation

  12. HSTLBO: A hybrid algorithm based on Harmony Search and Teaching-Learning-Based Optimization for complex high-dimensional optimization problems.

    Directory of Open Access Journals (Sweden)

    Shouheng Tuo

    Full Text Available Harmony Search (HS and Teaching-Learning-Based Optimization (TLBO as new swarm intelligent optimization algorithms have received much attention in recent years. Both of them have shown outstanding performance for solving NP-Hard optimization problems. However, they also suffer dramatic performance degradation for some complex high-dimensional optimization problems. Through a lot of experiments, we find that the HS and TLBO have strong complementarity each other. The HS has strong global exploration power but low convergence speed. Reversely, the TLBO has much fast convergence speed but it is easily trapped into local search. In this work, we propose a hybrid search algorithm named HSTLBO that merges the two algorithms together for synergistically solving complex optimization problems using a self-adaptive selection strategy. In the HSTLBO, both HS and TLBO are modified with the aim of balancing the global exploration and exploitation abilities, where the HS aims mainly to explore the unknown regions and the TLBO aims to rapidly exploit high-precision solutions in the known regions. Our experimental results demonstrate better performance and faster speed than five state-of-the-art HS variants and show better exploration power than five good TLBO variants with similar run time, which illustrates that our method is promising in solving complex high-dimensional optimization problems. The experiment on portfolio optimization problems also demonstrate that the HSTLBO is effective in solving complex read-world application.

  13. Solution of complex measuring problems for automation of a scientific experiment and technological processes; Reshenie slozhnykh izmeritel`nykh problem pri avtomatizatsii nauchnogo ehksperimenta i tekhnologicheskikh protsessov

    Energy Technology Data Exchange (ETDEWEB)

    Gribov, A A; Zhukov, V A; Sdobnov, S I; Yakovlev, G V [Rossijskij Nauchnyj Tsentr Kurchatovskij Inst., Moskva (Russian Federation)

    1996-12-31

    Paper discusses problems linked with automation of reactor measurements. Paper describes automated system to carry out neutron-physical experiments linked with measuring of slowly varying current of ionization chambers. The system is based on the trunk-module principle with application of a specialized 16-discharge trunk. Total information capacity for one current channel constitutes 5 bytes. 4 refs.; 1 fig.

  14. How Health Care Complexity Leads to Cooperation and Affects the Autonomy of Health Care Professionals

    NARCIS (Netherlands)

    Molleman, Eric; Broekhuis, Manda; Stoffels, Renee; Jaspers, Frans

    2008-01-01

    Health professionals increasingly face patients with complex health problems and this pressurizes them to cooperate. The authors have analyzed how the complexity of health care problems relates to two types of cooperation: consultation and multidisciplinary teamwork (MTW). Moreover, they have

  15. Analyzing public health policy: three approaches.

    Science.gov (United States)

    Coveney, John

    2010-07-01

    Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.

  16. Dealing with Complex and Ill-Structured Problems: Results of a Plan-Do-Check-Act Experiment in a Business Engineering Semester

    Science.gov (United States)

    Riis, Jens Ove; Achenbach, Marlies; Israelsen, Poul; Kyvsgaard Hansen, Poul; Johansen, John; Deuse, Jochen

    2017-01-01

    Challenged by increased globalisation and fast technological development, we carried out an experiment in the third semester of a global business engineering programme aimed at identifying conditions for training students in dealing with complex and ill-structured problems of forming a new business. As this includes a fuzzy front end, learning…

  17. Structural factoring approach for analyzing stochastic networks

    Science.gov (United States)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  18. The ESPAT tool: a general-purpose DSS shell for solving stochastic optimization problems in complex river-aquifer systems

    Science.gov (United States)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel; Tilmant, Amaury

    2015-04-01

    Stochastic programming methods are better suited to deal with the inherent uncertainty of inflow time series in water resource management. However, one of the most important hurdles in their use in practical implementations is the lack of generalized Decision Support System (DSS) shells, usually based on a deterministic approach. The purpose of this contribution is to present a general-purpose DSS shell, named Explicit Stochastic Programming Advanced Tool (ESPAT), able to build and solve stochastic programming problems for most water resource systems. It implements a hydro-economic approach, optimizing the total system benefits as the sum of the benefits obtained by each user. It has been coded using GAMS, and implements a Microsoft Excel interface with a GAMS-Excel link that allows the user to introduce the required data and recover the results. Therefore, no GAMS skills are required to run the program. The tool is divided into four modules according to its capabilities: 1) the ESPATR module, which performs stochastic optimization procedures in surface water systems using a Stochastic Dual Dynamic Programming (SDDP) approach; 2) the ESPAT_RA module, which optimizes coupled surface-groundwater systems using a modified SDDP approach; 3) the ESPAT_SDP module, capable of performing stochastic optimization procedures in small-size surface systems using a standard SDP approach; and 4) the ESPAT_DET module, which implements a deterministic programming procedure using non-linear programming, able to solve deterministic optimization problems in complex surface-groundwater river basins. The case study of the Mijares river basin (Spain) is used to illustrate the method. It consists in two reservoirs in series, one aquifer and four agricultural demand sites currently managed using historical (XIV century) rights, which give priority to the most traditional irrigation district over the XX century agricultural developments. Its size makes it possible to use either the SDP or

  19. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  20. Combining blue native polyacrylamide gel electrophoresis with liquid chromatography tandem mass spectrometry as an effective strategy for analyzing potential membrane protein complexes of Mycobacterium bovis bacillus Calmette-Guérin

    Directory of Open Access Journals (Sweden)

    Li Weijun

    2011-01-01

    Full Text Available Abstract Background Tuberculosis is an infectious bacterial disease in humans caused primarily by Mycobacterium tuberculosis, and infects one-third of the world's total population. Mycobacterium bovis bacillus Calmette-Guérin (BCG vaccine has been widely used to prevent tuberculosis worldwide since 1921. Membrane proteins play important roles in various cellular processes, and the protein-protein interactions involved in these processes may provide further information about molecular organization and cellular pathways. However, membrane proteins are notoriously under-represented by traditional two-dimensional polyacrylamide gel electrophoresis (2-D PAGE and little is known about mycobacterial membrane and membrane-associated protein complexes. Here we investigated M. bovis BCG by an alternative proteomic strategy coupling blue native PAGE to liquid chromatography tandem mass spectrometry (LC-MS/MS to characterize potential protein-protein interactions in membrane fractions. Results Using this approach, we analyzed native molecular composition of protein complexes in BCG membrane fractions. As a result, 40 proteins (including 12 integral membrane proteins, which were organized in 9 different gel bands, were unambiguous identified. The proteins identified have been experimentally confirmed using 2-D SDS PAGE. We identified MmpL8 and four neighboring proteins that were involved in lipid transport complexes, and all subunits of ATP synthase complex in their monomeric states. Two phenolpthiocerol synthases and three arabinosyltransferases belonging to individual operons were obtained in different gel bands. Furthermore, two giant multifunctional enzymes, Pks7 and Pks8, and four mycobacterial Hsp family members were determined. Additionally, seven ribosomal proteins involved in polyribosome complex and two subunits of the succinate dehydrogenase complex were also found. Notablely, some proteins with high hydrophobicity or multiple transmembrane

  1. Combining blue native polyacrylamide gel electrophoresis with liquid chromatography tandem mass spectrometry as an effective strategy for analyzing potential membrane protein complexes of Mycobacterium bovis bacillus Calmette-Guérin.

    Science.gov (United States)

    Zheng, Jianhua; Wei, Candong; Zhao, Lina; Liu, Liguo; Leng, Wenchuan; Li, Weijun; Jin, Qi

    2011-01-18

    Tuberculosis is an infectious bacterial disease in humans caused primarily by Mycobacterium tuberculosis, and infects one-third of the world's total population. Mycobacterium bovis bacillus Calmette-Guérin (BCG) vaccine has been widely used to prevent tuberculosis worldwide since 1921. Membrane proteins play important roles in various cellular processes, and the protein-protein interactions involved in these processes may provide further information about molecular organization and cellular pathways. However, membrane proteins are notoriously under-represented by traditional two-dimensional polyacrylamide gel electrophoresis (2-D PAGE) and little is known about mycobacterial membrane and membrane-associated protein complexes. Here we investigated M. bovis BCG by an alternative proteomic strategy coupling blue native PAGE to liquid chromatography tandem mass spectrometry (LC-MS/MS) to characterize potential protein-protein interactions in membrane fractions. Using this approach, we analyzed native molecular composition of protein complexes in BCG membrane fractions. As a result, 40 proteins (including 12 integral membrane proteins), which were organized in 9 different gel bands, were unambiguous identified. The proteins identified have been experimentally confirmed using 2-D SDS PAGE. We identified MmpL8 and four neighboring proteins that were involved in lipid transport complexes, and all subunits of ATP synthase complex in their monomeric states. Two phenolpthiocerol synthases and three arabinosyltransferases belonging to individual operons were obtained in different gel bands. Furthermore, two giant multifunctional enzymes, Pks7 and Pks8, and four mycobacterial Hsp family members were determined. Additionally, seven ribosomal proteins involved in polyribosome complex and two subunits of the succinate dehydrogenase complex were also found. Notablely, some proteins with high hydrophobicity or multiple transmembrane helixes were identified well in our work. In this

  2. Can motto-goals outperform learning and performance goals? Influence of goal setting on performance and affect in a complex problem solving task

    Directory of Open Access Journals (Sweden)

    Miriam S. Rohe

    2016-09-01

    Full Text Available In this paper, we bring together research on complex problem solving with that on motivational psychology about goal setting. Complex problems require motivational effort because of their inherent difficulties. Goal Setting Theory has shown with simple tasks that high, specific performance goals lead to better performance outcome than do-your-best goals. However, in complex tasks, learning goals have proven more effective than performance goals. Based on the Zurich Resource Model (Storch & Krause, 2014, so-called motto-goals (e.g., "I breathe happiness" should activate a person’s resources through positive affect. It was found that motto-goals are effective with unpleasant duties. Therefore, we tested the hypothesis that motto-goals outperform learning and performance goals in the case of complex problems. A total of N = 123 subjects participated in the experiment. In dependence of their goal condition, subjects developed a personal motto, learning, or performance goal. This goal was adapted for the computer-simulated complex scenario Tailorshop, where subjects worked as managers in a small fictional company. Other than expected, there was no main effect of goal condition for the management performance. As hypothesized, motto goals led to higher positive and lower negative affect than the other two goal types. Even though positive affect decreased and negative affect increased in all three groups during Tailorshop completion, participants with motto goals reported the lowest rates of negative affect over time. Exploratory analyses investigated the role of affect in complex problem solving via mediational analyses and the influence of goal type on perceived goal attainment.

  3. Providing Formative Assessment to Students Solving Multipath Engineering Problems with Complex Arrangements of Interacting Parts: An Intelligent Tutor Approach

    Science.gov (United States)

    Steif, Paul S.; Fu, Luoting; Kara, Levent Burak

    2016-01-01

    Problems faced by engineering students involve multiple pathways to solution. Students rarely receive effective formative feedback on handwritten homework. This paper examines the potential for computer-based formative assessment of student solutions to multipath engineering problems. In particular, an intelligent tutor approach is adopted and…

  4. Hemiequilibrium problems

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam Noor

    2004-01-01

    Full Text Available We consider a new class of equilibrium problems, known as hemiequilibrium problems. Using the auxiliary principle technique, we suggest and analyze a class of iterative algorithms for solving hemiequilibrium problems, the convergence of which requires either pseudomonotonicity or partially relaxed strong monotonicity. As a special case, we obtain a new method for hemivariational inequalities. Since hemiequilibrium problems include hemivariational inequalities and equilibrium problems as special cases, the results proved in this paper still hold for these problems.

  5. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    Directory of Open Access Journals (Sweden)

    Heinz-Martin Süß

    2018-05-01

    Full Text Available The original aim of complex problem solving (CPS research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system. The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2 and figural reasoning (Study 2 – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1 cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2 in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly

  6. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    Science.gov (United States)

    Süß, Heinz-Martin; Kretzschmar, André

    2018-01-01

    The original aim of complex problem solving (CPS) research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system) twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system). The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2) and figural reasoning (Study 2) – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1) cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2) in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly utilizes the

  7. Bourbaki's structure theory in the problem of complex systems simulation models synthesis and model-oriented programming

    Science.gov (United States)

    Brodsky, Yu. I.

    2015-01-01

    The work is devoted to the application of Bourbaki's structure theory to substantiate the synthesis of simulation models of complex multicomponent systems, where every component may be a complex system itself. An application of the Bourbaki's structure theory offers a new approach to the design and computer implementation of simulation models of complex multicomponent systems—model synthesis and model-oriented programming. It differs from the traditional object-oriented approach. The central concept of this new approach and at the same time, the basic building block for the construction of more complex structures is the concept of models-components. A model-component endowed with a more complicated structure than, for example, the object in the object-oriented analysis. This structure provides to the model-component an independent behavior-the ability of standard responds to standard requests of its internal and external environment. At the same time, the computer implementation of model-component's behavior is invariant under the integration of models-components into complexes. This fact allows one firstly to construct fractal models of any complexity, and secondly to implement a computational process of such constructions uniformly-by a single universal program. In addition, the proposed paradigm allows one to exclude imperative programming and to generate computer code with a high degree of parallelism.

  8. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  9. L’argumentation rhétorique et le problème de l’auditoire complexe Rhetorical Argumentation and the Problem of the Complex Audience

    Directory of Open Access Journals (Sweden)

    Christopher W. Tindale

    2009-04-01

    Full Text Available La nécessité pour l’argumentateur de connaître son auditoire afin de persuader est l’un des lieux communs de la théorie de l’argumentation. Mais, en dehors du cas où l’argumentation s’adresse à soi-même ou à un interlocuteur unique, les auditoires vers lesquels nous nous tournons sont de composition complexe, et reflètent la diversité de nos identités et les différents groupes auxquels nous appartenons. Comment les argumentateurs doivent-ils faire face à une telle diversité de l’auditoire ? S’inspirant principalement des travaux de Perelman et Olbrechts-Tyteca, ainsi que de ceux d’Amartya Sen, cet article explore la manière dont divers aspects identitaires sont choisis par les auditoires, et les moyens par lesquels les argumentateurs peuvent encourager de tels choix en préalable à l’acte de persuasion lui-même.It is a commonplace of argumentation theory that an arguer needs to know her or his audience in order to be persuasive. But beyond arguments directed to oneself or to a single interlocutor, the audiences we address are complex in make-up, reflecting the diversity of our own identities and the different groups to which we belong. How should arguers accommodate such diversity within audiences? Drawing principally from the work of Perelman and Olbrechts-Tyteca, as well as Amartya Sen, this paper explores the ways aspects of identities are chosen by audiences, and how arguers can encourage such choices as a preliminary move to persuasion itself.

  10. How the Center for Public Partnerships and Research Navigates Complex Social Problems to Make a Collective Difference.

    Science.gov (United States)

    Counts, Jacqueline; Gillam, Rebecca; Garstka, Teri A; Urbach, Ember

    2018-01-01

    The challenge of maximizing the well-being of children, youth, and families is recognizing that change occurs within complex social systems. Organizations dedicated to improving practice, advancing knowledge, and informing policy for the betterment of all must have the right approach, structure, and personnel to work in these complex systems. The University of Kansas Center for Public Partnerships and Research cultivates a portfolio of innovation, research, and data science approaches positioned to help move social service fields locally, regionally, and nationally. Mission, leadership, and smart growth guide our work and drive our will to affect positive change in the world.

  11. Solution of environmental protection problems and complex utilization of raw materials during mining and processing of uranium ores

    International Nuclear Information System (INIS)

    Litvinenko, V.G.; Savva, P.P.

    1993-01-01

    Consideration is given to the complex of measures taken in Priargunsky industrial mine-chemical association and directed to environment protection, complex utilization of raw materials during mining and processing of uranium ores. These measures include: 1) reduction of toxic chemical agent effluents into atmosphere due to introduction of new methods and gas cleaning systems; 2) rational use of water resources owing to application of circulating water supply systems, waste waters treatment and effective control of the state of water consumption by industrial enterprises; 3) utilization of gangue and industrial solid wastes

  12. Emotion regulation in interpersonal problems: the role of cognitive-emotional complexity, emotion regulation goals, and expressivity.

    Science.gov (United States)

    Coats, Abby Heckman; Blanchard-Fields, Fredda

    2008-03-01

    Young, middle-aged, and older adults' emotion regulation strategies in interpersonal problems were examined. Participants imagined themselves in anger- or sadness-eliciting situations with a close friend. Factor analyses of a new questionnaire supported a 4-factor model of emotion regulation strategies, including passivity, expressing emotions, seeking emotional information or support, and solving the problem. Results suggest that age differences in emotion regulation (such as older adults' increased endorsement of passive emotion regulation relative to young adults) are partially due to older adults' decreased ability to integrate emotion and cognition, increased prioritization of emotion regulation goals, and decreased tendency to express anger. (c) 2008 APA, all rights reserved.

  13. Several problems of algorithmization in integrated computation programs on third generation computers for short circuit currents in complex power networks

    Energy Technology Data Exchange (ETDEWEB)

    Krylov, V.A.; Pisarenko, V.P.

    1982-01-01

    Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

  14. Addressing Complex Problems: Using Authentic Audiences and Challenges to Develop Adaptive Leadership and Socially Responsible Agency in Leadership Learners

    Science.gov (United States)

    Andenoro, Anthony C.; Sowcik, Matthew J.; Balser, Teresa C.

    2017-01-01

    Complex and adaptive challenges threaten human well-being and sustainability. However, our leadership graduates often lack the capacity and or commitment to address these challenges in a meaningful way. This paper details a five-year study exploring the impact of an interdisciplinary undergraduate course on the development of global capacities,…

  15. THE PROBLEM OF ARCHITECTURE DESIGN IN A CONTEXT OF PARTIALLY KNOWN REQUIREMENTS OF COMPLEX WEB BASED APPLICATION "KSU FEEDBACK"

    Directory of Open Access Journals (Sweden)

    A. Spivakovsky

    2013-03-01

    Full Text Available The problem of flexible architecture design for critical parts of “KSU Feedback” application which do not have full requirements or clearly defined scope. Investigated recommended practices for solving such type of tasks and shown how they are applied in “KSU Feedback” architecture.

  16. The management of cognitive load during complex cognitive skill acquisition by means of computer-simulated problem solving.

    NARCIS (Netherlands)

    Kester, Liesbeth; Kirschner, Paul A.; Van Merriënboer, Jeroen

    2007-01-01

    This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram,

  17. Simple Solutions to Complex Problems: Moral Panic and the Fluid Shift from "Equity" to "Quality" in Education

    Science.gov (United States)

    Mockler, Nicole

    2014-01-01

    Education is increasingly conceptualised by governments and policymakers in western democracies in terms of productivity and human capital, emphasising elements of individualism and competition over concerns around democracy and equity. More and more, solutions to intransigent educational problems related to equity are seen in terms of quality and…

  18. Self-Regulation in the Midst of Complexity: A Case Study of High School Physics Students Engaged in Ill-Structured Problem Solving

    Science.gov (United States)

    Milbourne, Jeffrey David

    The purpose of this dissertation study was to explore the experiences of high school physics students who were solving complex, ill-structured problems, in an effort to better understand how self-regulatory behavior mediated the project experience. Consistent with Voss, Green, Post, and Penner's (1983) conception of an ill-structured problem in the natural sciences, the 'problems' consisted of scientific research projects that students completed under the supervision of a faculty mentor. Zimmerman and Campillo's (2003) self-regulatory framework of problem solving provided a holistic guide to data collection and analysis of this multi-case study, with five individual student cases. The study's results are explored in two manuscripts, each targeting a different audience. The first manuscript, intended for the Science Education Research community, presents a thick, rich description of the students' project experiences, consistent with a qualitative, case study analysis. Findings suggest that intrinsic interest was an important self-regulatory factor that helped motivate students throughout their project work, and that the self-regulatory cycle of forethought, performance monitoring, and self-reflection was an important component of the problem-solving process. Findings also support the application of Zimmerman and Campillo's framework to complex, ill-structured problems, particularly the cyclical nature of the framework. Finally, this study suggests that scientific research projects, with the appropriate support, can be a mechanism for improving students' selfregulatory behavior. The second manuscript, intended for Physics practitioners, combines the findings of the first manuscript with the perspectives of the primary, on-site research mentor, who has over a decade's worth of experience mentoring students doing physics research. His experience suggests that a successful research experience requires certain characteristics, including: a slow, 'on-ramp' to the research

  19. An integrated in silico approach to analyze the involvement of single amino acid polymorphisms in FANCD1/BRCA2-PALB2 and FANCD1/BRCA2-RAD51 complex.

    Science.gov (United States)

    Doss, C George Priya; Nagasundaram, N

    2014-11-01

    Fanconi anemia (FA) is an autosomal recessive human disease characterized by genomic instability and a marked increase in cancer risk. The importance of FANCD1 gene is manifested by the fact that deleterious amino acid substitutions were found to confer susceptibility to hereditary breast and ovarian cancers. Attaining experimental knowledge about the possible disease-associated substitutions is laborious and time consuming. The recent introduction of genome variation analyzing in silico tools have the capability to identify the deleterious variants in an efficient manner. In this study, we conducted in silico variation analysis of deleterious non-synonymous SNPs at both functional and structural level in the breast cancer and FA susceptibility gene BRCA2/FANCD1. To identify and characterize deleterious mutations in this study, five in silico tools based on two different prediction methods namely pathogenicity prediction (SIFT, PolyPhen, and PANTHER), and protein stability prediction (I-Mutant 2.0 and MuStab) were analyzed. Based on the deleterious scores that overlap in these in silico approaches, and the availability of three-dimensional structures, structure analysis was carried out with the major mutations that occurred in the native protein coded by FANCD1/BRCA2 gene. In this work, we report the results of the first molecular dynamics (MD) simulation study performed to analyze the structural level changes in time scale level with respect to the native and mutated protein complexes (G25R, W31C, W31R in FANCD1/BRCA2-PALB2, and F1524V, V1532F in FANCD1/BRCA2-RAD51). Analysis of the MD trajectories indicated that predicted deleterious variants alter the structural behavior of BRCA2-PALB2 and BRCA2-RAD51 protein complexes. In addition, statistical analysis was employed to test the significance of these in silico tool predictions. Based on these predictions, we conclude that the identification of disease-related SNPs by in silico methods, in combination with MD

  20. Complex solution of problem of all-season construction of roads and pipelines on universal composite pontoon units

    Science.gov (United States)

    Ryabkov, A. V.; Stafeeva, N. A.; Ivanov, V. A.; Zakuraev, A. F.

    2018-05-01

    A complex construction consisting of a universal floating pontoon road for laying pipelines in automatic mode on its body all year round and in any weather for Siberia and the Far North has been designed. A new method is proposed for the construction of pipelines on pontoon modules, which are made of composite materials. Pontoons made of composite materials for bedding pipelines with track-forming guides for automated wheeled transport, pipelayer, are designed. The proposed system eliminates the construction of a road along the route, ensures the buoyancy and smoothness of the self-propelled automated stacker in the form of a "centipede", which has a number of significant advantages in the construction and operation of the entire complex in the swamp and watered areas without overburden.

  1. The management of cognitive load during complex cognitive skill aquisition by means of computer simulated problem solving

    OpenAIRE

    Kester, L.; Kirschner, P.A.; Merriënboer, J.J.G.

    2005-01-01

    This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, while information in the integrated format condition was integrated in the circuit diagram. It was hypothesized that learners in the integrated format would achieve better test results than the learne...

  2. Experiences with explicit finite-difference schemes for complex fluid dynamics problems on STAR-100 and CYBER-203 computers

    Science.gov (United States)

    Kumar, A.; Rudy, D. H.; Drummond, J. P.; Harris, J. E.

    1982-01-01

    Several two- and three-dimensional external and internal flow problems solved on the STAR-100 and CYBER-203 vector processing computers are described. The flow field was described by the full Navier-Stokes equations which were then solved by explicit finite-difference algorithms. Problem results and computer system requirements are presented. Program organization and data base structure for three-dimensional computer codes which will eliminate or improve on page faulting, are discussed. Storage requirements for three-dimensional codes are reduced by calculating transformation metric data in each step. As a result, in-core grid points were increased in number by 50% to 150,000, with a 10% execution time increase. An assessment of current and future machine requirements shows that even on the CYBER-205 computer only a few problems can be solved realistically. Estimates reveal that the present situation is more storage limited than compute rate limited, but advancements in both storage and speed are essential to realistically calculate three-dimensional flow.

  3. A case-based, problem-based learning approach to prepare master of public health candidates for the complexities of global health.

    Science.gov (United States)

    Leon, Juan S; Winskell, Kate; McFarland, Deborah A; del Rio, Carlos

    2015-03-01

    Global health is a dynamic, emerging, and interdisciplinary field. To address current and emerging global health challenges, we need a public health workforce with adaptable and collaborative problem-solving skills. In the 2013-2014 academic year, the Hubert Department of Global Health at the Rollins School of Public Health-Emory University launched an innovative required core course for its first-year Master of Public Health students in the global health track. The course uses a case-based, problem-based learning approach to develop global health competencies. Small teams of students propose solutions to these problems by identifying learning issues and critically analyzing and synthesizing new information. We describe the course structure and logistics used to apply this approach in the context of a large class and share lessons learned.

  4. Modeling wake effects in large wind farms in complex terrain: the problem, the methods and the issues

    DEFF Research Database (Denmark)

    Politis, E.S.; Prospathopoulos, J.; Cabezon, D.

    2012-01-01

    turbulence closures, are used. The wind turbines are modeled as momentum absorbers by means of their thrust coefficient through the actuator disk approach. Alternative methods for estimating the reference wind speed in the calculation of the thrust are tested. The work presented in this paper is part......Computational fluid dynamic (CFD) methods are used in this paper to predict the power production from entire wind farms in complex terrain and to shed some light into the wake flow patterns. Two full three-dimensional Navier–Stokes solvers for incompressible fluid flow, employing k - ε and k - ω...

  5. Dealing with complex and ill-structured problems: results of a Plan-Do-Check-Act experiment in a business engineering semester

    Science.gov (United States)

    Riis, Jens Ove; Achenbach, Marlies; Israelsen, Poul; Kyvsgaard Hansen, Poul; Johansen, John; Deuse, Jochen

    2017-07-01

    Challenged by increased globalisation and fast technological development, we carried out an experiment in the third semester of a global business engineering programme aimed at identifying conditions for training student in dealing with complex and ill-structured problems of forming a new business. As this includes a fuzzy front end, learning cannot be measured in traditional, quantitative terms; therefore, we have explored the use of reflection to convert tacit knowledge to explicit knowledge. The experiment adopted a Plan-Do-Check-Act approach and concluded with developing a plan for new learning initiatives in the subsequent year's semester. The findings conclude that (1) problem-based learning develops more competencies than ordinarily measured at the examination, especially, the social/communication and personal competencies are developed; (2) students are capable of dealing with a complex and ambiguous problem, if properly guided. Four conditions were identified; (3) most students are not conscious of their learning, but are able to reflect if properly encouraged; and (4) improving engineering education should be considered as an organisational learning process.

  6. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  7. The feasibility of using expert systems to cope with the complexity and extent of the indoor radon problem

    International Nuclear Information System (INIS)

    Raes, F.; Poffijn, A.; Eggermont, G.

    1988-01-01

    The main problems in predicting the average radon concentration in a single house are: (1) to obtain specific high resolution information about the house, and (2) to handle qualitative but relevant information. We introduce the idea of using an expert system to obtain high resolution data by interrogating the inhabitants about their house, as well as to interpret the qualitative information obtained in this way. To study the feasibility of this approach, a prototype expert system has been written which was given the obvious name Radon Expert System (RAES). RAES derives a radon index starting from information obtained from geological maps and other data bases. It subsequently refines this information and focusses on a single house by asking for information from the inhabitants. With the help of RAES, we interrogated the inhabitants of a number of houses where radon measurements had previously been performed. The correspondence between prediction and measurement is encouraging. (author)

  8. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  9. Attention-deficit hyperactivity disorder (ADHD), substance use disorders, and criminality: a difficult problem with complex solutions.

    Science.gov (United States)

    Knecht, Carlos; de Alvaro, Raquel; Martinez-Raga, Jose; Balanza-Martinez, Vicent

    2015-05-01

    The association between attention-deficit hyperactivity disorder (ADHD) and criminality has been increasingly recognized as an important societal concern. Studies conducted in different settings have revealed high rates of ADHD among adolescent offenders. The risk for criminal behavior among individuals with ADHD is increased when there is psychiatric comorbidity, particularly conduct disorder and substance use disorder. In the present report, it is aimed to systematically review the literature on the epidemiological, neurobiological, and other risk factors contributing to this association, as well as the key aspects of the assessment, diagnosis, and treatment of ADHD among offenders. A systematic literature search of electronic databases (PubMed, EMBASE, and PsycINFO) was conducted to identify potentially relevant studies published in English, in peer-reviewed journals. Studies conducted in various settings within the judicial system and in many different countries suggest that the rate of adolescent and adult inmates with ADHD far exceeds that reported in the general population; however, underdiagnosis is common. Similarly, follow-up studies of children with ADHD have revealed high rates of criminal behaviors, arrests, convictions, and imprisonment in adolescence and adulthood. Assessment of ADHD and comorbid condition requires an ongoing and careful process. When treating offenders or inmates with ADHD, who commonly present other comorbid psychiatric disorder complex, comprehensive and tailored interventions, combining pharmacological and psychosocial strategies are likely to be needed.

  10. Ultrasonic simulation - Imagine3D and SimScan: Tools to solve the inverse problem for complex turbine components

    International Nuclear Information System (INIS)

    Mair, H.D.; Ciorau, P.; Owen, D.; Hazelton, T.; Dunning, G.

    2000-01-01

    Two ultrasonic simulation packages: Imagine 3D and SIMSCAN have specifically been developed to solve the inverse problem for blade root and rotor steeple of low-pressure turbine. The software was integrated with the 3D drawing of the inspected parts, and with the dimensions of linear phased-array probes. SIMSCAN simulates the inspection scenario in both optional conditions: defect location and probe movement/refracted angle range. The results are displayed into Imagine 3-D, with a variety of options: rendering, display 1:1, grid, generated UT beam. The results are very useful for procedure developer, training and to optimize the phased-array probe inspection sequence. A spreadsheet is generated to correlate the defect coordinates with UT data (probe position, skew and refracted angle, UT path, and probe movement). The simulation models were validated during experimental work with phased-array systems. The accuracy in probe position is ±1 mm, and the refracted/skew angle is within ±0.5 deg. . Representative examples of phased array focal laws/probe movement for a specific defect location, are also included

  11. Problems of nuclear reactor safety. Vol. 1

    International Nuclear Information System (INIS)

    Shal'nov, A.V.

    1995-01-01

    Proceedings of the 9. Topical Meeting 'Problems of nuclear reactor safety' are presented. Papers include results of studies and developments associated with methods of calculation and complex computerized simulation for stationary and transient processes in nuclear power plants. Main problems of reactor safety are discussed as well as rector accidents on operating NPP's are analyzed

  12. Tackling Complex Emergency Response Solutions Evaluation Problems in Sustainable Development by Fuzzy Group Decision Making Approaches with Considering Decision Hesitancy and Prioritization among Assessing Criteria.

    Science.gov (United States)

    Qi, Xiao-Wen; Zhang, Jun-Ling; Zhao, Shu-Ping; Liang, Chang-Yong

    2017-10-02

    In order to be prepared against potential balance-breaking risks affecting economic development, more and more countries have recognized emergency response solutions evaluation (ERSE) as an indispensable activity in their governance of sustainable development. Traditional multiple criteria group decision making (MCGDM) approaches to ERSE have been facing simultaneous challenging characteristics of decision hesitancy and prioritization relations among assessing criteria, due to the complexity in practical ERSE problems. Therefore, aiming at the special type of ERSE problems that hold the two characteristics, we investigate effective MCGDM approaches by hiring interval-valued dual hesitant fuzzy set (IVDHFS) to comprehensively depict decision hesitancy. To exploit decision information embedded in prioritization relations among criteria, we firstly define an fuzzy entropy measure for IVDHFS so that its derivative decision models can avoid potential information distortion in models based on classic IVDHFS distance measures with subjective supplementing mechanism; further, based on defined entropy measure, we develop two fundamental prioritized operators for IVDHFS by extending Yager's prioritized operators. Furthermore, on the strength of above methods, we construct two hesitant fuzzy MCGDM approaches to tackle complex scenarios with or without known weights for decision makers, respectively. Finally, case studies have been conducted to show effectiveness and practicality of our proposed approaches.

  13. Developing a case mix classification for child and adolescent mental health services: the influence of presenting problems, complexity factors and service providers on number of appointments.

    Science.gov (United States)

    Martin, Peter; Davies, Roger; Macdougall, Amy; Ritchie, Benjamin; Vostanis, Panos; Whale, Andy; Wolpert, Miranda

    2017-09-01

    Case-mix classification is a focus of international attention in considering how best to manage and fund services, by providing a basis for fairer comparison of resource utilization. Yet there is little evidence of the best ways to establish case mix for child and adolescent mental health services (CAMHS). To develop a case mix classification for CAMHS that is clinically meaningful and predictive of number of appointments attended and to investigate the influence of presenting problems, context and complexity factors and provider variation. We analysed 4573 completed episodes of outpatient care from 11 English CAMHS. Cluster analysis, regression trees and a conceptual classification based on clinical best practice guidelines were compared regarding their ability to predict number of appointments, using mixed effects negative binomial regression. The conceptual classification is clinically meaningful and did as well as data-driven classifications in accounting for number of appointments. There was little evidence for effects of complexity or context factors, with the possible exception of school attendance problems. Substantial variation in resource provision between providers was not explained well by case mix. The conceptually-derived classification merits further testing and development in the context of collaborative decision making.

  14. Tackling Complex Emergency Response Solutions Evaluation Problems in Sustainable Development by Fuzzy Group Decision Making Approaches with Considering Decision Hesitancy and Prioritization among Assessing Criteria

    Directory of Open Access Journals (Sweden)

    Xiao-Wen Qi

    2017-10-01

    Full Text Available In order to be prepared against potential balance-breaking risks affecting economic development, more and more countries have recognized emergency response solutions evaluation (ERSE as an indispensable activity in their governance of sustainable development. Traditional multiple criteria group decision making (MCGDM approaches to ERSE have been facing simultaneous challenging characteristics of decision hesitancy and prioritization relations among assessing criteria, due to the complexity in practical ERSE problems. Therefore, aiming at the special type of ERSE problems that hold the two characteristics, we investigate effective MCGDM approaches by hiring interval-valued dual hesitant fuzzy set (IVDHFS to comprehensively depict decision hesitancy. To exploit decision information embedded in prioritization relations among criteria, we firstly define an fuzzy entropy measure for IVDHFS so that its derivative decision models can avoid potential information distortion in models based on classic IVDHFS distance measures with subjective supplementing mechanism; further, based on defined entropy measure, we develop two fundamental prioritized operators for IVDHFS by extending Yager’s prioritized operators. Furthermore, on the strength of above methods, we construct two hesitant fuzzy MCGDM approaches to tackle complex scenarios with or without known weights for decision makers, respectively. Finally, case studies have been conducted to show effectiveness and practicality of our proposed approaches.

  15. Hypertension: Believe it or not, a Complex Problem La hipertensión arterial: aunque no lo parezca, un problema complejo

    Directory of Open Access Journals (Sweden)

    Alfredo Darío Espinosa Brito

    2011-03-01

    Full Text Available High blood pressure (hypertension is recognized as a major health problem due both to its morbidity and the disability it causes and to its impact on mortality, especially cardiovascular mortality. However, effectively addressing its prevention and control, both in individuals and in the general population, does not seem to be an easy task, even these days. This paper aims to present different aspects of arterial hypertension (a concept through diagnosis, treatment and follow-up focusing on this entity as a complex system including multiple elements related to cardiovascular disease.La hipertensión arterial constituye un reconocido problema de salud, tanto por su morbilidad, por la discapacidad que provoca, como por su repercusión en la mortalidad, especialmente cardiovascular. Sin embargo, enfrentar eficazmente su prevención y control, tanto en los individuos como en la población en general, no parece una tarea fácil, aún en nuestros días. Este trabajo tiene como objetivo presentar diferentes aspectos de la hipertensión arterial (concepto pasando por el diagnóstico, tratamiento y seguimiento enfocando dicha entidad como un sistema complejo que abarca múltiples elementos relacionados con las enfermedades cardiovasculares.HYPERTENSION: BELIEVE IT OR NOT, A COMPLEX PROBLEMABSTRACTHigh blood pressure (hypertension is recognized as a major health problem due both to its morbidity and the disability it causes and to its impact on mortality, especially cardiovascular mortality. However, effectively addressing its prevention and control, both in individuals and in the general population, does not seem to be an easy task, even these days. This paper aims to present different aspects of arterial hypertension (a concept through diagnosis, treatment and follow-up focusing on this entity as a complex system including multiple elements related to cardiovascular disease.

  16. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence

    Science.gov (United States)

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H.

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students’ CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence. PMID:26283992

  17. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence.

    Science.gov (United States)

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students' CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence.

  18. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of complex biosystem analyzing technology; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Fukugo seibutsukei kaiseki gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    The aim is to utilize the sophisticated functions of complex biosystems. In the research and development of technologies for effectively utilizing unexploited resources and substances such as seeweeds and algae, seaweeds are added to seawater to turn into a microbial suspension after the passage of two weeks, the suspension is next scattered on a carageenan culture medium, and then carageenan decomposing microbes are obtained. In the research and development of technologies for utilizing microbe/fauna-flora complex systems, technologies for exploring and analyzing microbes are studied. For this purpose, 48 kinds of sponges and 300 kinds of bacteria symbiotic with the sponges are sampled in Malaysia. Out of them, 15 exhibit enzyme inhibition and Artemia salina lethality activities. In the development of technologies for analyzing the functions of microbes engaged in the production of useful resources and substances for animals and plants, 150 kinds of micro-algae are subjected to screening using protease and chitinase inhibiting activities as the indexes, and it is found that an extract of Isochrysis galbana displays an intense inhibitory activity. The alga is cultured in quantities, the active component is isolated from 20g of dried alga, and its constitution is determined. (NEDO)

  19. Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

    CERN Multimedia

    2005-01-01

    Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

  20. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  1. Electron attachment analyzer

    International Nuclear Information System (INIS)

    Popp, P.; Grosse, H.J.; Leonhardt, J.; Mothes, S.; Oppermann, G.

    1984-01-01

    The invention concerns an electron attachment analyzer for detecting traces of electroaffine substances in electronegative gases, especially in air. The analyzer can be used for monitoring working places, e. g., in operating theatres. The analyzer consists of two electrodes inserted in a base frame of insulating material (quartz or ceramics) and a high-temperature resistant radiation source ( 85 Kr, 3 H, or 63 Ni)

  2. A complex analysis problem book

    CERN Document Server

    Alpay, Daniel

    2016-01-01

    This second edition presents a collection of exercises on the theory of analytic functions, including completed and detailed solutions. It introduces students to various applications and aspects of the theory of analytic functions not always touched on in a first course, while also addressing topics of interest to electrical engineering students (e.g., the realization of rational functions and its connections to the theory of linear systems and state space representations of such systems). It provides examples of important Hilbert spaces of analytic functions (in particular the Hardy space and the Fock space), and also includes a section reviewing essential aspects of topology, functional analysis and Lebesgue integration. Benefits of the 2nd edition Rational functions are now covered in a separate chapter. Further, the section on conformal mappings has been expanded.

  3. Complexity of Quantum Impurity Problems

    Science.gov (United States)

    Bravyi, Sergey; Gosset, David

    2017-12-01

    We give a quasi-polynomial time classical algorithm for estimating the ground state energy and for computing low energy states of quantum impurity models. Such models describe a bath of free fermions coupled to a small interacting subsystem called an impurity. The full system consists of n fermionic modes and has a Hamiltonian {H=H_0+H_{imp}}, where H 0 is quadratic in creation-annihilation operators and H imp is an arbitrary Hamiltonian acting on a subset of O(1) modes. We show that the ground energy of H can be approximated with an additive error {2^{-b}} in time {n^3 \\exp{[O(b^3)]}}. Our algorithm also finds a low energy state that achieves this approximation. The low energy state is represented as a superposition of {\\exp{[O(b^3)]}} fermionic Gaussian states. To arrive at this result we prove several theorems concerning exact ground states of impurity models. In particular, we show that eigenvalues of the ground state covariance matrix decay exponentially with the exponent depending very mildly on the spectral gap of H 0. A key ingredient of our proof is Zolotarev's rational approximation to the {√{x}} function. We anticipate that our algorithms may be used in hybrid quantum-classical simulations of strongly correlated materials based on dynamical mean field theory. We implemented a simplified practical version of our algorithm and benchmarked it using the single impurity Anderson model.

  4. COMBINATION OF APICALLY POSITIONED AND CORONALLY ADVANCED FLAP IN THE TREATMENT OF A COMPLEX MUCOGINGIVAL AND RESTORATIVE PROBLEM. A 3-YEAR FOLLOW-UP. (Case Report

    Directory of Open Access Journals (Sweden)

    Kamen Kotsilkov

    2015-07-01

    Full Text Available INTRODUCTION: The modern Periodontology has various approaches to achieve a complete functional and aesthetic rehabilitation of the mucogingival complex. These techniques include application of flaps with apical or coronal advancement in order to achieve different treatment objectives. Complex cases with different pathology on adjacent teeth require several surgeries thereby increasing treatment time and patient discomfort. New combined approaches are needed to meet the challenges of such cases. OBJECTIVE: This report presents a case with a simultaneous application of a resective and a mucogingival technique in one dental sextant. METHODS: I.C. (36 with a localized chronic periodontitis, Miller Class I gingival recessions (13,14 and subgingival caries lesions (15,16. A combined approach with simultaneous crown lengthening with apically positioned flap for 16,15 and root coverage with enamel matrix derivate and a coronally advanced flap for 14,13 was applied in order to avoid multiple surgical procedures. RESULTS: On the third month after the surgical procedure a complete root coverage (13,14 was achieved. The crown lengthening procedure enabled the restoration of the caries lesions and the placement of new crowns (15,16. The result at the third year demonstrates a stable gingival margin with no recurrence of the gingival recessions. CONCLUSION: The applied combined procedure led to a complete resolution of the existing problems with a single surgery. The simultaneous application of different procedures seems a promising approach aimed to reduce the treatment time and to diminish patient discomfort.

  5. The "Performance of Rotavirus and Oral Polio Vaccines in Developing Countries" (PROVIDE) study: description of methods of an interventional study designed to explore complex biologic problems.

    Science.gov (United States)

    Kirkpatrick, Beth D; Colgate, E Ross; Mychaleckyj, Josyf C; Haque, Rashidul; Dickson, Dorothy M; Carmolli, Marya P; Nayak, Uma; Taniuchi, Mami; Naylor, Caitlin; Qadri, Firdausi; Ma, Jennie Z; Alam, Masud; Walsh, Mary Claire; Diehl, Sean A; Petri, William A

    2015-04-01

    Oral vaccines appear less effective in children in the developing world. Proposed biologic reasons include concurrent enteric infections, malnutrition, breast milk interference, and environmental enteropathy (EE). Rigorous study design and careful data management are essential to begin to understand this complex problem while assuring research subject safety. Herein, we describe the methodology and lessons learned in the PROVIDE study (Dhaka, Bangladesh). A randomized clinical trial platform evaluated the efficacy of delayed-dose oral rotavirus vaccine as well as the benefit of an injectable polio vaccine replacing one dose of oral polio vaccine. This rigorous infrastructure supported the additional examination of hypotheses of vaccine underperformance. Primary and secondary efficacy and immunogenicity measures for rotavirus and polio vaccines were measured, as well as the impact of EE and additional exploratory variables. Methods for the enrollment and 2-year follow-up of a 700 child birth cohort are described, including core laboratory, safety, regulatory, and data management practices. Intense efforts to standardize clinical, laboratory, and data management procedures in a developing world setting provide clinical trials rigor to all outcomes. Although this study infrastructure requires extensive time and effort, it allows optimized safety and confidence in the validity of data gathered in complex, developing country settings. © The American Society of Tropical Medicine and Hygiene.

  6. Can Complexity be Planned?

    Directory of Open Access Journals (Sweden)

    Ilona Koutny

    2015-04-01

    Full Text Available The long accepted complexity invariance of human languages has become controversial within the last decade. In investigations of the problem, both creole and planned languages have often been neglected. After a presentation of the scope of the invariance problem and the proposition of the natural to planned language continuum, this article will discuss the contribution of planned languages. It will analyze the complexity of Esperanto at the phonological, morphological, syntactic and semantic levels, using linguistic data bases. The role of the L2 speech community and development of the language will also be taken into account when discussing the endurance of the same level of simplicity of this planned international language. The author argues that complexity can be variable and to some extent planned and maintained.

  7. Innovative Use of the Law to Address Complex Global Health Problems Comment on "The Legal Strength of International Health Instruments - What It Brings toGlobal Health Governance?"

    Science.gov (United States)

    Walls, Helen L; Ooms, Gorik

    2017-05-20

    Addressing the increasingly globalised determinants of many important problems affecting human health is a complex task requiring collective action. We suggest that part of the solution to addressing intractable global health issues indeed lies with the role of new legal instruments in the form of globally binding treaties, as described in the recent article of Nikogosian and Kickbusch. However, in addition to the use of international law to develop new treaties, another part of the solution may lie in innovative use of existing legal instruments. A 2015 court ruling in The Hague, which ordered the Dutch government to cut greenhouse gas emissions by at least 25% within five years, complements this perspective, suggesting a way forward for addressing global health problems that critically involves civil society and innovative use of existing domestic legal instruments. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  8. Wellbeing and resilience: mechanisms of transmission of health and risk in parents with complex mental health problems and their offspring--The WARM Study.

    Science.gov (United States)

    Harder, Susanne; Davidsen, Kirstine; MacBeth, Angus; Lange, Theis; Minnis, Helen; Andersen, Marianne Skovsager; Simonsen, Erik; Lundy, Jenna-Marie; Nyström-Hansen, Maja; Trier, Christopher Høier; Røhder, Katrine; Gumley, Andrew

    2015-12-09

    The WARM study is a longitudinal cohort study following infants of mothers with schizophrenia, bipolar disorder, depression and control from pregnancy to infant 1 year of age. Children of parents diagnosed with complex mental health problems including schizophrenia, bipolar disorder and depression, are at increased risk of developing mental health problems compared to the general population. Little is known regarding the early developmental trajectories of infants who are at ultra-high risk and in particular the balance of risk and protective factors expressed in the quality of early caregiver-interaction. We are establishing a cohort of pregnant women with a lifetime diagnosis of schizophrenia, bipolar disorder, major depressive disorder and a non-psychiatric control group. Factors in the parents, the infant and the social environment will be evaluated at 1, 4, 16 and 52 weeks in terms of evolution of very early indicators of developmental risk and resilience focusing on three possible environmental transmission mechanisms: stress, maternal caregiver representation, and caregiver-infant interaction. The study will provide data on very early risk developmental status and associated psychosocial risk factors, which will be important for developing targeted preventive interventions for infants of parents with severe mental disorder. NCT02306551, date of registration November 12, 2014.

  9. Nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Stritar, A.

    1986-01-01

    The development of Nuclear Power Plant Analyzers in USA is described. There are two different types of Analyzers under development in USA, the forst in Idaho and Los Alamos national Lab, the second in brookhaven National lab. That one is described in detail. The computer hardware and the mathematical models of the reactor vessel thermalhydraulics are described. (author)

  10. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  11. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts of vari...

  12. Gearbox vibration diagnostic analyzer

    Science.gov (United States)

    1992-01-01

    This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.

  13. EL PROBLEMA DE LA SOSTENIBILIDAD DENTRO DE LA COMPLEJIDAD DE LOS SISTEMAS DE PRODUCCION AGROPECUARIOS THE PROBLEM OF SUSTAINABILITY WITHIN THE COMPLEXITY OF AGRICULTURAL PRODUCTION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Alejandro Cotes Torres

    2005-12-01

    Full Text Available El problema de la sostenibilidad es una temática que desde finales del siglo XX, ha venido preocupando cada vez mas a los diferentes sectores de la sociedad; pasando a ser uno de los temas de mayor interés para empresarios, consumidores, académicos e investigadores, que conforman las diferentes cadenas agroalimentarias del mundo. Este artículo presenta desde el punto de vista de la Teoría General de Sistemas, algunos elementos de reflexión critica, abordando la problemática de la sostenibilidad desde la complejidad de los sistemas de producción agropecuarios, partiendo desde la concepción filosófica original de la agricultura, hasta llegar a plantear algunas consideraciones que se deben tener en cuenta para el desarrollo de avances científicos y tecnológicos acordes con las necesidades de las cadenas agroalimentarias del siglo XXI; las cuales permiten orientar no solo el trabajo de los profesionales que lideran los procesos de producción animal y vegetal, sino que crea un sentido de pertenencia en todos los participantes de la cadena, resaltando la importancia de estudiar a través de un pensamiento sistémico, la Agronomía y la Zootecnia, como disciplinas que se aproximan a las complejidades de la Agricultura la cual es la piedra angular de la civilización, tal y como la conocemos actualmente.The problem of sustainability is a topic that since the end of the XX century has been worrying more the different sectors of society; becoming one of the topics of greatest interest for managers, consumers, academics and investigators that conform the different agricultural food chains of the world. This paper presents from the General Systems Theory point of view some elements of critical reflection, approaching the problem of sustainability from the complexity of agricultural production systems, beginning with the original philosophical conception of agriculture and ending by outlining some considerations that should be kept in mind for

  14. A Categorization of Dynamic Analyzers

    Science.gov (United States)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  15. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  16. Extraction spectrophotometric analyzer

    International Nuclear Information System (INIS)

    Batik, J.; Vitha, F.

    1985-01-01

    Automation is discussed of extraction spectrophotometric determination of uranium in a solution. Uranium is extracted from accompanying elements in an HCl medium with a solution of tributyl phosphate in benzene. The determination is performed by measuring absorbance at 655 nm in a single-phase ethanol-water-benzene-tributyl phosphate medium. The design is described of an analyzer consisting of an analytical unit and a control unit. The analyzer performance promises increased productivity of labour, improved operating and hygiene conditions, and mainly more accurate results of analyses. (J.C.)

  17. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  18. Analyzing Political Television Advertisements.

    Science.gov (United States)

    Burson, George

    1992-01-01

    Presents a lesson plan to help students understand that political advertisements often mislead, lie, or appeal to emotion. Suggests that the lesson will enable students to examine political advertisements analytically. Includes a worksheet to be used by students to analyze individual political advertisements. (DK)

  19. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  20. Gauge cooling for the singular-drift problem in the complex Langevin method — a test in Random Matrix Theory for finite density QCD

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, Keitaro [KEK Theory Center, High Energy Accelerator Research Organization,1-1 Oho, Tsukuba 305-0801 (Japan); Nishimura, Jun [KEK Theory Center, High Energy Accelerator Research Organization,1-1 Oho, Tsukuba 305-0801 (Japan); Department of Particle and Nuclear Physics, School of High Energy Accelerator Science,Graduate University for Advanced Studies (SOKENDAI), 1-1 Oho, Tsukuba 305-0801 (Japan); Shimasaki, Shinji [KEK Theory Center, High Energy Accelerator Research Organization,1-1 Oho, Tsukuba 305-0801 (Japan); Research and Education Center for Natural Sciences, Keio University,Hiyoshi 4-1-1, Yokohama, Kanagawa 223-8521 (Japan)

    2016-07-14

    Recently, the complex Langevin method has been applied successfully to finite density QCD either in the deconfinement phase or in the heavy dense limit with the aid of a new technique called the gauge cooling. In the confinement phase with light quarks, however, convergence to wrong limits occurs due to the singularity in the drift term caused by small eigenvalues of the Dirac operator including the mass term. We propose that this singular-drift problem should also be overcome by the gauge cooling with different criteria for choosing the complexified gauge transformation. The idea is tested in chiral Random Matrix Theory for finite density QCD, where exact results are reproduced at zero temperature with light quarks. It is shown that the gauge cooling indeed changes drastically the eigenvalue distribution of the Dirac operator measured during the Langevin process. Despite its non-holomorphic nature, this eigenvalue distribution has a universal diverging behavior at the origin in the chiral limit due to a generalized Banks-Casher relation as we confirm explicitly.

  1. Application of a complex transport problem for simulating an acid rain episode in Europe. Anwendung eines komplexen Ausbreitungsmodells zur Simulation einer Episode saurer Deposition ueber Europa

    Energy Technology Data Exchange (ETDEWEB)

    Stern, R; Scherer, B

    1989-04-01

    For the first time in Europe, a comprehensive Eulerian regional tropospheric transport, transformation and removal model has been applied to an european wide acid deposition episode. This model, the Transport And Deposition of Acidifying Pollutants (TADAP) model incorporated detailed knowledge of the relevant physicochemical processes which lead to the formation of photochemical oxidants and acidifying pollutants. The EUROPA-model (EUM) of the German Weather Service, a limited area numerical weather prediction model, has been used to derive the total meteorological cloud variables. The application of the EUM/TADAP-modelling system to a 20 day-wintertime acid deposition episode in Europe showed that it is possible to model the principal features of the acid deposition system. In general, there is reasonable agreement between observed and predicted concentration and deposition patterns. Most discrepancies from observed trends can be explained by deviations between the modelled and the actual meteorology. First sensitivity studies with TADAP directed to reveal the influence of emission changes on the acid deposition system showed that there are considerable non-proportionalities between depositions of secondary pollutants and the emissions of the respective precursors. The nonlinearities arise due to the chemical coupling of the SO{sub x}/No{sub x}/VOC-system. This makes the design of control strategies to a highly complex task. Strategies developed to tackle different air pollution problems can therefore not be looked upon independently. (orig.) With 47 refs., 42 figs.

  2. Soft Decision Analyzer

    Science.gov (United States)

    Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam

    2013-01-01

    We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.

  3. KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Bennewitz, F.; Hummel, R.; Oelmann, K.

    1986-01-01

    The KWU Nuclear Plant Analyzer is a real time engineering simulator based on the KWU computer programs used in plant transient analysis and licensing. The primary goal is to promote the understanding of the technical and physical processes of a nuclear power plant at an on-site training facility. Thus the KWU Nuclear Plant Analyzer is available with comparable low costs right at the time when technical questions or training needs arise. This has been achieved by (1) application of the transient code NLOOP; (2) unrestricted operator interaction including all simulator functions; (3) using the mainframe computer Control Data Cyber 176 in the KWU computing center; (4) four color graphic displays controlled by a dedicated graphic computer, no control room equipment; and (5) coupling of computers by telecommunication via telephone

  4. Emission spectrometric isotope analyzer

    International Nuclear Information System (INIS)

    Mauersberger, K.; Meier, G.; Nitschke, W.; Rose, W.; Schmidt, G.; Rahm, N.; Andrae, G.; Krieg, D.; Kuefner, W.; Tamme, G.; Wichlacz, D.

    1982-01-01

    An emission spectrometric isotope analyzer has been designed for determining relative abundances of stable isotopes in gaseous samples in discharge tubes, in liquid samples, and in flowing gaseous samples. It consists of a high-frequency generator, a device for defined positioning of discharge tubes, a grating monochromator with oscillating slit and signal converter, signal generator, window discriminator, AND connection, read-out display, oscillograph, gas dosing device and chemical conversion system with carrier gas source and vacuum pump

  5. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility......Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  6. Electrodynamic thermogravimetric analyzer

    International Nuclear Information System (INIS)

    Spjut, R.E.; Bar-Ziv, E.; Sarofim, A.F.; Longwell, J.P.

    1986-01-01

    The design and operation of a new device for studying single-aerosol-particle kinetics at elevated temperatures, the electrodynamic thermogravimetric analyzer (EDTGA), was examined theoretically and experimentally. The completed device consists of an electrodynamic balance modified to permit particle heating by a CO 2 laser, temperature measurement by a three-color infrared-pyrometry system, and continuous weighing by a position-control system. In this paper, the position-control, particle-weight-measurement, heating, and temperature-measurement systems are described and their limitations examined

  7. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  8. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  9. Multiple capillary biochemical analyzer

    Science.gov (United States)

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  10. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  11. Trace impurity analyzer

    International Nuclear Information System (INIS)

    Schneider, W.J.; Edwards, D. Jr.

    1979-01-01

    The desirability for long-term reliability of large scale helium refrigerator systems used on superconducting accelerator magnets has necessitated detection of impurities to levels of a few ppM. An analyzer that measures trace impurity levels of condensable contaminants in concentrations of less than a ppM in 15 atm of He is described. The instrument makes use of the desorption temperature at an indicated pressure of the various impurities to determine the type of contaminant. The pressure rise at that temperature yields a measure of the contaminant level of the impurity. A LN 2 cryogenic charcoal trap is also employed to measure air impurities (nitrogen and oxygen) to obtain the full range of contaminant possibilities. The results of this detector which will be in use on the research and development helium refrigerator of the ISABELLE First-Cell is described

  12. Analyzing Water's Optical Absorption

    Science.gov (United States)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  13. ROBOT TASK SCENE ANALYZER

    International Nuclear Information System (INIS)

    Hamel, William R.; Everett, Steven

    2000-01-01

    Environmental restoration and waste management (ER and WM) challenges in the United States Department of Energy (DOE), and around the world, involve radiation or other hazards which will necessitate the use of remote operations to protect human workers from dangerous exposures. Remote operations carry the implication of greater costs since remote work systems are inherently less productive than contact human work due to the inefficiencies/complexities of teleoperation. To reduce costs and improve quality, much attention has been focused on methods to improve the productivity of combined human operator/remote equipment systems; the achievements to date are modest at best. The most promising avenue in the near term is to supplement conventional remote work systems with robotic planning and control techniques borrowed from manufacturing and other domains where robotic automation has been used. Practical combinations of teleoperation and robotic control will yield telerobotic work systems that outperform currently available remote equipment. It is believed that practical telerobotic systems may increase remote work efficiencies significantly. Increases of 30% to 50% have been conservatively estimated for typical remote operations. It is important to recognize that the basic hardware and software features of most modern remote manipulation systems can readily accommodate the functionality required for telerobotics. Further, several of the additional system ingredients necessary to implement telerobotic control--machine vision, 3D object and workspace modeling, automatic tool path generation and collision-free trajectory planning--are existent

  14. ROBOT TASK SCENE ANALYZER

    Energy Technology Data Exchange (ETDEWEB)

    William R. Hamel; Steven Everett

    2000-08-01

    Environmental restoration and waste management (ER and WM) challenges in the United States Department of Energy (DOE), and around the world, involve radiation or other hazards which will necessitate the use of remote operations to protect human workers from dangerous exposures. Remote operations carry the implication of greater costs since remote work systems are inherently less productive than contact human work due to the inefficiencies/complexities of teleoperation. To reduce costs and improve quality, much attention has been focused on methods to improve the productivity of combined human operator/remote equipment systems; the achievements to date are modest at best. The most promising avenue in the near term is to supplement conventional remote work systems with robotic planning and control techniques borrowed from manufacturing and other domains where robotic automation has been used. Practical combinations of teleoperation and robotic control will yield telerobotic work systems that outperform currently available remote equipment. It is believed that practical telerobotic systems may increase remote work efficiencies significantly. Increases of 30% to 50% have been conservatively estimated for typical remote operations. It is important to recognize that the basic hardware and software features of most modern remote manipulation systems can readily accommodate the functionality required for telerobotics. Further, several of the additional system ingredients necessary to implement telerobotic control--machine vision, 3D object and workspace modeling, automatic tool path generation and collision-free trajectory planning--are existent.

  15. Searching for Order Within Chaos: Complexity Theorys Implications to Intelligence Support During Joint Operational Planning

    Science.gov (United States)

    2017-06-09

    joint operational planning . 15. SUBJECT TERMS Complexity Theory , Complex Systems Theory , Complex Adaptive Systems, Dynamical Systems, Joint...complexity theory to analyze military problems and increase joint staff understanding of the operational environment during joint operational planning ?” the...13). Complex Systems Theory : “the study of the behavior of [complex adaptive] systems” (Ilachinski 2004, 4). For the purpose of this thesis there is

  16. Social learning for solving complex problems: a promising solution or wishful thinking? A case study of multi-actor negotiation for the integrated management and sustainable use of the Drentsche Aa area in the Netherlands

    NARCIS (Netherlands)

    Bommel, van S.; Roling, N.G.; Aarts, M.N.C.; Turnhout, E.

    2009-01-01

    Social learning has been championed as a promising approach to address complex resource problems. According to theory, social learning requires several pre-conditions to be met, including (1) a divergence of interests, (2) mutual interdependence and (3) the ability to communicate. This article

  17. Social learning for solving complex problems: a promising solution or wishful thinking?: a case-study of multi-actor negotiation for the integrated management and the sustainable use of the Drentsche Aa area in the Netherlands

    NARCIS (Netherlands)

    van Bommel, S.; Röling, N.; Aarts, N.; Turnhout, E.

    2009-01-01

    Social learning has been championed as a promising approach to address complex resource problems. According to theory, social learning requires several pre-conditions to be met, including (1) a divergence of interests, (2) mutual interdependence and (3) the ability to communicate. This article

  18. The problem of complex eigensystems in the semianalytical solution for advancement of time in solute transport simulations: a new method using real arithmetic

    Science.gov (United States)

    Umari, Amjad M.J.; Gorelick, Steven M.

    1986-01-01

    In the numerical modeling of groundwater solute transport, explicit solutions may be obtained for the concentration field at any future time without computing concentrations at intermediate times. The spatial variables are discretized and time is left continuous in the governing differential equation. These semianalytical solutions have been presented in the literature and involve the eigensystem of a coefficient matrix. This eigensystem may be complex (i.e., have imaginary components) due to the asymmetry created by the advection term in the governing advection-dispersion equation. Previous investigators have either used complex arithmetic to represent a complex eigensystem or chosen large dispersivity values for which the imaginary components of the complex eigenvalues may be ignored without significant error. It is shown here that the error due to ignoring the imaginary components of complex eigenvalues is large for small dispersivity values. A new algorithm that represents the complex eigensystem by converting it to a real eigensystem is presented. The method requires only real arithmetic.

  19. A neutron activation analyzer

    International Nuclear Information System (INIS)

    Westphal, G.P.; Lemmel, H.; Grass, F.; De Regge, P.P.; Burns, K.; Markowicz, A.

    2005-01-01

    Dubbed 'Analyzer' because of its simplicity, a neutron activation analysis facility for short-lived isomeric transitions is based on a low-cost rabbit system and an adaptive digital filter which are controlled by a software performing irradiation control, loss-free gamma-spectrometry, spectra evaluation, nuclide identification and calculation of concentrations in a fully automatic flow of operations. Designed for TRIGA reactors and constructed from inexpensive plastic tubing and an aluminum in-core part, the rabbit system features samples of 5 ml and 10 ml with sample separation at 150 ms and 200 ms transport time or 25 ml samples without separation at a transport time of 300 ms. By automatically adapting shaping times to pulse intervals the preloaded digital filter gives best throughput at best resolution up to input counting rates of 10 6 cps. Loss-free counting enables quantitative correction of counting losses of up to 99%. As a test of system reproducibility in sample separation geometry, K, Cl, Mn, Mg, Ca, Sc, and V have been determined in various reference materials at excellent agreement with consensus values. (author)

  20. Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  1. Analyzing Visibility Configurations.

    Science.gov (United States)

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  2. Communication complexity and information complexity

    Science.gov (United States)

    Pankratov, Denis

    Information complexity enables the use of information-theoretic tools in communication complexity theory. Prior to the results presented in this thesis, information complexity was mainly used for proving lower bounds and direct-sum theorems in the setting of communication complexity. We present three results that demonstrate new connections between information complexity and communication complexity. In the first contribution we thoroughly study the information complexity of the smallest nontrivial two-party function: the AND function. While computing the communication complexity of AND is trivial, computing its exact information complexity presents a major technical challenge. In overcoming this challenge, we reveal that information complexity gives rise to rich geometrical structures. Our analysis of information complexity relies on new analytic techniques and new characterizations of communication protocols. We also uncover a connection of information complexity to the theory of elliptic partial differential equations. Once we compute the exact information complexity of AND, we can compute exact communication complexity of several related functions on n-bit inputs with some additional technical work. Previous combinatorial and algebraic techniques could only prove bounds of the form theta( n). Interestingly, this level of precision is typical in the area of information theory, so our result demonstrates that this meta-property of precise bounds carries over to information complexity and in certain cases even to communication complexity. Our result does not only strengthen the lower bound on communication complexity of disjointness by making it more exact, but it also shows that information complexity provides the exact upper bound on communication complexity. In fact, this result is more general and applies to a whole class of communication problems. In the second contribution, we use self-reduction methods to prove strong lower bounds on the information

  3. Some problems in recording and analyzing South African English ...

    African Journals Online (AJOL)

    ... english, etymology, family names, folk etymology, french, german, hebrew, initialisms, latin, lexicography, misprints, nonce forms, overdefinition, personal names, place names, postal terms, prepositions, productivization, reflexive pronouns, slang, slips of the tongue, south african english, spelling, status and usage labels, ...

  4. Analyzing Human Communication Networks in Organizations: Applications to Management Problems.

    Science.gov (United States)

    Farace, Richard V.; Danowski, James A.

    Investigating the networks of communication in organizations leads to an understanding of efficient and inefficient information dissemination as practiced in large systems. Most important in organizational communication is the role of the "liaison person"--the coordinator of intercommunication. When functioning efficiently, coordinators maintain…

  5. Monte Carlo techniques for analyzing deep-penetration problems

    International Nuclear Information System (INIS)

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1986-01-01

    Current methods and difficulties in Monte Carlo deep-penetration calculations are reviewed, including statistical uncertainty and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multigroup Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications

  6. Monte Carlo techniques for analyzing deep penetration problems

    International Nuclear Information System (INIS)

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1985-01-01

    A review of current methods and difficulties in Monte Carlo deep-penetration calculations is presented. Statistical uncertainty is discussed, and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing is reviewed. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multi-group Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications

  7. Some Problems in Recording and Analyzing South African English ...

    African Journals Online (AJOL)

    1994-05-24

    May 24, 1994 ... tial dictionary of South African English should be written by comparing it to ... For instance, the South African English labor term go-slow sounds odd to the American ear .... Yiddish has oykh / oykhet (literally 'also') here ("kh'vel.

  8. PROBLEMS IN ANALYZING INFLATION DURING THE CIVIL WAR

    Directory of Open Access Journals (Sweden)

    Paul R. Auerbach

    2002-01-01

    Full Text Available In the American Civil War, a drastic increase in the level of “high powered money” with the issuance of the greenbacks had a relatively modest effect on the measured price level. The existence of a free market in gold and the presence of specie are offered as an explanation for the constrained movements both in the money multiplier and in movements in measured income velocity. These unusual results largely reflect the fact that in such a world of freely fluctuating multiple currencies, a rise in the measured price level does not reflect the decline in the value of money.

  9. Monte Carlo techniques for analyzing deep penetration problems

    International Nuclear Information System (INIS)

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1985-01-01

    A review of current methods and difficulties in Monte Carlo deep-penetration calculations is presented. Statistical uncertainty is discussed, and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing is reviewed. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multi-group Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications. 29 refs

  10. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  11. Mental health problems in deaf and severely hard of hearing children and adolescents : findings on prevalence, pathogenesis and clinical complexities, and implications for prevention, diagnosis and intervention

    NARCIS (Netherlands)

    Gent, Tiejo van

    2012-01-01

    The aim of this thesis is to expand the knowledge of mental health problems with deaf and severely hard of hearing children and adolescents in the following domains: 1. The prevalence of mental health problems; 2. Specific intra- and interpersonal aspects of pathogenesis; 3. characteristics of the

  12. Organization of a multichannel analyzer for gamma ray spectrometry

    International Nuclear Information System (INIS)

    Robinet, Genevieve

    1988-06-01

    This report describes the software organization of a medium scale multichannel analyzer for qualitative and quantitative measurements of the gamma rays emitted by radioactive samples. The first part reminds basis of radioactivity, principle of gamma ray detection, and data processing used for interpretation of a nuclear spectrum. The second part describes first the general organization of the software and then gives some details on interactivity, multidetector capabilites, and integration of complex algorithms for peak search and nuclide identification;problems encountered during the design phase are mentioned and solutions are given. Basic ideas are presented for further developments, such as expert system which should improve interpretation of the results. This present software has been integrated in a manufactured multichannel analyzer named 'POLYGAM NU416'. [fr

  13. An interdisciplinary complex problem as a starting point for learning: Impact of the PBL method in second-year Environmental engineering students

    Directory of Open Access Journals (Sweden)

    E. Saez de Camara

    2015-09-01

    Full Text Available Three courses of the second year degree in Environmental Engineering (Geology and Pedology, Ecology and Economics and Business Administration have been remodeled using the Problem-Based Learning methodology. The proposed problem is a real-life and integrative problem related to their specialization which must be solved in these three courses at the same time. The results reveal that during this experience students were considerably more active, cooperative and involved, and the success rate doubled that of similar engineering courses of the Faculty. Regarding students’ opinion, it should be emphasized that they perceive that this method is functional and encouraging. A high percentage of the students describe the experience as positive or very positive. Additionally, they stated that the Problem-Based Learning promoted the development of skills that, in their own view, are essential for their career, such as teamwork and communication.

  14. The SPAR thermal analyzer: Present and future

    Science.gov (United States)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  15. Effective economics of nuclear fuel power complex

    International Nuclear Information System (INIS)

    Shevelev, Ya.V.; Klimenko, A.V.

    1996-01-01

    Problems of the economic theory and practice of functioning the nuclear fuel power complex (NFPC) are considered. Using the principle of market equilibrium for optimization of the NFPC hierarchical system is analyzed. The main attention is paid to determining the prices of production and consumption of the NFPC enterprises. Economic approaches on the optimal calculations are described. The ecological safety of NPP and NFPC enterprises is analyzed. A conception of the market socialism is presented

  16. XI International conference Problems of solvation and complex formation in solutions, and VI Conference of young scientists Theoretical and experimental chemistry of liquid-phase systems (Krestovsky readings). Summary of reports

    International Nuclear Information System (INIS)

    2011-01-01

    The collection contains materials of plenary, sectional and poster sessions, presented at the XI International conference Problems of solvation and complex formation in solutions, and VI Conference of young scientists Theoretical and experimental chemistry of liquid-phase systems (Krestovsky readings). Theoretical questions and new experimental methods of chemistry of solutions, structure and dynamics of molecular and ion-molecular systems in solution and at the phase boundary; modern aspects of applied chemistry of solutions are discussed [ru

  17. Tourette Syndrome: Overview and Classroom Interventions. A Complex Neurobehavioral Disorder Which May Involve Learning Problems, Attention Deficit Hyperactivity Disorder, Obsessive Compulsive Symptoms, and Stereotypical Behaviors.

    Science.gov (United States)

    Fisher, Ramona A.; Collins, Edward C.

    Tourette Syndrome is conceptualized as a neurobehavioral disorder, with behavioral aspects that are sometimes difficult for teachers to understand and deal with. The disorder has five layers of complexity: (1) observable multiple motor, vocal, and cognitive tics and sensory involvement; (2) Attention Deficit Hyperactivity Disorder; (3)…

  18. Developing Seventh Grade Students' Understanding of Complex Environmental Problems with Systems Tools and Representations: A Quasi-Experimental Study

    Science.gov (United States)

    Doganca Kucuk, Zerrin; Saysel, Ali Kerem

    2018-01-01

    A systems-based classroom intervention on environmental education was designed for seventh grade students; the results were evaluated to see its impact on the development of systems thinking skills and standard science achievement and whether the systems approach is a more effective way to teach environmental issues that are dynamic and complex. A…

  19. L.E.A.D.: A Framework for Evidence Gathering and Use for the Prevention of Obesity and Other Complex Public Health Problems

    Science.gov (United States)

    Chatterji, Madhabi; Green, Lawrence W.; Kumanyika, Shiriki

    2014-01-01

    This article summarizes a comprehensive, systems-oriented framework designed to improve the use of a wide variety of evidence sources to address population-wide obesity problems. The L.E.A.D. framework (for "Locate" the evidence, "Evaluate" the evidence, "Assemble" the evidence, and inform "Decisions"),…

  20. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  1. Creativity for Problem Solvers

    DEFF Research Database (Denmark)

    Vidal, Rene Victor Valqui

    2009-01-01

    This paper presents some modern and interdisciplinary concepts about creativity and creative processes specially related to problem solving. Central publications related to the theme are briefly reviewed. Creative tools and approaches suitable to support problem solving are also presented. Finally......, the paper outlines the author’s experiences using creative tools and approaches to: Facilitation of problem solving processes, strategy development in organisations, design of optimisation systems for large scale and complex logistic systems, and creative design of software optimisation for complex non...

  2. Going beyond the hero in leadership development: the place of healthcare context, complexity and relationships: Comment on "Leadership and leadership development in healthcare settings - a simplistic solution to complex problems?".

    Science.gov (United States)

    Ford, Jackie

    2015-04-01

    There remains a conviction that the torrent of publications and the financial outlay on leadership development will create managers with the skills and characters of perfect leaders, capable of guiding healthcare organisations through the challenges and crises of the 21st century. The focus of much attention continues to be the search for the (illusory) core set of heroic qualities, abilities or competencies that will enable the development of leaders to achieve levels of supreme leadership and organisational performance. This brief commentary adds support to McDonald's (1) call for recognition of the complexity of the undertaking.

  3. Going beyond the Hero in Leadership Development: The Place of Healthcare Context, Complexity and Relationships; Comment on “Leadership and Leadership Development in Healthcare Settings – A Simplistic Solution to Complex Problems?”

    Directory of Open Access Journals (Sweden)

    Jackie Ford

    2015-04-01

    Full Text Available There remains a conviction that the torrent of publications and the financial outlay on leadership development will create managers with the skills and characters of perfect leaders, capable of guiding healthcare organisations through the challenges and crises of the 21st century. The focus of much attention continues to be the search for the (illusory core set of heroic qualities, abilities or competencies that will enable the development of leaders to achieve levels of supreme leadership and organisational performance. This brief commentary adds support to McDonald’s (1 call for recognition of the complexity of the undertaking.

  4. Using machine-learning methods to analyze economic loss function of quality management processes

    Science.gov (United States)

    Dzedik, V. A.; Lontsikh, P. A.

    2018-05-01

    During analysis of quality management systems, their economic component is often analyzed insufficiently. To overcome this issue, it is necessary to withdraw the concept of economic loss functions from tolerance thinking and address it. Input data about economic losses in processes have a complex form, thus, using standard tools to solve this problem is complicated. Use of machine learning techniques allows one to obtain precise models of the economic loss function based on even the most complex input data. Results of such analysis contain data about the true efficiency of a process and can be used to make investment decisions.

  5. A tandem parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Fujisawa, A.; Iguchi, H.; Nishizawa, A.; Kawasumi, Y.

    1996-11-01

    By a new modification of a parallel plate analyzer the second-order focus is obtained in an arbitrary injection angle. This kind of an analyzer with a small injection angle will have an advantage of small operational voltage, compared to the Proca and Green analyzer where the injection angle is 30 degrees. Thus, the newly proposed analyzer will be very useful for the precise energy measurement of high energy particles in MeV range. (author)

  6. Priorización y análisis de problemas de salud con una mirada desde la equidad: experiencia en el nivel local en Venezuela Approaches to determine priorities and to analyze problems of health with a look from the equity: experience in the local level in Venezuela

    Directory of Open Access Journals (Sweden)

    Henny Heredia

    2011-03-01

    Full Text Available En este artículo se analiza la aplicación del momento explicativo de la Planificación Estratégica Situacional (PES y el Análisis de la Situación de Salud (ASIS, como enfoques que conjugados permiten priorizar con una mirada desde la equidad problemas de salud en el nivel local factibles de intervención. A través del estudio de un caso desarrollado en la parroquia Zuata del Estado Aragua, Venezuela, se presenta la aplicación de ambos enfoques. Los actores claves de dicha parroquia priorizaron la baja cobertura de agua potable, como problema de salud. Al analizar el problema se seleccionaron las siguientes causas claves para elaborar la propuesta de acción: escasa participación comunitaria, debilidad de planes gubernamentales, ausencia de políticas urbanísticas, inadecuada administración de los recursos públicos y poca conciencia en el uso racional del agua. Al final se concluye que la articulación PES-ASIS contribuye a generar insumos que concretizados por los actores en un plan de acción, pueden contribuir en la reducción de inequidades. Asimismo, la participación activa de los actores permite evidenciar los problemas reales de la población y construir un plan de demandas.This article analyzes the application of the explanatory moment of the Strategic Situational Planning (SSP and the Analysis of the Situation of Health (ASIS, as approaches that together, allow to prioritize with a look from the equity problems of health in the local level feasible of intervention. By using the case study developed in the parish Zuata of Aragua State, Venezuela, it can be observed the application of both approaches The main actors of the above mentioned parish prioritized the low coverage of drinkable water, like a health problem. On having analyzed the problem, the following causes were selected to prepare the proposed action: scarce community participation, weakness of governmental plans, absence of political town-planning, inadequate

  7. Outcomes-Based Authentic Learning, Portfolio Assessment, and a Systems Approach to ‘Complex Problem-Solving’: Related Pillars for Enhancing the Innovative Role of PBL in Future Higher Education

    Directory of Open Access Journals (Sweden)

    Cameron Richards

    2015-06-01

    Full Text Available The challenge of better reconciling individual and collective aspects of innovative problem-solving can be productively addressed to enhance the role of PBL as a key focus of the creative process in future higher education. This should involve ‘active learning’ approaches supported by related processes of teaching, assessment and curriculum. As Biggs & Tan (2011 have suggested, an integrated or systemic approach is needed for the most effective practice of outcomes-based education also especially relevant for addressing relatively simple as well as more complex problems. Such a model will be discussed in relation to the practical example of a Masters subject conceived with interdisciplinary implications, applications, and transferability: ‘sustainable policy studies in science, technology and innovation’. Different modes of PBL might be encouraged in terms of the authentic kinds of ‘complex problem-solving’ issues and challenges which increasingly confront an interdependent and changing world. PBL can be further optimized when projects or cases also involve contexts and examples of research and inquiry. However, perhaps the most crucial pillar is a model of portfolio assessment for linking and encouraging as well as distinguishing individual contributions to collaborative projects and activities.

  8. Developing an Approach for Analyzing and Verifying System Communication

    Science.gov (United States)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  9. Adaptive Beamforming Based on Complex Quaternion Processes

    Directory of Open Access Journals (Sweden)

    Jian-wu Tao

    2014-01-01

    Full Text Available Motivated by the benefits of array signal processing in quaternion domain, we investigate the problem of adaptive beamforming based on complex quaternion processes in this paper. First, a complex quaternion least-mean squares (CQLMS algorithm is proposed and its performance is analyzed. The CQLMS algorithm is suitable for adaptive beamforming of vector-sensor array. The weight vector update of CQLMS algorithm is derived based on the complex gradient, leading to lower computational complexity. Because the complex quaternion can exhibit the orthogonal structure of an electromagnetic vector-sensor in a natural way, a complex quaternion model in time domain is provided for a 3-component vector-sensor array. And the normalized adaptive beamformer using CQLMS is presented. Finally, simulation results are given to validate the performance of the proposed adaptive beamformer.

  10. Encyclopedia of Complexity and Systems Science

    CERN Document Server

    Meyers, Robert A

    2009-01-01

    Encyclopedia of Complexity and Systems Science provides an authoritative single source for understanding and applying the concepts of complexity theory together with the tools and measures for analyzing complex systems in all fields of science and engineering. The science and tools of complexity and systems science include theories of self-organization, complex systems, synergetics, dynamical systems, turbulence, catastrophes, instabilities, nonlinearity, stochastic processes, chaos, neural networks, cellular automata, adaptive systems, and genetic algorithms. Examples of near-term problems and major unknowns that can be approached through complexity and systems science include: The structure, history and future of the universe; the biological basis of consciousness; the integration of genomics, proteomics and bioinformatics as systems biology; human longevity limits; the limits of computing; sustainability of life on earth; predictability, dynamics and extent of earthquakes, hurricanes, tsunamis, and other n...

  11. THE PROBLEMS OF MODELING THE RELIABILITY STRUCTURE OF THE COMPLEX TECHNICAL SYSTEM ON THE BASIS OF A STEAM‐WATER SYSTEM OF THE ENGINE ROOM

    Directory of Open Access Journals (Sweden)

    Leszek CHYBOWSKI

    2012-04-01

    Full Text Available In the paper the concept of a system structure with particular emphasis on the reliability structure has been presented. Advantages and disadvantages of modeling the reliability structure of a system using reliability block diagrams (RBD have been shown. RBD models of a marine steam‐water system constructed according to the concept of ‘multi‐component’, ‘one component’ and mixed models have been discussed. Critical remarks on the practical application of models which recognize only the structural surplus have been dealt with. The significant value of the model by professors Smalko and Jaźwiński called by them ‘default reliability structure’ has been pointed out. The necessity of building a new type of models: quality‐quantity, useful in the methodology developed by the author's multi-criteria analysis of importance of elements in the reliability structure of complex technical systems.

  12. Class III malocclusion with complex problems of lateral open bite and severe crowding successfully treated with miniscrew anchorage and lingual orthodontic brackets.

    Science.gov (United States)

    Yanagita, Takeshi; Kuroda, Shingo; Takano-Yamamoto, Teruko; Yamashiro, Takashi

    2011-05-01

    In this article, we report the successful use of miniscrews in a patient with an Angle Class III malocclusion, lateral open bite, midline deviation, and severe crowding. Simultaneously resolving such problems with conventional Class III treatment is difficult. In this case, the treatment procedure was even more challenging because the patient preferred to have lingual brackets on the maxillary teeth. As a result, miniscrews were used to facilitate significant asymmetric tooth movement in the posterior and downward directions; this contributed to the camouflage of the skeletal mandibular protrusion together with complete resolution of the severe crowding and lateral open bite. Analysis of the jaw motion showed that irregularities in chewing movement were also resolved, and a stable occlusion was achieved. Improvements in the facial profile and dental arches remained stable at the 18-month follow-up. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  13. Cognitive modeling and dynamic probabilistic simulation of operating crew response to complex system accidents. Part 4: IDAC causal model of operator problem-solving response

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y.H.J. [Center for Risk and Reliability, University of Maryland, College Park, MD 20742 (United States) and Paul Scherrer Institute, 5232 Villigen PSI (Switzerland)]. E-mail: yhc@umd.edu; Mosleh, A. [Center for Risk and Reliability, University of Maryland, College Park, MD 20742 (United States)

    2007-08-15

    This is the fourth in a series of five papers describing the Information, Decision, and Action in Crew context (IDAC) operator response model for human reliability analysis. An example application of this modeling technique is also discussed in this series. The model has been developed to probabilistically predicts the responses of a nuclear power plant control room operating crew in accident conditions. The operator response spectrum includes cognitive, emotional, and physical activities during the course of an accident. This paper assesses the effects of the performance-influencing factors (PIFs) affecting the operators' problem-solving responses including information pre-processing (I), diagnosis and decision making (D), and action execution (A). Literature support and justifications are provided for the assessment on the influences of PIFs.

  14. Cognitive modeling and dynamic probabilistic simulation of operating crew response to complex system accidents. Part 4: IDAC causal model of operator problem-solving response

    International Nuclear Information System (INIS)

    Chang, Y.H.J.; Mosleh, A.

    2007-01-01

    This is the fourth in a series of five papers describing the Information, Decision, and Action in Crew context (IDAC) operator response model for human reliability analysis. An example application of this modeling technique is also discussed in this series. The model has been developed to probabilistically predicts the responses of a nuclear power plant control room operating crew in accident conditions. The operator response spectrum includes cognitive, emotional, and physical activities during the course of an accident. This paper assesses the effects of the performance-influencing factors (PIFs) affecting the operators' problem-solving responses including information pre-processing (I), diagnosis and decision making (D), and action execution (A). Literature support and justifications are provided for the assessment on the influences of PIFs

  15. X-ray fluorescence analyzer arrangement

    International Nuclear Information System (INIS)

    Vatai, Endre; Ando, Laszlo; Gal, Janos.

    1981-01-01

    An x-ray fluorescence analyzer for the quantitative determination of one or more elements of complex samples is reported. The novelties of the invention are the excitation of the samples by x-rays or γ-radiation, the application of a balanced filter pair as energy selector, and the measurement of the current or ion charge of ionization detectors used as sensors. Due to the increased sensitivity and accuracy, the novel design can extend the application fields of x-ray fluorescence analyzers. (A.L.)

  16. Waste Collection Vehicle Routing Problem: Literature Review

    OpenAIRE

    Hui Han; Eva Ponce Cueto

    2015-01-01

    Waste generation is an issue which has caused wide public concern in modern societies, not only for the quantitative rise of the amount of waste generated, but also for the increasing complexity of some products and components. Waste collection is a highly relevant activity in the reverse logistics system and how to collect waste in an efficient way is an area that needs to be improved. This paper analyzes the major contribution about Waste Collection Vehicle Routing Problem (WCVRP) in litera...

  17. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  18. The death of neonates: the multi-professional team’s perceptions of the problem in the light of its complexity

    Directory of Open Access Journals (Sweden)

    Larissa Spies Subutzki

    2018-01-01

    Full Text Available Objective: to learn about the perception which the multi-professional team of a neonatal intensive care unit have of the process of dying and death of neonates. Method: this is a qualitative, exploratory-descriptive study, undertaken with three focus groups made up of 35 professionals from the multi-professional team of the neonatal and pediatric intensive care unit of a hospital located in the northwest region of the State of Rio Grande do Sul in Brazil. The data were decoded on the basis of content analysis. Results: the data yielded four thematic categories: Death: an interruption of the natural order; Death: dying is a complex process for which there are no answers; Death: awakening to a new state of life and Death: the coexistence of the tangible and the  intangible. Conclusion: is the study concluded that death is still conceived of as a fragmented phenomenon and it is dissociated from the process of human life, although there is evidence that the professionals believe in the prospect of being able to speak about and reflect on the matter and expand their theoretical and practical understanding of death.

  19. Complex organic pollutant mixtures originating from industrial and municipal emissions in surface waters of the megacity Jakarta-an example of a water pollution problem in emerging economies.

    Science.gov (United States)

    Dsikowitzky, Larissa; Hagemann, Lukas; Dwiyitno; Ariyani, Farida; Irianto, Hari Eko; Schwarzbauer, Jan

    2017-12-01

    During the last decades, the global industrial production partly shifted from industrialized nations to emerging and developing countries. In these upcoming economies, the newly developed industrial centers are generally located in densely populated areas, resulting in the discharge of often only partially treated industrial and municipal wastewaters into the surface waters. There is a huge gap of knowledge about the composition of the complex organic pollutant mixtures occurring in such heavily impacted areas. Therefore, we applied a non-target screening to comprehensively assess river pollution in a large industrial area located in the megacity Jakarta. More than 100 structurally diverse organic contaminants were identified, some of which were reported here for the first time as environmental contaminants. The concentrations of paper manufacturing chemicals in river water-for example, of the endocrine-disrupting compound bisphenol A (50-8000 ng L -1 )-were as high as in pure untreated paper industry wastewaters. The non-target screening approach is the adequate tool for the identification of water contaminants in the new global centers of industrial manufacturing-as the first crucial step towards the evaluation of as yet unrecognized environmental risks.

  20. Central Asia’s Ili River Ecosystem as a Wicked Problem: Unraveling Complex Interrelationships at the Interface of Water, Energy, and Food

    Directory of Open Access Journals (Sweden)

    Steven G. Pueppke

    2018-04-01

    Full Text Available The Ili River originates in the mountains of Xinjiang, China, and flows across an increasingly arid landscape before terminating in Kazakhstan’s Lake Balkhash, which has no outlet to the ocean. The river has been extensively impounded and diverted over the past half century to produce hydroelectric power and food on irrigated land. Water withdrawals are increasing to the extent that they are beginning to threaten the ecosystem, just as it is becoming stressed by altered inflows as glaciers retreat and disappear. If the Ili River ecosystem is to be preserved, it is crucial that we thoroughly understand the spatial and temporal nuances of the interrelationships between water, energy, and food—and the vulnerability of these components to climate change. The ecosystem has all of the characteristics of a classically-defined “wicked problem”, and so it warrants treatment as a complex and dynamic challenge subject to changing assumptions, unexpected consequences, and strong social and economic overtones. Research should thus focus not just on new knowledge about the water, energy, or food component, but on advancing our understanding of the ecosystem as a whole. This will require the participation of interdisciplinary teams of researchers with both tacit and specialized knowledge.

  1. Problems created on delayed supply of feedstock for the HDPE plant of Jam Petrochemical Complex (JPC) in Iran : a case study[The 1. international construction specialty conference

    Energy Technology Data Exchange (ETDEWEB)

    Etemadzadeh, S.; Mortaheb, M. [Sharif Univ. of Technology, Tehran (Iran, Islamic Republic of). Dept. of Civil Engineering; Beigi, H. [Jam Petrochemical Co., Assaluyeh, Bushehr (Iran, Islamic Republic of)

    2006-07-01

    The total loss incurred due to delays in supply of feedstock and utilities over the past 2 years during Iran's construction boom in petrochemical plants was evaluated. The problems associated with the delay of feedstock supply and its impact on the final stages of a petrochemical project were discussed and the factors that affect the financial viability of a project were identified. In particular, the paper reviewed issues regarding equipment warranty and their pre-mature expiration; unavoidable rework prior to pre-commissioning; preservation and maintenance cost of equipment in a humid and hot environment; changes in technology and market demands; and, additional fixed costs covering salaries and maintenance costs. Remedial action plans addressing these issues were proposed in order to reduce the costs and any further delays of a project. The importance of technical audits at the feasibility stage of a project was emphasized along with the need to verify the accuracy of initial data for proper design and completion of a project.

  2. Waste Collection Vehicle Routing Problem: Literature Review

    Directory of Open Access Journals (Sweden)

    Hui Han

    2015-08-01

    Full Text Available Waste generation is an issue which has caused wide public concern in modern societies, not only for the quantitative rise of the amount of waste generated, but also for the increasing complexity of some products and components. Waste collection is a highly relevant activity in the reverse logistics system and how to collect waste in an efficient way is an area that needs to be improved. This paper analyzes the major contribution about Waste Collection Vehicle Routing Problem (WCVRP in literature. Based on a classification of waste collection (residential, commercial and industrial, firstly the key findings for these three types of waste collection are presented. Therefore, according to the model (Node Routing Problems and Arc Routing problems used to represent WCVRP, different methods and techniques are analyzed in this paper to solve WCVRP. This paper attempts to serve as a roadmap of research literature produced in the field of WCVRP.

  3. Balance Problems

    Science.gov (United States)

    ... often, it could be a sign of a balance problem. Balance problems can make you feel unsteady. You may ... related injuries, such as a hip fracture. Some balance problems are due to problems in the inner ...

  4. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  5. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  6. Problems of nuclear reactor safety. Vol. 1; Problemy bezopasnosti yaderno-ehnergeticheskikh ustanovok. Tom 1

    Energy Technology Data Exchange (ETDEWEB)

    Shal` nov, A V [Moskovskij Inzhenerno-Fizicheskij Inst., Moscow (Russian Federation)

    1996-12-31

    Proceedings of the 9. Topical Meeting `Problems of nuclear reactor safety` are presented. Papers include results of studies and developments associated with methods of calculation and complex computerized simulation for stationary and transient processes in nuclear power plants. Main problems of reactor safety are discussed as well as rector accidents on operating NPP`s are analyzed.

  7. [Population problem, comprehension problem].

    Science.gov (United States)

    Tallon, F

    1993-08-01

    Overpopulation of developing countries in general, and Rwanda in particular, is not just their problem but a problem for developed countries as well. Rapid population growth is a key factor in the increase of poverty in sub-Saharan Africa. Population growth outstrips food production. Africa receives more and more foreign food, economic, and family planning aid each year. The Government of Rwanda encourages reduced population growth. Some people criticize it, but this criticism results in mortality and suffering. One must combat this ignorance, but attitudes change slowly. Some of these same people find the government's acceptance of family planning an invasion of their privacy. Others complain that rich countries do not have campaigns to reduce births, so why should Rwanda do so? The rate of schooling does not increase in Africa, even though the number of children in school increases, because of rapid population growth. Education is key to improvements in Africa's socioeconomic growth. Thus, Africa, is underpopulated in terms of potentiality but overpopulated in terms of reality, current conditions, and possibilities of overexploitation. Africa needs to invest in human resources. Families need to save, and to so, they must refrain from having many children. Africa should resist the temptation to waste, as rich countries do, and denounce it. Africa needs to become more independent of these countries, but structural adjustment plans, growing debt, and rapid population growth limit national independence. Food aid is a means for developed countries to dominate developing countries. Modernization through foreign aid has had some positive effects on developing countries (e.g., improved hygiene, mortality reduction), but these also sparked rapid population growth. Rwandan society is no longer traditional, but it is also not yet modern. A change in mentality to fewer births, better quality of life for living infants, better education, and less burden for women must occur

  8. Using systems thinking and the Intervention Level Framework to analyse public health planning for complex problems: Otitis media in Aboriginal and Torres Strait Islander children.

    Science.gov (United States)

    Durham, Jo; Schubert, Lisa; Vaughan, Lisa; Willis, Cameron D

    2018-01-01

    Middle ear disease (otitis media) is endemic among Aboriginal and Torres Strait Islander children in Australia and represents an important cause of hearing loss. The disease is the result of a mix of biological, environmental and host risk factors that interact in complex, non-linear ways along a dynamic continuum. As such, it is generally recognised that a holistic, systems approach is required to reverse the high rates of otitis media in Aboriginal and Torres Strait Islander children. The objective of this paper is to examine the alignment between efforts designed to address otitis media in Aboriginal and Torres Strait Islander children in Queensland, Australia and core concepts of systems thinking. This paper's overall purpose is to identify which combination of activities, and at which level, hold the potential to facilitate systems changes to better support ear health among Aboriginal and Torres Strait Islander children. We began with a review of documents identified in consultation with stakeholders and an online search. In addition, key informants were invited to participate in an online survey and a face-to-face or phone interview. Qualitative interviews using a semi-structured interview guide were used to explore survey responses in more depth. We also undertook interviews at the community level to elicit a diverse range of views. Ideas, statements or activities reported in the documents and interviews as being performed under the Intervention Level Framework were identified using qualitative thematic and content analysis. A quantitative descriptive analysis was also undertaken, whereby data was extracted into an Excel spreadsheet and coded under the relevant strategic directions and performance indicators of the Framework. Subsequently, we coded activities against the five-level intervention framework developed by Malhi and colleagues, that is: 1) paradigm; 2) goals; 3) system structure; 4) feedback and delays; and 5) structural elements. Overall, twenty

  9. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  10. DEMorphy, German Language Morphological Analyzer

    OpenAIRE

    Altinok, Duygu

    2018-01-01

    DEMorphy is a morphological analyzer for German. It is built onto large, compactified lexicons from German Morphological Dictionary. A guesser based on German declension suffixed is also provided. For German, we provided a state-of-art morphological analyzer. DEMorphy is implemented in Python with ease of usability and accompanying documentation. The package is suitable for both academic and commercial purposes wit a permissive licence.

  11. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  12. Computational complexity in entanglement transformations

    Science.gov (United States)

    Chitambar, Eric A.

    In physics, systems having three parts are typically much more difficult to analyze than those having just two. Even in classical mechanics, predicting the motion of three interacting celestial bodies remains an insurmountable challenge while the analogous two-body problem has an elementary solution. It is as if just by adding a third party, a fundamental change occurs in the structure of the problem that renders it unsolvable. In this thesis, we demonstrate how such an effect is likewise present in the theory of quantum entanglement. In fact, the complexity differences between two-party and three-party entanglement become quite conspicuous when comparing the difficulty in deciding what state changes are possible for these systems when no additional entanglement is consumed in the transformation process. We examine this entanglement transformation question and its variants in the language of computational complexity theory, a powerful subject that formalizes the concept of problem difficulty. Since deciding feasibility of a specified bipartite transformation is relatively easy, this task belongs to the complexity class P. On the other hand, for tripartite systems, we find the problem to be NP-Hard, meaning that its solution is at least as hard as the solution to some of the most difficult problems humans have encountered. One can then rigorously defend the assertion that a fundamental complexity difference exists between bipartite and tripartite entanglement since unlike the former, the full range of forms realizable by the latter is incalculable (assuming P≠NP). However, similar to the three-body celestial problem, when one examines a special subclass of the problem---invertible transformations on systems having at least one qubit subsystem---we prove that the problem can be solved efficiently. As a hybrid of the two questions, we find that the question of tripartite to bipartite transformations can be solved by an efficient randomized algorithm. Our results are

  13. A nonlinear oscillatory problem

    International Nuclear Information System (INIS)

    Zhou Qingqing.

    1991-10-01

    We have studied the nonlinear oscillatory problem of orthotropic cylindrical shell, we have analyzed the character of the oscillatory system. The stable condition of the oscillatory system has been given. (author). 6 refs

  14. Solving applied mathematical problems with Matlab

    CERN Document Server

    Xue, Dingyu

    2008-01-01

    Computer Mathematics Language-An Overview. Fundamentals of MATLAB Programming. Calculus Problems. MATLAB Computations of Linear Algebra Problems. Integral Transforms and Complex Variable Functions. Solutions to Nonlinear Equations and Optimization Problems. MATLAB Solutions to Differential Equation Problems. Solving Interpolations and Approximations Problems. Solving Probability and Mathematical Statistics Problems. Nontraditional Solution Methods for Mathematical Problems.

  15. Problems over Information Systems

    KAUST Repository

    Chikalov, Igor

    2011-01-01

    The problems of estimation of the minimum average time complexity of decision trees and design of efficient algorithms are complex in general case. The upper bounds described in Chap. 2.4.3 can not be applied directly due to large computational complexity of the parameter M(z). Under reasonable assumptions about the relation of P and NP, there are no polynomial time algorithms with good approximation ratio [12, 32]. One of the possible solutions is to consider particular classes of problems and improve the existing results using characteristics of the considered classes. © Springer-Verlag Berlin Heidelberg 2011.

  16. Managing Complexity

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Posse, Christian; Malard, Joel M.

    2004-08-01

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today’s most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically-based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This paper explores the state of the art in the use physical analogs for understanding the behavior of some econophysical systems and to deriving stable and robust control strategies for them. In particular we review and discussion applications of some analytic methods based on the thermodynamic metaphor according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood.

  17. Complex variables

    CERN Document Server

    Flanigan, Francis J

    2010-01-01

    A caution to mathematics professors: Complex Variables does not follow conventional outlines of course material. One reviewer noting its originality wrote: ""A standard text is often preferred [to a superior text like this] because the professor knows the order of topics and the problems, and doesn't really have to pay attention to the text. He can go to class without preparation."" Not so here-Dr. Flanigan treats this most important field of contemporary mathematics in a most unusual way. While all the material for an advanced undergraduate or first-year graduate course is covered, discussion

  18. Speech Problems

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Speech Problems KidsHealth / For Teens / Speech Problems What's in ... a person's ability to speak clearly. Some Common Speech and Language Disorders Stuttering is a problem that ...

  19. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  20. Multichannel analyzer embedded in FPGA

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Bravo M, I.

    2017-10-01

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  1. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Porkholm, K.; Nurmilaukas, P.; Tiihonen, O.; Haenninen, M.; Puska, E.

    1992-12-01

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  2. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  3. Methodological problems of monetary evaluations of complex environmental damages - the example of the forest damages in the Federal Republic of Germany. Zur monetaeren Bewertung von Umweltschaeden - methodische Untersuchung am Beispiel der Waldschaeden

    Energy Technology Data Exchange (ETDEWEB)

    Ewers, H J; Brabaender, H D; Brechtel, H M; Both, M; Hayessen, E; Jahn, A; Moehring, B; Moog, M; Nohl, W; Richter, U

    1986-01-01

    The study analyses the methodological problems of monetary evaluations of complex environmental damages by treating an actual example, the monetary evaluation of the present and further expectable forest damages in the Federal Republic of Germany. The state of the forest ecosystems, which can be expected under different plausible assumptions with respect to emissions and immissions of important air contaminants like SO/sub 2/ and NO/sub x/, is established by using expert surveys and scenario-techniques. The monetary consequences of different states of the forest system (status-quo-scenario: air pollution of the beginning 80ies continues; trend-scenario: strong, but possible reductions of SO/sub 2/ and NO/sub x/ emissions during the simulation period; reference-scenario: development path as if no air pollution existed) are estimated for three fields (timber production, recreation and tourism, water and soil protection) and for a period of 77 years (1984 to 2060).

  4. Analyzing water/wastewater infrastructure interdependencies

    International Nuclear Information System (INIS)

    Gillette, J. L.; Fisher, R. E.; Peerenboom, J. P.; Whitfield, R. G.

    2002-01-01

    This paper describes four general categories of infrastructure interdependencies (physical, cyber, geographic, and logical) as they apply to the water/wastewater infrastructure, and provides an overview of one of the analytic approaches and tools used by Argonne National Laboratory to evaluate interdependencies. Also discussed are the dimensions of infrastructure interdependency that create spatial, temporal, and system representation complexities that make analyzing the water/wastewater infrastructure particularly challenging. An analytical model developed to incorporate the impacts of interdependencies on infrastructure repair times is briefly addressed

  5. Methods of analyzing crude oil

    Science.gov (United States)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  6. Therapy Talk: Analyzing Therapeutic Discourse

    Science.gov (United States)

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  7. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  8. Proton-beam energy analyzer

    International Nuclear Information System (INIS)

    Belan, V.N.; Bolotin, L.I.; Kiselev, V.A.; Linnik, A.F.; Uskov, V.V.

    1989-01-01

    The authors describe a magnetic analyzer for measurement of proton-beam energy in the range from 100 keV to 25 MeV. The beam is deflected in a uniform transverse magnetic field and is registered by photographing a scintillation screen. The energy spectrum of the beam is constructed by microphotometry of the photographic film

  9. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  10. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  11. Development of pulse neutron coal analyzer

    International Nuclear Information System (INIS)

    Jing Shiwie; Gu Deshan; Qiao Shuang; Liu Yuren; Liu Linmao; Jing Shiwei

    2005-01-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14 MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented

  12. Development of a nuclear plant analyzer (NPA)

    International Nuclear Information System (INIS)

    De Vlaminck, M.; Mampaey, L.; Vanhoenacker, L.; Bastenaire, F.

    1990-01-01

    A Nuclear Plant Analyzer has been developed by TRACTABEL. Three distinct functional units make up the Nuclear Plant Analyser, a model builder, a run time unit and an analysis unit. The model builder is intended to build simulation models which describe on the one hand the geometric structure and initial conditions of a given plant and on the other hand command control logics and reactor protection systems. The run time unit carries out dialog between the user and the thermal-hydraulic code. The analysis unit is aimed at deep analyzing of the transient results. The model builder is being tested in the framework of the International Standard Problem ISP-26, which is the simulation of a LOCA on the Japanese ROSA facility

  13. Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR and Information Entropy

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2015-04-01

    Full Text Available Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL, Shili (SL and Renli (RL, so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.

  14. Mathematical Models to Determine Stable Behavior of Complex Systems

    Science.gov (United States)

    Sumin, V. I.; Dushkin, A. V.; Smolentseva, T. E.

    2018-05-01

    The paper analyzes a possibility to predict functioning of a complex dynamic system with a significant amount of circulating information and a large number of random factors impacting its functioning. Functioning of the complex dynamic system is described as a chaotic state, self-organized criticality and bifurcation. This problem may be resolved by modeling such systems as dynamic ones, without applying stochastic models and taking into account strange attractors.

  15. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  16. Complex Systems: Control and Modeling Problems

    Science.gov (United States)

    2004-08-23

    возможности не будет. Это связано прежде всего с тем, что существующие финансовые протоколы ISO 8583 , NDC, Diebold912 не позволяют в полной мере...doctrine/jel/- new_pubs/4_02_1.pdf. [8] DARPA Advanced Logistics Project, 2001. www.darpa.mil/ iso /alp. [9] [10] [11] [12] R.A. Rathmell, "A

  17. Parameters calculation of fuel assembly with complex geometry

    International Nuclear Information System (INIS)

    Wu Hongchun; Ju Haitao; Yao Dong

    2006-01-01

    The code DRAGON was developed for CANDU reactor by Ecole Polytechnique de Montreal of Canada. In order to validate the DRAGON code's applicability for complex geometry fuel assembly calculation, the rod shape fuel assembly of PWR benchmark problem and the plate shape fuel assembly of MTR benchmark problem were analyzed by DRAGON code. Some other shape fuel assemblies were also discussed simply. Calculation results show that the DRAGON code can be used to calculate variform fuel assembly and the precision is high. (authors)

  18. Problem situations in management activity

    OpenAIRE

    N.A. DUBINKO

    2009-01-01

    This article reviews contemporary methodological and theoretical approaches to the problem situations in management activity. Revealed and analyzed the types of problem situations managers dealing with in their activity. Rank correlation of problem situations shows distinctions depending on management work experience. Revealed gender distinctions in the managers' ideas of management problems.

  19. Structural qualia: a solution to the hard problem of consciousness.

    Science.gov (United States)

    Loorits, Kristjan

    2014-01-01

    The hard problem of consciousness has been often claimed to be unsolvable by the methods of traditional empirical sciences. It has been argued that all the objects of empirical sciences can be fully analyzed in structural terms but that consciousness is (or has) something over and above its structure. However, modern neuroscience has introduced a theoretical framework in which also the apparently non-structural aspects of consciousness, namely the so called qualia or qualitative properties, can be analyzed in structural terms. That framework allows us to see qualia as something compositional with internal structures that fully determine their qualitative nature. Moreover, those internal structures can be identified which certain neural patterns. Thus consciousness as a whole can be seen as a complex neural pattern that misperceives some of its own highly complex structural properties as monadic and qualitative. Such neural pattern is analyzable in fully structural terms and thereby the hard problem is solved.

  20. Structural qualia: a solution to the hard problem of consciousness

    Directory of Open Access Journals (Sweden)

    Kristjan eLoorits

    2014-03-01

    Full Text Available The hard problem of consciousness has been often claimed to be unsolvable by the methods of traditional empirical sciences. It has been argued that all the objects of empirical sciences can be fully analyzed in structural terms but that consciousness is (or has something over and above its structure. However, modern neuroscience has introduced a theoretical framework in which also the apparently non-structural aspects of consciousness, namely the so called qualia or qualitative properties, can be analyzed in structural terms. That framework allows us to see qualia as something compositional with internal structures that fully determine their qualitative nature. Moreover, those internal structures can be identified which certain neural patterns. Thus consciousness as a whole can be seen as a complex neural pattern that misperceives some of its own highly complex structural properties as monadic and qualitative. Such neural pattern is analyzable in fully structural terms and thereby the hard problem is solved.