WorldWideScience

Sample records for analyzing complex problems

  1. Problem-solving tools for analyzing system problems. The affinity map and the relationship diagram.

    Science.gov (United States)

    Lepley, C J

    1998-12-01

    The author describes how to use two management tools, an affinity map and a relationship diagram, to define and analyze aspects of a complex problem in a system. The affinity map identifies the key influencing elements of the problem, whereas the relationship diagram helps to identify the area that is the most important element of the issue. Managers can use the tools to draw a map of problem drivers, graphically display the drivers in a diagram, and use the diagram to develop a cause-and-effect relationship.

  2. Analyzing the complexity of nanotechnology

    NARCIS (Netherlands)

    Vries, de M.J.; Schummer, J.; Baird, D.

    2006-01-01

    Nanotechnology is a highly complex technological development due to many uncertainties in our knowledge about it. The Dutch philosopher Herman Dooyeweerd has developed a conceptual framework that can be used (1) to analyze the complexity of technological developments and (2) to see how priorities

  3. Advice Complexity of the Online Search Problem

    DEFF Research Database (Denmark)

    Clemente, Jhoirene; Hromkovič, Juraj; Komm, Dennis

    2016-01-01

    the minimum amount of information needed in order to achieve a certain competitive ratio. We design an algorithm that reads $b$ bits of advice and achieves a competitive ratio of (M/m)^{1/(2^b+1)} where M and m are the maximum and minimum price in the input. We also give a matching lower bound. Furthermore......The online search problem is a fundamental problem in finance. The numerous direct applications include searching for optimal prices for commodity trading and trading foreign currencies. In this paper, we analyze the advice complexity of this problem. In particular, we are interested in identifying......, we compare the power of advice and randomization for this problem....

  4. Quantum trajectories in complex space: One-dimensional stationary scattering problems

    International Nuclear Information System (INIS)

    Chou, C.-C.; Wyatt, Robert E.

    2008-01-01

    One-dimensional time-independent scattering problems are investigated in the framework of the quantum Hamilton-Jacobi formalism. The equation for the local approximate quantum trajectories near the stagnation point of the quantum momentum function is derived, and the first derivative of the quantum momentum function is related to the local structure of quantum trajectories. Exact complex quantum trajectories are determined for two examples by numerically integrating the equations of motion. For the soft potential step, some particles penetrate into the nonclassical region, and then turn back to the reflection region. For the barrier scattering problem, quantum trajectories may spiral into the attractors or from the repellers in the barrier region. Although the classical potentials extended to complex space show different pole structures for each problem, the quantum potentials present the same second-order pole structure in the reflection region. This paper not only analyzes complex quantum trajectories and the total potentials for these examples but also demonstrates general properties and similar structures of the complex quantum trajectories and the quantum potentials for one-dimensional time-independent scattering problems

  5. Understanding the determinants of problem-solving behavior in a complex environment

    Science.gov (United States)

    Casner, Stephen A.

    1994-01-01

    It is often argued that problem-solving behavior in a complex environment is determined as much by the features of the environment as by the goals of the problem solver. This article explores a technique to determine the extent to which measured features of a complex environment influence problem-solving behavior observed within that environment. In this study, the technique is used to determine how complex flight deck and air traffic control environment influences the strategies used by airline pilots when controlling the flight path of a modern jetliner. Data collected aboard 16 commercial flights are used to measure selected features of the task environment. A record of the pilots' problem-solving behavior is analyzed to determine to what extent behavior is adapted to the environmental features that were measured. The results suggest that the measured features of the environment account for as much as half of the variability in the pilots' problem-solving behavior and provide estimates on the probable effects of each environmental feature.

  6. Ordinal optimization and its application to complex deterministic problems

    Science.gov (United States)

    Yang, Mike Shang-Yu

    1998-10-01

    We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.

  7. Qubit Complexity of Continuous Problems

    National Research Council Canada - National Science Library

    Papageorgiou, A; Traub, J. F

    2005-01-01

    .... The authors show how to obtain the classical query complexity for continuous problems. They then establish a simple formula for a lower bound on the qubit complexity in terms of the classical query complexity...

  8. Application of NASA management approach to solve complex problems on earth

    Science.gov (United States)

    Potate, J. S.

    1972-01-01

    The application of NASA management approach to solving complex problems on earth is discussed. The management of the Apollo program is presented as an example of effective management techniques. Four key elements of effective management are analyzed. Photographs of the Cape Kennedy launch sites and supporting equipment are included to support the discussions.

  9. Analyzing complex networks evolution through Information Theory quantifiers

    International Nuclear Information System (INIS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  10. Problems of development of Kuzbass fuel power-engineering complex

    International Nuclear Information System (INIS)

    Mazikin, V.P.; Razumnyak, N.L.; Shatirov, S.V.; Gladyshev, G.P.

    2000-01-01

    Problems of Kuzbass fuel and energy complex development, bituminous and brown coal being its main resource, are discussed. Balance reserves of bituminous coal in Kuzbass are estimated as 59 bln. tons, which makes up 29% of the world and nearly 60% of bituminous coal reserves in Russia. Dynamics of price rise in reference to energy-grade coal of Kuzbass is analyzed. The structure of the Kuzbass energy system is considered and characteristics of its major state district electric power plants and heat and power generating plants are provided. Water-coal and water-black oil fuels are od interest for Kuzbass energy production as alternative source of energy. Special attention is paid to environmental problems of coal concentration [ru

  11. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  12. Complex saddle points and the sign problem in complex Langevin simulation

    International Nuclear Information System (INIS)

    Hayata, Tomoya; Hidaka, Yoshimasa; Tanizaki, Yuya

    2016-01-01

    We show that complex Langevin simulation converges to a wrong result within the semiclassical analysis, by relating it to the Lefschetz-thimble path integral, when the path-integral weight has different phases among dominant complex saddle points. Equilibrium solution of the complex Langevin equation forms local distributions around complex saddle points. Its ensemble average approximately becomes a direct sum of the average in each local distribution, where relative phases among them are dropped. We propose that by taking these phases into account through reweighting, we can solve the wrong convergence problem. However, this prescription may lead to a recurrence of the sign problem in the complex Langevin method for quantum many-body systems.

  13. Complex Problems in Entrepreneurship Education: Examining Complex Problem-Solving in the Application of Opportunity Identification

    Directory of Open Access Journals (Sweden)

    Yvette Baggen

    2017-01-01

    Full Text Available In opening up the black box of what entrepreneurship education (EE should be about, this study focuses on the exploration of relationships between two constructs: opportunity identification (OI and complex problem-solving (CPS. OI, as a domain-specific capability, is at the core of entrepreneurship research, whereas CPS is a more domain-general skill. On a conceptual level, there are reasons to believe that CPS skills can help individuals to identify potential opportunities in dynamic and nontransparent environments. Therefore, we empirically investigated whether CPS relates to OI among 113 masters students. Data is analyzed using multiple regressions. The results show that CPS predicts the number of concrete ideas that students generate, suggesting that having CPS skills supports the generation of detailed, potential business ideas of good quality. The results of the current study suggest that training CPS, as a more domain-general skill, could be a valuable part of what should be taught in EE.

  14. Complex Sequencing Problems and Local Search Heuristics

    NARCIS (Netherlands)

    Brucker, P.; Hurink, Johann L.; Osman, I.H.; Kelly, J.P.

    1996-01-01

    Many problems can be formulated as complex sequencing problems. We will present problems in flexible manufacturing that have such a formulation and apply local search methods like iterative improvement, simulated annealing and tabu search to solve these problems. Computational results are reported.

  15. Common ground, complex problems and decision making

    NARCIS (Netherlands)

    Beers, P.J.; Boshuizen, H.P.A.; Kirschner, P.A.; Gijselaers, W.H.

    2006-01-01

    Organisations increasingly have to deal with complex problems. They often use multidisciplinary teams to cope with such problems where different team members have different perspectives on the problem, different individual knowledge and skills, and different approaches on how to solve the problem.

  16. Complex network problems in physics, computer science and biology

    Science.gov (United States)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  17. Setting up problems raised by construction of the EDF-Eurodif complex

    International Nuclear Information System (INIS)

    Fontaine, J.P.; Roux, J.P.

    1977-01-01

    After presentation of the Tricastin site and the nuclear complex to be built there, the main problems of social, economical or administrative order arising from the establishment of the site are analyzed and the solutions applied in order to overcome them are described. In conclusion they note that the largest site in Europe should be carried out up to completion in the best interests of local collectivities, of the Engineers and the populations concerned [fr

  18. Conceptual and Developmental Analysis of Mental Models: An Example with Complex Change Problems.

    Science.gov (United States)

    Poirier, Louise

    Defining better implicit models of children's actions in a series of situations is of paramount importance to understanding how knowledge is constructed. The objective of this study was to analyze the implicit mental models used by children in complex change problems to understand the stability of the models and their evolution with the child's…

  19. The Process of Solving Complex Problems

    Science.gov (United States)

    Fischer, Andreas; Greiff, Samuel; Funke, Joachim

    2012-01-01

    This article is about Complex Problem Solving (CPS), its history in a variety of research domains (e.g., human problem solving, expertise, decision making, and intelligence), a formal definition and a process theory of CPS applicable to the interdisciplinary field. CPS is portrayed as (a) knowledge acquisition and (b) knowledge application…

  20. Environmental problems in the nuclear weapons complex

    International Nuclear Information System (INIS)

    Fultz, K.O.

    1989-04-01

    This paper provide the authors' views on the environmental problems facing the Department of Energy. Testimony is based on a large body of work, over 50 reports and testimonies since 1981, on environmental, safety, and health aspects of DOE's nuclear weapons complex. This work has shown that the complex faces a wide variety of serious problem areas including aging facilities, safety concerns which have shut down DOE's production reactors, and environmental cleanup

  1. Solving Complex Problems to Create Charter Extension Options

    DEFF Research Database (Denmark)

    Tippmann, Esther; Nell, Phillip Christopher

    undertaken by 29 subsidiary units supports our hypotheses, demonstrating that these activities are a means to systematically reduce inherent problem solving biases. This study contributes to problem solving theory, the literature on headquarters’ roles in complex organizations, as well as the literature......This study examines subsidiary-driven problem solving processes and their potential to create advanced solutions for charter extension options. Problem solving theory suggests that biases in problem formulation and solution search can confine problem solving potential. We thus argue that balanced...... solution search, or activities to reconcile the need for some solution features to be locally-tailored while others can be internationally standardized, mediates the relationships between problem complexity/headquarters involvement and the capacity to create advanced solutions. An analysis of 67 projects...

  2. Applications of systems thinking and soft operations research in managing complexity from problem framing to problem solving

    CERN Document Server

    2016-01-01

    This book captures current trends and developments in the field of systems thinking and soft operations research which can be applied to solve today's problems of dynamic complexity and interdependency. Such ‘wicked problems’ and messes are seemingly intractable problems characterized as value-laden, ambiguous, and unstable, that resist being tamed by classical problem solving. Actions and interventions associated with this complex problem space can have highly unpredictable and unintended consequences. Examples of such complex problems include health care reform, global climate change, transnational serious and organized crime, terrorism, homeland security, human security, disaster management, and humanitarian aid. Moving towards the development of solutions to these complex problem spaces depends on the lens we use to examine them and how we frame the problem. It will be shown that systems thinking and soft operations research has had great success in contributing to the management of complexity. .

  3. Cross-national comparisons of complex problem-solving strategies in two microworlds.

    Science.gov (United States)

    Güss, C Dominik; Tuason, Ma Teresa; Gerhard, Christiane

    2010-04-01

    Research in the fields of complex problem solving (CPS) and dynamic decision making using microworlds has been mainly conducted in Western industrialized countries. This study analyzes the CPS process by investigating thinking-aloud protocols in five countries. Participants were 511 students from Brazil, Germany, India, the Philippines, and the United States who worked on two microworlds. On the basis of cultural-psychological theories, specific cross-national differences in CPS strategies were hypothesized. Following theories of situatedness of cognition, hypotheses about the specific frequency of problem-solving strategies in the two microworlds were developed. Results of the verbal protocols showed (a) modification of the theoretical CPS model, (b) task dependence of CPS strategies, and (c) cross-national differences in CPS strategies. Participants' CPS processes were particularly influenced by country-specific problem-solving strategies. Copyright © 2009 Cognitive Science Society, Inc.

  4. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  5. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands.

    Science.gov (United States)

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high ( n = 58) or low ( n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as

  6. Finding practical solutions to complex problems: IDRC's fifth annual ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-04-15

    Apr 15, 2016 ... English · Français ... Finding practical solutions to complex problems: IDRC's fifth annual ... “IDRC staff share a common goal with the researchers they work with – to find low-cost, down-to-earth solutions to complex problems ...

  7. One Problem, Many Solutions : Simple Statistical Approaches Help Unravel the Complexity of the Immune System in an Ecological Context

    NARCIS (Netherlands)

    Buehler, Deborah M.; Versteegh, Maaike A.; Matson, Kevin D.; Tieleman, Irene

    2011-01-01

    The immune system is a complex collection of interrelated and overlapping solutions to the problem of disease. To deal with this complexity, researchers have devised multiple ways to measure immune function and to analyze the resulting data. In this way both organisms and researchers employ many

  8. One problem, many solutions: simple statistical approaches help unravel the complexity of th eimmune system in an ecological context

    NARCIS (Netherlands)

    Buehler, D.M.; Versteegh, M.A.; Matson, K.D.; Tieleman, B.I.

    2011-01-01

    The immune system is a complex collection of interrelated and overlapping solutions to the problem of disease. To deal with this complexity, researchers have devised multiple ways to measure immune function and to analyze the resulting data. In this way both organisms and researchers employ many

  9. Preparing new nurses with complexity science and problem-based learning.

    Science.gov (United States)

    Hodges, Helen F

    2011-01-01

    Successful nurses function effectively with adaptability, improvability, and interconnectedness, and can see emerging and unpredictable complex problems. Preparing new nurses for complexity requires a significant change in prevalent but dated nursing education models for rising graduates. The science of complexity coupled with problem-based learning and peer review contributes a feasible framework for a constructivist learning environment to examine real-time systems data; explore uncertainty, inherent patterns, and ambiguity; and develop skills for unstructured problem solving. This article describes a pilot study of a problem-based learning strategy guided by principles of complexity science in a community clinical nursing course. Thirty-five senior nursing students participated during a 3-year period. Assessments included peer review, a final project paper, reflection, and a satisfaction survey. Results were higher than expected levels of student satisfaction, increased breadth and analysis of complex data, acknowledgment of community as complex adaptive systems, and overall higher level thinking skills than in previous years. 2011, SLACK Incorporated.

  10. Quantum complexity of graph and algebraic problems

    International Nuclear Information System (INIS)

    Doern, Sebastian

    2008-01-01

    This thesis is organized as follows: In Chapter 2 we give some basic notations, definitions and facts from linear algebra, graph theory, group theory and quantum computation. In Chapter 3 we describe three important methods for the construction of quantum algorithms. We present the quantum search algorithm by Grover, the quantum amplitude amplification and the quantum walk search technique by Magniez et al. These three tools are the basis for the development of our new quantum algorithms for graph and algebra problems. In Chapter 4 we present two tools for proving quantum query lower bounds. We present the quantum adversary method by Ambainis and the polynomial method introduced by Beals et al. The quantum adversary tool is very useful to prove good lower bounds for many graph and algebra problems. The part of the thesis containing the original results is organized in two parts. In the first part we consider the graph problems. In Chapter 5 we give a short summary of known quantum graph algorithms. In Chapter 6 to 8 we study the complexity of our new algorithms for matching problems, graph traversal and independent set problems on quantum computers. In the second part of our thesis we present new quantum algorithms for algebraic problems. In Chapter 9 to 10 we consider group testing problems and prove quantum complexity bounds for important problems from linear algebra. (orig.)

  11. Quantum complexity of graph and algebraic problems

    Energy Technology Data Exchange (ETDEWEB)

    Doern, Sebastian

    2008-02-04

    This thesis is organized as follows: In Chapter 2 we give some basic notations, definitions and facts from linear algebra, graph theory, group theory and quantum computation. In Chapter 3 we describe three important methods for the construction of quantum algorithms. We present the quantum search algorithm by Grover, the quantum amplitude amplification and the quantum walk search technique by Magniez et al. These three tools are the basis for the development of our new quantum algorithms for graph and algebra problems. In Chapter 4 we present two tools for proving quantum query lower bounds. We present the quantum adversary method by Ambainis and the polynomial method introduced by Beals et al. The quantum adversary tool is very useful to prove good lower bounds for many graph and algebra problems. The part of the thesis containing the original results is organized in two parts. In the first part we consider the graph problems. In Chapter 5 we give a short summary of known quantum graph algorithms. In Chapter 6 to 8 we study the complexity of our new algorithms for matching problems, graph traversal and independent set problems on quantum computers. In the second part of our thesis we present new quantum algorithms for algebraic problems. In Chapter 9 to 10 we consider group testing problems and prove quantum complexity bounds for important problems from linear algebra. (orig.)

  12. The ESTER particle and plasma analyzer complex for the Phobos mission

    Energy Technology Data Exchange (ETDEWEB)

    Afonin, V.V.; Shutte, N.M. (AN SSSR, Moscow (USSR). Inst. Kosmicheskikh Issledovanij); McKenna-Lawlor, S.; Rusznyak, P. (Space Technology Ireland Ltd., Maynooth (Ireland)); Kiraly, P.; Szabo, L.; Szalai, S.; Szucs, I.T.; Varhalmi, L. (Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics); Marsden, R. (European Space Agency, Noordwijk (Netherlands). Space Science Dept.); Richter, A.; Witte, M. (Max-Planck-Institut fuer Aeronomie, Katlenburg-Lindau (Germany, F.R.))

    1990-05-01

    The ESTER particle and plasma analyzer system for the Phobos Mission comprised a complex of three instruments (LET, SLED and HARP) serviced by a common Data Processing Unit. An account is provided of this complex, its objectives and excellent performance in space. (orig.).

  13. Modeling the Structure and Complexity of Engineering Routine Design Problems

    NARCIS (Netherlands)

    Jauregui Becker, Juan Manuel; Wits, Wessel Willems; van Houten, Frederikus J.A.M.

    2011-01-01

    This paper proposes a model to structure routine design problems as well as a model of its design complexity. The idea is that having a proper model of the structure of such problems enables understanding its complexity, and likewise, a proper understanding of its complexity enables the development

  14. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    Science.gov (United States)

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  15. On the complexity of container stowage planning problems

    DEFF Research Database (Denmark)

    Tierney, Kevin; Pacino, Dario; Jensen, Rune Møller

    2014-01-01

    The optimization of container ship and depot operations embeds the kk-shift problem, in which containers must be stowed in stacks such that at most kk containers must be removed in order to reach containers below them. We first solve an open problem introduced by Avriel et al. (2000) by showing...... that changing from uncapacitated to capacitated stacks reduces the complexity of this problem from NP-complete to polynomial. We then examine the complexity of the current state-of-the-art abstraction of container ship stowage planning, wherein containers and slots are grouped together. To do this, we define...... the hatch overstow problem, in which a set of containers are placed on top of the hatches of a container ship such that the number of containers that are stowed on hatches that must be accessed is minimized. We show that this problem is NP-complete by a reduction from the set-covering problem, which means...

  16. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc

    2017-01-01

    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze...... programs in high-level languages, AProVE automatically converts them to (int-)TRSs. Then, a wide range of techniques is employed to prove termination and to infer complexity bounds for the resulting rewrite systems. The generated proofs can be exported to check their correctness using automatic certifiers...

  17. Correcting environmental problems facing the nuclear weapons complex

    International Nuclear Information System (INIS)

    Rezendes, V.S.

    1990-06-01

    This report discusses DOE's efforts to correct the environmental problems facing the nuclear weapons complex. It focuses on three main points. First, the weapons complex faces a variety of serious and costly environmental problems. Second, during the past year, DOE has made some important changes to its organization that should help change its management focus from one that emphasizes materials production to one that more clearly focuses on environmental concerns. Third, because resolution of DOE's environmental problems will require considerable resources during a period of budgetary constraints, it is imperative that DOE have internal controls in place to ensure that resources are spent efficiently

  18. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    Science.gov (United States)

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58) or low (n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as

  19. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    Directory of Open Access Journals (Sweden)

    Vera Hagemann

    2017-09-01

    Full Text Available Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes, the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58 or low (n = 58 collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes

  20. Non-commutative cryptography and complexity of group-theoretic problems

    CERN Document Server

    Myasnikov, Alexei; Ushakov, Alexander

    2011-01-01

    This book is about relations between three different areas of mathematics and theoretical computer science: combinatorial group theory, cryptography, and complexity theory. It explores how non-commutative (infinite) groups, which are typically studied in combinatorial group theory, can be used in public-key cryptography. It also shows that there is remarkable feedback from cryptography to combinatorial group theory because some of the problems motivated by cryptography appear to be new to group theory, and they open many interesting research avenues within group theory. In particular, a lot of emphasis in the book is put on studying search problems, as compared to decision problems traditionally studied in combinatorial group theory. Then, complexity theory, notably generic-case complexity of algorithms, is employed for cryptanalysis of various cryptographic protocols based on infinite groups, and the ideas and machinery from the theory of generic-case complexity are used to study asymptotically dominant prop...

  1. Explicitly solvable complex Chebyshev approximation problems related to sine polynomials

    Science.gov (United States)

    Freund, Roland

    1989-01-01

    Explicitly solvable real Chebyshev approximation problems on the unit interval are typically characterized by simple error curves. A similar principle is presented for complex approximation problems with error curves induced by sine polynomials. As an application, some new explicit formulae for complex best approximations are derived.

  2. New complex variable meshless method for advection—diffusion problems

    International Nuclear Information System (INIS)

    Wang Jian-Fei; Cheng Yu-Min

    2013-01-01

    In this paper, an improved complex variable meshless method (ICVMM) for two-dimensional advection—diffusion problems is developed based on improved complex variable moving least-square (ICVMLS) approximation. The equivalent functional of two-dimensional advection—diffusion problems is formed, the variation method is used to obtain the equation system, and the penalty method is employed to impose the essential boundary conditions. The difference method for two-point boundary value problems is used to obtain the discrete equations. Then the corresponding formulas of the ICVMM for advection—diffusion problems are presented. Two numerical examples with different node distributions are used to validate and inestigate the accuracy and efficiency of the new method in this paper. It is shown that ICVMM is very effective for advection—diffusion problems, and has a good convergent character, accuracy, and computational efficiency

  3. ANALYZING THE RELATIONSHIP BETWEEN PROBLEM SOLVING SKILLS AND PERSONALITY CHARACTERISTICS OF UNIVERSITY STUDENTS

    OpenAIRE

    SÜLEYMAN DÜNDAR

    2013-01-01

    The aim of this study is to analyze problem solving skills of university students according to their personal characteristics. We try to find out if there is a difference in problem solving skills considering sex, class and personality harmony characteristics. Personal data form, Problem Solving Scale and Hacettepe Personality Scale are used as measurement tools. The results of the study indicate that there is no difference between male and female students in problem solving skills. Problem s...

  4. Addressing complex design problems through inductive learning

    OpenAIRE

    Hanna, S.

    2012-01-01

    Optimisation and related techniques are well suited to clearly defined problems involving systems that can be accurately simulated, but not to tasks in which the phenomena in question are highly complex or the problem ill-defined. These latter are typical of architecture and particularly creative design tasks, which therefore currently lack viable computational tools. It is argued that as design teams and construction projects of unprecedented scale are increasingly frequent, this is just whe...

  5. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    Science.gov (United States)

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  6. Analyzing Integrated Cost-Schedule Risk for Complex Product Systems R&D Projects

    Directory of Open Access Journals (Sweden)

    Zhe Xu

    2014-01-01

    Full Text Available The vast majority of the research efforts in project risk management tend to assess cost risk and schedule risk independently. However, project cost and time are related in reality and the relationship between them should be analyzed directly. We propose an integrated cost and schedule risk assessment model for complex product systems R&D projects. Graphical evaluation review technique (GERT, Monte Carlo simulation, and probability distribution theory are utilized to establish the model. In addition, statistical analysis and regression analysis techniques are employed to analyze simulation outputs. Finally, a complex product systems R&D project as an example is modeled by the proposed approach and the simulation outputs are analyzed to illustrate the effectiveness of the risk assessment model. It seems that integrating cost and schedule risk assessment can provide more reliable risk estimation results.

  7. How Unstable Are Complex Financial Systems? Analyzing an Inter-bank Network of Credit Relations

    Science.gov (United States)

    Sinha, Sitabhra; Thess, Maximilian; Markose, Sheri

    The recent worldwide economic crisis of 2007-09 has focused attention on the need to analyze systemic risk in complex financial networks. We investigate the problem of robustness of such systems in the context of the general theory of dynamical stability in complex networks and, in particular, how the topology of connections influence the risk of the failure of a single institution triggering a cascade of successive collapses propagating through the network. We use data on bilateral liabilities (or exposure) in the derivatives market between 202 financial intermediaries based in USA and Europe in the last quarter of 2009 to empirically investigate the network structure of the over-the-counter (OTC) derivatives market. We observe that the network exhibits both heterogeneity in node properties and the existence of communities. It also has a prominent core-periphery organization and can resist large-scale collapse when subjected to individual bank defaults (however, failure of any bank in the core may result in localized collapse of the innermost core with substantial loss of capital) but is vulnerable to system-wide breakdown as a result of an accompanying liquidity crisis.

  8. Analyzing the Problems of Ayandeh Bank Branches across the Country Using Data Mining Technique

    Directory of Open Access Journals (Sweden)

    Shabnam Mohammadi

    2014-06-01

    Full Text Available In order to manage problems and complaints of customers and branches, many banks in the country outsource parts of their customer relationship management to companies such as call centers. Since this important unit is managed out of the banks, analyzing the data and evaluating the performance of call centers are very important. On the other hand, many banks are not able to analyze and do not know how to use hidden patterns in the data. Hence, by presenting RFS model in this paper, we have tried to cluster bank branches based on R factor (recently announced problem, F (frequency or number of difficulties and S (branches satisfaction with call center and find the relationship between these factors and mentioned problems. Moreover, call center's ability to resolve problems of branches of each cluster can be assessed using S Factor. Branches were distributed into four optimized clusters based on their behavior pattern. Finally, the results were analyzed and the recommendations were presented to improve the performance of call centers.

  9. Knowledge based method for solving complexity in design problems

    NARCIS (Netherlands)

    Vermeulen, B.

    2007-01-01

    The process of design aircraft systems is becoming more and more complex, due to an increasing amount of requirements. Moreover, the knowledge on how to solve these complex design problems becomes less readily available, because of a decrease in availability of intellectual resources and reduced

  10. Fluid leadership: inviting diverse inputs to address complex problems

    OpenAIRE

    Moir, Sylvia

    2016-01-01

    Approved for public release; distribution is unlimited History is replete with examples of misapplied leadership strategies. When singular methods are used to solve multifaceted problems, negative results are often the consequence. Complex issues in a complex environment require complex perspectives; the homeland security enterprise (HSE) needs leaders who can adapt their leadership styles according to emerging environments. Furthermore, the diverse agencies within the HSE must work togeth...

  11. Making mobility-related disability better: a complex response to a complex problem.

    Science.gov (United States)

    Rockwood, Kenneth

    2012-10-15

    Mobility disability in older adults can arise from single system problems, such as discrete musculoskeletal injury. In frail older adults, however, mobility disability is part of a complex web of problems. The approach to their rehabilitation must take that complexity into account, as is reported by Fairhall et al. First, their overall health state must be assessed, which is achieved by a comprehensive geriatric assessment. The assessment can show how a particular patient came to be disabled, so that an individualized care plan can be worked out. Whether this approach works in general can be evaluated by looking at group differences in mean mobility test scores. Knowing whether it has worked in the individual patient requires an individualized measure. This is because not every patient starts from the same point, and not every patient achieves success by aiming for the same goal. For one patient, walking unassisted for three metres would be a triumph; for another it would be a tragedy. Unless we understand the complexity of the needs of frail older adults, we will neither be able to treat them effectively nor evaluate our efforts sensibly.Please see related article http://www.biomedcentral.com/1741-7015/10/120.

  12. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    Science.gov (United States)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  13. Solving complex problems a handbook

    CERN Document Server

    Schönwandt, Walter; Grunau, Jens; Utz, Jürgen; Voermanek, Katrin

    2014-01-01

    When you're planning something big, problems appear rather quickly. We hear of them on a daily basis. The bigger or more complex a task, the more we have to deal with complicated, multidisciplinary task formulations. In many cases it is architecture, including urban and spatial planning, but also politics and all types of organizational forms, irrespective of whether they are public authorities or private enterprises, which are expected to deliver functional solutions for such challenges. This is precisely where this book is helpful. It introduces a methodology for developing target-specific,

  14. Developing an agent-based model on how different individuals solve complex problems

    Directory of Open Access Journals (Sweden)

    Ipek Bozkurt

    2015-01-01

    Full Text Available Purpose: Research that focuses on the emotional, mental, behavioral and cognitive capabilities of individuals has been abundant within disciplines such as psychology, sociology, and anthropology, among others. However, when facing complex problems, a new perspective to understand individuals is necessary. The main purpose of this paper is to develop an agent-based model and simulation to gain understanding on the decision-making and problem-solving abilities of individuals. Design/Methodology/approach: The micro-level analysis modeling and simulation paradigm Agent-Based Modeling Through the use of Agent-Based Modeling, insight is gained on how different individuals with different profiles deal with complex problems. Using previous literature from different bodies of knowledge, established theories and certain assumptions as input parameters, a model is built and executed through a computer simulation. Findings: The results indicate that individuals with certain profiles have better capabilities to deal with complex problems. Moderate profiles could solve the entire complex problem, whereas profiles within extreme conditions could not. This indicates that having a strong predisposition is not the ideal way when approaching complex problems, and there should always be a component from the other perspective. The probability that an individual may use these capabilities provided by the opposite predisposition provides to be a useful option. Originality/value: The originality of the present research stems from how individuals are profiled, and the model and simulation that is built to understand how they solve complex problems. The development of the agent-based model adds value to the existing body of knowledge within both social sciences, and modeling and simulation.

  15. Solving complex band structure problems with the FEAST eigenvalue algorithm

    Science.gov (United States)

    Laux, S. E.

    2012-08-01

    With straightforward extension, the FEAST eigenvalue algorithm [Polizzi, Phys. Rev. B 79, 115112 (2009)] is capable of solving the generalized eigenvalue problems representing traveling-wave problems—as exemplified by the complex band-structure problem—even though the matrices involved are complex, non-Hermitian, and singular, and hence outside the originally stated range of applicability of the algorithm. The obtained eigenvalues/eigenvectors, however, contain spurious solutions which must be detected and removed. The efficiency and parallel structure of the original algorithm are unaltered. The complex band structures of Si layers of varying thicknesses and InAs nanowires of varying radii are computed as test problems.

  16. Complex multiplication and lifting problems

    CERN Document Server

    Chai, Ching-Li; Oort, Frans

    2013-01-01

    Abelian varieties with complex multiplication lie at the origins of class field theory, and they play a central role in the contemporary theory of Shimura varieties. They are special in characteristic 0 and ubiquitous over finite fields. This book explores the relationship between such abelian varieties over finite fields and over arithmetically interesting fields of characteristic 0 via the study of several natural CM lifting problems which had previously been solved only in special cases. In addition to giving complete solutions to such questions, the authors provide numerous examples to illustrate the general theory and present a detailed treatment of many fundamental results and concepts in the arithmetic of abelian varieties, such as the Main Theorem of Complex Multiplication and its generalizations, the finer aspects of Tate's work on abelian varieties over finite fields, and deformation theory. This book provides an ideal illustration of how modern techniques in arithmetic geometry (such as descent the...

  17. Making mobility-related disability better: a complex response to a complex problem

    Directory of Open Access Journals (Sweden)

    Rockwood Kenneth

    2012-10-01

    Full Text Available Abstract Mobility disability in older adults can arise from single system problems, such as discrete musculoskeletal injury. In frail older adults, however, mobility disability is part of a complex web of problems. The approach to their rehabilitation must take that complexity into account, as is reported by Fairhall et al. First, their overall health state must be assessed, which is achieved by a comprehensive geriatric assessment. The assessment can show how a particular patient came to be disabled, so that an individualized care plan can be worked out. Whether this approach works in general can be evaluated by looking at group differences in mean mobility test scores. Knowing whether it has worked in the individual patient requires an individualized measure. This is because not every patient starts from the same point, and not every patient achieves success by aiming for the same goal. For one patient, walking unassisted for three metres would be a triumph; for another it would be a tragedy. Unless we understand the complexity of the needs of frail older adults, we will neither be able to treat them effectively nor evaluate our efforts sensibly. Please see related article http://www.biomedcentral.com/1741-7015/10/120

  18. Program for Analyzing Flows in a Complex Network

    Science.gov (United States)

    Majumdar, Alok Kumar

    2006-01-01

    Generalized Fluid System Simulation Program (GFSSP) version 4 is a general-purpose computer program for analyzing steady-state and transient flows in a complex fluid network. The program is capable of modeling compressibility, fluid transients (e.g., water hammers), phase changes, mixtures of chemical species, and such externally applied body forces as gravitational and centrifugal ones. A graphical user interface enables the user to interactively develop a simulation of a fluid network consisting of nodes and branches. The user can also run the simulation and view the results in the interface. The system of equations for conservation of mass, energy, chemical species, and momentum is solved numerically by a combination of the Newton-Raphson and successive-substitution methods.

  19. Increasing process understanding by analyzing complex interactions in experimental data

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Allesø, Morten; Kristensen, Henning Gjelstrup

    2009-01-01

    understanding of a coating process. It was possible to model the response, that is, the amount of drug released, using both mentioned techniques. However, the ANOVAmodel was difficult to interpret as several interactions between process parameters existed. In contrast to ANOVA, GEMANOVA is especially suited...... for modeling complex interactions and making easily understandable models of these. GEMANOVA modeling allowed a simple visualization of the entire experimental space. Furthermore, information was obtained on how relative changes in the settings of process parameters influence the film quality and thereby drug......There is a recognized need for new approaches to understand unit operations with pharmaceutical relevance. A method for analyzing complex interactions in experimental data is introduced. Higher-order interactions do exist between process parameters, which complicate the interpretation...

  20. Analogy as a strategy for supporting complex problem solving under uncertainty.

    Science.gov (United States)

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  1. Particle swarm as optimization tool in complex nuclear engineering problems

    International Nuclear Information System (INIS)

    Medeiros, Jose Antonio Carlos Canedo

    2005-06-01

    Due to its low computational cost, gradient-based search techniques associated to linear programming techniques are being used as optimization tools. These techniques, however, when applied to multimodal search spaces, can lead to local optima. When finding solutions for complex multimodal domains, random search techniques are being used with great efficacy. In this work we exploit the swarm optimization algorithm search power capacity as an optimization tool for the solution of complex high dimension and multimodal search spaces of nuclear problems. Due to its easy and natural representation of high dimension domains, the particle swarm optimization was applied with success for the solution of complex nuclear problems showing its efficacy in the search of solutions in high dimension and complex multimodal spaces. In one of these applications it enabled a natural and trivial solution in a way not obtained with other methods confirming the validity of its application. (author)

  2. Simulation Gaming as a Social Development Instrument : Dealing with Complex Problems

    NARCIS (Netherlands)

    Klievink, B.; Janssen, M.

    Improving public service delivery is a very complex domain and the complexity is difficult to grasp by stakeholders having various degree of knowledge and involvement. An emergent and promising method for dealing with complex problems is simulation gaming, which can be used to capitalize the

  3. SCHOOL VIOLENCE: A COMPLEX PROBLEM

    Directory of Open Access Journals (Sweden)

    María del Rosario Ayala-Carrillo

    2015-07-01

    Full Text Available School violence is one type of violence that reflects the breakdown of current society. It is impossible to speak of school violence as an isolated phenomenon without establishing nexuses between public and private life, between collective and individual behaviors, between family and community aspects, without making reference to differences in gender and the life stories of those who are the aggressors or the victims, and without considering the patriarchal culture and interpersonal relationships. When all these factor are interrelated, they make the problem of violence a very complex one that requires us to know the different factors in order to understand it and deal with it.

  4. Structuring and assessing large and complex decision problems using MCDA

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    This paper presents an approach for the structuring and assessing of large and complex decision problems using multi-criteria decision analysis (MCDA). The MCDA problem is structured in a decision tree and assessed using the REMBRANDT technique featuring a procedure for limiting the number of pair...

  5. Dependability problems of complex information systems

    CERN Document Server

    Zamojski, Wojciech

    2014-01-01

    This monograph presents original research results on selected problems of dependability in contemporary Complex Information Systems (CIS). The ten chapters are concentrated around the following three aspects: methods for modelling of the system and its components, tasks ? or in more generic and more adequate interpretation, functionalities ? accomplished by the system and conditions for their correct realization in the dynamic operational environment. While the main focus is on theoretical advances and roadmaps for implementations of new technologies, a?much needed forum for sharing of the bes

  6. Addressing Complex Challenges through Adaptive Leadership: A Promising Approach to Collaborative Problem Solving

    Science.gov (United States)

    Nelson, Tenneisha; Squires, Vicki

    2017-01-01

    Organizations are faced with solving increasingly complex problems. Addressing these issues requires effective leadership that can facilitate a collaborative problem solving approach where multiple perspectives are leveraged. In this conceptual paper, we critique the effectiveness of earlier leadership models in tackling complex organizational…

  7. Analyzing the Responses of 7-8 Year Olds When Solving Partitioning Problems

    Science.gov (United States)

    Badillo, Edelmira; Font, Vicenç; Edo, Mequè

    2015-01-01

    We analyze the mathematical solutions of 7- to 8-year-old pupils while individually solving an arithmetic problem. The analysis was based on the "configuration of objects," an instrument derived from the onto-semiotic approach to mathematical knowledge. Results are illustrated through a number of cases. From the analysis of mathematical…

  8. arXiv Spin models in complex magnetic fields: a hard sign problem

    CERN Document Server

    de Forcrand, Philippe

    2018-01-01

    Coupling spin models to complex external fields can give rise to interesting phenomena like zeroes of the partition function (Lee-Yang zeroes, edge singularities) or oscillating propagators. Unfortunately, it usually also leads to a severe sign problem that can be overcome only in special cases; if the partition function has zeroes, the sign problem is even representation-independent at these points. In this study, we couple the N-state Potts model in different ways to a complex external magnetic field and discuss the above mentioned phenomena and their relations based on analytic calculations (1D) and results obtained using a modified cluster algorithm (general D) that in many cases either cures or at least drastically reduces the sign-problem induced by the complex external field.

  9. How Cognitive Style and Problem Complexity Affect Preservice Agricultural Education Teachers' Abilities to Solve Problems in Agricultural Mechanics

    Science.gov (United States)

    Blackburn, J. Joey; Robinson, J. Shane; Lamm, Alexa J.

    2014-01-01

    The purpose of this experimental study was to determine the effects of cognitive style and problem complexity on Oklahoma State University preservice agriculture teachers' (N = 56) ability to solve problems in small gasoline engines. Time to solution was operationalized as problem solving ability. Kirton's Adaption-Innovation Inventory was…

  10. The Similar Structures and Control Problems of Complex Systems

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In this paper, the naturally evolving complex systems, such as biotic and social ones, are considered. Focusing on their structures, a feature is noteworthy, i.e., the similarity in structures. The relations between the functions and behaviors of these systems and their similar structures will be studied. Owing to the management of social systems and the course of evolution of biotic systems may be regarded as control processes, the researches will be within the scope of control problems. Moreover, since it is difficult to model for biotic and social systems, it will start with the control problems of complex systems, possessing similar structures, in engineering.The obtained results show that for either linear or nonlinear systems and for a lot of control problemssimilar structures lead to a series of simplifications. In general, the original system may be decomposed into reduced amount of subsystems with lower dimensions and simpler structures. By virtue of such subsystems, the control problems of original system can be solved more simply.At last, it turns round to observe the biotic and social systems and some analyses are given.

  11. Tourists' mental representations of complex travel decision problems

    NARCIS (Netherlands)

    Dellaert, B.G.C.; Arentze, T.A.; Horeni, O.

    2014-01-01

    Tourism research has long recognized the complexity of many decisions that tourists make and proposed models to describe and analyze tourist decision processes. This article complements this previous research by proposing a view that moves away from the process of making a decision and instead

  12. Solution of a Complex Least Squares Problem with Constrained Phase.

    Science.gov (United States)

    Bydder, Mark

    2010-12-30

    The least squares solution of a complex linear equation is in general a complex vector with independent real and imaginary parts. In certain applications in magnetic resonance imaging, a solution is desired such that each element has the same phase. A direct method for obtaining the least squares solution to the phase constrained problem is described.

  13. Tracing the development of complex problems and the methods of its information support

    International Nuclear Information System (INIS)

    Belenki, A.; Ryjov, A.

    1999-01-01

    This article is dedicated to the development of a technology for information monitoring of complex problems such as IAEA safeguards tasks. The main purpose of this technology is to create human-machine systems for monitoring problems with complex subject areas such as political science, social science, business, ecology and etc. (author)

  14. How Students Circumvent Problem-Solving Strategies that Require Greater Cognitive Complexity.

    Science.gov (United States)

    Niaz, Mansoor

    1996-01-01

    Analyzes the great diversity in problem-solving strategies used by students in solving a chemistry problem and discusses the relationship between these variables and different cognitive variables. Concludes that students try to circumvent certain problem-solving strategies by adapting flexible and stylistic innovations that render the cognitive…

  15. Advice Complexity of the Online Induced Subgraph Problem

    DEFF Research Database (Denmark)

    Komm, Dennis; Královič, Rastislav; Královič, Richard

    2016-01-01

    of the input can influence the solution quality. We evaluate the information in a quantitative way by considering the best possible advice of given size that describes the unknown input. Using a result from Boyar et al. we give a tight trade-off relationship stating that, for inputs of length n, roughly n...... subgraph problem, preemption does not significantly help by giving a lower bound of Omega(n/(c^2\\log c)) on the bits of advice that are needed to obtain competitive ratio c, where c is any increasing function bounded from above by \\sqrt{n/\\log n}. We also give a linear lower bound for c close to 1....... these problems by investigating a generalized problem: for an arbitrary but fixed hereditary property, find some maximal induced subgraph having the property. We investigate this problem from the point of view of advice complexity, i.e. we ask how some additional information about the yet unrevealed parts...

  16. Data Mining and Complex Problems: Case Study in Composite Materials

    Science.gov (United States)

    Rabelo, Luis; Marin, Mario

    2009-01-01

    Data mining is defined as the discovery of useful, possibly unexpected, patterns and relationships in data using statistical and non-statistical techniques in order to develop schemes for decision and policy making. Data mining can be used to discover the sources and causes of problems in complex systems. In addition, data mining can support simulation strategies by finding the different constants and parameters to be used in the development of simulation models. This paper introduces a framework for data mining and its application to complex problems. To further explain some of the concepts outlined in this paper, the potential application to the NASA Shuttle Reinforced Carbon-Carbon structures and genetic programming is used as an illustration.

  17. Complexity of hierarchically and 1-dimensional periodically specified problems

    Energy Technology Data Exchange (ETDEWEB)

    Marathe, M.V.; Hunt, H.B. III; Stearns, R.E.; Radhakrishnan, V.

    1995-08-23

    We study the complexity of various combinatorial and satisfiability problems when instances are specified using one of the following specifications: (1) the 1-dimensional finite periodic narrow specifications of Wanke and Ford et al. (2) the 1-dimensional finite periodic narrow specifications with explicit boundary conditions of Gale (3) the 2-way infinite1-dimensional narrow periodic specifications of Orlin et al. and (4) the hierarchical specifications of Lengauer et al. we obtain three general types of results. First, we prove that there is a polynomial time algorithm that given a 1-FPN- or 1-FPN(BC)specification of a graph (or a C N F formula) constructs a level-restricted L-specification of an isomorphic graph (or formula). This theorem along with the hardness results proved here provides alternative and unified proofs of many hardness results proved in the past either by Lengauer and Wagner or by Orlin. Second, we study the complexity of generalized CNF satisfiability problems of Schaefer. Assuming P {ne} PSPACE, we characterize completely the polynomial time solvability of these problems, when instances are specified as in (1), (2),(3) or (4). As applications of our first two types of results, we obtain a number of new PSPACE-hardness and polynomial time algorithms for problems specified as in (1), (2), (3) or(4). Many of our results also hold for O(log N) bandwidth bounded planar instances.

  18. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    Science.gov (United States)

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we developed. The…

  19. Conceptual and procedural knowledge community college students use when solving a complex science problem

    Science.gov (United States)

    Steen-Eibensteiner, Janice Lee

    2006-07-01

    A strong science knowledge base and problem solving skills have always been highly valued for employment in the science industry. Skills currently needed for employment include being able to problem solve (Overtoom, 2000). Academia also recognizes the need for effectively teaching students to apply problem solving skills in clinical settings. This thesis investigates how students solve complex science problems in an academic setting in order to inform the development of problem solving skills for the workplace. Students' use of problem solving skills in the form of learned concepts and procedural knowledge was studied as students completed a problem that might come up in real life. Students were taking a community college sophomore biology course, Human Anatomy & Physiology II. The problem topic was negative feedback inhibition of the thyroid and parathyroid glands. The research questions answered were (1) How well do community college students use a complex of conceptual knowledge when solving a complex science problem? (2) What conceptual knowledge are community college students using correctly, incorrectly, or not using when solving a complex science problem? (3) What problem solving procedural knowledge are community college students using successfully, unsuccessfully, or not using when solving a complex science problem? From the whole class the high academic level participants performed at a mean of 72% correct on chapter test questions which was a low average to fair grade of C-. The middle and low academic participants both failed (F) the test questions (37% and 30% respectively); 29% (9/31) of the students show only a fair performance while 71% (22/31) fail. From the subset sample population of 2 students each from the high, middle, and low academic levels selected from the whole class 35% (8/23) of the concepts were used effectively, 22% (5/23) marginally, and 43% (10/23) poorly. Only 1 concept was used incorrectly by 3/6 of the students and identified as

  20. Modal and Mixed Specifications: Key Decision Problems and their Complexities

    DEFF Research Database (Denmark)

    Antonik, Adam; Huth, Michael; Larsen, Kim Guldstrand

    2010-01-01

    , and whether all implementations of one specification are implementations of another one. For each of these decision problems we investigate the worst-case computational complexity for the modal and mixed case. We show that the first decision problem is EXPTIME-complete for modal as well as for mixed......Modal and mixed transition systems are specification formalisms that allow mixing of over- and under-approximation. We discuss three fundamental decision problems for such specifications: whether a set of specifications has a common implementation, whether a sole specification has an implementation...... specifications. We prove that the second decision problem is EXPTIME-complete for mixed specifications (while it is known to be trivial for modal ones). The third decision problem is furthermore demonstrated to be EXPTIME-complete for mixed specifications....

  1. Solving the three-body Coulomb breakup problem using exterior complex scaling

    Energy Technology Data Exchange (ETDEWEB)

    McCurdy, C.W.; Baertschy, M.; Rescigno, T.N.

    2004-05-17

    Electron-impact ionization of the hydrogen atom is the prototypical three-body Coulomb breakup problem in quantum mechanics. The combination of subtle correlation effects and the difficult boundary conditions required to describe two electrons in the continuum have made this one of the outstanding challenges of atomic physics. A complete solution of this problem in the form of a ''reduction to computation'' of all aspects of the physics is given by the application of exterior complex scaling, a modern variant of the mathematical tool of analytic continuation of the electronic coordinates into the complex plane that was used historically to establish the formal analytic properties of the scattering matrix. This review first discusses the essential difficulties of the three-body Coulomb breakup problem in quantum mechanics. It then describes the formal basis of exterior complex scaling of electronic coordinates as well as the details of its numerical implementation using a variety of methods including finite difference, finite elements, discrete variable representations, and B-splines. Given these numerical implementations of exterior complex scaling, the scattering wave function can be generated with arbitrary accuracy on any finite volume in the space of electronic coordinates, but there remains the fundamental problem of extracting the breakup amplitudes from it. Methods are described for evaluating these amplitudes. The question of the volume-dependent overall phase that appears in the formal theory of ionization is resolved. A summary is presented of accurate results that have been obtained for the case of electron-impact ionization of hydrogen as well as a discussion of applications to the double photoionization of helium.

  2. Is Principled Pragmatism a Viable Framework for Addressing Complex Problems?

    Science.gov (United States)

    Islam, S.

    2017-12-01

    Complex water problems are connected with many competing and often conflicting values, interests, and tools. These problems can't be addressed through simply applying dogmatic principles or a deal-making pragmatic approach. Because these problems are interconnected and interdependent, a final solution can't be pre-specified. Any intervention to a complex problem requires attention to both principles and pragmatism. Strict adherence to principles without pragmatism is often not actionable; pure pragmatism exercised without guiding principles is not sustainable. In a colloquial sense, pragmatism is often taken to suggest practical, opportunistic, and expedient approaches at the expense of principles. This perception appears to be rooted in the dichotomy between "being pragmatic" and "being ideological". The notion of principled pragmatism attempts to get away from this duality by focusing on how to make ideas clear and actionable. In other words, how to connect our thoughts to action given the context, constraints, and capacity. Principled pragmatism - rooted in equity and sustainability as guiding principles for water management - approach attempts to synthesize symbolic aspirations with realistic assessment to chart a trajectory of actionable subset of implementable solutions. Case studies from the Ganges Basin will show the utility of principled pragmatism for water management in a changing world.

  3. Inductive dielectric analyzer

    International Nuclear Information System (INIS)

    Agranovich, Daniel; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri; Polygalov, Eugene

    2017-01-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions. (paper)

  4. Analyzing public health policy: three approaches.

    Science.gov (United States)

    Coveney, John

    2010-07-01

    Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.

  5. Identification of effective visual problem solving strategies in a complex visual domain

    NARCIS (Netherlands)

    Van Meeuwen, Ludo; Jarodzka, Halszka; Brand-Gruwel, Saskia; Kirschner, Paul A.; De Bock, Jeano; Van Merriënboer, Jeroen

    2018-01-01

    Students in complex visual domains must acquire visual problem solving strategies that allow them to make fast decisions and come up with good solutions to real-time problems. In this study, 31 air traffic controllers at different levels of expertise (novice, intermediate, expert) were confronted

  6. DOE's efforts to correct environmental problems of the nuclear weapons complex

    International Nuclear Information System (INIS)

    Rezendes, V.S.

    1990-03-01

    This report focuses on four main issues: the environmental problems at DOE's nuclear weapons complex, recent changes in DOE's organizational structure, DOE's 1991 budget request, and the need for effective management systems. This report concludes that the environmental problems are enormous and will take decades to resolve. Widespread contamination can be found at many DOE sites, and the full extent of the environmental problems is unknown. DOE has taken several steps during the past year to better deal with these problems, including making organizational improvements and requesting additional funds for environmental restoration and waste management activities

  7. Fluid Ability (Gf and Complex Problem Solving (CPS

    Directory of Open Access Journals (Sweden)

    Patrick Kyllonen

    2017-07-01

    Full Text Available Complex problem solving (CPS has emerged over the past several decades as an important construct in education and in the workforce. We examine the relationship between CPS and general fluid ability (Gf both conceptually and empirically. A review of definitions of the two factors, prototypical tasks, and the information processing analyses of performance on those tasks suggest considerable conceptual overlap. We review three definitions of CPS: a general definition emerging from the human problem solving literature; a more specialized definition from the “German School” emphasizing performance in many-variable microworlds, with high domain-knowledge requirements; and a third definition based on performance in Minimal Complex Systems (MCS, with fewer variables and reduced knowledge requirements. We find a correlation of 0.86 between expert ratings of the importance of CPS and Gf across 691 occupations in the O*NET database. We find evidence that employers value both Gf and CPS skills, but CPS skills more highly, even after controlling for the importance of domain knowledge. We suggest that this may be due to CPS requiring not just cognitive ability but additionally skill in applying that ability in domains. We suggest that a fruitful future direction is to explore the importance of domain knowledge in CPS.

  8. Analyzing the causation of a railway accident based on a complex network

    Science.gov (United States)

    Ma, Xin; Li, Ke-Ping; Luo, Zi-Yan; Zhou, Jin

    2014-02-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents.

  9. Automatic Algorithm Selection for Complex Simulation Problems

    CERN Document Server

    Ewald, Roland

    2012-01-01

    To select the most suitable simulation algorithm for a given task is often difficult. This is due to intricate interactions between model features, implementation details, and runtime environment, which may strongly affect the overall performance. An automated selection of simulation algorithms supports users in setting up simulation experiments without demanding expert knowledge on simulation. Roland Ewald analyzes and discusses existing approaches to solve the algorithm selection problem in the context of simulation. He introduces a framework for automatic simulation algorithm selection and

  10. Child outcomes of home-visiting for families with complex and multiple problems

    NARCIS (Netherlands)

    van Assen, Arend; Dickscheit, Jana; Post, Wendy; Grietens, Hans

    2016-01-01

    Introduction Families with complex and multiple problems are faced with an accumulation of problems across multiple areas of life. Furthermore, these families are often considered to be ‘difficult to treat’. Children and teenagers growing up in these families are exposed to an accumulation of risks

  11. Fluid Ability (Gf) and Complex Problem Solving (CPS)

    OpenAIRE

    Patrick Kyllonen; Cristina Anguiano Carrasco; Harrison J. Kell

    2017-01-01

    Complex problem solving (CPS) has emerged over the past several decades as an important construct in education and in the workforce. We examine the relationship between CPS and general fluid ability (Gf) both conceptually and empirically. A review of definitions of the two factors, prototypical tasks, and the information processing analyses of performance on those tasks suggest considerable conceptual overlap. We review three definitions of CPS: a general definition emerging from the human pr...

  12. Beyond Psychometrics: The Difference between Difficult Problem Solving and Complex Problem Solving

    Directory of Open Access Journals (Sweden)

    Jens F. Beckmann

    2017-10-01

    Full Text Available In this paper we argue that a synthesis of findings across the various sub-areas of research in complex problem solving and consequently progress in theory building is hampered by an insufficient differentiation of complexity and difficulty. In the proposed framework of person, task, and situation (PTS, complexity is conceptualized as a quality that is determined by the cognitive demands that the characteristics of the task and the situation impose. Difficulty represents the quantifiable level of a person’s success in dealing with such demands. We use the well-documented “semantic effect” as an exemplar for testing some of the conceptual assumptions derived from the PTS framework. We demonstrate how a differentiation between complexity and difficulty can help take beyond a potentially too narrowly defined psychometric perspective and subsequently gain a better understanding of the cognitive mechanisms behind this effect. In an empirical study a total of 240 university students were randomly allocated to one of four conditions. The four conditions resulted from contrasting the semanticity level of the variable labels used in the CPS system (high vs. low and two instruction conditions for how to explore the CPS system’s causal structure (starting with the assumption that all relationships between variables existed vs. starting with the assumption that none of the relationships existed. The variation in the instruction aimed at inducing knowledge acquisition processes of either (1 systematic elimination of presumptions, or (2 systematic compilation of a mental representation of the causal structure underpinning the system. Results indicate that (a it is more complex to adopt a “blank slate” perspective under high semanticity as it requires processes of inhibiting prior assumptions, and (b it seems more difficult to employ a systematic heuristic when testing against presumptions. In combination, situational characteristics, such as the

  13. New Approach to Analyzing Physics Problems: A Taxonomy of Introductory Physics Problems

    Science.gov (United States)

    Teodorescu, Raluca E.; Bennhold, Cornelius; Feldman, Gerald; Medsker, Larry

    2013-01-01

    This paper describes research on a classification of physics problems in the context of introductory physics courses. This classification, called the Taxonomy of Introductory Physics Problems (TIPP), relates physics problems to the cognitive processes required to solve them. TIPP was created in order to design educational objectives, to develop…

  14. Complex analogues of real problems

    DEFF Research Database (Denmark)

    Esdahl-Schou, Rune

    This thesis will be a mix of different problems in number theory. As such it is split into two natural parts. The rst part focuses on normal numbers and construction of numbers that are normal to a given complex base. It is written in the style of a thorough and introductory paper on that subject....... Certain classical theorems are stated without proof but with a reference instead, though usually a proof is given. This part of the thesis represents the pinnacle of the authors work during the first two years of his PhD study. The work presented is greatly inspired by the work of Madritsch, Thuswaldner...... and Tichy in [Madritsch et al., 2008] and [Madritsch, 2008] and contains a generalisation of the main theorem in [Madritsch, 2008]. The second part of the thesis focuses on Diophantine approximation, mainly on a famous conjecture by Schmidt from the 1980s. This conjecture was solved by Badziahin, Pollington...

  15. Using Model Checking for Analyzing Distributed Power Control Problems

    Directory of Open Access Journals (Sweden)

    Thomas Brihaye

    2010-01-01

    Full Text Available Model checking (MC is a formal verification technique which has been known and still knows a resounding success in the computer science community. Realizing that the distributed power control (PC problem can be modeled by a timed game between a given transmitter and its environment, the authors wanted to know whether this approach can be applied to distributed PC. It turns out that it can be applied successfully and allows one to analyze realistic scenarios including the case of discrete transmit powers and games with incomplete information. The proposed methodology is as follows. We state some objectives a transmitter-receiver pair would like to reach. The network is modeled by a game where transmitters are considered as timed automata interacting with each other. The objectives are then translated into timed alternating-time temporal logic formulae and MC is exploited to know whether the desired properties are verified and determine a winning strategy.

  16. The Streaming Complexity of Cycle Counting, Sorting by Reversals, and Other Problems

    DEFF Research Database (Denmark)

    Verbin, Elad; Yu, Wei

    2011-01-01

    -way. By designing reductions from BHH, we prove lower bounds for the streaming complexity of approximating the sorting by reversal distance, of approximately counting the number of cycles in a 2-regular graph, and of other problems. For example, here is one lower bound that we prove, for a cycle-counting problem...

  17. Device for analyzing a solution

    International Nuclear Information System (INIS)

    Marchand, Joseph.

    1978-01-01

    The device enables a solution containing an antigen to be analyzed by the radio-immunology technique without coming up against the problems of antigen-antibody complex and free antigen separation. This device, for analyzing a solution containing a biological compound capable of reacting with an antagonistic compound specific of the biological compound, features a tube closed at its bottom end and a component set and immobilized in the bottom of the tube so as to leave a capacity between the bottom of the tube and its lower end. The component has a large developed surface and is so shaped that it allows the solution to be analyzed to have access to the bottom of the tube; it is made of a material having some elastic deformation and able to take up a given quantity of the biological compound or of the antagonistic compound specific of the biological compound [fr

  18. Understanding and quantifying cognitive complexity level in mathematical problem solving items

    Directory of Open Access Journals (Sweden)

    SUSAN E. EMBRETSON

    2008-09-01

    Full Text Available The linear logistic test model (LLTM; Fischer, 1973 has been applied to a wide variety of new tests. When the LLTM application involves item complexity variables that are both theoretically interesting and empirically supported, several advantages can result. These advantages include elaborating construct validity at the item level, defining variables for test design, predicting parameters of new items, item banking by sources of complexity and providing a basis for item design and item generation. However, despite the many advantages of applying LLTM to test items, it has been applied less often to understand the sources of complexity for large-scale operational test items. Instead, previously calibrated item parameters are modeled using regression techniques because raw item response data often cannot be made available. In the current study, both LLTM and regression modeling are applied to mathematical problem solving items from a widely used test. The findings from the two methods are compared and contrasted for their implications for continued development of ability and achievement tests based on mathematical problem solving items.

  19. A Real-Life Case Study of Audit Interactions--Resolving Messy, Complex Problems

    Science.gov (United States)

    Beattie, Vivien; Fearnley, Stella; Hines, Tony

    2012-01-01

    Real-life accounting and auditing problems are often complex and messy, requiring the synthesis of technical knowledge in addition to the application of generic skills. To help students acquire the necessary skills to deal with these problems effectively, educators have called for the use of case-based methods. Cases based on real situations (such…

  20. Redundant interferometric calibration as a complex optimization problem

    Science.gov (United States)

    Grobler, T. L.; Bernardi, G.; Kenyon, J. S.; Parsons, A. R.; Smirnov, O. M.

    2018-05-01

    Observations of the redshifted 21 cm line from the epoch of reionization have recently motivated the construction of low-frequency radio arrays with highly redundant configurations. These configurations provide an alternative calibration strategy - `redundant calibration' - and boost sensitivity on specific spatial scales. In this paper, we formulate calibration of redundant interferometric arrays as a complex optimization problem. We solve this optimization problem via the Levenberg-Marquardt algorithm. This calibration approach is more robust to initial conditions than current algorithms and, by leveraging an approximate matrix inversion, allows for further optimization and an efficient implementation (`redundant STEFCAL'). We also investigated using the preconditioned conjugate gradient method as an alternative to the approximate matrix inverse, but found that its computational performance is not competitive with respect to `redundant STEFCAL'. The efficient implementation of this new algorithm is made publicly available.

  1. Analyzing the causation of a railway accident based on a complex network

    International Nuclear Information System (INIS)

    Ma Xin; Li Ke-Ping; Luo Zi-Yan; Zhou Jin

    2014-01-01

    In this paper, a new model is constructed for the causation analysis of railway accident based on the complex network theory. In the model, the nodes are defined as various manifest or latent accident causal factors. By employing the complex network theory, especially its statistical indicators, the railway accident as well as its key causations can be analyzed from the overall perspective. As a case, the “7.23” China—Yongwen railway accident is illustrated based on this model. The results show that the inspection of signals and the checking of line conditions before trains run played an important role in this railway accident. In conclusion, the constructed model gives a theoretical clue for railway accident prediction and, hence, greatly reduces the occurrence of railway accidents. (interdisciplinary physics and related areas of science and technology)

  2. Upper estimates of complexity of algorithms for multi-peg Tower of Hanoi problem

    Directory of Open Access Journals (Sweden)

    Sergey Novikov

    2007-06-01

    Full Text Available There are proved upper explicit estimates of complexity of lgorithms: for multi-peg Tower of Hanoi problem with the limited number of disks, for Reve's puzzle and for $5$-peg Tower of Hanoi problem with the free number of disks.

  3. Determining the Effects of Cognitive Style, Problem Complexity, and Hypothesis Generation on the Problem Solving Ability of School-Based Agricultural Education Students

    Science.gov (United States)

    Blackburn, J. Joey; Robinson, J. Shane

    2016-01-01

    The purpose of this experimental study was to assess the effects of cognitive style, problem complexity, and hypothesis generation on the problem solving ability of school-based agricultural education students. Problem solving ability was defined as time to solution. Kirton's Adaption-Innovation Inventory was employed to assess students' cognitive…

  4. Implementation of exterior complex scaling in B-splines to solve atomic and molecular collision problems

    International Nuclear Information System (INIS)

    McCurdy, C William; MartIn, Fernando

    2004-01-01

    B-spline methods are now well established as widely applicable tools for the evaluation of atomic and molecular continuum states. The mathematical technique of exterior complex scaling has been shown, in a variety of other implementations, to be a powerful method with which to solve atomic and molecular scattering problems, because it allows the correct imposition of continuum boundary conditions without their explicit analytic application. In this paper, an implementation of exterior complex scaling in B-splines is described that can bring the well-developed technology of B-splines to bear on new problems, including multiple ionization and breakup problems, in a straightforward way. The approach is demonstrated for examples involving the continuum motion of nuclei in diatomic molecules as well as electronic continua. For problems involving electrons, a method based on Poisson's equation is presented for computing two-electron integrals over B-splines under exterior complex scaling

  5. Internet of THings Area Coverage Analyzer (ITHACA for Complex Topographical Scenarios

    Directory of Open Access Journals (Sweden)

    Raúl Parada

    2017-10-01

    Full Text Available The number of connected devices is increasing worldwide. Not only in contexts like the Smart City, but also in rural areas, to provide advanced features like smart farming or smart logistics. Thus, wireless network technologies to efficiently allocate Internet of Things (IoT and Machine to Machine (M2M communications are necessary. Traditional cellular networks like Global System for Mobile communications (GSM are widely used worldwide for IoT environments. Nevertheless, Low Power Wide Area Networks (LP-WAN are becoming widespread as infrastructure for present and future IoT and M2M applications. Based also on a subscription service, the LP-WAN technology SIGFOXTM may compete with cellular networks in the M2M and IoT communications market, for instance in those projects where deploying the whole communications infrastructure is too complex or expensive. For decision makers to decide the most suitable technology for each specific application, signal coverage is within the key features. Unfortunately, besides simulated coverage maps, decision-makers do not have real coverage maps for SIGFOXTM, as they can be found for cellular networks. Thereby, we propose Internet of THings Area Coverage Analyzer (ITHACA, a signal analyzer prototype to provide automated signal coverage maps and analytics for LP-WAN. Experiments performed in the Gran Canaria Island, Spain (with both urban and complex topographic rural environments, returned a real SIGFOXTM service availability above 97% and above 11% more coverage with respect to the company-provided simulated maps. We expect that ITHACA may help decision makers to deploy the most suitable technologies for future IoT and M2M projects.

  6. What Do Employers Pay for Employees' Complex Problem Solving Skills?

    Science.gov (United States)

    Ederer, Peer; Nedelkoska, Ljubica; Patt, Alexander; Castellazzi, Silvia

    2015-01-01

    We estimate the market value that employers assign to the complex problem solving (CPS) skills of their employees, using individual-level Mincer-style wage regressions. For the purpose of the study, we collected new and unique data using psychometric measures of CPS and an extensive background questionnaire on employees' personal and work history.…

  7. Structural factoring approach for analyzing stochastic networks

    Science.gov (United States)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  8. Medicines counterfeiting is a complex problem: a review of key challenges across the supply chain.

    Science.gov (United States)

    Tremblay, Michael

    2013-02-01

    The paper begins by asking why there is a market for counterfeit medicines, which in effect creates the problem of counterfeiting itself. Contributing factors include supply chain complexity and the lack of whole-systems thinking. These two underpin the author's view that counterfeiting is a complex (i.e. wicked) problem, and that corporate, public policy and regulatory actions need to be mindful of how their actions may be causal. The paper offers a problem-based review of key components of this complexity, viz., the knowledge end-users/consumers have of medicines; whether restrictive information policies may hamper information provision to patients; the internet's direct access to consumers; internet-enabled distribution of unsafe and counterfeit medicines; whether the internet is a parallel and competitive supply chain to legitimate routes; organised crime as an emerging medicines manufacturer and supplier and whether substandard medicines is really the bigger problem. Solutions respect the perceived complexity of the supply chain challenges. The paper identifies the need to avoid technologically-driven solutions, calling for 'technological agnosticism'. Both regulation and public policy need to reflect the dynamic nature of the problem and avoid creating perverse incentives; it may be, for instance, that medicines pricing and reimbursement policies, which affect consumer/patient access may act as market signals to counterfeiters, since this creates a cash market in cheaper drugs.

  9. Generalist solutions to complex problems: generating practice-based evidence--the example of managing multi-morbidity.

    Science.gov (United States)

    Reeve, Joanne; Blakeman, Tom; Freeman, George K; Green, Larry A; James, Paul A; Lucassen, Peter; Martin, Carmel M; Sturmberg, Joachim P; van Weel, Chris

    2013-08-07

    A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet the concept of multi-morbidity (the presence of >2 diseases) is a product of the design of health care systems which define health care need on the basis of disease status. So does the solution lie in an alternative model of healthcare? Strengthening generalist practice has been proposed as part of the solution to tackling multi-morbidity. Generalism is a professional philosophy of practice, deeply known to many practitioners, and described as expertise in whole person medicine. But generalism lacks the evidence base needed by policy makers and planners to support service redesign. The challenge is to fill this practice-research gap in order to critically explore if and when generalist care offers a robust alternative to management of this complex problem. We need practice-based evidence to fill this gap. By recognising generalist practice as a 'complex intervention' (intervening in a complex system), we outline an approach to evaluate impact using action-research principles. We highlight the implications for those who both commission and undertake research in order to tackle this problem. Answers to the complex problem of multi-morbidity won't come from doing more of the same. We need to change systems of care, and so the systems for generating evidence to support that care. This paper contributes to that work through outlining a process for generating practice-based evidence of generalist solutions to the complex problem of person-centred care for people with multi-morbidity.

  10. Learning about Complex Multi-Stakeholder Issues: Assessing the Visual Problem Appraisal

    NARCIS (Netherlands)

    Witteveen, L.M.; Put, M.; Leeuwis, C.

    2010-01-01

    This paper presents an evaluation of the visual problem appraisal (VPA) learning environment in higher education. The VPA has been designed for the training of competences that are required in complex stakeholder settings in relation to sustainability issues. The design of VPA incorporates a

  11. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem.

    Science.gov (United States)

    Williams, Patricia Ah; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat.

  12. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem

    Science.gov (United States)

    Williams, Patricia AH; Woodward, Andrew J

    2015-01-01

    The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. PMID:26229513

  13. Amodified probabilistic genetic algorithm for the solution of complex constrained optimization problems

    OpenAIRE

    Vorozheikin, A.; Gonchar, T.; Panfilov, I.; Sopov, E.; Sopov, S.

    2009-01-01

    A new algorithm for the solution of complex constrained optimization problems based on the probabilistic genetic algorithm with optimal solution prediction is proposed. The efficiency investigation results in comparison with standard genetic algorithm are presented.

  14. Inverse problems in complex material design: Applications to non-crystalline solids

    Science.gov (United States)

    Biswas, Parthapratim; Drabold, David; Elliott, Stephen

    The design of complex amorphous materials is one of the fundamental problems in disordered condensed-matter science. While impressive developments of ab-initio simulation methods during the past several decades have brought tremendous success in understanding materials property from micro- to mesoscopic length scales, a major drawback is that they fail to incorporate existing knowledge of the materials in simulation methodologies. Since an essential feature of materials design is the synergy between experiment and theory, a properly developed approach to design materials should be able to exploit all available knowledge of the materials from measured experimental data. In this talk, we will address the design of complex disordered materials as an inverse problem involving experimental data and available empirical information. We show that the problem can be posed as a multi-objective non-convex optimization program, which can be addressed using a number of recently-developed bio-inspired global optimization techniques. In particular, we will discuss how a population-based stochastic search procedure can be used to determine the structure of non-crystalline solids (e.g. a-SiH, a-SiO2, amorphous graphene, and Fe and Ni clusters). The work is partially supported by NSF under Grant Nos. DMR 1507166 and 1507670.

  15. Putting the puzzle together: the role of 'problem definition' in complex clinical judgement.

    Science.gov (United States)

    Cristancho, Sayra; Lingard, Lorelei; Forbes, Thomas; Ott, Michael; Novick, Richard

    2017-02-01

    We teach judgement in pieces; that is, we talk about each aspect separately (patient, plan, resources, technique, etc.). We also let trainees figure out how to put the pieces together. In complex situations, this might be problematic. Using data from a drawing-based study on surgeons' experiences with complex situations, we explore the notion of 'problem definition' in real-world clinical judgement using the theoretical lens of systems engineering. 'Emergence', the sensitising concept for analysis, is rooted in two key systems premises: that person and context are inseparable and that what emerges is an act of choice. Via a 'gallery walk' we used these premises to perform analysis on individual drawings as well as cross-comparisons of multiple drawings. Our focus was to understand similarities and differences among the vantage points used by multiple surgeons. In this paper we challenge two assumptions from current models of clinical judgement: that experts hold a fixed and static definition of the problem and that consequently the focus of the expert's work is on solving the problem. Each situation described by our participants revealed different but complementary perspectives of what a surgical problem might come to be: from concerns about ensuring standard of care, to balancing personal emotions versus care choices, to coordinating resources, and to maintaining control while in the midst of personality clashes. We suggest that it is only at the situation and system level, not at the individual level, that we are able to appreciate the nuances of defining the problem when experts make judgements during real-world complex situations. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  16. Problems of nuclear reactor safety. Vol. 1

    International Nuclear Information System (INIS)

    Shal'nov, A.V.

    1995-01-01

    Proceedings of the 9. Topical Meeting 'Problems of nuclear reactor safety' are presented. Papers include results of studies and developments associated with methods of calculation and complex computerized simulation for stationary and transient processes in nuclear power plants. Main problems of reactor safety are discussed as well as rector accidents on operating NPP's are analyzed

  17. A Hybrid DGTD-MNA Scheme for Analyzing Complex Electromagnetic Systems

    KAUST Repository

    Li, Peng

    2015-01-07

    A hybrid electromagnetics (EM)-circuit simulator for analyzing complex systems consisting of EM devices loaded with nonlinear multi-port lumped circuits is described. The proposed scheme splits the computational domain into two subsystems: EM and circuit subsystems, where field interactions are modeled using Maxwell and Kirchhoff equations, respectively. Maxwell equations are discretized using a discontinuous Galerkin time domain (DGTD) scheme while Kirchhoff equations are discretized using a modified nodal analysis (MNA)-based scheme. The coupling between the EM and circuit subsystems is realized at the lumped ports, where related EM fields and circuit voltages and currents are allowed to “interact’’ via numerical flux. To account for nonlinear lumped circuit elements, the standard Newton-Raphson method is applied at every time step. Additionally, a local time-stepping scheme is developed to improve the efficiency of the hybrid solver. Numerical examples consisting of EM systems loaded with single and multiport linear/nonlinear circuit networks are presented to demonstrate the accuracy, efficiency, and applicability of the proposed solver.

  18. Can Complexity be Planned?

    Directory of Open Access Journals (Sweden)

    Ilona Koutny

    2015-04-01

    Full Text Available The long accepted complexity invariance of human languages has become controversial within the last decade. In investigations of the problem, both creole and planned languages have often been neglected. After a presentation of the scope of the invariance problem and the proposition of the natural to planned language continuum, this article will discuss the contribution of planned languages. It will analyze the complexity of Esperanto at the phonological, morphological, syntactic and semantic levels, using linguistic data bases. The role of the L2 speech community and development of the language will also be taken into account when discussing the endurance of the same level of simplicity of this planned international language. The author argues that complexity can be variable and to some extent planned and maintained.

  19. Topographical memory analyzed in mice using the Hamlet test, a novel complex maze.

    Science.gov (United States)

    Crouzier, Lucie; Gilabert, Damien; Rossel, Mireille; Trousse, Françoise; Maurice, Tangui

    2018-03-01

    The Hamlet test is an innovative device providing a complex environment for testing topographic memory in mice. Animals were trained in groups for weeks in a small village with a central agora, streets expanding from it towards five functionalized houses, where they can drink, eat, hide, run, interact with a stranger mouse. Memory was tested by depriving mice from water or food and analyzing their ability to locate the Drink/Eat house. Exploration and memory were analyzed in different strains, gender, and after different training periods and delays. After 2 weeks training, differences in exploration patterns were observed between strains, but not gender. Neuroanatomical structures activated by training, identified using FosB/ΔFosB immunolabelling, showed an involvement of the hippocampus-subiculum-parahippocampal gyrus axis and dopaminergic structures. Training increased hippocampal neurogenesis (cell proliferation and neuronal maturation) and modified the amnesic efficacy of muscarinic or nicotinic cholinergic antagonists. Moreover, topographical disorientation in Alzheimer's disease was addressed using intracerebroventricular injection of amyloid β 25-35 peptide in trained mice. When retested after 7 days, Aβ 25-35 -treated mice showed memory impairment. The Hamlet test specifically allows analysis of topographical memory in mice, based on complex environment. It offers an innovative tool for various ethological or pharmacological research needs. For instance, it allowed to examine topographical disorientation, a warning sign in Alzheimer's disease. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Ability to analyze the statement of a problem as a metasubject result of learning

    Directory of Open Access Journals (Sweden)

    V.A. Guruzhapov

    2014-08-01

    Full Text Available We provide with the results of experimental research of younger school students ability to analyze and understand the missing terms of a mathematical problem as one of the components of metasubject educational outcomes. The pupils were offered tasks of the diagnostic technique developed by V.A. Guruzhapov, and aimed at assessing the relationships of varying quantities of items. The sample of subjects was 168 students of forms I-III of two Moscow schools. It was found that this technique can estimate the metasubject component of the educational process in the traditional system of education in terms of the analysis of the adequacy of the object display properties in its model. The validity of the methodology was tested in a training experiment conducted by L.N. Shilenkova. An analysis of tasks of another subject content than what was presented in diagnostic tasks was performed with younger students. After learning, the results of the experimental group students significantly improved. On this basis it is concluded that the proposed diagnostic tasks can be used to assess the ability of younger school students to analyze and understand the missing statements of the problem as one of the components of metasubject educational outcomes. The designed developing educational situation can be used in the practice of modern elementary school to enhance learning.

  1. Implementation of inter-unit analysis for C and C++ languages in a source-based static code analyzer

    Directory of Open Access Journals (Sweden)

    A. V. Sidorin

    2015-01-01

    Full Text Available The proliferation of automated testing capabilities arises a need for thorough testing of large software systems, including system inter-component interfaces. The objective of this research is to build a method for inter-procedural inter-unit analysis, which allows us to analyse large and complex software systems including multi-architecture projects (like Android OS as well as to support complex assembly systems of projects. Since the selected Clang Static Analyzer uses source code directly as input data, we need to develop a special technique to enable inter-unit analysis for such analyzer. This problem is of special nature because of C and C++ language features that assume and encourage the separate compilation of project files. We describe the build and analysis system that was implemented around Clang Static Analyzer to enable inter-unit analysis and consider problems related to support of complex projects. We also consider the task of merging abstract source trees of translation units and its related problems such as handling conflicting definitions, complex build systems and complex projects support, including support for multi-architecture projects, with examples. We consider both issues related to language design and human-related mistakes (that may be intentional. We describe some heuristics that were used for this work to make the merging process faster. The developed system was tested using Android OS as the input to show it is applicable even for such complicated projects. This system does not depend on the inter-procedural analysis method and allows the arbitrary change of its algorithm.

  2. Solving complex fisheries management problems

    DEFF Research Database (Denmark)

    Petter Johnsen, Jahn; Eliasen, Søren Qvist

    2011-01-01

    A crucial issue for the new EU common fisheries policy is how to solve the discard problem. Through a study of the institutional set up and the arrangements for solving the discard problem in Denmark, the Faroe Islands, Iceland and Norway, the article identifies the discard problem as related...

  3. ATHENA [Advanced Thermal Hydraulic Energy Network Analyzer] solutions to developmental assessment problems

    International Nuclear Information System (INIS)

    Carlson, K.E.; Ransom, V.H.; Roth, P.A.

    1987-03-01

    The ATHENA (Advanced Thermal Hydraulic Energy Network Analyzer) code has been developed to perform transient simulation of the thermal hydraulic systems that may be found in fusion reactors, space reactors, and other advanced systems. As an assessment of current capability the code was applied to a number of physical problems, both conceptual and actual experiments. Results indicate that the numerical solution to the basic conservation equations is technically sound, and that generally good agreement can be obtained when modeling relevant hydrodynamic experiments. The assessment also demonstrates basic fusion system modeling capability and verifies compatibility of the code with both CDC and CRAY mainframes. Areas where improvements could be made include constitutive modeling, which describes the interfacial exchange term. 13 refs., 84 figs

  4. Level of satisfaction of older persons with their general practitioner and practice: role of complexity of health problems.

    Directory of Open Access Journals (Sweden)

    Antonius J Poot

    Full Text Available BACKGROUND: Satisfaction is widely used to evaluate and direct delivery of medical care; a complicated relationship exists between patient satisfaction, morbidity and age. This study investigates the relationships between complexity of health problems and level of patient satisfaction of older persons with their general practitioner (GP and practice. METHODS AND FINDINGS: This study is embedded in the ISCOPE (Integrated Systematic Care for Older Persons study. Enlisted patients aged ≥75 years from 59 practices received a written questionnaire to screen for complex health problems (somatic, functional, psychological and social. For 2664 randomly chosen respondents (median age 82 years; 68% female information was collected on level of satisfaction (satisfied, neutral, dissatisfied with their GP and general practice, and demographic and clinical characteristics including complexity of health problems. Of all participants, 4% was dissatisfied with their GP care, 59% neutral and 37% satisfied. Between these three categories no differences were observed in age, gender, country of birth or education level. The percentage of participants dissatisfied with their GP care increased from 0.4% in those with 0 problem domains to 8% in those with 4 domains, i.e. having complex health problems (p<0.001. Per additional health domain with problems, the risk of being dissatisfied increased 1.7 times (95% CI 1.4-2.14; p<0.001. This was independent of age, gender, and demographic and clinical parameters (adjusted OR 1.4, 95% CI 1.1-1.8; p = 0.021. CONCLUSION: In older persons, dissatisfaction with general practice is strongly correlated with rising complexity of health problems, independent of age, demographic and clinical parameters. It remains unclear whether complexity of health problems is a patient characteristic influencing the perception of care, or whether the care is unable to handle the demands of these patients. Prospective studies are needed to

  5. Two-Level Solutions to Exponentially Complex Problems in Glass Science

    DEFF Research Database (Denmark)

    Mauro, John C.; Smedskjær, Morten Mattrup

    Glass poses an especially challenging problem for physicists. The key to making progress in theoretical glass science is to extract the key physics governing properties of practical interest. In this spirit, we discuss several two-level solutions to exponentially complex problems in glass science....... Topological constraint theory, originally developed by J.C. Phillips, is based on a two-level description of rigid and floppy modes in a glass network and can be used to derive quantitatively accurate and analytically solvable models for a variety of macroscopic properties. The temperature dependence...... that captures both primary and secondary relaxation modes. Such a model also offers the ability to calculate the distinguishability of particles during glass transition and relaxation processes. Two-level models can also be used to capture the distribution of various network-forming species in mixed...

  6. How students process equations in solving quantitative synthesis problems? Role of mathematical complexity in students’ mathematical performance

    Directory of Open Access Journals (Sweden)

    Bashirah Ibrahim

    2017-10-01

    Full Text Available We examine students’ mathematical performance on quantitative “synthesis problems” with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking, formulation and combination of equations require conceptual reasoning; simplification of equations requires manipulation of equations as computational tools. Mathematical complexity is operationally defined by the number and the type of equations to be manipulated concurrently due to the number of unknowns in each equation. We use two types of synthesis problems, namely, sequential and simultaneous tasks. Sequential synthesis tasks require a chronological application of pertinent concepts, and simultaneous synthesis tasks require a concurrent application of the pertinent concepts. A total of 179 physics major students from a second year mechanics course participated in the study. Data were collected from written tasks and individual interviews. Results show that mathematical complexity negatively influences the students’ mathematical performance on both types of synthesis problems. However, for the sequential synthesis tasks, it interferes only with the students’ simplification of equations. For the simultaneous synthesis tasks, mathematical complexity additionally impedes the students’ formulation and combination of equations. Several reasons may explain this difference, including the students’ different approaches to the two types of synthesis problems, cognitive load, and the variation of mathematical complexity within each synthesis type.

  7. The problem of sustainability within the complexity of agricultural production systems

    International Nuclear Information System (INIS)

    Cotes Torres, Alejandro; Cotes Torres, Jose Miguel

    2005-01-01

    The problem of sustainability is a topic that since the end of the XX century has been worrying more the different sectors of society; becoming one of the topics of greatest interest for managers, consumers, academics and investigators that conform the different agricultural food chains of the world. This paper presents from the general systems theory point of view some elements of critical reflection, approaching the problem of sustainability from the complexity of agricultural production systems, beginning with the original philosophical conception of agricultural and ending by outlining some considerations that should be kept in mind for the development of scientific and technological advances concordant with the agricultural food chain needs of the XX century; which permit an orientation of not only work by profession is who lead the processes of animal and vegetable production, but also creates a sense of pertinence in all of the participants in the chain, highlighting the importance of studying by means of systemic thought, agronomy and animal science, as disciplines that approach to complexities of agriculture which is the angular stone of civilization, such as we know it at the moment

  8. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem

    Directory of Open Access Journals (Sweden)

    Williams PAH

    2015-07-01

    Full Text Available Patricia AH Williams, Andrew J Woodward eHealth Research Group and Security Research Institute, Edith Cowan University, Perth, WA, Australia Abstract: The increased connectivity to existing computer networks has exposed medical devices to cybersecurity vulnerabilities from which they were previously shielded. For the prevention of cybersecurity incidents, it is important to recognize the complexity of the operational environment as well as to catalog the technical vulnerabilities. Cybersecurity protection is not just a technical issue; it is a richer and more intricate problem to solve. A review of the factors that contribute to such a potentially insecure environment, together with the identification of the vulnerabilities, is important for understanding why these vulnerabilities persist and what the solution space should look like. This multifaceted problem must be viewed from a systemic perspective if adequate protection is to be put in place and patient safety concerns addressed. This requires technical controls, governance, resilience measures, consolidated reporting, context expertise, regulation, and standards. It is evident that a coordinated, proactive approach to address this complex challenge is essential. In the interim, patient safety is under threat. Keywords: cybersecurity, security, safety, wireless, risk, medical devices

  9. Molecular computing towards a novel computing architecture for complex problem solving

    CERN Document Server

    Chang, Weng-Long

    2014-01-01

    This textbook introduces a concise approach to the design of molecular algorithms for students or researchers who are interested in dealing with complex problems. Through numerous examples and exercises, you will understand the main difference of molecular circuits and traditional digital circuits to manipulate the same problem and you will also learn how to design a molecular algorithm of solving any a problem from start to finish. The book starts with an introduction to computational aspects of digital computers and molecular computing, data representation of molecular computing, molecular operations of molecular computing and number representation of molecular computing, and provides many molecular algorithm to construct the parity generator and the parity checker of error-detection codes on digital communication, to encode integers of different formats, single precision and double precision of floating-point numbers, to implement addition and subtraction of unsigned integers, to construct logic operations...

  10. The Consensus String Problem and the Complexity of Comparing Hidden Markov Models

    DEFF Research Database (Denmark)

    Lyngsø, Rune Bang; Pedersen, Christian Nørgaard Storm

    2002-01-01

    The basic theory of hidden Markov models was developed and applied to problems in speech recognition in the late 1960s, and has since then been applied to numerous problems, e.g. biological sequence analysis. Most applications of hidden Markov models are based on efficient algorithms for computing...... the probability of generating a given string, or computing the most likely path generating a given string. In this paper we consider the problem of computing the most likely string, or consensus string, generated by a given model, and its implications on the complexity of comparing hidden Markov models. We show...... that computing the consensus string, and approximating its probability within any constant factor, is NP-hard, and that the same holds for the closely related labeling problem for class hidden Markov models. Furthermore, we establish the NP-hardness of comparing two hidden Markov models under the L∞- and L1...

  11. How Health Care Complexity Leads to Cooperation and Affects the Autonomy of Health Care Professionals

    NARCIS (Netherlands)

    Molleman, Eric; Broekhuis, Manda; Stoffels, Renee; Jaspers, Frans

    2008-01-01

    Health professionals increasingly face patients with complex health problems and this pressurizes them to cooperate. The authors have analyzed how the complexity of health care problems relates to two types of cooperation: consultation and multidisciplinary teamwork (MTW). Moreover, they have

  12. Individual Differences in Students' Complex Problem Solving Skills: How They Evolve and What They Imply

    Science.gov (United States)

    Wüstenberg, Sascha; Greiff, Samuel; Vainikainen, Mari-Pauliina; Murphy, Kevin

    2016-01-01

    Changes in the demands posed by increasingly complex workplaces in the 21st century have raised the importance of nonroutine skills such as complex problem solving (CPS). However, little is known about the antecedents and outcomes of CPS, especially with regard to malleable external factors such as classroom climate. To investigate the relations…

  13. Building University Capacity to Visualize Solutions to Complex Problems in the Arctic

    Science.gov (United States)

    Broderson, D.; Veazey, P.; Raymond, V. L.; Kowalski, K.; Prakash, A.; Signor, B.

    2016-12-01

    Rapidly changing environments are creating complex problems across the globe, which are particular magnified in the Arctic. These worldwide challenges can best be addressed through diverse and interdisciplinary research teams. It is incumbent on such teams to promote co-production of knowledge and data-driven decision-making by identifying effective methods to communicate their findings and to engage with the public. Decision Theater North (DTN) is a new semi-immersive visualization system that provides a space for teams to collaborate and develop solutions to complex problems, relying on diverse sets of skills and knowledge. It provides a venue to synthesize the talents of scientists, who gather information (data); modelers, who create models of complex systems; artists, who develop visualizations; communicators, who connect and bridge populations; and policymakers, who can use the visualizations to develop sustainable solutions to pressing problems. The mission of Decision Theater North is to provide a cutting-edge visual environment to facilitate dialogue and decision-making by stakeholders including government, industry, communities and academia. We achieve this mission by adopting a multi-faceted approach reflected in the theater's design, technology, networking capabilities, user support, community relationship building, and strategic partnerships. DTN is a joint project of Alaska's National Science Foundation Experimental Program to Stimulate Competitive Research (NSF EPSCoR) and the University of Alaska Fairbanks (UAF), who have brought the facility up to full operational status and are now expanding its development space to support larger team science efforts. Based in Fairbanks, Alaska, DTN is uniquely poised to address changes taking place in the Arctic and subarctic, and is connected with a larger network of decision theaters that include the Arizona State University Decision Theater Network and the McCain Institute in Washington, DC.

  14. Predictability problems of global change as seen through natural systems complexity description. 2. Approach

    Directory of Open Access Journals (Sweden)

    Vladimir V. Kozoderov

    1998-01-01

    Full Text Available Developing the general statements of the proposed global change theory, outlined in Part 1 of the publication, Kolmogorov's probability space is used to study properties of information measures (unconditional, joint and conditional entropies, information divergence, mutual information, etc.. Sets of elementary events, the specified algebra of their sub-sets and probability measures for the algebra are composite parts of the space. The information measures are analyzed using the mathematical expectance operator and the adequacy between an additive function of sets and their equivalents in the form of the measures. As a result, explanations are given to multispectral satellite imagery visualization procedures using Markov's chains of random variables represented by pixels of the imagery. The proposed formalism of the information measures application enables to describe the natural targets complexity by syntactically governing probabilities. Asserted as that of signal/noise ratios finding for anomalies of natural processes, the predictability problem is solved by analyses of temporal data sets of related measurements for key regions and their background within contextually coherent structures of natural targets and between particular boundaries of the structures.

  15. Case study method and problem-based learning: utilizing the pedagogical model of progressive complexity in nursing education.

    Science.gov (United States)

    McMahon, Michelle A; Christopher, Kimberly A

    2011-08-19

    As the complexity of health care delivery continues to increase, educators are challenged to determine educational best practices to prepare BSN students for the ambiguous clinical practice setting. Integrative, active, and student-centered curricular methods are encouraged to foster student ability to use clinical judgment for problem solving and informed clinical decision making. The proposed pedagogical model of progressive complexity in nursing education suggests gradually introducing students to complex and multi-contextual clinical scenarios through the utilization of case studies and problem-based learning activities, with the intention to transition nursing students into autonomous learners and well-prepared practitioners at the culmination of a nursing program. Exemplar curricular activities are suggested to potentiate student development of a transferable problem solving skill set and a flexible knowledge base to better prepare students for practice in future novel clinical experiences, which is a mutual goal for both educators and students.

  16. Level of satisfaction of older persons with their general practitioner and practice: role of complexity of health problems.

    Science.gov (United States)

    Poot, Antonius J; den Elzen, Wendy P J; Blom, Jeanet W; Gussekloo, Jacobijn

    2014-01-01

    Satisfaction is widely used to evaluate and direct delivery of medical care; a complicated relationship exists between patient satisfaction, morbidity and age. This study investigates the relationships between complexity of health problems and level of patient satisfaction of older persons with their general practitioner (GP) and practice. This study is embedded in the ISCOPE (Integrated Systematic Care for Older Persons) study. Enlisted patients aged ≥75 years from 59 practices received a written questionnaire to screen for complex health problems (somatic, functional, psychological and social). For 2664 randomly chosen respondents (median age 82 years; 68% female) information was collected on level of satisfaction (satisfied, neutral, dissatisfied) with their GP and general practice, and demographic and clinical characteristics including complexity of health problems. Of all participants, 4% was dissatisfied with their GP care, 59% neutral and 37% satisfied. Between these three categories no differences were observed in age, gender, country of birth or education level. The percentage of participants dissatisfied with their GP care increased from 0.4% in those with 0 problem domains to 8% in those with 4 domains, i.e. having complex health problems (ppatient characteristic influencing the perception of care, or whether the care is unable to handle the demands of these patients. Prospective studies are needed to investigate the causal associations between care organization, patient characteristics, indicators of quality, and patient perceptions.

  17. Application of Artificial Neural Networks to Complex Groundwater Management Problems

    International Nuclear Information System (INIS)

    Coppola, Emery; Poulton, Mary; Charles, Emmanuel; Dustman, John; Szidarovszky, Ferenc

    2003-01-01

    As water quantity and quality problems become increasingly severe, accurate prediction and effective management of scarcer water resources will become critical. In this paper, the successful application of artificial neural network (ANN) technology is described for three types of groundwater prediction and management problems. In the first example, an ANN was trained with simulation data from a physically based numerical model to predict head (groundwater elevation) at locations of interest under variable pumping and climate conditions. The ANN achieved a high degree of predictive accuracy, and its derived state-transition equations were embedded into a multiobjective optimization formulation and solved to generate a trade-off curve depicting water supply in relation to contamination risk. In the second and third examples, ANNs were developed with real-world hydrologic and climate data for different hydrogeologic environments. For the second problem, an ANN was developed using data collected for a 5-year, 8-month period to predict heads in a multilayered surficial and limestone aquifer system under variable pumping, state, and climate conditions. Using weekly stress periods, the ANN substantially outperformed a well-calibrated numerical flow model for the 71-day validation period, and provided insights into the effects of climate and pumping on water levels. For the third problem, an ANN was developed with data collected automatically over a 6-week period to predict hourly heads in 11 high-capacity public supply wells tapping a semiconfined bedrock aquifer and subject to large well-interference effects. Using hourly stress periods, the ANN accurately predicted heads for 24-hour periods in all public supply wells. These test cases demonstrate that the ANN technology can solve a variety of complex groundwater management problems and overcome many of the problems and limitations associated with traditional physically based flow models

  18. The Development of Complex Problem Solving in Adolescence: A Latent Growth Curve Analysis

    Science.gov (United States)

    Frischkorn, Gidon T.; Greiff, Samuel; Wüstenberg, Sascha

    2014-01-01

    Complex problem solving (CPS) as a cross-curricular competence has recently attracted more attention in educational psychology as indicated by its implementation in international educational large-scale assessments such as the Programme for International Student Assessment. However, research on the development of CPS is scarce, and the few…

  19. Nanotechnology for sustainability: what does nanotechnology offer to address complex sustainability problems?

    Energy Technology Data Exchange (ETDEWEB)

    Wiek, Arnim, E-mail: arnim.wiek@asu.edu; Foley, Rider W. [Arizona State University, School of Sustainability (United States); Guston, David H. [Arizona State University, Center for Nanotechnology in Society, Consortium for Science, Policy and Outcomes (United States)

    2012-09-15

    Nanotechnology is widely associated with the promise of positively contributing to sustainability. However, this view often focuses on end-of-pipe applications, for instance, for water purification or energy efficiency, and relies on a narrow concept of sustainability. Approaching sustainability problems and solution options from a comprehensive and systemic perspective instead may yield quite different conclusions about the contribution of nanotechnology to sustainability. This study conceptualizes sustainability problems as complex constellations with several potential intervention points and amenable to different solution options. The study presents results from interdisciplinary workshops and literature reviews that appraise the contribution of the selected nanotechnologies to mitigate such problems. The study focuses exemplarily on the urban context to make the appraisals tangible and relevant. The solution potential of nanotechnology is explored not only for well-known urban sustainability problems such as water contamination and energy use but also for less obvious ones such as childhood obesity. Results indicate not only potentials but also limitations of nanotechnology's contribution to sustainability and can inform anticipatory governance of nanotechnology in general, and in the urban context in particular.

  20. Nanotechnology for sustainability: what does nanotechnology offer to address complex sustainability problems?

    International Nuclear Information System (INIS)

    Wiek, Arnim; Foley, Rider W.; Guston, David H.

    2012-01-01

    Nanotechnology is widely associated with the promise of positively contributing to sustainability. However, this view often focuses on end-of-pipe applications, for instance, for water purification or energy efficiency, and relies on a narrow concept of sustainability. Approaching sustainability problems and solution options from a comprehensive and systemic perspective instead may yield quite different conclusions about the contribution of nanotechnology to sustainability. This study conceptualizes sustainability problems as complex constellations with several potential intervention points and amenable to different solution options. The study presents results from interdisciplinary workshops and literature reviews that appraise the contribution of the selected nanotechnologies to mitigate such problems. The study focuses exemplarily on the urban context to make the appraisals tangible and relevant. The solution potential of nanotechnology is explored not only for well-known urban sustainability problems such as water contamination and energy use but also for less obvious ones such as childhood obesity. Results indicate not only potentials but also limitations of nanotechnology’s contribution to sustainability and can inform anticipatory governance of nanotechnology in general, and in the urban context in particular.

  1. Organization of a multichannel analyzer for gamma ray spectrometry

    International Nuclear Information System (INIS)

    Robinet, Genevieve

    1988-06-01

    This report describes the software organization of a medium scale multichannel analyzer for qualitative and quantitative measurements of the gamma rays emitted by radioactive samples. The first part reminds basis of radioactivity, principle of gamma ray detection, and data processing used for interpretation of a nuclear spectrum. The second part describes first the general organization of the software and then gives some details on interactivity, multidetector capabilites, and integration of complex algorithms for peak search and nuclide identification;problems encountered during the design phase are mentioned and solutions are given. Basic ideas are presented for further developments, such as expert system which should improve interpretation of the results. This present software has been integrated in a manufactured multichannel analyzer named 'POLYGAM NU416'. [fr

  2. Complex analysis and dynamical systems new trends and open problems

    CERN Document Server

    Golberg, Anatoly; Jacobzon, Fiana; Shoikhet, David; Zalcman, Lawrence

    2018-01-01

    This book focuses on developments in complex dynamical systems and geometric function theory over the past decade, showing strong links with other areas of mathematics and the natural sciences. Traditional methods and approaches surface in physics and in the life and engineering sciences with increasing frequency – the Schramm‐Loewner evolution, Laplacian growth, and quadratic differentials are just a few typical examples. This book provides a representative overview of these processes and collects open problems in the various areas, while at the same time showing where and how each particular topic evolves. This volume is dedicated to the memory of Alexander Vasiliev.

  3. Using Educational Data Mining Methods to Assess Field-Dependent and Field-Independent Learners' Complex Problem Solving

    Science.gov (United States)

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…

  4. Structural qualia: a solution to the hard problem of consciousness.

    Science.gov (United States)

    Loorits, Kristjan

    2014-01-01

    The hard problem of consciousness has been often claimed to be unsolvable by the methods of traditional empirical sciences. It has been argued that all the objects of empirical sciences can be fully analyzed in structural terms but that consciousness is (or has) something over and above its structure. However, modern neuroscience has introduced a theoretical framework in which also the apparently non-structural aspects of consciousness, namely the so called qualia or qualitative properties, can be analyzed in structural terms. That framework allows us to see qualia as something compositional with internal structures that fully determine their qualitative nature. Moreover, those internal structures can be identified which certain neural patterns. Thus consciousness as a whole can be seen as a complex neural pattern that misperceives some of its own highly complex structural properties as monadic and qualitative. Such neural pattern is analyzable in fully structural terms and thereby the hard problem is solved.

  5. Structural qualia: a solution to the hard problem of consciousness

    Directory of Open Access Journals (Sweden)

    Kristjan eLoorits

    2014-03-01

    Full Text Available The hard problem of consciousness has been often claimed to be unsolvable by the methods of traditional empirical sciences. It has been argued that all the objects of empirical sciences can be fully analyzed in structural terms but that consciousness is (or has something over and above its structure. However, modern neuroscience has introduced a theoretical framework in which also the apparently non-structural aspects of consciousness, namely the so called qualia or qualitative properties, can be analyzed in structural terms. That framework allows us to see qualia as something compositional with internal structures that fully determine their qualitative nature. Moreover, those internal structures can be identified which certain neural patterns. Thus consciousness as a whole can be seen as a complex neural pattern that misperceives some of its own highly complex structural properties as monadic and qualitative. Such neural pattern is analyzable in fully structural terms and thereby the hard problem is solved.

  6. Addressing Complex Societal Problems: Enabling Multiple Dimensions of Proximity to Sustain Partnerships for Collective Impact in Quebec

    Directory of Open Access Journals (Sweden)

    Nii A. Addy

    2018-03-01

    Full Text Available Sustainable solutions for complex societal problems, like poverty, require informing stakeholders about progress and changes needed as they collaborate. Yet, inter-organizational collaboration researchers highlight monumental challenges in measuring seemingly intangible factors during collective impact processes. We grapple with the question: How can decision-makers coherently conceptualize and measure seemingly intangible factors to sustain partnerships for the emergence of collective impact? We conducted an inductive process case study to address this question, analyzing data from documents, observations, and interviews of 24 philanthropy leaders and multiple stakeholders in a decades-long partnership involving Canada’s largest private family foundation, government and community networks, and during which a “collective impact project” emerged in Quebec Province, Canada. The multidimensional proximity framework provided an analytical lens. During the first phase of the partnership studied, there was a lack of baseline measurement of largely qualitative factors—conceptualized as cognitive, social, and institutional proximity between stakeholders—which evaluations suggested were important for explaining which community networks successfully brought about desired outcomes. Non-measurement of these factors was a problem in providing evidence for sustained engagement of stakeholders, such as government and local businesses. We develop a multidimensional proximity model that coherently conceptualizes qualitative proximity factors, for measuring their change over time.

  7. The problem of motivating teaching staff in a complex amalgamation.

    Science.gov (United States)

    Kenrick, M A

    1993-09-01

    This paper addresses some of the problems brought about by the merger of a number of schools of nursing into a new complex amalgamation. A very real concern in the new colleges of nursing and midwifery in the United Kingdom is the effect of amalgamation on management systems and staff morale. The main focus of this paper is the motivation of staff during this time of change. There is currently a lack of security amongst staff and in many instances the personal job satisfaction of nurse teachers and managers of nurse education has been reduced, which has made the task of motivating staff difficult. Hence, two major theories of motivation and the implications of these theories for managers of nurse education are discussed. The criteria used for the selection of managers within the new colleges, leadership styles and organizational structures are reviewed. The amalgamations have brought about affiliation with higher-education institutions. Some problems associated with these mergers and the effects on the motivation of staff both within the higher-education institutions and the nursing colleges are outlined. Strategies for overcoming some of the problems are proposed including job enlargement, job enrichment, potential achievement rewards and the use of individual performance reviews which may be useful for assessing the ability of all staff, including managers, in the new amalgamations.

  8. Dynamic Modeling as a Cognitive Regulation Scaffold for Developing Complex Problem-Solving Skills in an Educational Massively Multiplayer Online Game Environment

    Science.gov (United States)

    Eseryel, Deniz; Ge, Xun; Ifenthaler, Dirk; Law, Victor

    2011-01-01

    Following a design-based research framework, this article reports two empirical studies with an educational MMOG, called "McLarin's Adventures," on facilitating 9th-grade students' complex problem-solving skill acquisition in interdisciplinary STEM education. The article discusses the nature of complex and ill-structured problem solving…

  9. Exploring Corn-Ethanol As A Complex Problem To Teach Sustainability Concepts Across The Science-Business-Liberal Arts Curriculum

    Science.gov (United States)

    Oches, E. A.; Szymanski, D. W.; Snyder, B.; Gulati, G. J.; Davis, P. T.

    2012-12-01

    The highly interdisciplinary nature of sustainability presents pedagogic challenges when sustainability concepts are incorporated into traditional disciplinary courses. At Bentley University, where over 90 percent of students major in business disciplines, we have created a multidisciplinary course module centered on corn ethanol that explores a complex social, environmental, and economic problem and develops basic data analysis and analytical thinking skills in several courses spanning the natural, physical, and social sciences within the business curriculum. Through an NSF-CCLI grant, Bentley faculty from several disciplines participated in a summer workshop to define learning objectives, create course modules, and develop an assessment plan to enhance interdisciplinary sustainability teaching. The core instructional outcome was a data-rich exercise for all participating courses in which students plot and analyze multiple parameters of corn planted and harvested for various purposes including food (human), feed (animal), ethanol production, and commodities exchanged for the years 1960 to present. Students then evaluate patterns and trends in the data and hypothesize relationships among the plotted data and environmental, social, and economic drivers, responses, and unintended consequences. After the central data analysis activity, students explore corn ethanol production as it relates to core disciplinary concepts in their individual classes. For example, students in Environmental Chemistry produce ethanol using corn and sugar as feedstocks and compare the efficiency of each process, while learning about enzymes, fermentation, distillation, and other chemical principles. Principles of Geology students examine the effects of agricultural runoff on surface water quality associated with extracting greater agricultural yield from mid-continent croplands. The American Government course examines the role of political institutions, the political process, and various

  10. Student Learning of Complex Earth Systems: A Model to Guide Development of Student Expertise in Problem-Solving

    Science.gov (United States)

    Holder, Lauren N.; Scherer, Hannah H.; Herbert, Bruce E.

    2017-01-01

    Engaging students in problem-solving concerning environmental issues in near-surface complex Earth systems involves developing student conceptualization of the Earth as a system and applying that scientific knowledge to the problems using practices that model those used by professionals. In this article, we review geoscience education research…

  11. Analysis of decision alternatives of the deep borehole filter restoration problem

    International Nuclear Information System (INIS)

    Abdildin, Yerkin G.; Abbas, Ali E.

    2016-01-01

    The energy problem is one of the biggest challenges facing the world in the 21st century. The nuclear energy is the fastest-growing contributor to the world energy and uranium mining is the primary step in its chain. One of the fundamental problems in the uranium extraction industry is the deep borehole filter restoration problem. This decision problem is very complex due to multiple objectives and various uncertainties. Besides the improvement of uranium production, the decision makers often need to meet internationally recognized standards (ISO 14001) of labor protection, safety measures, and preservation of environment. The problem can be simplified by constructing the multiattribute utility function, but the choice of the appropriate functional form requires the practical evaluation of different methods. In present work, we evaluate the alternatives of this complex problem by two distinct approaches for analyzing decision problems. The decision maker and the assessor is a Deputy Director General of a transnational corporation. - Highlights: • Analyzes 5 borehole recovery methods across the 4 most important attributes (criteria). • Considers financial, technological, environmental, and safety factors. • Compares two decision analysis approaches and the profit analysis. • Illustrates the assessments of the decision maker's preferences. • Determines that the assumption of independence of attributes yields imprecise recommendations.

  12. Numerical nonlinear complex geometrical optics algorithm for the 3D Calderón problem

    DEFF Research Database (Denmark)

    Delbary, Fabrice; Knudsen, Kim

    2014-01-01

    to the generalized Laplace equation. The 3D problem was solved in theory in late 1980s using complex geometrical optics solutions and a scattering transform. Several approximations to the reconstruction method have been suggested and implemented numerically in the literature, but here, for the first time, a complete...... computer implementation of the full nonlinear algorithm is given. First a boundary integral equation is solved by a Nystrom method for the traces of the complex geometrical optics solutions, second the scattering transform is computed and inverted using fast Fourier transform, and finally a boundary value...

  13. HSTLBO: A hybrid algorithm based on Harmony Search and Teaching-Learning-Based Optimization for complex high-dimensional optimization problems.

    Directory of Open Access Journals (Sweden)

    Shouheng Tuo

    Full Text Available Harmony Search (HS and Teaching-Learning-Based Optimization (TLBO as new swarm intelligent optimization algorithms have received much attention in recent years. Both of them have shown outstanding performance for solving NP-Hard optimization problems. However, they also suffer dramatic performance degradation for some complex high-dimensional optimization problems. Through a lot of experiments, we find that the HS and TLBO have strong complementarity each other. The HS has strong global exploration power but low convergence speed. Reversely, the TLBO has much fast convergence speed but it is easily trapped into local search. In this work, we propose a hybrid search algorithm named HSTLBO that merges the two algorithms together for synergistically solving complex optimization problems using a self-adaptive selection strategy. In the HSTLBO, both HS and TLBO are modified with the aim of balancing the global exploration and exploitation abilities, where the HS aims mainly to explore the unknown regions and the TLBO aims to rapidly exploit high-precision solutions in the known regions. Our experimental results demonstrate better performance and faster speed than five state-of-the-art HS variants and show better exploration power than five good TLBO variants with similar run time, which illustrates that our method is promising in solving complex high-dimensional optimization problems. The experiment on portfolio optimization problems also demonstrate that the HSTLBO is effective in solving complex read-world application.

  14. Using machine-learning methods to analyze economic loss function of quality management processes

    Science.gov (United States)

    Dzedik, V. A.; Lontsikh, P. A.

    2018-05-01

    During analysis of quality management systems, their economic component is often analyzed insufficiently. To overcome this issue, it is necessary to withdraw the concept of economic loss functions from tolerance thinking and address it. Input data about economic losses in processes have a complex form, thus, using standard tools to solve this problem is complicated. Use of machine learning techniques allows one to obtain precise models of the economic loss function based on even the most complex input data. Results of such analysis contain data about the true efficiency of a process and can be used to make investment decisions.

  15. Status of the Monte Carlo library least-squares (MCLLS) approach for non-linear radiation analyzer problems

    Science.gov (United States)

    Gardner, Robin P.; Xu, Libai

    2009-10-01

    The Center for Engineering Applications of Radioisotopes (CEAR) has been working for over a decade on the Monte Carlo library least-squares (MCLLS) approach for treating non-linear radiation analyzer problems including: (1) prompt gamma-ray neutron activation analysis (PGNAA) for bulk analysis, (2) energy-dispersive X-ray fluorescence (EDXRF) analyzers, and (3) carbon/oxygen tool analysis in oil well logging. This approach essentially consists of using Monte Carlo simulation to generate the libraries of all the elements to be analyzed plus any other required background libraries. These libraries are then used in the linear library least-squares (LLS) approach with unknown sample spectra to analyze for all elements in the sample. Iterations of this are used until the LLS values agree with the composition used to generate the libraries. The current status of the methods (and topics) necessary to implement the MCLLS approach is reported. This includes: (1) the Monte Carlo codes such as CEARXRF, CEARCPG, and CEARCO for forward generation of the necessary elemental library spectra for the LLS calculation for X-ray fluorescence, neutron capture prompt gamma-ray analyzers, and carbon/oxygen tools; (2) the correction of spectral pulse pile-up (PPU) distortion by Monte Carlo simulation with the code CEARIPPU; (3) generation of detector response functions (DRF) for detectors with linear and non-linear responses for Monte Carlo simulation of pulse-height spectra; and (4) the use of the differential operator (DO) technique to make the necessary iterations for non-linear responses practical. In addition to commonly analyzed single spectra, coincidence spectra or even two-dimensional (2-D) coincidence spectra can also be used in the MCLLS approach and may provide more accurate results.

  16. Effective economics of nuclear fuel power complex

    International Nuclear Information System (INIS)

    Shevelev, Ya.V.; Klimenko, A.V.

    1996-01-01

    Problems of the economic theory and practice of functioning the nuclear fuel power complex (NFPC) are considered. Using the principle of market equilibrium for optimization of the NFPC hierarchical system is analyzed. The main attention is paid to determining the prices of production and consumption of the NFPC enterprises. Economic approaches on the optimal calculations are described. The ecological safety of NPP and NFPC enterprises is analyzed. A conception of the market socialism is presented

  17. Analyzing Pre-Service Primary Teachers' Fraction Knowledge Structures through Problem Posing

    Science.gov (United States)

    Kilic, Cigdem

    2015-01-01

    In this study it was aimed to determine pre-service primary teachers' knowledge structures of fraction through problem posing activities. A total of 90 pre-service primary teachers participated in this study. A problem posing test consisting of two questions was used and the participants were asked to generate as many as problems based on the…

  18. Searching for Order Within Chaos: Complexity Theorys Implications to Intelligence Support During Joint Operational Planning

    Science.gov (United States)

    2017-06-09

    joint operational planning . 15. SUBJECT TERMS Complexity Theory , Complex Systems Theory , Complex Adaptive Systems, Dynamical Systems, Joint...complexity theory to analyze military problems and increase joint staff understanding of the operational environment during joint operational planning ?” the...13). Complex Systems Theory : “the study of the behavior of [complex adaptive] systems” (Ilachinski 2004, 4). For the purpose of this thesis there is

  19. Using iMCFA to Perform the CFA, Multilevel CFA, and Maximum Model for Analyzing Complex Survey Data.

    Science.gov (United States)

    Wu, Jiun-Yu; Lee, Yuan-Hsuan; Lin, John J H

    2018-01-01

    To construct CFA, MCFA, and maximum MCFA with LISREL v.8 and below, we provide iMCFA (integrated Multilevel Confirmatory Analysis) to examine the potential multilevel factorial structure in the complex survey data. Modeling multilevel structure for complex survey data is complicated because building a multilevel model is not an infallible statistical strategy unless the hypothesized model is close to the real data structure. Methodologists have suggested using different modeling techniques to investigate potential multilevel structure of survey data. Using iMCFA, researchers can visually set the between- and within-level factorial structure to fit MCFA, CFA and/or MAX MCFA models for complex survey data. iMCFA can then yield between- and within-level variance-covariance matrices, calculate intraclass correlations, perform the analyses and generate the outputs for respective models. The summary of the analytical outputs from LISREL is gathered and tabulated for further model comparison and interpretation. iMCFA also provides LISREL syntax of different models for researchers' future use. An empirical and a simulated multilevel dataset with complex and simple structures in the within or between level was used to illustrate the usability and the effectiveness of the iMCFA procedure on analyzing complex survey data. The analytic results of iMCFA using Muthen's limited information estimator were compared with those of Mplus using Full Information Maximum Likelihood regarding the effectiveness of different estimation methods.

  20. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving.

    Science.gov (United States)

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-03-06

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  1. Analyzing a problem-solution pattern in the transcription of a conversation: suggestions for the ELF classroom

    Directory of Open Access Journals (Sweden)

    Navas Brenes, César A.

    2005-12-01

    Full Text Available This paper analyzes a problem-solution pattern shown in the transcription of a conversation. This analysis is based on a conversation that has been elicited from three native speakers of English. These speakers were given a topic that dealt with the problem of children being constantly exposed to violent video games. As a result, the writer recorded an oral text that contains several elements related to a problem-solution pattern such as the main issue, opinions, personal examples, possible solutions, and the evaluation of these solutions. The writer analyzed this pattern in terms of discourse analysis using idea units from the transcription. Moreover, the writer will point out how appropriate this transcription is for preparing teaching tasks for the EFL classroom. Este artículo analiza el patrón estructural de la solución de un problema evidente en la transcripción de una conversación. El análisis esta basado en un diálogo obtenido de tres hablantes nativos de la lengua inglesa. El autor suministró a dichas personas un tema relacionado con el problema de la exposición de niños al contenido violento de los juegos de video. Como resultado, el escritor obtuvo una grabación del texto oral que contiene varios elementos relacionados con el patrón estructural de la solución de un problema. Algunos de estos elementos son el problema principal, opiniones, ejemplos personales, soluciones viables al problema, y la evaluación de dichas soluciones. El autor analizó algunas líneas de la transcripción con la ayuda de algunos conceptos relacionados con el análisis del discurso. Cabe destacar que el uso de tales trascripciones es muy apropiado en la preparación de actividades para una clase de inglés como lengua extranjera.

  2. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    Science.gov (United States)

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  3. Masalah Oedipus Complex Dalam Novel Umibe No Kafuka Karya Haruki Murakami

    Directory of Open Access Journals (Sweden)

    Linda Unsriana

    2011-05-01

    Full Text Available Haruki Murakami, a Japanese novelist is an example of a great novelist who has produced his works which have been translated in various languages for a variety of readers in the world. He has also been awarded the Yomiuri Literary Award for his novel Wind-up Bird Chronicle in 1996. In his novel Umibe no Kafuka, Murakami describes the life of a young man and the Oedipus complex problems he experienced. This study will look for root causes of problems in the Oedipus complex problem of Kafka Tamura, to analyze them through methods of role characterization.

  4. Generalist solutions to complex problems: generating practice-based evidence - the example of managing multi-morbidity

    NARCIS (Netherlands)

    Reeve, J.; Blakeman, T.; Freeman, G.K.; Green, L.A.; James, P.A.; Lucassen, P.L.; Martin, C.M.; Sturmberg, J.P.; Weel, C. van

    2013-01-01

    BACKGROUND: A growing proportion of people are living with long term conditions. The majority have more than one. Dealing with multi-morbidity is a complex problem for health systems: for those designing and implementing healthcare as well as for those providing the evidence informing practice. Yet

  5. Problem-solving with multiple interdependent criteria: better solution to complex problems

    International Nuclear Information System (INIS)

    Carlsson, C.; Fuller, R.

    1996-01-01

    We consider multiple objective programming (MOP) problems with additive interdependencies, this is when the states of some chosen objective are attained through supportive or inhibitory feed-backs from several other objectives. MOP problems with independent objectives (when the cause-effect relations between the decision variables and the objectives are completely known) will be treated as special cases of the MOP in which we have interdependent objectives. We illustrate our ideas by a simple three-objective real-life problem

  6. Encyclopedia of Complexity and Systems Science

    CERN Document Server

    Meyers, Robert A

    2009-01-01

    Encyclopedia of Complexity and Systems Science provides an authoritative single source for understanding and applying the concepts of complexity theory together with the tools and measures for analyzing complex systems in all fields of science and engineering. The science and tools of complexity and systems science include theories of self-organization, complex systems, synergetics, dynamical systems, turbulence, catastrophes, instabilities, nonlinearity, stochastic processes, chaos, neural networks, cellular automata, adaptive systems, and genetic algorithms. Examples of near-term problems and major unknowns that can be approached through complexity and systems science include: The structure, history and future of the universe; the biological basis of consciousness; the integration of genomics, proteomics and bioinformatics as systems biology; human longevity limits; the limits of computing; sustainability of life on earth; predictability, dynamics and extent of earthquakes, hurricanes, tsunamis, and other n...

  7. A Comparison of Geographic Information Systems, Complex Networks, and Other Models for Analyzing Transportation Network Topologies

    Science.gov (United States)

    Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher

    2005-01-01

    This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.

  8. A study of the logical model of capital market complexity theories

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Analyzes the shortcomings of the classic capital market theories based on EMH and discloses the complexity essence of the capital market. Considering the capital market a complicated, interactive and adaptable dynamic system, with complexity science as the method for researching the operation law of the capital market, this paper constructs a nonlinear logical model to analyze the applied realm, focal point and interrelationship of such theories as dissipative structure theory, chaos theory, fractal theory, synergetics theory, catastrophe theory and scale theory, and summarizes and discusses the achievements and problems of each theory.Based on the research, the paper foretells the developing direction of complexity science in a capital market.

  9. Waste Collection Vehicle Routing Problem: Literature Review

    Directory of Open Access Journals (Sweden)

    Hui Han

    2015-08-01

    Full Text Available Waste generation is an issue which has caused wide public concern in modern societies, not only for the quantitative rise of the amount of waste generated, but also for the increasing complexity of some products and components. Waste collection is a highly relevant activity in the reverse logistics system and how to collect waste in an efficient way is an area that needs to be improved. This paper analyzes the major contribution about Waste Collection Vehicle Routing Problem (WCVRP in literature. Based on a classification of waste collection (residential, commercial and industrial, firstly the key findings for these three types of waste collection are presented. Therefore, according to the model (Node Routing Problems and Arc Routing problems used to represent WCVRP, different methods and techniques are analyzed in this paper to solve WCVRP. This paper attempts to serve as a roadmap of research literature produced in the field of WCVRP.

  10. Mathematical Models to Determine Stable Behavior of Complex Systems

    Science.gov (United States)

    Sumin, V. I.; Dushkin, A. V.; Smolentseva, T. E.

    2018-05-01

    The paper analyzes a possibility to predict functioning of a complex dynamic system with a significant amount of circulating information and a large number of random factors impacting its functioning. Functioning of the complex dynamic system is described as a chaotic state, self-organized criticality and bifurcation. This problem may be resolved by modeling such systems as dynamic ones, without applying stochastic models and taking into account strange attractors.

  11. Problems in creating enviroment and health protection systems

    International Nuclear Information System (INIS)

    Vorob'ev, E.I.; Reznichenko, V.Yu.

    1981-01-01

    The problems in creating environmental and health protection systems are considered with relation to development of nuclear energetics facilities. A problem of transition from the system of detection and observation to the uniform system of environment and health protection and control is set. The objectives and problems of such a system are analyzed and the basic principles of their construction are outlined. A system conception for a fuel energetic complex is described. Usefulness of such systems in solving problems of sites of industrial objects including nuclear power industrial objects, of removal of these objects from service and etc. is shown. New requirements to medical-biological investigations on designing of such a system are discussed [ru

  12. Adaptive Beamforming Based on Complex Quaternion Processes

    Directory of Open Access Journals (Sweden)

    Jian-wu Tao

    2014-01-01

    Full Text Available Motivated by the benefits of array signal processing in quaternion domain, we investigate the problem of adaptive beamforming based on complex quaternion processes in this paper. First, a complex quaternion least-mean squares (CQLMS algorithm is proposed and its performance is analyzed. The CQLMS algorithm is suitable for adaptive beamforming of vector-sensor array. The weight vector update of CQLMS algorithm is derived based on the complex gradient, leading to lower computational complexity. Because the complex quaternion can exhibit the orthogonal structure of an electromagnetic vector-sensor in a natural way, a complex quaternion model in time domain is provided for a 3-component vector-sensor array. And the normalized adaptive beamformer using CQLMS is presented. Finally, simulation results are given to validate the performance of the proposed adaptive beamformer.

  13. Computational complexity in entanglement transformations

    Science.gov (United States)

    Chitambar, Eric A.

    In physics, systems having three parts are typically much more difficult to analyze than those having just two. Even in classical mechanics, predicting the motion of three interacting celestial bodies remains an insurmountable challenge while the analogous two-body problem has an elementary solution. It is as if just by adding a third party, a fundamental change occurs in the structure of the problem that renders it unsolvable. In this thesis, we demonstrate how such an effect is likewise present in the theory of quantum entanglement. In fact, the complexity differences between two-party and three-party entanglement become quite conspicuous when comparing the difficulty in deciding what state changes are possible for these systems when no additional entanglement is consumed in the transformation process. We examine this entanglement transformation question and its variants in the language of computational complexity theory, a powerful subject that formalizes the concept of problem difficulty. Since deciding feasibility of a specified bipartite transformation is relatively easy, this task belongs to the complexity class P. On the other hand, for tripartite systems, we find the problem to be NP-Hard, meaning that its solution is at least as hard as the solution to some of the most difficult problems humans have encountered. One can then rigorously defend the assertion that a fundamental complexity difference exists between bipartite and tripartite entanglement since unlike the former, the full range of forms realizable by the latter is incalculable (assuming P≠NP). However, similar to the three-body celestial problem, when one examines a special subclass of the problem---invertible transformations on systems having at least one qubit subsystem---we prove that the problem can be solved efficiently. As a hybrid of the two questions, we find that the question of tripartite to bipartite transformations can be solved by an efficient randomized algorithm. Our results are

  14. Efficient Solutions to Two-Party and Multiparty Millionaires’ Problem

    Directory of Open Access Journals (Sweden)

    Xin Liu

    2017-01-01

    Full Text Available The millionaires’ problem is the basis of secure multiparty computation and has many applications. Using a vectorization method and the Paillier encryption scheme, we first propose a secure two-party solution to the millionaires’ problem, which can determine x=y,  xy in one execution. Subsequently, using the vectorization and secret splitting methods, we propose an information-theoretically secure protocol to solve the multiparty millionaires’ problem (a.k.a. secure sorting problem, and this protocol can resist collusion attacks. We analyze the accuracy and security of our protocols in the semihonest model and compare the computational and communication complexities between the proposed protocols and the existing ones.

  15. A method for evaluating the problem complex of choosing the ventilation system for a new building

    DEFF Research Database (Denmark)

    Hviid, Christian Anker; Svendsen, Svend

    2007-01-01

    The application of a ventilation system in a new building is a multidimensional complex problem that involves quantifiable and non-quantifiable data like energy consump¬tion, indoor environment, building integration and architectural expression. This paper presents a structured method for evaluat...

  16. Nuclear Society and non-proliferation problems

    International Nuclear Information System (INIS)

    Gagarinskij, A.Ya.; Kushnarev, S.V.; Ponomarev-Stepnoj, N.N.; Sukhoruchkin, V.K.; Khromov, V.V.; Shmelev, V.M.

    1997-01-01

    In the USSR Nuclear Society in 1991 the special working group on the problems of nuclear weapons non-proliferation and nuclear materials control, uniting the experts of different types (nuclear physicists, lawyers, teachers), was created. This group became the mechanism of the practical Nuclear Society activity realization in this sphere. Three milestones of the innovative activity can be specified. First Milestone. In January 1992 the Central Nuclear Society Board (of the International Public Nuclear Society Association) published a special appeal to the First Leaders of all countries - former USSR republics. This address paid a special attention to the unity of the USSR power-industrial complex, and numerous problems arisen while separating this complex, including nuclear weapons non-proliferation problems, were indicated as well. Second Milestone. In 1992 and 1993 the Nuclear Society experts issued two selection 'Nuclear Non-proliferation and Control Problems' including reviewing basic papers. In addition, materials on non-proliferation and control are published regularly in the organs. Third Milestone.In 1993 - 1997 some special scientific and technical events (conferences, workshops, meetings) allowing to analyze the joint international projects and contracts outcomes, and establish new contacts between the specialists of NIS, Baltic states and others, have been hold

  17. Parameters calculation of fuel assembly with complex geometry

    International Nuclear Information System (INIS)

    Wu Hongchun; Ju Haitao; Yao Dong

    2006-01-01

    The code DRAGON was developed for CANDU reactor by Ecole Polytechnique de Montreal of Canada. In order to validate the DRAGON code's applicability for complex geometry fuel assembly calculation, the rod shape fuel assembly of PWR benchmark problem and the plate shape fuel assembly of MTR benchmark problem were analyzed by DRAGON code. Some other shape fuel assemblies were also discussed simply. Calculation results show that the DRAGON code can be used to calculate variform fuel assembly and the precision is high. (authors)

  18. The Assessment of 21st Century Skills in Industrial and Organizational Psychology: Complex and Collaborative Problem Solving

    OpenAIRE

    Neubert, Jonas; Mainert, Jakob; Kretzschmar, André; Greiff, Samuel

    2015-01-01

    In the current paper, we highlight why and how industrial and organizational psychology can take advantage of research on 21st century skills and their assessment. We present vital theoretical perspectives, a suitable framework for assessment, and exemplary instruments with a focus on advances in the assessment of Human Capital. Specifically, Complex Problem Solving (CPS) and Collaborative Problem Solving (ColPS) are two transversal skills (i.e., skills that span multiple domains) that are...

  19. Qualitative aspects of nonlinear wave motion: Complexity and simplicity

    International Nuclear Information System (INIS)

    Engelbrecht, J.

    1993-01-01

    The nonlinear wave processes possess many qualitative properties which cannot be described by linear theories. In this presentation, an attempt is made to systematize the main aspects of this fascinating area. The sources of nonlinearities are analyzed in order to understand why and how the nonlinear mathematical models are formulated. The technique of evolution equations is discussed then as a main mathematical tool to separate multiwave processes into single waves. The evolution equations give concise but in many cases sufficient description of wave processes in solids permitting to analyze spectral changes, phase changes and velocities, coupling of waves, and interaction of nonlinearities with other physical effects of the same order. Several new problems are listed. Knowing the reasons, the seemingly complex problems can be effectively analyzed. 61 refs

  20. Validity of the MicroDYN Approach: Complex Problem Solving Predicts School Grades beyond Working Memory Capacity

    Science.gov (United States)

    Schweizer, Fabian; Wustenberg, Sascha; Greiff, Samuel

    2013-01-01

    This study examines the validity of the complex problem solving (CPS) test MicroDYN by investigating a) the relation between its dimensions--rule identification (exploration strategy), rule knowledge (acquired knowledge), rule application (control performance)--and working memory capacity (WMC), and b) whether CPS predicts school grades in…

  1. Employing the Hilbert-Huang Transform to analyze observed natural complex signals: Calm wind meandering cases

    Science.gov (United States)

    Martins, Luis Gustavo Nogueira; Stefanello, Michel Baptistella; Degrazia, Gervásio Annes; Acevedo, Otávio Costa; Puhales, Franciano Scremin; Demarco, Giuliano; Mortarini, Luca; Anfossi, Domenico; Roberti, Débora Regina; Costa, Felipe Denardin; Maldaner, Silvana

    2016-11-01

    In this study we analyze natural complex signals employing the Hilbert-Huang spectral analysis. Specifically, low wind meandering meteorological data are decomposed into turbulent and non turbulent components. These non turbulent movements, responsible for the absence of a preferential direction of the horizontal wind, provoke negative lobes in the meandering autocorrelation functions. The meandering characteristic time scales (meandering periods) are determined from the spectral peak provided by the Hilbert-Huang marginal spectrum. The magnitudes of the temperature and horizontal wind meandering period obtained agree with the results found from the best fit of the heuristic meandering autocorrelation functions. Therefore, the new method represents a new procedure to evaluate meandering periods that does not employ mathematical expressions to represent observed meandering autocorrelation functions.

  2. Can motto-goals outperform learning and performance goals? Influence of goal setting on performance and affect in a complex problem solving task

    Directory of Open Access Journals (Sweden)

    Miriam S. Rohe

    2016-09-01

    Full Text Available In this paper, we bring together research on complex problem solving with that on motivational psychology about goal setting. Complex problems require motivational effort because of their inherent difficulties. Goal Setting Theory has shown with simple tasks that high, specific performance goals lead to better performance outcome than do-your-best goals. However, in complex tasks, learning goals have proven more effective than performance goals. Based on the Zurich Resource Model (Storch & Krause, 2014, so-called motto-goals (e.g., "I breathe happiness" should activate a person’s resources through positive affect. It was found that motto-goals are effective with unpleasant duties. Therefore, we tested the hypothesis that motto-goals outperform learning and performance goals in the case of complex problems. A total of N = 123 subjects participated in the experiment. In dependence of their goal condition, subjects developed a personal motto, learning, or performance goal. This goal was adapted for the computer-simulated complex scenario Tailorshop, where subjects worked as managers in a small fictional company. Other than expected, there was no main effect of goal condition for the management performance. As hypothesized, motto goals led to higher positive and lower negative affect than the other two goal types. Even though positive affect decreased and negative affect increased in all three groups during Tailorshop completion, participants with motto goals reported the lowest rates of negative affect over time. Exploratory analyses investigated the role of affect in complex problem solving via mediational analyses and the influence of goal type on perceived goal attainment.

  3. Fourth youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects. Theses of reports

    International Nuclear Information System (INIS)

    2007-01-01

    Theses of reports of the Fourth youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects (18-20 April 2007, Ozersk) are presented. The book contains theses of reports of the seventh subject sections: NFC: science and industry; Ecological problems in NFC development: radiation safety, radioecology and radiobiology; Nuclear power engineering: economics, safety, field experience; Atomic branch: history, today and future; New technologies in education. Education and training for NFC plants, public opinion; Information technologies and telecommunications; Long-term science intensive technologies and new materials [ru

  4. Problems of nuclear reactor safety. Vol. 1; Problemy bezopasnosti yaderno-ehnergeticheskikh ustanovok. Tom 1

    Energy Technology Data Exchange (ETDEWEB)

    Shal` nov, A V [Moskovskij Inzhenerno-Fizicheskij Inst., Moscow (Russian Federation)

    1996-12-31

    Proceedings of the 9. Topical Meeting `Problems of nuclear reactor safety` are presented. Papers include results of studies and developments associated with methods of calculation and complex computerized simulation for stationary and transient processes in nuclear power plants. Main problems of reactor safety are discussed as well as rector accidents on operating NPP`s are analyzed.

  5. Geographical National Condition and Complex System

    Directory of Open Access Journals (Sweden)

    WANG Jiayao

    2016-01-01

    Full Text Available The significance of studying the complex system of geographical national conditions lies in rationally expressing the complex relationships of the “resources-environment-ecology-economy-society” system. Aiming to the problems faced by the statistical analysis of geographical national conditions, including the disunity of research contents, the inconsistency of range, the uncertainty of goals, etc.the present paper conducted a range of discussions from the perspectives of concept, theory and method, and designed some solutions based on the complex system theory and coordination degree analysis methods.By analyzing the concepts of geographical national conditions, geographical national conditions survey and geographical national conditions statistical analysis, as well as investigating the relationships between theirs, the statistical contents and the analytical range of geographical national conditions are clarified and defined. This investigation also clarifies the goals of the statistical analysis by analyzing the basic characteristics of the geographical national conditions and the complex system, and the consistency between the analysis of the degree of coordination and statistical analyses. It outlines their goals, proposes a concept for the complex system of geographical national conditions, and it describes the concept. The complex system theory provides new theoretical guidance for the statistical analysis of geographical national conditions. The degree of coordination offers new approaches on how to undertake the analysis based on the measurement method and decision-making analysis scheme upon which the complex system of geographical national conditions is based. It analyzes the overall trend via the degree of coordination of the complex system on a macro level, and it determines the direction of remediation on a micro level based on the degree of coordination among various subsystems and of single systems. These results establish

  6. Learning by Preparing to Teach: Fostering Self-Regulatory Processes and Achievement during Complex Mathematics Problem Solving

    Science.gov (United States)

    Muis, Krista R.; Psaradellis, Cynthia; Chevrier, Marianne; Di Leo, Ivana; Lajoie, Susanne P.

    2016-01-01

    We developed an intervention based on the learning by teaching paradigm to foster self-regulatory processes and better learning outcomes during complex mathematics problem solving in a technology-rich learning environment. Seventy-eight elementary students were randomly assigned to 1 of 2 conditions: learning by preparing to teach, or learning for…

  7. A Reliability Test of a Complex System Based on Empirical Likelihood

    OpenAIRE

    Zhou, Yan; Fu, Liya; Zhang, Jun; Hui, Yongchang

    2016-01-01

    To analyze the reliability of a complex system described by minimal paths, an empirical likelihood method is proposed to solve the reliability test problem when the subsystem distributions are unknown. Furthermore, we provide a reliability test statistic of the complex system and extract the limit distribution of the test statistic. Therefore, we can obtain the confidence interval for reliability and make statistical inferences. The simulation studies also demonstrate the theorem results.

  8. Class II malocclusion with complex problems treated with a novel combination of lingual orthodontic appliances and lingual arches.

    Science.gov (United States)

    Yanagita, Takeshi; Nakamura, Masahiro; Kawanabe, Noriaki; Yamashiro, Takashi

    2014-07-01

    This case report describes a novel method of combining lingual appliances and lingual arches to control horizontal problems. The patient, who was 25 years of age at her first visit to our hospital with a chief complaint of crooked anterior teeth, was diagnosed with skeletal Class II and Angle Class II malocclusion with anterior deep bite, lateral open bite, premolar crossbite, and severe crowding in both arches. She was treated with premolar extractions and temporary anchorage devices. Conventionally, it is ideal to use labial brackets simultaneously with appliances, such as a lingual arch, a quad-helix, or a rapid expansion appliance, in patients with complex problems requiring horizontal, anteroposterior, and vertical control; however, this patient strongly requested orthodontic treatment with lingual appliances. A limitation of lingual appliances is that they cannot be used with other conventional appliances. In this report, we present the successful orthodontic treatment of a complex problem using modified lingual appliances that enabled combined use of a conventional lingual arch. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  9. FOCUS, Neutron Transport System for Complex Geometry Reactor Core and Shielding Problems by Monte-Carlo

    International Nuclear Information System (INIS)

    Hoogenboom, J.E.

    1980-01-01

    1 - Description of problem or function: FOCUS enables the calculation of any quantity related to neutron transport in reactor or shielding problems, but was especially designed to calculate differential quantities, such as point values at one or more of the space, energy, direction and time variables of quantities like neutron flux, detector response, reaction rate, etc. or averages of such quantities over a small volume of the phase space. Different types of problems can be treated: systems with a fixed neutron source which may be a mono-directional source located out- side the system, and Eigen function problems in which the neutron source distribution is given by the (unknown) fundamental mode Eigen function distribution. Using Monte Carlo methods complex 3- dimensional geometries and detailed cross section information can be treated. Cross section data are derived from ENDF/B, with anisotropic scattering and discrete or continuous inelastic scattering taken into account. Energy is treated as a continuous variable and time dependence may also be included. 2 - Method of solution: A transformed form of the adjoint Boltzmann equation in integral representation is solved for the space, energy, direction and time variables by Monte Carlo methods. Adjoint particles are defined with properties in some respects contrary to those of neutrons. Adjoint particle histories are constructed from which estimates are obtained of the desired quantity. Adjoint cross sections are defined with which the nuclide and reaction type are selected in a collision. The energy after a collision is selected from adjoint energy distributions calculated together with the adjoint cross sections in advance of the actual Monte Carlo calculation. For multiplying systems successive generations of adjoint particles are obtained which will die out for subcritical systems with a fixed neutron source and will be kept approximately stationary for Eigen function problems. Completely arbitrary problems can

  10. [Methamphetamine - just another stimulant or a more complex problem?].

    Science.gov (United States)

    Lecomte, Tania; Massé, Marjolaine

    2014-01-01

    Methamphetamine (MA) has recently become very popular in the media, due in part to its increasing popularity as well as its psychotropic effects and the negative consequences of its use. Is it a stimulant like any other, or does methamphetamine use lead to specific difficulties in its users? The aim of this article is to provide a brief review of the literature by explaining some of the reasons for its popularity in Canada as well as the physical, dental, psychiatric, cognitive and legal problems associated with its use. MA's popularity: Regarding its popularity, MA has benefitted from multiple factors, namely its low cost for users and manufacturers, its quick and intense psychotropic effects (increased energy, sexual arousal, rapid thinking, sleeplessness, lack of appetite), its easy access, as well as its various methods of ingestion (nasal, oral, injection). MA abuse also results in a multitude of negative effects, both physical and mental. MA's physical effects: In terms of negative physical effects, cardiac problems, skin infections, sexually transmitted (and injection-related) diseases as well as meth mouth are described. MA's mental effects: In terms of mental consequences, two recently published Canadian studies revealing high rates of depression symptoms and of sustained psychotic symptoms in a subgroup of MA users are presented. Studies reporting various cognitive deficits in MA user are also reviewed, including reports of high prevalence of childhood attention deficit and hyperactivity disorder diagnoses among adult MA users. Furthermore, MA abusers are documented as having been highly exposed to trauma in their lives, with many presenting with post-traumatic stress disorder criteria. This manuscript also explores the reasons behind the forensic profiles of individuals using MA, particularly the increased tendency toward violent acts, the high incarceration rates of the homeless users and the high percentage of individuals diagnosed with antisocial

  11. Performance-complexity tradeoff in sequential decoding for the unconstrained AWGN channel

    KAUST Repository

    Abediseid, Walid

    2013-06-01

    In this paper, the performance limits and the computational complexity of the lattice sequential decoder are analyzed for the unconstrained additive white Gaussian noise channel. The performance analysis available in the literature for such a channel has been studied only under the use of the minimum Euclidean distance decoder that is commonly referred to as the lattice decoder. Lattice decoders based on solutions to the NP-hard closest vector problem are very complex to implement, and the search for low complexity receivers for the detection of lattice codes is considered a challenging problem. However, the low computational complexity advantage that sequential decoding promises, makes it an alternative solution to the lattice decoder. In this work, we characterize the performance and complexity tradeoff via the error exponent and the decoding complexity, respectively, of such a decoder as a function of the decoding parameter - the bias term. For the above channel, we derive the cut-off volume-to-noise ratio that is required to achieve a good error performance with low decoding complexity. © 2013 IEEE.

  12. Numerical sensitivity computation for discontinuous gradient-only optimization problems using the complex-step method

    CSIR Research Space (South Africa)

    Wilke, DN

    2012-07-01

    Full Text Available problems that utilise remeshing (i.e. the mesh topology is allowed to change) between design updates. Here, changes in mesh topology result in abrupt changes in the discretization error of the computed response. These abrupt changes in turn manifests... in shape optimization but may be present whenever (partial) differential equations are ap- proximated numerically with non-constant discretization methods e.g. remeshing of spatial domains or automatic time stepping in temporal domains. Keywords: Complex...

  13. Ethical and legal issues arising from complex genetic disorders. DOE final report

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Lori

    2002-10-09

    The project analyzed the challenges raised by complex genetic disorders in genetic counselling, for clinical practice, for public health, for quality assurance, and for protection against discrimination. The research found that, in some settings, solutions created in the context of single gene disorders are more difficult to apply to complex disorders. In other settings, the single gene solutions actually backfired and created additional problems when applied to complex genetic disorders. The literature of five common, complex genetic disorders--Alzheimer's, asthma, coronary heart disease, diabetes, and psychiatric illnesses--was evaluated in depth.

  14. On the complexity of a bundle pricing problem

    NARCIS (Netherlands)

    Grigoriev, Alexander; van Lohn, Joyce; Uetz, Marc Jochen

    2010-01-01

    We consider the problem of pricing items in order to maximize the revenue obtainable from a set of single minded customers. We relate the tractability of the problem to structural properties of customers' valuations: the problem admits an effcient approximation algorithm, parameterized along the

  15. Analyzing the problems with the current adoption of IFRS in the companies among India, China, Germany, Russia and Kenya

    Directory of Open Access Journals (Sweden)

    Robert Mosomi Ombati

    2018-01-01

    Full Text Available Accounting information provides past and current financial information of an economic unit for business managers, potential investors, and other interested parties. Internally generated accounting information helps business managers with planning, controlling, and making decisions referred to as managerial accounting information. However, if the companies, which have adopted International Financial Reporting Standards (IFRS globally, cannot generate the same information then the accounting practices need to be improved. For this purpose, the current study was performed with the objectives of measuring relationship between profitability and market capitalization and to analyze the challenges faced by listed firms of various countries in association with the implementation of IFRS. For this purpose, 15 companies were selected from 5 countries including India, China, Germany, Russia and Kenya. The secondary data regarding the correlation between profitability and market capitalization were analyzed to calculate the correlations. The primary data regarding the managers perception were analyzed with multiple regression method using SPSS-19 software to find out the company related variables, investors’ related variables and government agency related variables responsible for problems in the current adoption of IFRS.

  16. Developing an Approach for Analyzing and Verifying System Communication

    Science.gov (United States)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  17. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  18. Analyzing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2014-01-01

    New types of disclosure and reporting are argued to be vital in order to convey a transparent picture of the true state of the company. However, they are unfortunately not without problems as these types of information are somewhat more complex than the information provided in the traditional...... stakeholders in a form that corresponds to the stakeholders understanding, then disclosure and interpretation of key performance indicators will also be facilitated....

  19. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  20. ReaderBench: A Multi-lingual Framework for Analyzing Text Complexity

    NARCIS (Netherlands)

    Dascalu, Mihai; Gutu, Gabriel; Ruseti, Stefan; Paraschiv, Ionut Cristian; Dessus, Philippe; McNamara, Danielle S.; Crossley, Scott; Trausan-Matu, Stefan

    2017-01-01

    Assessing textual complexity is a difficult, but important endeavor, especially for adapting learning materials to students’ and readers’ levels of understanding. With the continuous growth of information technologies spanning through various research fields, automated assessment tools have

  1. Using Model Checking for Analyzing Distributed Power Control Problems

    DEFF Research Database (Denmark)

    Brihaye, Thomas; Jungers, Marc; Lasaulce, Samson

    2010-01-01

    Model checking (MC) is a formal verification technique which has been known and still knows a resounding success in the computer science community. Realizing that the distributed power control ( PC) problem can be modeled by a timed game between a given transmitter and its environment, the authors...... objectives a transmitter-receiver pair would like to reach. The network is modeled by a game where transmitters are considered as timed automata interacting with each other. The objectives are then translated into timed alternating-time temporal logic formulae and MC is exploited to know whether the desired...

  2. The virtual product-process design laboratory to manage the complexity in the verification of formulated products

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Malik, Tahir I.

    2011-01-01

    -Process Design laboratory (virtual PPD-lab) software is based on this decomposition strategy for the design of formulated liquid products. When the needed models are available in the software, the solution of formulation design/verification problems is straightforward, while when models are not available...... mixtures need to be predicted. This complexity has to be managed through decomposition of the problem into sub-problems. Each sub-problem is solved and analyzed and, from the knowledge gained, an overall evaluation of the complex chemical system representing the product is made. The virtual Product...... in the software library, they need to be developed and/or implemented. The potential of the virtual PPD-lab in managing the complexity in the verification of formulated products, after the needed models have been developed and implemented, is highlighted in this paper through a case study from industry dealing...

  3. Collecting and Analyzing Stakeholder Feedback for Signing at Complex Interchanges

    Science.gov (United States)

    2014-10-01

    The purpose of this project was to identify design constraints related to signing, markings, and geometry for complex interchanges, and then to identify useful topics for future research that will yield findings that can address those design issues. ...

  4. Comparisons of complex network based models and real train flow model to analyze Chinese railway vulnerability

    International Nuclear Information System (INIS)

    Ouyang, Min; Zhao, Lijing; Hong, Liu; Pan, Zhezhe

    2014-01-01

    Recently numerous studies have applied complex network based models to study the performance and vulnerability of infrastructure systems under various types of attacks and hazards. But how effective are these models to capture their real performance response is still a question worthy of research. Taking the Chinese railway system as an example, this paper selects three typical complex network based models, including purely topological model (PTM), purely shortest path model (PSPM), and weight (link length) based shortest path model (WBSPM), to analyze railway accessibility and flow-based vulnerability and compare their results with those from the real train flow model (RTFM). The results show that the WBSPM can produce the train routines with 83% stations and 77% railway links identical to the real routines and can approach the RTFM the best for railway vulnerability under both single and multiple component failures. The correlation coefficient for accessibility vulnerability from WBSPM and RTFM under single station failures is 0.96 while it is 0.92 for flow-based vulnerability; under multiple station failures, where each station has the same failure probability fp, the WBSPM can produce almost identical vulnerability results with those from the RTFM under almost all failures scenarios when fp is larger than 0.62 for accessibility vulnerability and 0.86 for flow-based vulnerability

  5. Analyzing the Implicit Computational Complexity of object-oriented programs

    OpenAIRE

    Marion , Jean-Yves; Péchoux , Romain

    2008-01-01

    International audience; A sup-interpretation is a tool which provides upper bounds on the size of the values computed by the function symbols of a program. Sup-interpretations have shown their interest to deal with the complexity of first order functional programs. This paper is an attempt to adapt the framework of sup-interpretations to a fragment of object-oriented programs, including loop and while constructs and methods with side effects. We give a criterion, called brotherly criterion, w...

  6. Assessment of Time Series Complexity Using Improved Approximate Entropy

    International Nuclear Information System (INIS)

    Kong De-Ren; Xie Hong-Bo

    2011-01-01

    Approximate entropy (ApEn), a measure quantifying complexity and/or regularity, is believed to be an effective method of analyzing diverse settings. However, the similarity definition of vectors based on Heaviside function may cause some problems in the validity and accuracy of ApEn. To overcome the problems, an improved approximate entropy (iApEn) based on the sigmoid function is proposed. The performance of iApEn is tested on the independent identically distributed (IID) Gaussian noise, the MIX stochastic model, the Rossler map, the logistic map, and the high-dimensional Mackey—Glass oscillator. The results show that iApEn is superior to ApEn in several aspects, including better relative consistency, freedom of parameter selection, robust to noise, and more independence on record length when characterizing time series with different complexities. (general)

  7. On solution to the problem of criticality by alternative Monte Carlo method

    International Nuclear Information System (INIS)

    Kyncl, J.

    2005-03-01

    The problem of criticality for the neutron transport equation is analyzed. The problem is transformed into an equivalent problem in a suitable set of complex functions, and the existence and uniqueness of its solution is demonstrated. The source iteration method is discussed. It is shown that the final result of the iterative process is strongly affected by the insufficient accuracy of the individual iterations. A modified method is suggested to circumvent this problem based on the theory of positive operators; the criticality problem is solved by the Monte Carlo method constructing special random process and variable so that the difference between the result and the true value can be arbitrarily small. The efficiency of this alternative method is analysed

  8. Design patterns for instructional materials that foster proficiency at analyzing and interpreting complex geoscience data

    Science.gov (United States)

    Kastens, K. A.; Krumhansl, R.

    2016-12-01

    The Next Generation Science Standards incorporate a stronger emphasis on having students work with data than did prior standards. This emphasis is most obvious in Practice 4: Analyzing and Interpreting Data, but also permeates performance expectations built on Practice 2 when students test models, Practice 6 when students construct explanations, and Practice 7 when student test claims with evidence. To support curriculum developers who wish to guide high school students towards more sophisticated engagement with complex data, we analyzed a well-regarded body of instructional materials designed for use in introductory college courses (http://serc.carleton.edu/integrate/teaching_materials/). Our analysis sought design patterns that can be reused for a variety of topics at the high school or college level. We found five such patterns, each of which was used in at least half of the modules analyzed. We describe each pattern, provide an example, and hypothesize a theory of action that could explain how the sequence of activities leverages known perceptual, cognitive and/or social processes to foster learning from and about data. In order from most to least frequent, the observed design patterns are as follows: In Data Puzzles, students respond to guiding questions about high-value snippets of data pre-selected and sequenced by the curriculum developer to lead to an Aha! inference. In Pooling Data to See the Big Picture, small groups analyze different instances of analogous phenomenon (e.g. different hurricanes, or different divergent plate boundaries) and pool their insights to extract the commonalities that constitute the essence of that phenomenon. In Make a Decision or Recommendation, students combine geoscience data with other factors (such as economic or environmental justice concerns) to make a decision or recommendation about a human or societal action. In Predict-Observe-Explain, students make a prediction about what the Earth will look like under conditions

  9. Waste Collection Vehicle Routing Problem: Literature Review

    OpenAIRE

    Hui Han; Eva Ponce Cueto

    2015-01-01

    Waste generation is an issue which has caused wide public concern in modern societies, not only for the quantitative rise of the amount of waste generated, but also for the increasing complexity of some products and components. Waste collection is a highly relevant activity in the reverse logistics system and how to collect waste in an efficient way is an area that needs to be improved. This paper analyzes the major contribution about Waste Collection Vehicle Routing Problem (WCVRP) in litera...

  10. Expected Fitness Gains of Randomized Search Heuristics for the Traveling Salesperson Problem.

    Science.gov (United States)

    Nallaperuma, Samadhi; Neumann, Frank; Sudholt, Dirk

    2017-01-01

    Randomized search heuristics are frequently applied to NP-hard combinatorial optimization problems. The runtime analysis of randomized search heuristics has contributed tremendously to our theoretical understanding. Recently, randomized search heuristics have been examined regarding their achievable progress within a fixed-time budget. We follow this approach and present a fixed-budget analysis for an NP-hard combinatorial optimization problem. We consider the well-known Traveling Salesperson Problem (TSP) and analyze the fitness increase that randomized search heuristics are able to achieve within a given fixed-time budget. In particular, we analyze Manhattan and Euclidean TSP instances and Randomized Local Search (RLS), (1+1) EA and (1+[Formula: see text]) EA algorithms for the TSP in a smoothed complexity setting, and derive the lower bounds of the expected fitness gain for a specified number of generations.

  11. Research and assessment of competitiveness of large engineering complexes

    Directory of Open Access Journals (Sweden)

    Krivorotov V.V.

    2017-01-01

    Full Text Available The urgency of the problem of ensuring the competitiveness of manufacturing and high-tech sectors is shown. Substantiated the decisive role of the large industrial complexes in the formation of the results of the national economy; the author’s interpretation of the concept of “industrial complex” with regard to current economic systems. Current approaches to assessing the competitiveness of enterprises and industrial complexes are analyzed; showing their main advantages and disadvantages. Provides scientific-methodological approach to the study and management of competitiveness of a large industrial complex; the description of its main units is provided. As a Central element of the scientific methodology approach proposed the methodology for assessing the competitiveness of a large industrial complex based on the Pattern-method; a modular system of indicators of competitiveness is developed and its adaptation to a large engineering complexes is made. Using the developed methodology the competitiveness of one of the largest engineering complexes of the group of companies Uralelectrotyazhmash, which is the leading enterprises in electrotechnical industry of Russia is assessed. The evaluation identified the main problems and bottlenecks in the development of these enterprises, and their comparison with leading competitors is provided. According to the results of the study the main conclusions and recommendations are formed.

  12. Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR and Information Entropy

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2015-04-01

    Full Text Available Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL, Shili (SL and Renli (RL, so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.

  13. Education for complex problem solving

    DEFF Research Database (Denmark)

    Kjær-Rasmussen, Lone Krogh

    The Problem-Based Learning model as it is practiced at Aalborg University grew out of expectations for future graduates in the 1970s. Many changes and developments have taken place since then in the ways the principles and methodologies are practiced, due to changes in society and governmental...... regulations. However, the basic educational principles and methodologies are still the same and seem to meet expectations from society and academic work places today. This is what surveys and research, done regularly, document. (see for instance Krogh, 2013)....

  14. A new differential calculus on a complex banach space with application to variational problems of quantum theory

    International Nuclear Information System (INIS)

    Sharma, C.S.; Rebelo, I.

    1975-01-01

    It is proved that a semilinear function on a complex banach space is not differentiable according to the usual definition of differentiability in the calculus on banch spaces. It is shown that this result makes the calculus largely inapplicable to the solution od variational problems of quantum mechanics. A new concept of differentiability called semidifferentiability is defined. This generalizes the standard concept of differentiability in a banach space and the resulting calculus is particularly suitable for optimizing real-value functions on a complex banach space and is directly applicable to the solution of quantum mechanical variational problems. As an example of such application a rigorous proof of a generalized version of a result due to Sharma (J. Phys. A; 2:413 (1969)) is given. In the course of this work a new concept of prelinearity is defined and some standard results in the calculus in banach spaces are extended and generalized into more powerful ones applicable directly to prelinear functions and hence yielding the standard results for linear function as particular cases. (author)

  15. Fifth Anniversary youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects. Theses of reports

    International Nuclear Information System (INIS)

    2009-01-01

    Theses of reports of the Fifth Anniversary youth scientifically-practical conference Nuclear-industrial complex of Ural: problems and prospects (21-23 April 2009, Ozersk) are presented. The book contains abstracts of papers of fourth thematic sections: SNF reprocessing: science and industry; Radioecology and radiobiology; Advanced science-intensive technologies and materials; Education and training for NFC plants

  16. Ranking in evolving complex networks

    Science.gov (United States)

    Liao, Hao; Mariani, Manuel Sebastian; Medo, Matúš; Zhang, Yi-Cheng; Zhou, Ming-Yang

    2017-05-01

    Complex networks have emerged as a simple yet powerful framework to represent and analyze a wide range of complex systems. The problem of ranking the nodes and the edges in complex networks is critical for a broad range of real-world problems because it affects how we access online information and products, how success and talent are evaluated in human activities, and how scarce resources are allocated by companies and policymakers, among others. This calls for a deep understanding of how existing ranking algorithms perform, and which are their possible biases that may impair their effectiveness. Many popular ranking algorithms (such as Google's PageRank) are static in nature and, as a consequence, they exhibit important shortcomings when applied to real networks that rapidly evolve in time. At the same time, recent advances in the understanding and modeling of evolving networks have enabled the development of a wide and diverse range of ranking algorithms that take the temporal dimension into account. The aim of this review is to survey the existing ranking algorithms, both static and time-aware, and their applications to evolving networks. We emphasize both the impact of network evolution on well-established static algorithms and the benefits from including the temporal dimension for tasks such as prediction of network traffic, prediction of future links, and identification of significant nodes.

  17. Analyzing maintenance strategies by agent-based simulations: A feasibility study

    International Nuclear Information System (INIS)

    Kaegi, M.; Mock, R.; Kroeger, W.

    2009-01-01

    Thoroughly planned and implemented maintenance strategies save time and cost. However, the integration of maintenance work into reliability analysis is difficult as common modeling techniques are often not applicable due to state explosion which calls for restrictive model assumptions and oversimplification. From authors' point of view, agent-based modeling (ABM) of technical and organizational systems is a promising approach to overcome such problems. But since ABM is not well established in reliability analysis its feasibility in this area still has to be demonstrated. For this purpose ABM is compared with Markov chains, namely by analyzing the reliability of a maintained n-unit system with dependent repair events, applying both modeling approaches. Although ABM and Markov chains lead to the same numerical results, the former points out the potentiality of an improved system state handling. This is demonstrated by extending the ABM with operators as additional 'agents' featuring their location (x;y) availability (0;1) and different maintenance strategies. This extension highlights the capability of ABM to analyze complex emergent system behavior and allows a systematic refinement and optimization of the maintenance strategies.

  18. An Experimental Analysis on Dispatching Rules for the Train Platforming Problem in Busy Complex Passenger Stations

    Directory of Open Access Journals (Sweden)

    Qiongfang Zeng

    2017-09-01

    platforming problem (TPP by using mixed integer linear programming and job shop scheduling theory. First, the operation procedures and scheduled time adjustment costs of different train types specific to busy complex passenger stations are explicitly represented. Second, a multi-criteria scheduling model (MCS for TPP without earliness and tardiness time window (ETTW and a time window scheduling model (TWS with ETTW for TPP are proposed. Third, various dispatching rules were designed by incorporating the dispatcher experiences with modern scheduling theory and a rule-based metaheuristic to solve the above model is presented. With solution improvement strategies analogous to those used in practice by dispatchers, the realistic size problems in acceptable time can be solved.

  19. Individual Differences in Strategy Use on Division Problems: Mental versus Written Computation

    Science.gov (United States)

    Hickendorff, Marian; van Putten, Cornelis M.; Verhelst, Norman D.; Heiser, Willem J.

    2010-01-01

    Individual differences in strategy use (choice and accuracy) were analyzed. A sample of 362 Grade 6 students solved complex division problems under 2 different conditions. In the choice condition students were allowed to use either a mental or a written strategy. In the subsequent no-choice condition, they were required to use a written strategy.…

  20. Automated System for Teaching Computational Complexity of Algorithms Course

    Directory of Open Access Journals (Sweden)

    Vadim S. Roublev

    2017-01-01

    Full Text Available This article describes problems of designing automated teaching system for “Computational complexity of algorithms” course. This system should provide students with means to familiarize themselves with complex mathematical apparatus and improve their mathematical thinking in the respective area. The article introduces the technique of algorithms symbol scroll table that allows estimating lower and upper bounds of computational complexity. Further, we introduce a set of theorems that facilitate the analysis in cases when the integer rounding of algorithm parameters is involved and when analyzing the complexity of a sum. At the end, the article introduces a normal system of symbol transformations that allows one both to perform any symbol transformations and simplifies the automated validation of such transformations. The article is published in the authors’ wording.

  1. Dealing with complex and ill-structured problems: results of a Plan-Do-Check-Act experiment in a business engineering semester

    Science.gov (United States)

    Riis, Jens Ove; Achenbach, Marlies; Israelsen, Poul; Kyvsgaard Hansen, Poul; Johansen, John; Deuse, Jochen

    2017-07-01

    Challenged by increased globalisation and fast technological development, we carried out an experiment in the third semester of a global business engineering programme aimed at identifying conditions for training student in dealing with complex and ill-structured problems of forming a new business. As this includes a fuzzy front end, learning cannot be measured in traditional, quantitative terms; therefore, we have explored the use of reflection to convert tacit knowledge to explicit knowledge. The experiment adopted a Plan-Do-Check-Act approach and concluded with developing a plan for new learning initiatives in the subsequent year's semester. The findings conclude that (1) problem-based learning develops more competencies than ordinarily measured at the examination, especially, the social/communication and personal competencies are developed; (2) students are capable of dealing with a complex and ambiguous problem, if properly guided. Four conditions were identified; (3) most students are not conscious of their learning, but are able to reflect if properly encouraged; and (4) improving engineering education should be considered as an organisational learning process.

  2. X-ray fluorescence analyzer arrangement

    International Nuclear Information System (INIS)

    Vatai, Endre; Ando, Laszlo; Gal, Janos.

    1981-01-01

    An x-ray fluorescence analyzer for the quantitative determination of one or more elements of complex samples is reported. The novelties of the invention are the excitation of the samples by x-rays or γ-radiation, the application of a balanced filter pair as energy selector, and the measurement of the current or ion charge of ionization detectors used as sensors. Due to the increased sensitivity and accuracy, the novel design can extend the application fields of x-ray fluorescence analyzers. (A.L.)

  3. ANALYZING ALGEBRAIC THINKING USING “GUESS MY NUMBER” PROBLEMS

    Directory of Open Access Journals (Sweden)

    Estella De Los Santos

    2012-01-01

    Full Text Available The purpose of this study was to assess student knowledge of numeric, visual and algebraic representations. A definite gap between arithmetic and algebra has been documented in the research. The researchers’ goal was to identify a link between the two. Using four “Guess My Number problems, seventh and tenth grade students were asked to write numeric, visual, and algebraic representations. Seventh-grade students had significantly higher scores than tenth-grade students on visual representation responses. There were no significant differences between the seventh and tenth grade students’ responses on the numeric and algebraic representation. The researchers believed that the semi-concrete and visual models, such as used in this study, may provide the link between numeric and algebraic concepts for many students.

  4. Efficient algorithms for analyzing the singularly perturbed boundary value problems of fractional order

    Science.gov (United States)

    Sayevand, K.; Pichaghchi, K.

    2018-04-01

    In this paper, we were concerned with the description of the singularly perturbed boundary value problems in the scope of fractional calculus. We should mention that, one of the main methods used to solve these problems in classical calculus is the so-called matched asymptotic expansion method. However we shall note that, this was not achievable via the existing classical definitions of fractional derivative, because they do not obey the chain rule which one of the key elements of the matched asymptotic expansion method. In order to accommodate this method to fractional derivative, we employ a relatively new derivative so-called the local fractional derivative. Using the properties of local fractional derivative, we extend the matched asymptotic expansion method to the scope of fractional calculus and introduce a reliable new algorithm to develop approximate solutions of the singularly perturbed boundary value problems of fractional order. In the new method, the original problem is partitioned into inner and outer solution equations. The reduced equation is solved with suitable boundary conditions which provide the terminal boundary conditions for the boundary layer correction. The inner solution problem is next solved as a solvable boundary value problem. The width of the boundary layer is approximated using appropriate resemblance function. Some theoretical results are established and proved. Some illustrating examples are solved and the results are compared with those of matched asymptotic expansion method and homotopy analysis method to demonstrate the accuracy and efficiency of the method. It can be observed that, the proposed method approximates the exact solution very well not only in the boundary layer, but also away from the layer.

  5. Vibrations and stability of complex beam systems

    CERN Document Server

    Stojanović, Vladimir

    2015-01-01

     This book reports on solved problems concerning vibrations and stability of complex beam systems. The complexity of a system is considered from two points of view: the complexity originating from the nature of the structure, in the case of two or more elastically connected beams; and the complexity derived from the dynamic behavior of the system, in the case of a damaged single beam, resulting from the harm done to its simple structure. Furthermore, the book describes the analytical derivation of equations of two or more elastically connected beams, using four different theories (Euler, Rayleigh, Timoshenko and Reddy-Bickford). It also reports on a new, improved p-version of the finite element method for geometrically nonlinear vibrations. The new method provides more accurate approximations of solutions, while also allowing us to analyze geometrically nonlinear vibrations. The book describes the appearance of longitudinal vibrations of damaged clamped-clamped beams as a result of discontinuity (damage). It...

  6. Environmental problems and economic development in an endogenous fertility model

    OpenAIRE

    Frank Joest; Martin Quaas; Johannes Schiller

    2006-01-01

    Population growth is often viewed as a most oppressive global problem with respect to environmental deterioration, but the relationships between population development, economic dynamics and environmental pollution are complex due to various feedback mechanisms. We analyze society’s economic decisions on birth rates, investment into human and physical capital, and polluting emissions within an optimal control model of the coupled demographic-economic-environmental system. We show that a long-...

  7. Using the Van Hiele theory to analyze primary school teachers' written work on geometrical proof problems

    Science.gov (United States)

    Jupri, A.

    2018-05-01

    The lack of ability of primary school teachers in deductive thinking, such as doing geometrical proof, is an indispensable issue to be dealt with. In this paper, we report on results of a three-step of the field document study. The study was part of a pilot study for improving deductive thinking ability of primary school teachers. First, we designed geometrical proof problems adapted from literature. Second, we administered an individual written test involving nine master students of primary education program, in which they are having experiences as primary school mathematics teachers. Finally, we analyzed the written work from the view of the Van Hiele theory. The results revealed that even if about the half of the teachers show ability in doing formal proof, still the rest provides inappropriate proving. For further investigation, we wonder whether primary school teachers would show better deductive thinking if the teaching of geometry is designed in a systematic and appropriate manner according to the Van Hiele theory.

  8. Eye-Tracking Study of Complexity in Gas Law Problems

    Science.gov (United States)

    Tang, Hui; Pienta, Norbert

    2012-01-01

    This study, part of a series investigating students' use of online tools to assess problem solving, uses eye-tracking hardware and software to explore the effect of problem difficulty and cognitive processes when students solve gas law word problems. Eye movements are indices of cognition; eye-tracking data typically include the location,…

  9. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    Directory of Open Access Journals (Sweden)

    Heinz-Martin Süß

    2018-05-01

    Full Text Available The original aim of complex problem solving (CPS research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system. The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2 and figural reasoning (Study 2 – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1 cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2 in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly

  10. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    Science.gov (United States)

    Süß, Heinz-Martin; Kretzschmar, André

    2018-01-01

    The original aim of complex problem solving (CPS) research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system) twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system). The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2) and figural reasoning (Study 2) – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1) cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2) in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly utilizes the

  11. Use of multiple singular value decompositions to analyze complex intracellular calcium ion signals

    KAUST Repository

    Martinez, Josue G.; Huang, Jianhua Z.; Burghardt, Robert C.; Barhoumi, Rola; Carroll, Raymond J.

    2009-01-01

    ) to extract the signals from such movies, in a way that is semi-automatic and tuned closely to the actual data and their many complexities. These complexities include the following. First, the images themselves are of no interest: all interest focuses

  12. Dependency visualization for complex system understanding

    Energy Technology Data Exchange (ETDEWEB)

    Smart, J. Allison Cory [Univ. of California, Davis, CA (United States)

    1994-09-01

    With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impaired as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.

  13. Element Free Lattice Boltzmann Method for Fluid-Flow Problems

    International Nuclear Information System (INIS)

    Jo, Jong Chull; Roh, Kyung Wan; Yune, Young Gill; Kim, Hho Jhung; Kwon, Young Kwon

    2007-01-01

    The Lattice Boltzmann Method (LBM) has been developed for application to thermal-fluid problems. Most of the those studies considered a regular shape of lattice or mesh like square and cubic grids. In order to apply the LBM to more practical cases, it is necessary to be able to solve complex or irregular shapes of problem domains. Some techniques were based on the finite element method. Generally, the finite element method is very powerful for solving two or three-dimensional complex or irregular shapes of domains using the iso-parametric element formulation which is based on a mathematical mapping from a regular shape of element in an imaginary domain to a more general and irregular shape of element in the physical domain. In addition, the element free technique is also quite useful to analyze a complex shape of domain because there is no need to divide a domain by a compatible finite element mesh. This paper presents a new finite element and element free formulations for the lattice Boltzmann equation using the general weighted residual technique. Then, a series of validation examples are presented

  14. Element Free Lattice Boltzmann Method for Fluid-Flow Problems

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Jong Chull; Roh, Kyung Wan; Yune, Young Gill; Kim, Hho Jhung [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Kwon, Young Kwon [US Naval Postgraduate School, New York (United States)

    2007-10-15

    The Lattice Boltzmann Method (LBM) has been developed for application to thermal-fluid problems. Most of the those studies considered a regular shape of lattice or mesh like square and cubic grids. In order to apply the LBM to more practical cases, it is necessary to be able to solve complex or irregular shapes of problem domains. Some techniques were based on the finite element method. Generally, the finite element method is very powerful for solving two or three-dimensional complex or irregular shapes of domains using the iso-parametric element formulation which is based on a mathematical mapping from a regular shape of element in an imaginary domain to a more general and irregular shape of element in the physical domain. In addition, the element free technique is also quite useful to analyze a complex shape of domain because there is no need to divide a domain by a compatible finite element mesh. This paper presents a new finite element and element free formulations for the lattice Boltzmann equation using the general weighted residual technique. Then, a series of validation examples are presented.

  15. Genetic algorithms applied to nonlinear and complex domains

    International Nuclear Information System (INIS)

    Barash, D; Woodin, A E

    1999-01-01

    The dissertation, titled ''Genetic Algorithms Applied to Nonlinear and Complex Domains'', describes and then applies a new class of powerful search algorithms (GAS) to certain domains. GAS are capable of solving complex and nonlinear problems where many parameters interact to produce a ''final'' result such as the optimization of the laser pulse in the interaction of an atom with an intense laser field. GAS can very efficiently locate the global maximum by searching parameter space in problems which are unsuitable for a search using traditional methods. In particular, the dissertation contains new scientific findings in two areas. First, the dissertation examines the interaction of an ultra-intense short laser pulse with atoms. GAS are used to find the optimal frequency for stabilizing atoms in the ionization process. This leads to a new theoretical formulation, to explain what is happening during the ionization process and how the electron is responding to finite (real-life) laser pulse shapes. It is shown that the dynamics of the process can be very sensitive to the ramp of the pulse at high frequencies. The new theory which is formulated, also uses a novel concept (known as the (t,t') method) to numerically solve the time-dependent Schrodinger equation Second, the dissertation also examines the use of GAS in modeling decision making problems. It compares GAS with traditional techniques to solve a class of problems known as Markov Decision Processes. The conclusion of the dissertation should give a clear idea of where GAS are applicable, especially in the physical sciences, in problems which are nonlinear and complex, i.e. difficult to analyze by other means

  16. Contentious problems in bioscience and biotechnology: a pilot study of an approach to ethics education.

    Science.gov (United States)

    Berry, Roberta M; Borenstein, Jason; Butera, Robert J

    2013-06-01

    This manuscript describes a pilot study in ethics education employing a problem-based learning approach to the study of novel, complex, ethically fraught, unavoidably public, and unavoidably divisive policy problems, called "fractious problems," in bioscience and biotechnology. Diverse graduate and professional students from four US institutions and disciplines spanning science, engineering, humanities, social science, law, and medicine analyzed fractious problems employing "navigational skills" tailored to the distinctive features of these problems. The students presented their results to policymakers, stakeholders, experts, and members of the public. This approach may provide a model for educating future bioscientists and bioengineers so that they can meaningfully contribute to the social understanding and resolution of challenging policy problems generated by their work.

  17. Reliability Standards of Complex Engineering Systems

    Science.gov (United States)

    Galperin, E. M.; Zayko, V. A.; Gorshkalev, P. A.

    2017-11-01

    Production and manufacture play an important role in today’s modern society. Industrial production is nowadays characterized by increased and complex communications between its parts. The problem of preventing accidents in a large industrial enterprise becomes especially relevant. In these circumstances, the reliability of enterprise functioning is of particular importance. Potential damage caused by an accident at such enterprise may lead to substantial material losses and, in some cases, can even cause a loss of human lives. That is why industrial enterprise functioning reliability is immensely important. In terms of their reliability, industrial facilities (objects) are divided into simple and complex. Simple objects are characterized by only two conditions: operable and non-operable. A complex object exists in more than two conditions. The main characteristic here is the stability of its operation. This paper develops the reliability indicator combining the set theory methodology and a state space method. Both are widely used to analyze dynamically developing probability processes. The research also introduces a set of reliability indicators for complex technical systems.

  18. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 1. [thermal analyzer manual

    Science.gov (United States)

    Lee, H. P.

    1977-01-01

    The NASTRAN Thermal Analyzer Manual describes the fundamental and theoretical treatment of the finite element method, with emphasis on the derivations of the constituent matrices of different elements and solution algorithms. Necessary information and data relating to the practical applications of engineering modeling are included.

  19. Creativity for Problem Solvers

    DEFF Research Database (Denmark)

    Vidal, Rene Victor Valqui

    2009-01-01

    This paper presents some modern and interdisciplinary concepts about creativity and creative processes specially related to problem solving. Central publications related to the theme are briefly reviewed. Creative tools and approaches suitable to support problem solving are also presented. Finally......, the paper outlines the author’s experiences using creative tools and approaches to: Facilitation of problem solving processes, strategy development in organisations, design of optimisation systems for large scale and complex logistic systems, and creative design of software optimisation for complex non...

  20. Lessons Learned from Crowdsourcing Complex Engineering Tasks.

    Science.gov (United States)

    Staffelbach, Matthew; Sempolinski, Peter; Kijewski-Correa, Tracy; Thain, Douglas; Wei, Daniel; Kareem, Ahsan; Madey, Gregory

    2015-01-01

    Crowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crowdsourcing by using Mechanical Turk for a more complicated task: analysis and creation of wind simulations. Our investigation examined the feasibility of using crowdsourcing for complex, highly technical tasks. This was done to determine if the benefits of crowdsourcing could be harnessed to accurately and effectively contribute to solving complex real world engineering problems. Of course, untrained crowds cannot be used as a mere substitute for trained expertise. Rather, we sought to understand how crowd workers can be used as a large pool of labor for a preliminary analysis of complex data. We compared the skill of the anonymous crowd workers from Amazon Mechanical Turk with that of civil engineering graduate students, making a first pass at analyzing wind simulation data. For the first phase, we posted analysis questions to Amazon crowd workers and to two groups of civil engineering graduate students. A second phase of our experiment instructed crowd workers and students to create simulations on our Virtual Wind Tunnel website to solve a more complex task. With a sufficiently comprehensive tutorial and compensation similar to typical crowd-sourcing wages, we were able to enlist crowd workers to effectively complete longer, more complex tasks with competence comparable to that of graduate students with more comprehensive, expert-level knowledge. Furthermore, more complex tasks require increased communication with the workers. As tasks become more complex, the employment relationship begins to become more akin to outsourcing than crowdsourcing. Through this investigation, we were able to stretch and explore the limits of crowdsourcing as a tool for solving complex problems.

  1. Hemiequilibrium problems

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam Noor

    2004-01-01

    Full Text Available We consider a new class of equilibrium problems, known as hemiequilibrium problems. Using the auxiliary principle technique, we suggest and analyze a class of iterative algorithms for solving hemiequilibrium problems, the convergence of which requires either pseudomonotonicity or partially relaxed strong monotonicity. As a special case, we obtain a new method for hemivariational inequalities. Since hemiequilibrium problems include hemivariational inequalities and equilibrium problems as special cases, the results proved in this paper still hold for these problems.

  2. Modeling and Analyzing Operational Decision-Making Synchronization of C2 Organization in Complex Environment

    Directory of Open Access Journals (Sweden)

    Zou Zhigang

    2013-01-01

    Full Text Available In order to improve capability of operational decision-making synchronization (ODMS in command and control (C2 organization, the paper puts forward that ODMS is the negotiation process of situation cognition with three phases about “situation cognition, situation interaction and decision-making synchronization” in complex environment, and then the model and strategies of ODMS are given in quantity. Firstly, measure indexes of three steps above are given in the paper based on the time consumed in negotiation, and three patterns are proposed for negotiating timely in high quality during situation interaction. Secondly, the ODMS model with two stages in continuous changing situation is put forward in the paper, and ODMS strategies are analyzed within environment influence and time restriction. Thirdly, simulation cases are given to validate the process of ODMS under different continuous changing situations the results of this model are better than the other previous models to fulfill the actual restrictions, and the process of ODMS can be adjusted more reasonable for improving the capability of ODMS. Then we discuss the case and summarize the influence factors of ODMS in the C2 organization as organization structure, shared information resources, negotiation patterns, and allocation of decision rights.

  3. Complex processing and utilization of waste as the basis for sustainable economic development district

    Directory of Open Access Journals (Sweden)

    V.М. Ilchenko

    2015-06-01

    Full Text Available The article describes the main environmental problems of Ukraine. The problems that are connected with complex processing and recycling, the example Dnieper economic paradise-one, which allows more detailed present environmental situation of the country at this stage. The article is used and analyzed recent environmental performance and the basic problems of on-disposal and recycling. Basic research methods: observation, analysis and comparison. The aim was to find ways to overcome the ecological crisis in Ukraine. As a result of the research, it was determined that most types of waste-tion prevail in Ukraine and found the best solutions to problems related to waste and their processing. It was possible to find the main problem that has caused serious environmental situation, and the main task for the country at this stage. The main problems and tasks Dnieper economic region. Also indicate how to save, due to complex processing waste. The article is very relevant and important because it is here that the basic problems and tasks of Ukraine concerning the ecological situation. It also focuses on eco-logical problems, which the government does not pay enough attention.

  4. Detection of expression quantitative trait Loci in complex mouse crosses: impact and alleviation of data quality and complex population substructure.

    Science.gov (United States)

    Iancu, Ovidiu D; Darakjian, Priscila; Kawane, Sunita; Bottomly, Daniel; Hitzemann, Robert; McWeeney, Shannon

    2012-01-01

    Complex Mus musculus crosses, e.g., heterogeneous stock (HS), provide increased resolution for quantitative trait loci detection. However, increased genetic complexity challenges detection methods, with discordant results due to low data quality or complex genetic architecture. We quantified the impact of theses factors across three mouse crosses and two different detection methods, identifying procedures that greatly improve detection quality. Importantly, HS populations have complex genetic architectures not fully captured by the whole genome kinship matrix, calling for incorporating chromosome specific relatedness information. We analyze three increasingly complex crosses, using gene expression levels as quantitative traits. The three crosses were an F(2) intercross, a HS formed by crossing four inbred strains (HS4), and a HS (HS-CC) derived from the eight lines found in the collaborative cross. Brain (striatum) gene expression and genotype data were obtained using the Illumina platform. We found large disparities between methods, with concordance varying as genetic complexity increased; this problem was more acute for probes with distant regulatory elements (trans). A suite of data filtering steps resulted in substantial increases in reproducibility. Genetic relatedness between samples generated overabundance of detected eQTLs; an adjustment procedure that includes the kinship matrix attenuates this problem. However, we find that relatedness between individuals is not evenly distributed across the genome; information from distinct chromosomes results in relatedness structure different from the whole genome kinship matrix. Shared polymorphisms from distinct chromosomes collectively affect expression levels, confounding eQTL detection. We suggest that considering chromosome specific relatedness can result in improved eQTL detection.

  5. Collectives and the design of complex systems

    CERN Document Server

    Wolpert, David

    2004-01-01

    Increasingly powerful computers are making possible distributed systems comprised of many adaptive and self-motivated computational agents. Such systems, when distinguished by system-level performance criteria, are known as "collectives." Collectives and the Design of Complex Systems lays the foundation for a science of collectives and describes how to design them for optimal performance. An introductory survey chapter is followed by descriptions of information-processing problems that can only be solved by the joint actions of large communities of computers, each running its own complex, decentralized machine-learning algorithm. Subsequent chapters analyze the dynamics and structures of collectives, as well as address economic, model-free, and control-theory approaches to designing complex systems. The work assumes a modest understanding of basic statistics and calculus. Topics and Features: Introduces the burgeoning science of collectives and its practical applications in a single useful volume Combines ap...

  6. Methodological issues in analyzing human communication – the complexities of multimodality

    DEFF Research Database (Denmark)

    Høegh, Tina

    2017-01-01

    This chapter develops a multimodal method for transcribing speech, communication, and performance. The chapter discusses the methodological solutions to the complex translation of speech, language rhythm and gesture in time and space into the two-dimensional format of a piece of paper. The focus...

  7. Knowledge to action for solving complex problems: insights from a review of nine international cases.

    Science.gov (United States)

    Riley, B L; Robinson, K L; Gamble, J; Finegood, D T; Sheppard, D; Penney, T L; Best, A

    2015-05-01

    Solving complex problems such as preventing chronic diseases introduces unique challenges for the creation and application of knowledge, or knowledge to action (KTA). KTA approaches that apply principles of systems thinking are thought to hold promise, but practical strategies for their application are not well understood. In this paper we report the results of a scan of systems approaches to KTA with a goal to identify how to optimize their implementation and impact. A 5-person advisory group purposefully selected 9 initiatives to achieve diversity on issues addressed and organizational forms. Information on each case was gathered from documents and through telephone interviews with primary contacts within each organization. Following verification of case descriptions, an inductive analysis was conducted within and across cases. The cases revealed 5 guidelines for moving from conceiving KTA systems to implementing them: (1) establish and nurture relationships, (2) co-produce and curate knowledge, (3) create feedback loops, (4) frame as systems interventions rather than projects, and (5) consider variations across time and place. Results from the environmental scan are a modest start to translating systems concepts for KTA into practice. Use of the strategies revealed in the scan may improve KTA for solving complex public health problems. The strategies themselves will benefit from the development of a science that aims to understand adaptation and ongoing learning from policy and practice interventions, strengthens enduring relationships, and fills system gaps in addition to evidence gaps. Systems approaches to KTA will also benefit from robust evaluations.

  8. Managing the Complexity of Design Problems through Studio-Based Learning

    Science.gov (United States)

    Cennamo, Katherine; Brandt, Carol; Scott, Brigitte; Douglas, Sarah; McGrath, Margarita; Reimer, Yolanda; Vernon, Mitzi

    2011-01-01

    The ill-structured nature of design problems makes them particularly challenging for problem-based learning. Studio-based learning (SBL), however, has much in common with problem-based learning and indeed has a long history of use in teaching students to solve design problems. The purpose of this ethnographic study of an industrial design class,…

  9. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  10. Economic interpretation of sustainable development of the flax complex

    Directory of Open Access Journals (Sweden)

    Oleg Ivanovich Botkin

    2012-09-01

    Full Text Available This paper reviews the definition of the notions “stability” and “stable development”, and analyzes the factors influencing the development stability. We suggest the definition of the flax complex stable development and its assessment. We also examine the factors causing the flax complex functioning instability. An integral index was proposed to determine the stability of flax complex; this index takes into account the rate of growth (or decline of major products manufacturing, commodities, profits from product sales, accounts receivable and accounts payable, investments into fixed capital, labor productivity, coefficient of manufacturing capacity utilization and updating of the basic funds. The paper deals with the problems of its development and modern state of flax sub-complex of agroindustrial complex, as well as with the matters of disproportions between the complex’s branches. It covers the causes of tolling schemes of flax processing businesses work and therole of thestatein native market of flax products formation. The necessity of industry diversification and innovation development is substantiated.

  11. Computer-aided design system for a complex of problems on calculation and analysis of engineering and economical indexes of NPP power units

    International Nuclear Information System (INIS)

    Stepanov, V.I.; Koryagin, A.V.; Ruzankov, V.N.

    1988-01-01

    Computer-aided design system for a complex of problems concerning calculation and analysis of engineering and economical indices of NPP power units is described. In the system there are means for automated preparation and debugging of data base software complex, which realizes th plotted algorithm in the power unit control system. Besides, in the system there are devices for automated preparation and registration of technical documentation

  12. Linking Complex Problem Solving and General Mental Ability to Career Advancement: Does a Transversal Skill Reveal Incremental Predictive Validity?

    Science.gov (United States)

    Mainert, Jakob; Kretzschmar, André; Neubert, Jonas C.; Greiff, Samuel

    2015-01-01

    Transversal skills, such as complex problem solving (CPS) are viewed as central twenty-first-century skills. Recent empirical findings have already supported the importance of CPS for early academic advancement. We wanted to determine whether CPS could also contribute to the understanding of career advancement later in life. Towards this end, we…

  13. Inverse Problems in Systems Biology: A Critical Review.

    Science.gov (United States)

    Guzzi, Rodolfo; Colombo, Teresa; Paci, Paola

    2018-01-01

    Systems Biology may be assimilated to a symbiotic cyclic interplaying between the forward and inverse problems. Computational models need to be continuously refined through experiments and in turn they help us to make limited experimental resources more efficient. Every time one does an experiment we know that there will be some noise that can disrupt our measurements. Despite the noise certainly is a problem, the inverse problems already involve the inference of missing information, even if the data is entirely reliable. So the addition of a certain limited noise does not fundamentally change the situation but can be used to solve the so-called ill-posed problem, as defined by Hadamard. It can be seen as an extra source of information. Recent studies have shown that complex systems, among others the systems biology, are poorly constrained and ill-conditioned because it is difficult to use experimental data to fully estimate their parameters. For these reasons was born the concept of sloppy models, a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. Furthermore the concept of sloppy models contains also the concept of un-identifiability, because the models are characterized by many parameters that are poorly constrained by experimental data. Then a strategy needs to be designed to infer, analyze, and understand biological systems. The aim of this work is to provide a critical review to the inverse problems in systems biology defining a strategy to determine the minimal set of information needed to overcome the problems arising from dynamic biological models that generally may have many unknown, non-measurable parameters.

  14. On the Chern Yamabe Problem

    DEFF Research Database (Denmark)

    Angella, Daniele; Calamai, Simone; Spotti, Cristiano

    2017-01-01

    We undertake the study of an analogue of the Yamabe problem for complex manifolds. More precisely, for any conformal Hermitian structure on a compact complex manifold, we are concerned in the existence of metrics with constant Chern scalar curvature. In this note, we set the problem and we provid...

  15. The emerging problem of physical child abuse in South Korea.

    Science.gov (United States)

    Hahm, H C; Guterman, N B

    2001-05-01

    South Korea has had remarkably high incidence and prevalence rates of physical violence against children, yet the problem has received only limited public and professional attention until very recently. This article represents the first attempt in English to systematically analyze South Korea's recent epidemiological studies on child maltreatment. Discussed are sociocultural factors that have contributed both to delays in child protection laws and a low public awareness of the problem of child abuse. The article highlights methodological issues concerning the definition of physical abuse in South Korea and the complex attitudes toward violence. It also examines the role of the Korean women's movement in the reform of family laws and the recent establishment of new child protection legislation. Suggestions for future directions for the problem of child maltreatment within South Korea are presented.

  16. An analytical approach to managing complex process problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramstad, Kari; Andersen, Espen; Rohde, Hans Christian; Tydal, Trine

    2006-03-15

    The oil companies are continuously investing time and money to ensure optimum regularity on their production facilities. High regularity increases profitability, reduces workload on the offshore organisation and most important; - reduces discharge to air and sea. There are a number of mechanisms and tools available in order to achieve high regularity. Most of these are related to maintenance, system integrity, well operations and process conditions. However, for all of these tools, they will only be effective if quick and proper analysis of fluids and deposits are carried out. In fact, analytical backup is a powerful tool used to maintain optimised oil production, and should as such be given high priority. The present Operator (Hydro Oil and Energy) and the Chemical Supplier (MI Production Chemicals) have developed a cooperation to ensure that analytical backup is provided efficiently to the offshore installations. The Operator's Research and Development (R and D) departments and the Chemical Supplier have complementary specialties in both personnel and equipment, and this is utilized to give the best possible service when required from production technologists or operations. In order for the Operator's Research departments, Health, Safety and Environment (HSE) departments and Operations to approve analytical work performed by the Chemical Supplier, a number of analytical tests are carried out following procedures agreed by both companies. In the present paper, three field case examples of analytical cooperation for managing process problems will be presented. 1) Deposition in a Complex Platform Processing System. 2) Contaminated Production Chemicals. 3) Improved Monitoring of Scale Inhibitor, Suspended Solids and Ions. In each case the Research Centre, Operations and the Chemical Supplier have worked closely together to achieve fast solutions and Best Practice. (author) (tk)

  17. Comparison of Degrees of Potential-Energy-Surface Anharmonicity for Complexes and Clusters with Hydrogen Bonds

    Science.gov (United States)

    Kozlovskaya, E. N.; Doroshenko, I. Yu.; Pogorelov, V. E.; Vaskivskyi, Ye. V.; Pitsevich, G. A.

    2018-01-01

    Previously calculated multidimensional potential-energy surfaces of the MeOH monomer and dimer, water dimer, malonaldehyde, formic acid dimer, free pyridine-N-oxide/trichloroacetic acid complex, and protonated water dimer were analyzed. The corresponding harmonic potential-energy surfaces near the global minima were constructed for series of clusters and complexes with hydrogen bonds of different strengths based on the behavior of the calculated multidimensional potential-energy surfaces. This enabled the introduction of an obvious anharmonicity parameter for the calculated potential-energy surfaces. The anharmonicity parameter was analyzed as functions of the size of the analyzed area near the energy minimum, the number of points over which energies were compared, and the dimensionality of the solved vibrational problem. Anharmonicity parameters for potential-energy surfaces in complexes with strong, medium, and weak H-bonds were calculated under identical conditions. The obtained anharmonicity parameters were compared with the corresponding diagonal anharmonicity constants for stretching vibrations of the bridging protons and the lengths of the hydrogen bridges.

  18. Addressing Complexity in Environmental Management and Governance

    Directory of Open Access Journals (Sweden)

    Sabrina Kirschke

    2017-06-01

    Full Text Available Governance for complex problem solving has been increasingly discussed in environmental sustainability research. Above all, researchers continuously observe that sustainability problems are complex or “wicked”, and suggest participatory models to address these problems in practice. In order to add to this debate, this study suggests a more differentiated theoretical approach to define governance for complex environmental problem solving than in previous studies. The approach consists of two vital steps: First, we operationalize complexity and define management strategies for solving environmental sustainability problems based on findings from psychology research. Second, we identify governance strategies that facilitate these management strategies. Linking those strategies suggests that the role of diverse institutions, actors, and interactions differs for five key dimensions of complexity: goals, variables, dynamics, interconnections, and informational uncertainty. The results strengthen systematic analyses of environmental sustainability problems in both theory and practice.

  19. To the problem of reliability standardization in computer-aided manufacturing at NPP units

    International Nuclear Information System (INIS)

    Yastrebenetskij, M.A.; Shvyryaev, Yu.V.; Spektor, L.I.; Nikonenko, I.V.

    1989-01-01

    The problems of reliability standardization in computer-aided manufacturing of NPP units considering the following approaches: computer-aided manufacturing of NPP units as a part of automated technological complex; computer-aided manufacturing of NPP units as multi-functional system, are analyzed. Selection of the composition of reliability indeces for computer-aided manufacturing of NPP units for each of the approaches considered is substantiated

  20. Genetic algorithms applied to nonlinear and complex domains; TOPICAL

    International Nuclear Information System (INIS)

    Barash, D; Woodin, A E

    1999-01-01

    The dissertation, titled ''Genetic Algorithms Applied to Nonlinear and Complex Domains'', describes and then applies a new class of powerful search algorithms (GAS) to certain domains. GAS are capable of solving complex and nonlinear problems where many parameters interact to produce a ''final'' result such as the optimization of the laser pulse in the interaction of an atom with an intense laser field. GAS can very efficiently locate the global maximum by searching parameter space in problems which are unsuitable for a search using traditional methods. In particular, the dissertation contains new scientific findings in two areas. First, the dissertation examines the interaction of an ultra-intense short laser pulse with atoms. GAS are used to find the optimal frequency for stabilizing atoms in the ionization process. This leads to a new theoretical formulation, to explain what is happening during the ionization process and how the electron is responding to finite (real-life) laser pulse shapes. It is shown that the dynamics of the process can be very sensitive to the ramp of the pulse at high frequencies. The new theory which is formulated, also uses a novel concept (known as the (t,t') method) to numerically solve the time-dependent Schrodinger equation Second, the dissertation also examines the use of GAS in modeling decision making problems. It compares GAS with traditional techniques to solve a class of problems known as Markov Decision Processes. The conclusion of the dissertation should give a clear idea of where GAS are applicable, especially in the physical sciences, in problems which are nonlinear and complex, i.e. difficult to analyze by other means

  1. Analyzing complex networks through correlations in centrality measurements

    International Nuclear Information System (INIS)

    Ricardo Furlan Ronqui, José; Travieso, Gonzalo

    2015-01-01

    Many real world systems can be expressed as complex networks of interconnected nodes. It is frequently important to be able to quantify the relative importance of the various nodes in the network, a task accomplished by defining some centrality measures, with different centrality definitions stressing different aspects of the network. It is interesting to know to what extent these different centrality definitions are related for different networks. In this work, we study the correlation between pairs of a set of centrality measures for different real world networks and two network models. We show that the centralities are in general correlated, but with stronger correlations for network models than for real networks. We also show that the strength of the correlation of each pair of centralities varies from network to network. Taking this fact into account, we propose the use of a centrality correlation profile, consisting of the values of the correlation coefficients between all pairs of centralities of interest, as a way to characterize networks. Using the yeast protein interaction network as an example we show also that the centrality correlation profile can be used to assess the adequacy of a network model as a representation of a given real network. (paper)

  2. Approximate solutions for the two-dimensional integral transport equation. Solution of complex two-dimensional transport problems

    International Nuclear Information System (INIS)

    Sanchez, Richard.

    1980-11-01

    This work is divided into two parts: the first part deals with the solution of complex two-dimensional transport problems, the second one (note CEA-N-2166) treats the critically mixed methods of resolution. A set of approximate solutions for the isotropic two-dimensional neutron transport problem has been developed using the interface current formalism. The method has been applied to regular lattices of rectangular cells containing a fuel pin, cladding, and water, or homogenized structural material. The cells are divided into zones that are homogeneous. A zone-wise flux expansion is used to formulate a direct collision probability problem within a cell. The coupling of the cells is effected by making extra assumptions on the currents entering and leaving the interfaces. Two codes have been written: CALLIOPE uses a cylindrical cell model and one or three terms for the flux expansion, and NAUSICAA uses a two-dimensional flux representation and does a truly two-dimensional calculation inside each cell. In both codes, one or three terms can be used to make a space-independent expansion of the angular fluxes entering and leaving each side of the cell. The accuracies and computing times achieved with the different approximations are illustrated by numerical studies on two benchmark problems and by calculations performed in the APOLLO multigroup code [fr

  3. Using threshold regression to analyze survival data from complex surveys: With application to mortality linked NHANES III Phase II genetic data.

    Science.gov (United States)

    Li, Yan; Xiao, Tao; Liao, Dandan; Lee, Mei-Ling Ting

    2018-03-30

    The Cox proportional hazards (PH) model is a common statistical technique used for analyzing time-to-event data. The assumption of PH, however, is not always appropriate in real applications. In cases where the assumption is not tenable, threshold regression (TR) and other survival methods, which do not require the PH assumption, are available and widely used. These alternative methods generally assume that the study data constitute simple random samples. In particular, TR has not been studied in the setting of complex surveys that involve (1) differential selection probabilities of study subjects and (2) intracluster correlations induced by multistage cluster sampling. In this paper, we extend TR procedures to account for complex sampling designs. The pseudo-maximum likelihood estimation technique is applied to estimate the TR model parameters. Computationally efficient Taylor linearization variance estimators that consider both the intracluster correlation and the differential selection probabilities are developed. The proposed methods are evaluated by using simulation experiments with various complex designs and illustrated empirically by using mortality-linked Third National Health and Nutrition Examination Survey Phase II genetic data. Copyright © 2017 John Wiley & Sons, Ltd.

  4. The brachistochrone problem in open quantum systems

    International Nuclear Information System (INIS)

    Rotter, Ingrid

    2007-01-01

    Recently, the quantum brachistochrone problem has been discussed in the literature by using non-Hermitian Hamilton operators of different types. Here, it is demonstrated that the passage time is tunable in realistic open quantum systems due to the biorthogonality of the eigenfunctions of the non-Hermitian Hamilton operator. As an example, the numerical results obtained by Bulgakov et al for the transmission through microwave cavities of different shapes are analyzed from the point of view of the brachistochrone problem. The passage time is shortened in the crossover from the weak-coupling to the strong-coupling regime where the resonance states overlap and many branch points (exceptional points) in the complex plane exist. The effect can not be described in the framework of the standard quantum mechanics with the Hermitian Hamilton operator and consideration of S matrix poles

  5. Principles of big data preparing, sharing, and analyzing complex information

    CERN Document Server

    Berman, Jules J

    2013-01-01

    Principles of Big Data helps readers avoid the common mistakes that endanger all Big Data projects. By stressing simple, fundamental concepts, this book teaches readers how to organize large volumes of complex data, and how to achieve data permanence when the content of the data is constantly changing. General methods for data verification and validation, as specifically applied to Big Data resources, are stressed throughout the book. The book demonstrates how adept analysts can find relationships among data objects held in disparate Big Data resources, when the data objects are endo

  6. The graph-theoretic minimum energy path problem for ionic conduction

    Directory of Open Access Journals (Sweden)

    Ippei Kishida

    2015-10-01

    Full Text Available A new computational method was developed to analyze the ionic conduction mechanism in crystals through graph theory. The graph was organized into nodes, which represent the crystal structures modeled by ionic site occupation, and edges, which represent structure transitions via ionic jumps. We proposed a minimum energy path problem, which is similar to the shortest path problem. An effective algorithm to solve the problem was established. Since our method does not use randomized algorithm and time parameters, the computational cost to analyze conduction paths and a migration energy is very low. The power of the method was verified by applying it to α-AgI and the ionic conduction mechanism in α-AgI was revealed. The analysis using single point calculations found the minimum energy path for long-distance ionic conduction, which consists of 12 steps of ionic jumps in a unit cell. From the results, the detailed theoretical migration energy was calculated as 0.11 eV by geometry optimization and nudged elastic band method. Our method can refine candidates for possible jumps in crystals and it can be adapted to other computational methods, such as the nudged elastic band method. We expect that our method will be a powerful tool for analyzing ionic conduction mechanisms, even for large complex crystals.

  7. Problems over Information Systems

    KAUST Repository

    Chikalov, Igor

    2011-01-01

    The problems of estimation of the minimum average time complexity of decision trees and design of efficient algorithms are complex in general case. The upper bounds described in Chap. 2.4.3 can not be applied directly due to large computational complexity of the parameter M(z). Under reasonable assumptions about the relation of P and NP, there are no polynomial time algorithms with good approximation ratio [12, 32]. One of the possible solutions is to consider particular classes of problems and improve the existing results using characteristics of the considered classes. © Springer-Verlag Berlin Heidelberg 2011.

  8. Electromagnetic waves in complex systems selected theoretical and applied problems

    CERN Document Server

    Velychko, Lyudmyla

    2016-01-01

    This book gives guidance to solve problems in electromagnetics, providing both examples of solving serious research problems as well as the original results to encourage further investigations. The book contains seven chapters on various aspects of resonant wave scattering, each solving one original problem. All of them are unified by the authors’ desire to show advantages of rigorous approaches at all stages, from the formulation of a problem and the selection of a method to the interpretation of results. The book reveals a range of problems associated with wave propagation and scattering in natural and artificial environments or with the design of antennas elements. The authors invoke both theoretical (analytical and numerical) and experimental techniques for handling the problems. Attention is given to mathematical simulations, computational efficiency, and physical interpretation of the experimental results. The book is written for students, graduate students and young researchers. .

  9. Complex Time-Delay Systems Theory and Applications

    CERN Document Server

    Atay, Fatihcan M

    2010-01-01

    Time delays in dynamical systems arise as an inevitable consequence of finite speeds of information transmission. Realistic models increasingly demand the inclusion of delays in order to properly understand, analyze, design, and control real-life systems. The goal of this book is to present the state-of-the-art in research on time-delay dynamics in the framework of complex systems and networks. While the mathematical theory of delay equations is quite mature, its application to the particular problems of complex systems and complexity is a newly emerging field, and the present volume aims to play a pioneering role in this perspective. The chapters in this volume are authored by renowned experts and cover both theory and applications in a wide range of fields, with examples extending from neuroscience and biology to laser physics and vehicle traffic. Furthermore, all chapters include sufficient introductory material and extensive bibliographies, making the book a self-contained reference for both students and ...

  10. Analyzing discourse and text complexity for learning and collaborating a cognitive approach based on natural language processing

    CERN Document Server

    Dascălu, Mihai

    2014-01-01

    With the advent and increasing popularity of Computer Supported Collaborative Learning (CSCL) and e-learning technologies, the need of automatic assessment and of teacher/tutor support for the two tightly intertwined activities of comprehension of reading materials and of collaboration among peers has grown significantly. In this context, a polyphonic model of discourse derived from Bakhtin’s work as a paradigm is used for analyzing both general texts and CSCL conversations in a unique framework focused on different facets of textual cohesion. As specificity of our analysis, the individual learning perspective is focused on the identification of reading strategies and on providing a multi-dimensional textual complexity model, whereas the collaborative learning dimension is centered on the evaluation of participants’ involvement, as well as on collaboration assessment. Our approach based on advanced Natural Language Processing techniques provides a qualitative estimation of the learning process and enhance...

  11. Edge-Matching Problems with Rotations

    DEFF Research Database (Denmark)

    Ebbesen, Martin; Fischer, Paul; Witt, Carsten

    2011-01-01

    Edge-matching problems, also called puzzles, are abstractions of placement problems with neighborhood conditions. Pieces with colored edges have to be placed on a board such that adjacent edges have the same color. The problem has gained interest recently with the (now terminated) Eternity II...... puzzle, and new complexity results. In this paper we consider a number of settings which differ in size of the puzzles and the manipulations allowed on the pieces. We investigate the effect of allowing rotations of the pieces on the complexity of the problem, an aspect that is only marginally treated so...

  12. Analysis of a finite PML approximation to the three dimensional elastic wave scattering problem

    KAUST Repository

    Bramble, James H.

    2010-01-01

    We consider the application of a perfectly matched layer (PML) technique to approximate solutions to the elastic wave scattering problem in the frequency domain. The PML is viewed as a complex coordinate shift in spherical coordinates which leads to a variable complex coefficient equation for the displacement vector posed on an infinite domain (the complement of the scatterer). The rapid decay of the PML solution suggests truncation to a bounded domain with a convenient outer boundary condition and subsequent finite element approximation (for the truncated problem). We prove existence and uniqueness of the solutions to the infinite domain and truncated domain PML equations (provided that the truncated domain is sufficiently large). We also show exponential convergence of the solution of the truncated PML problem to the solution of the original scattering problem in the region of interest. We then analyze a Galerkin numerical approximation to the truncated PML problem and prove that it is well posed provided that the PML damping parameter and mesh size are small enough. Finally, computational results illustrating the efficiency of the finite element PML approximation are presented. © 2010 American Mathematical Society.

  13. The Ultimatum Game in complex networks

    International Nuclear Information System (INIS)

    Sinatra, R; Gómez-Gardeñes, J; Latora, V; Iranzo, J; Floría, L M; Moreno, Y

    2009-01-01

    We address the problem of how cooperative (altruistic-like) behavior arises in natural and social systems by analyzing an Ultimatum Game in complex networks. Specifically, players of three types are considered: (a) empathetic, whose aspiration levels, and offers, are equal, (b) pragmatic, who do not distinguish between the different roles and aim to obtain the same benefit, and (c) agents whose aspiration levels, and offers, are independent. We analyze the asymptotic behavior of pure populations with different topologies using two kinds of strategic update rules: natural selection, which relies on replicator dynamics, and social penalty, inspired by the Bak–Sneppen dynamics, in which players are subject to a social selection rule penalizing not only the less fit individuals, but also their first neighbors. We discuss the emergence of fairness in the different settings and network topologies

  14. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  15. Dealing with Complex and Ill-Structured Problems: Results of a Plan-Do-Check-Act Experiment in a Business Engineering Semester

    Science.gov (United States)

    Riis, Jens Ove; Achenbach, Marlies; Israelsen, Poul; Kyvsgaard Hansen, Poul; Johansen, John; Deuse, Jochen

    2017-01-01

    Challenged by increased globalisation and fast technological development, we carried out an experiment in the third semester of a global business engineering programme aimed at identifying conditions for training students in dealing with complex and ill-structured problems of forming a new business. As this includes a fuzzy front end, learning…

  16. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    International Nuclear Information System (INIS)

    Tahvili, Sahar; Österberg, Jonas; Silvestrov, Sergei; Biteus, Jonas

    2014-01-01

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation

  17. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    Energy Technology Data Exchange (ETDEWEB)

    Tahvili, Sahar [Mälardalen University (Sweden); Österberg, Jonas; Silvestrov, Sergei [Division of Applied Mathematics, Mälardalen University (Sweden); Biteus, Jonas [Scania CV (Sweden)

    2014-12-10

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.

  18. Role and capabilities of financial structures in development of fuel-power complex in Russia

    International Nuclear Information System (INIS)

    Nevzlin, L.B.; Kukin, N.V.

    1993-01-01

    The problems of financing the enterprises of the fuel-power complex (FPC) in Russia nowadays are discussed. The causes of the FPC hard financial situation are analyzed. The forms and methods of investing activity financing and participation of financial structures in stock-holding of FPC enterprises, which can improve the present situation, are shown

  19. Big and complex data analysis methodologies and applications

    CERN Document Server

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  20. Complexity Analysis of Industrial Organizations Based on a Perspective of Systems Engineering Analysts

    Directory of Open Access Journals (Sweden)

    I. H. Garbie

    2011-12-01

    Full Text Available Complexity in industrial organizations became more difficult and complex to be solved and it needs more attention from academicians and technicians. For these reasons, complexity in industrial organizations represents a new challenge in the next decades. Until now, analysis of industrial organizations complexity is still remaining a research topic of immense international interest and they require reduction in their complexity. In this paper, analysis of complexity in industrial organizations is shown based on the perspective of systems engineering analyst. In this perspective, analysis of complexity was divided into different levels and these levels were defined as complexity levels. A framework of analyzing these levels was proposed and suggested based on the complexity in industrial organizations. This analysis was divided into four main issues: industrial system vision, industrial system structure, industrial system operating, and industrial system evaluating. This analysis shows that the complexity of industrial organizations is still an ill-structured and a multi-dimensional problem.

  1. Towards a theoretical framework for analyzing complex linguistic networks

    CERN Document Server

    Lücking, Andy; Banisch, Sven; Blanchard, Philippe; Job, Barbara

    2016-01-01

    The aim of this book is to advocate and promote network models of linguistic systems that are both based on thorough mathematical models and substantiated in terms of linguistics. In this way, the book contributes first steps towards establishing a statistical network theory as a theoretical basis of linguistic network analysis the boarder of the natural sciences and the humanities.This book addresses researchers who want to get familiar with theoretical developments, computational models and their empirical evaluation in the field of complex linguistic networks. It is intended to all those who are interested in statisticalmodels of linguistic systems from the point of view of network research. This includes all relevant areas of linguistics ranging from phonological, morphological and lexical networks on the one hand and syntactic, semantic and pragmatic networks on the other. In this sense, the volume concerns readers from many disciplines such as physics, linguistics, computer science and information scien...

  2. Complexity analysis of accelerated MCMC methods for Bayesian inversion

    International Nuclear Information System (INIS)

    Hoang, Viet Ha; Schwab, Christoph; Stuart, Andrew M

    2013-01-01

    The Bayesian approach to inverse problems, in which the posterior probability distribution on an unknown field is sampled for the purposes of computing posterior expectations of quantities of interest, is starting to become computationally feasible for partial differential equation (PDE) inverse problems. Balancing the sources of error arising from finite-dimensional approximation of the unknown field, the PDE forward solution map and the sampling of the probability space under the posterior distribution are essential for the design of efficient computational Bayesian methods for PDE inverse problems. We study Bayesian inversion for a model elliptic PDE with an unknown diffusion coefficient. We provide complexity analyses of several Markov chain Monte Carlo (MCMC) methods for the efficient numerical evaluation of expectations under the Bayesian posterior distribution, given data δ. Particular attention is given to bounds on the overall work required to achieve a prescribed error level ε. Specifically, we first bound the computational complexity of ‘plain’ MCMC, based on combining MCMC sampling with linear complexity multi-level solvers for elliptic PDE. Our (new) work versus accuracy bounds show that the complexity of this approach can be quite prohibitive. Two strategies for reducing the computational complexity are then proposed and analyzed: first, a sparse, parametric and deterministic generalized polynomial chaos (gpc) ‘surrogate’ representation of the forward response map of the PDE over the entire parameter space, and, second, a novel multi-level Markov chain Monte Carlo strategy which utilizes sampling from a multi-level discretization of the posterior and the forward PDE. For both of these strategies, we derive asymptotic bounds on work versus accuracy, and hence asymptotic bounds on the computational complexity of the algorithms. In particular, we provide sufficient conditions on the regularity of the unknown coefficients of the PDE and on the

  3. Solution of problems of automation of elevator complex

    Directory of Open Access Journals (Sweden)

    V. S. Kudryashov

    2018-01-01

    Full Text Available The article is devoted to the solution of automation tasks in the development of the operator's workstation (AWP for controlling the elevator with a capacity of 280 tons per hour as part of the work of LLC "Intelligent Automation Complexes". In the existing elevator complexes, only grain transportation is provided (there are no control systems with automatic grain drying with high accuracy of humidity measurement, automatic generation of grain transportation routes is not provided (for each route, a technical task is required and changes to the control program of the system are required. At the same time, more precise regulation of the flow of grain flows is required (the automatic latches used have only the "open / close" positions. The goal of elevator automation is: to reduce the time of equipment downturn by tracking the operating time of the equipment, the number of accidents and informing the operator about equipment that is susceptible to failure; reduction of the time for setting up and servicing the elevator; improvement of product quality; a decrease in the percentage of rejects, as well as a decrease in the influence of the human factor on the process. The paper provides a brief description of the proposed gate valve control algorithms, the auto-building of the grain drying route, the filtering of the grain moisture readings and the fragments of the operator's workstation program (in indusoft web studio for controlling the elevator complex. The proposed solutions allow: to reduce the time of equipment downtime by 20% and the total service time of the complex; weed out the undried grain for ridding in automatic mode for repeated drying; to improve the quality of products through automatic control of grain overheating; to reduce the production waste by 3%, and also to reduce the influence of the human factor on the process of grain transportation and drying.

  4. On the complexity of the balanced vertex ordering problem

    Directory of Open Access Journals (Sweden)

    Jan Kara

    2007-01-01

    Full Text Available We consider the problem of finding a balanced ordering of the vertices of a graph. More precisely, we want to minimise the sum, taken over all vertices v, of the difference between the number of neighbours to the left and right of v. This problem, which has applications in graph drawing, was recently introduced by Biedl et al. [Discrete Applied Math. 148:27--48, 2005]. They proved that the problem is solvable in polynomial time for graphs with maximum degree three, but NP-hard for graphs with maximum degree six. One of our main results is to close the gap in these results, by proving NP-hardness for graphs with maximum degree four. Furthermore, we prove that the problem remains NP-hard for planar graphs with maximum degree four and for 5-regular graphs. On the other hand, we introduce a polynomial time algorithm that determines whetherthere is a vertex ordering with total imbalance smaller than a fixed constant, and a polynomial time algorithm that determines whether a given multigraph with even degrees has an `almost balanced' ordering.

  5. Self-Regulation in the Midst of Complexity: A Case Study of High School Physics Students Engaged in Ill-Structured Problem Solving

    Science.gov (United States)

    Milbourne, Jeffrey David

    The purpose of this dissertation study was to explore the experiences of high school physics students who were solving complex, ill-structured problems, in an effort to better understand how self-regulatory behavior mediated the project experience. Consistent with Voss, Green, Post, and Penner's (1983) conception of an ill-structured problem in the natural sciences, the 'problems' consisted of scientific research projects that students completed under the supervision of a faculty mentor. Zimmerman and Campillo's (2003) self-regulatory framework of problem solving provided a holistic guide to data collection and analysis of this multi-case study, with five individual student cases. The study's results are explored in two manuscripts, each targeting a different audience. The first manuscript, intended for the Science Education Research community, presents a thick, rich description of the students' project experiences, consistent with a qualitative, case study analysis. Findings suggest that intrinsic interest was an important self-regulatory factor that helped motivate students throughout their project work, and that the self-regulatory cycle of forethought, performance monitoring, and self-reflection was an important component of the problem-solving process. Findings also support the application of Zimmerman and Campillo's framework to complex, ill-structured problems, particularly the cyclical nature of the framework. Finally, this study suggests that scientific research projects, with the appropriate support, can be a mechanism for improving students' selfregulatory behavior. The second manuscript, intended for Physics practitioners, combines the findings of the first manuscript with the perspectives of the primary, on-site research mentor, who has over a decade's worth of experience mentoring students doing physics research. His experience suggests that a successful research experience requires certain characteristics, including: a slow, 'on-ramp' to the research

  6. Student Perception Problems in Using Historical Language: Semantic/Phonetic Connotation and Concept Loss

    Directory of Open Access Journals (Sweden)

    Erhan METİN

    2012-05-01

    Full Text Available Historical language is for the advanced education of students and for understanding the new, complex terminology which we are not accustomed to and the development of the language which has gradually been more complex. Being conscious of the historical language may be the products while learning the history. In other words, historical language is an important factor in learning and teaching the history. Therefore, the words may have different references according to context and situations they are used and they may also change in line with the situations the listener and the speaker encounter them. Moreover, communication not only involves the use of individual symbols (words, but also includes all of the structure of symbolic patterns (word groups and sentences. These larger groups have the same detailed relationships (related to reference areas as in the separate words. A part of the general problem of communication may arise from the following fact. Words, word groups, clauses, sentences are the symbols which have specific references and reference areas for the speaker but it is not certain that the reference and reference areas would be transferred to listeners. The aim of this study is to determine the problems stemming from language use in history classes via analyzing the relationship between history education and language by considering the students’ perspective on history education and language relationship. At the same time it is the aim of this study to offer a unique model to show how the relationship between history education and language should be analyzed. Te study is based on a descriptive research model. Both qualitative and quantitative research methods are used. First of all, qualitative data are collected, analyzed and reached findings and then during the experimental process, which is designed considering these findings, quantitative data are collected, analyzed and the results are found. While collecting the data

  7. Problem analysis of geotechnical well drilling in complex environment

    International Nuclear Information System (INIS)

    Kasenov, A K; Biletskiy, M T; Ratov, B T; Korotchenko, T V

    2015-01-01

    The article examines primary causes of problems occurring during the drilling of geotechnical wells (injection, production and monitoring wells) for in-situ leaching to extract uranium in South Kazakhstan. Such a drilling problem as hole caving which is basically caused by various chemical and physical factors (hydraulic, mechanical, etc.) has been thoroughly investigated. The analysis of packing causes has revealed that this problem usually occurs because of insufficient amount of drilling mud being associated with small cross section downward flow and relatively large cross section upward flow. This is explained by the fact that when spear bores are used to drill clay rocks, cutting size is usually rather big and there is a risk for clay particles to coagulate

  8. [Pollution-ecological problems of old industrial and mining areas and future research prospects].

    Science.gov (United States)

    Zhou, Qixing

    2005-06-01

    Environmental pollution and its solicitation in ecological problems of old industrial and mining areas have become a worldwide technological puzzle restricting sustainable economic and social development. But, the definition and category of old industrial and mining areas is still disputed as an important concept. In this paper, the concept of old industrial and mining area was discussed in theory, and, proceeded with analyzing the complexity of current situation and environmental pollution problems of old industrial and mining areas in China, more keystone attention was paid to the secondary pollution problems from old industrial and mining areas as an important frontier of science. On the basis of expounding the complexity and characters of environmental pollution in old industrial and mining areas, it was suggested that as two key scientific problems in environmental sciences and ecology, the formation mechanisms and control technology of secondary pollution in old industrial and mining areas and the responses of new-type diseases to environmental pollution based on molecular ecotoxicology should be systematically studied on the national scale, and be an important component of environmental protection strategy in China in the future.

  9. Fostering Creative Problem Solvers in Higher Education

    DEFF Research Database (Denmark)

    Zhou, Chunfang

    2016-01-01

    to meet such challenges. This chapter aims to illustrate how to understand: 1) complexity as the nature of professional practice; 2) creative problem solving as the core skill in professional practice; 3) creativity as interplay between persons and their environment; 4) higher education as the context......Recent studies have emphasized issues of social emergence based on thinking of societies as complex systems. The complexity of professional practice has been recognized as the root of challenges for higher education. To foster creative problem solvers is a key response of higher education in order...... of fostering creative problem solvers; and 5) some innovative strategies such as Problem-Based Learning (PBL) and building a learning environment by Information Communication Technology (ICT) as potential strategies of creativity development. Accordingly, this chapter contributes to bridge the complexity...

  10. System sight at a problem of efficiency of enterprises’s operaton of the Russian chemical complex

    Directory of Open Access Journals (Sweden)

    Svyatoslav Arkadyevich Nikitin

    2011-06-01

    Full Text Available Chemical industry plays an important role in the development of the domestic economy as one of the basic facilities of Russia's economy, laying the foundation for its long-term and stable development. As a major supplier of raw materials, intermediates, and products of various materials (plastics, chemical fibers, tires, paints and varnishes, dyes, fertilizers, feed additives, pharmaceuticals, medical equipment etc. in almost all sectors of industry, agriculture, health care, human services, commerce, science, culture and education, defense industry, chemical complex has direct impact on the efficiency of their operation and development in these new directions. Therefore, the condition and development of domestic chemistry determines the level of national competitiveness, economic growth and Russia's wealth. However, like most industries in Russia today, chemical industry is going through a difficult period. The presence of a set of common economic problems (identified by technological backwardness and high depreciation, low innovation activity of domestic enterprises of the chemical complex, a lack of effectiveness of the investment process, infrastructure and resource constraints etc., as well as internal management problems causes the rapid growth of interest of uncompetitive Russian chemical products on the world market. Under these conditions, not only a radical adjustment of the internal control systems and chemical plants, but also a significant organizational and economic change is required. Thus, unless we take measures to improve the domestic chemical industry in the coming years, almost all of it grow back and may get into the situation of struggle for survival.

  11. Probing the topological properties of complex networks modeling short written texts.

    Directory of Open Access Journals (Sweden)

    Diego R Amancio

    Full Text Available In recent years, graph theory has been widely employed to probe several language properties. More specifically, the so-called word adjacency model has been proven useful for tackling several practical problems, especially those relying on textual stylistic analysis. The most common approach to treat texts as networks has simply considered either large pieces of texts or entire books. This approach has certainly worked well-many informative discoveries have been made this way-but it raises an uncomfortable question: could there be important topological patterns in small pieces of texts? To address this problem, the topological properties of subtexts sampled from entire books was probed. Statistical analyses performed on a dataset comprising 50 novels revealed that most of the traditional topological measurements are stable for short subtexts. When the performance of the authorship recognition task was analyzed, it was found that a proper sampling yields a discriminability similar to the one found with full texts. Surprisingly, the support vector machine classification based on the characterization of short texts outperformed the one performed with entire books. These findings suggest that a local topological analysis of large documents might improve its global characterization. Most importantly, it was verified, as a proof of principle, that short texts can be analyzed with the methods and concepts of complex networks. As a consequence, the techniques described here can be extended in a straightforward fashion to analyze texts as time-varying complex networks.

  12. Analysis of the spectrum of a Cartesian Perfectly Matched Layer (PML) approximation to acoustic scattering problems

    KAUST Repository

    Kim, Seungil

    2010-01-01

    In this paper, we study the spectrum of the operator which results when the Perfectly Matched Layer (PML) is applied in Cartesian geometry to the Laplacian on an unbounded domain. This is often thought of as a complex change of variables or "complex stretching." The reason that such an operator is of interest is that it can be used to provide a very effective domain truncation approach for approximating acoustic scattering problems posed on unbounded domains. Stretching associated with polar or spherical geometry lead to constant coefficient operators outside of a bounded transition layer and so even though they are on unbounded domains, they (and their numerical approximations) can be analyzed by more standard compact perturbation arguments. In contrast, operators associated with Cartesian stretching are non-constant in unbounded regions and hence cannot be analyzed via a compact perturbation approach. Alternatively, to show that the scattering problem PML operator associated with Cartesian geometry is stable for real nonzero wave numbers, we show that the essential spectrum of the higher order part only intersects the real axis at the origin. This enables us to conclude stability of the PML scattering problem from a uniqueness result given in a subsequent publication. © 2009 Elsevier Inc. All rights reserved.

  13. A case study of analyzing 11th graders’ problem solving ability on heat and temperature topic

    Science.gov (United States)

    Yulianawati, D.; Muslim; Hasanah, L.; Samsudin, A.

    2018-05-01

    Problem solving ability must be owned by students after the process of physics learning so that the concept of physics becomes meaningful. Consequently, the research aims to describe their problem solving ability. Metacognition is contributed to physics learning to the success of students in solving problems. This research has already been implemented to 37 science students (30 women and 7 men) of eleventh grade from one of the secondary schools in Bandung. The research methods utilized the single case study with embedded research design. The instrument is Heat and Temperature Problem Solving Ability Test (HT-PSAT) which consists of twelve questions from three context problems. The result shows that the average value of the test is 8.27 out of the maximum total value of 36. In conclusion, eleventh graders’ problem-solving ability is still under expected. The implication of the findings is able to create learning situations which are probably developing students to embrace better problem solving ability.

  14. Analyzing the BBOB results by means of benchmarking concepts.

    Science.gov (United States)

    Mersmann, O; Preuss, M; Trautmann, H; Bischl, B; Weihs, C

    2015-01-01

    We present methods to answer two basic questions that arise when benchmarking optimization algorithms. The first one is: which algorithm is the "best" one? and the second one is: which algorithm should I use for my real-world problem? Both are connected and neither is easy to answer. We present a theoretical framework for designing and analyzing the raw data of such benchmark experiments. This represents a first step in answering the aforementioned questions. The 2009 and 2010 BBOB benchmark results are analyzed by means of this framework and we derive insight regarding the answers to the two questions. Furthermore, we discuss how to properly aggregate rankings from algorithm evaluations on individual problems into a consensus, its theoretical background and which common pitfalls should be avoided. Finally, we address the grouping of test problems into sets with similar optimizer rankings and investigate whether these are reflected by already proposed test problem characteristics, finding that this is not always the case.

  15. A complex approach to the blue-loop problem

    Science.gov (United States)

    Ostrowski, Jakub; Daszynska-Daszkiewicz, Jadwiga

    2015-08-01

    The problem of the blue loops during the core helium burning, outstanding for almost fifty years, is one of the most difficult and poorly understood problems in stellar astrophysics. Most of the work focused on the blue loops done so far has been performed with old stellar evolution codes and with limited computational resources. In the end the obtained conclusions were based on a small sample of models and could not have taken into account more advanced effects and interactions between them.The emergence of the blue loops depends on many details of the evolution calculations, in particular on chemical composition, opacity, mixing processes etc. The non-linear interactions between these factors contribute to the statement that in most cases it is hard to predict without a precise stellar modeling whether a loop will emerge or not. The high sensitivity of the blue loops to even small changes of the internal structure of a star yields one more issue: a sensitivity to numerical problems, which are common in calculations of stellar models on advanced stages of the evolution.To tackle this problem we used a modern stellar evolution code MESA. We calculated a large grid of evolutionary tracks (about 8000 models) with masses in the range of 3.0 - 25.0 solar masses from the zero age main sequence to the depletion of helium in the core. In order to make a comparative analysis, we varied metallicity, helium abundance and different mixing parameters resulting from convective overshooting, rotation etc.The better understanding of the properties of the blue loops is crucial for our knowledge of the population of blue supergiants or pulsating variables such as Cepheids, α-Cygni or Slowly Pulsating B-type supergiants. In case of more massive models it is also of great importance for studies of the progenitors of supernovae.

  16. An overview on polynomial approximation of NP-hard problems

    Directory of Open Access Journals (Sweden)

    Paschos Vangelis Th.

    2009-01-01

    Full Text Available The fact that polynomial time algorithm is very unlikely to be devised for an optimal solving of the NP-hard problems strongly motivates both the researchers and the practitioners to try to solve such problems heuristically, by making a trade-off between computational time and solution's quality. In other words, heuristic computation consists of trying to find not the best solution but one solution which is 'close to' the optimal one in reasonable time. Among the classes of heuristic methods for NP-hard problems, the polynomial approximation algorithms aim at solving a given NP-hard problem in poly-nomial time by computing feasible solutions that are, under some predefined criterion, as near to the optimal ones as possible. The polynomial approximation theory deals with the study of such algorithms. This survey first presents and analyzes time approximation algorithms for some classical examples of NP-hard problems. Secondly, it shows how classical notions and tools of complexity theory, such as polynomial reductions, can be matched with polynomial approximation in order to devise structural results for NP-hard optimization problems. Finally, it presents a quick description of what is commonly called inapproximability results. Such results provide limits on the approximability of the problems tackled.

  17. Subsynchronous Oscillation Problem Research in the UHVDC System of a Regional Power Grid in China

    Directory of Open Access Journals (Sweden)

    Qu Ying

    2016-01-01

    Full Text Available Along with the grid structure being more and more complex and the rapid development of the HVDC system, studying the subsynchronous oscillation (SSO problem on HVDC system has more engineering practice significance. The paper studies subsynchronous oscillations problem of generators near the ±800kV UHVDC converter station, and analyzes the subsynchronous oscillation possibilities through PSCAD/EMTDC simulation. At last, though the researched UHVDC thermal plants have none SSO risk but it needs other measures to make the relevant generators return on normal operation.

  18. Development of a Novel Cu(II Complex Modified Electrode and a Portable Electrochemical Analyzer for the Determination of Dissolved Oxygen (DO in Water

    Directory of Open Access Journals (Sweden)

    Salvatore Gianluca Leonardi

    2016-04-01

    Full Text Available The development of an electrochemical dissolved oxygen (DO sensor based on a novel Cu(II complex-modified screen printed carbon electrode is reported. The voltammetric behavior of the modified electrode was investigated at different scan rates and oxygen concentrations in PBS (pH = 7. An increase of cathodic current (at about −0.4 vs. Ag/AgCl with the addition of oxygen was observed. The modified Cu(II complex electrode was demonstrated for the determination of DO in water using chronoamperometry. A small size and low power consumption home-made portable electrochemical analyzer based on custom electronics for sensor interfacing and operating in voltammetry and amperometry modes has been also designed and fabricated. Its performances in the monitoring of DO in water were compared with a commercial one.

  19. Communication complexity and information complexity

    Science.gov (United States)

    Pankratov, Denis

    Information complexity enables the use of information-theoretic tools in communication complexity theory. Prior to the results presented in this thesis, information complexity was mainly used for proving lower bounds and direct-sum theorems in the setting of communication complexity. We present three results that demonstrate new connections between information complexity and communication complexity. In the first contribution we thoroughly study the information complexity of the smallest nontrivial two-party function: the AND function. While computing the communication complexity of AND is trivial, computing its exact information complexity presents a major technical challenge. In overcoming this challenge, we reveal that information complexity gives rise to rich geometrical structures. Our analysis of information complexity relies on new analytic techniques and new characterizations of communication protocols. We also uncover a connection of information complexity to the theory of elliptic partial differential equations. Once we compute the exact information complexity of AND, we can compute exact communication complexity of several related functions on n-bit inputs with some additional technical work. Previous combinatorial and algebraic techniques could only prove bounds of the form theta( n). Interestingly, this level of precision is typical in the area of information theory, so our result demonstrates that this meta-property of precise bounds carries over to information complexity and in certain cases even to communication complexity. Our result does not only strengthen the lower bound on communication complexity of disjointness by making it more exact, but it also shows that information complexity provides the exact upper bound on communication complexity. In fact, this result is more general and applies to a whole class of communication problems. In the second contribution, we use self-reduction methods to prove strong lower bounds on the information

  20. Evaluation of fine ceramics raw powders with particle size analyzers having different measuring principle and its problem

    International Nuclear Information System (INIS)

    Hayakawa, Osamu; Nakahira, Kenji; Tsubaki, Junichiro.

    1995-01-01

    Many kinds of analyzers based on various principles have been developed for measuring particle size distribution of fine ceramics powders. But the reproducibility of the results, interchangeability of the models, reliability of the ends of the measured distribution have not been investigated for each principle. In this paper, these important points for particle size analysis were clarified by measuring raw material powders of fine ceramics. (1) in the case of laser diffraction and scattering method, the reproducibility in the same model is good, however, interchangeability of the different models is not so good, especially at the ends of the distribution. Submicron powders having high refractive index show such a tendency remarkably. (2) the photo sedimentation method has some problems to be conquered, especially in measuring submicron powders having high refractive index or flaky shape particles. The reproducibility of X-ray sedimentation method is much better than that of photo sedimentation. (3) the light obscuration and electrical sensing zone methods, show good reproducibility, however, sometime bad interchangeability is affected by calibration and so on. (author)

  1. A novel multilayer model for missing link prediction and future link forecasting in dynamic complex networks

    Science.gov (United States)

    Yasami, Yasser; Safaei, Farshad

    2018-02-01

    The traditional complex network theory is particularly focused on network models in which all network constituents are dealt with equivalently, while fail to consider the supplementary information related to the dynamic properties of the network interactions. This is a main constraint leading to incorrect descriptions of some real-world phenomena or incomplete capturing the details of certain real-life problems. To cope with the problem, this paper addresses the multilayer aspects of dynamic complex networks by analyzing the properties of intrinsically multilayered co-authorship networks, DBLP and Astro Physics, and presenting a novel multilayer model of dynamic complex networks. The model examines the layers evolution (layers birth/death process and lifetime) throughout the network evolution. Particularly, this paper models the evolution of each node's membership in different layers by an Infinite Factorial Hidden Markov Model considering feature cascade, and thereby formulates the link generation process for intra-layer and inter-layer links. Although adjacency matrixes are useful to describe the traditional single-layer networks, such a representation is not sufficient to describe and analyze the multilayer dynamic networks. This paper also extends a generalized mathematical infrastructure to address the problems issued by multilayer complex networks. The model inference is performed using some Markov Chain Monte Carlo sampling strategies, given synthetic and real complex networks data. Experimental results indicate a tremendous improvement in the performance of the proposed multilayer model in terms of sensitivity, specificity, positive and negative predictive values, positive and negative likelihood ratios, F1-score, Matthews correlation coefficient, and accuracy for two important applications of missing link prediction and future link forecasting. The experimental results also indicate the strong predictivepower of the proposed model for the application of

  2. Solving applied mathematical problems with Matlab

    CERN Document Server

    Xue, Dingyu

    2008-01-01

    Computer Mathematics Language-An Overview. Fundamentals of MATLAB Programming. Calculus Problems. MATLAB Computations of Linear Algebra Problems. Integral Transforms and Complex Variable Functions. Solutions to Nonlinear Equations and Optimization Problems. MATLAB Solutions to Differential Equation Problems. Solving Interpolations and Approximations Problems. Solving Probability and Mathematical Statistics Problems. Nontraditional Solution Methods for Mathematical Problems.

  3. Performance Comparison of OpenMP, MPI, and MapReduce in Practical Problems

    Directory of Open Access Journals (Sweden)

    Sol Ji Kang

    2015-01-01

    Full Text Available With problem size and complexity increasing, several parallel and distributed programming models and frameworks have been developed to efficiently handle such problems. This paper briefly reviews the parallel computing models and describes three widely recognized parallel programming frameworks: OpenMP, MPI, and MapReduce. OpenMP is the de facto standard for parallel programming on shared memory systems. MPI is the de facto industry standard for distributed memory systems. MapReduce framework has become the de facto standard for large scale data-intensive applications. Qualitative pros and cons of each framework are known, but quantitative performance indexes help get a good picture of which framework to use for the applications. As benchmark problems to compare those frameworks, two problems are chosen: all-pairs-shortest-path problem and data join problem. This paper presents the parallel programs for the problems implemented on the three frameworks, respectively. It shows the experiment results on a cluster of computers. It also discusses which is the right tool for the jobs by analyzing the characteristics and performance of the paradigms.

  4. Problem Solving Reasoning and Problem Based Instruction in Geometry Learning

    Science.gov (United States)

    Sulistyowati, F.; Budiyono, B.; Slamet, I.

    2017-09-01

    This research aims to analyze the comparison Problem Solving Reasoning (PSR) and Problem Based Instruction (PBI) on problem solving and mathematical communication abilities viewed from Self-Regulated Learning (SRL). Learning was given to grade 8th junior high school students. This research uses quasi experimental method, and then with descriptive analysis. Data were analyzed using two-ways multivariate analysis of variance (MANOVA) and one-way analysis of variance (ANOVA) with different cells. The result of data analysis were learning model gives different effect, level of SRL gives the same effect, and there is no interaction between the learning model with the SRL on the problem solving and mathematical communication abilities. The t-test statistic was used to find out more effective learning model. Based on the test, regardless of the level of SRL, PSR is more effective than PBI for problemsolving ability. The result of descriptive analysis was PSR had the advantage in creating learning that optimizing the ability of learners in reasoning to solve a mathematical problem. Consequently, the PSR is the right learning model to be applied in the classroom to improve problem solving ability of learners.

  5. The SPAR thermal analyzer: Present and future

    Science.gov (United States)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  6. “Robots in Space” Multiagent Problem: Complexity, Information and Cryptographic Aspects

    Directory of Open Access Journals (Sweden)

    A. Yu. Bernstein

    2013-01-01

    Full Text Available We study a multiagent algorithmic problem that we call Robot in Space (RinS: There are n ≥ 2 autonomous robots, that need to agree without outside interference on distribution of shelters, so that straight pathes to the shelters will not intersect. The problem is closely related to the assignment problem in Graph Theory, to the convex hull problem in Combinatorial Geometry, or to the path-planning problem in Artificial Intelligence. Our algorithm grew up from a local search solution of the problem suggested by E.W. Dijkstra. We present a multiagent anonymous and scalable algorithm (protocol solving the problem, give an upper bound for the algorithm, prove (manually its correctness, and examine two communication aspects of the RinS problem — the informational and cryptographic. We proved that (1 there is no protocol that solves the RinS, which transfers a bounded number of bits, and (2 suggested the protocol that allows robots to check whether their paths intersect, without revealing additional information about their relative positions (with respect to shelters. The present paper continues the research presented in Mars Robot Puzzle (a Multiagent Approach to the Dijkstra Problem (by E.V. Bodin, N.O. Garanina, and N.V. Shilov, published in Modeling and analysis of information systems, 18(2, 2011.

  7. NASGRO 3.0: A Software for Analyzing Aging Aircraft

    Science.gov (United States)

    Mettu, S. R.; Shivakumar, V.; Beek, J. M.; Yeh, F.; Williams, L. C.; Forman, R. G.; McMahon, J. J.; Newman, J. C., Jr.

    1999-01-01

    Structural integrity analysis of aging aircraft is a critical necessity in view of the increasing numbers of such aircraft in general aviation, the airlines and the military. Efforts are in progress by NASA, the FAA and the DoD to focus attention on aging aircraft safety. The present paper describes the NASGRO software which is well-suited for effectively analyzing the behavior of defects that may be found in aging aircraft. The newly revised Version 3.0 has many features specifically implemented to suit the needs of the aircraft community. The fatigue crack growth computer program NASA/FLAGRO 2.0 was originally developed to analyze space hardware such as the Space Shuttle, the International Space Station and the associated payloads. Due to popular demand, the software was enhanced to suit the needs of the aircraft industry. Major improvements in Version 3.0 are the incorporation of the ability to read aircraft spectra of unlimited size, generation of common aircraft fatigue load blocks, and the incorporation of crack-growth models which include load-interaction effects such as retardation due to overloads and acceleration due to underloads. Five new crack-growth models, viz., generalized Willenborg, modified generalized Willenborg, constant closure model, Walker-Chang model and the deKoning-Newman strip-yield model, have been implemented. To facilitate easier input of geometry, material properties and load spectra, a Windows-style graphical user interface has been developed. Features to quickly change the input and rerun the problem as well as examine the output are incorporated. NASGRO has been organized into three modules, the crack-growth module being the primary one. The other two modules are the boundary element module and the material properties module. The boundary-element module provides the ability to model and analyze complex two-dimensional problems to obtain stresses and stress-intensity factors. The material properties module allows users to store and

  8. Analyzing scheduling in the food-processing industry

    DEFF Research Database (Denmark)

    Akkerman, Renzo; van Donk, Dirk Pieter

    2009-01-01

    Production scheduling has been widely studied in several research areas, resulting in a large number of methods, prescriptions, and approaches. However, the impact on scheduling practice seems relatively low. This is also the case in the food-processing industry, where industry......-specific characteristics induce specific and complex scheduling problems. Based on ideas about decomposition of the scheduling task and the production process, we develop an analysis methodology for scheduling problems in food processing. This combines an analysis of structural (technological) elements of the production...... process with an analysis of the tasks of the scheduler. This helps to understand, describe, and structure scheduling problems in food processing, and forms a basis for improving scheduling and applying methods developed in literature. It also helps in evaluating the organisational structures...

  9. Analyzing Interpersonal Problem Solving in Terms of Solution Focused Approach and Humor Styles of University Student

    Science.gov (United States)

    Koc, Hayri; Arslan, Coskun

    2017-01-01

    In this study university students interpersonal problem solving approaches were investigated in terms of solution focused approach and humor styles. The participants were 773 (542 female and 231 male, between 17-33 years old) university students. To determine the university students' problem solving approaches "Interpersonal Problem Solving…

  10. Leadership and leadership development in healthcare settings - a simplistic solution to complex problems?

    Science.gov (United States)

    McDonald, Ruth

    2014-10-01

    There is a trend in health systems around the world to place great emphasis on and faith in improving 'leadership'. Leadership has been defined in many ways and the elitist implications of traditional notions of leadership sit uncomfortably with modern healthcare organisations. The concept of distributed leadership incorporates inclusivity, collectiveness and collaboration, with the result that, to some extent, all staff, not just those in senior management roles, are viewed as leaders. Leadership development programmes are intended to equip individuals to improve leadership skills, but we know little about their effectiveness. Furthermore, the content of these programmes varies widely and the fact that many lack a sense of how they fit with individual or organisational goals raises questions about how they are intended to achieve their aims. It is important to avoid simplistic assumptions about the ability of improved leadership to solve complex problems. It is also important to evaluate leadership development programmes in ways that go beyond descriptive accounts.

  11. Sleep, Cognition, and Behavioral Problems in School-Age Children: A Century of Research Meta-Analyzed

    NARCIS (Netherlands)

    Astill, R.G.; van der Heijden, K.B.; van IJzendoorn, M.H.; van Someren, E.J.W.

    2012-01-01

    Clear associations of sleep, cognitive performance, and behavioral problems have been demonstrated in meta-analyses of studies in adults. This meta-analysis is the first to systematically summarize all relevant studies reporting on sleep, cognition, and behavioral problems in healthy school-age

  12. Some problems of treating acute arterial obstruction using hyperbaric oxygenation ultraviolet irradiation of blood and hemosorption

    International Nuclear Information System (INIS)

    Karyakin, A.M.

    1988-01-01

    Up-to-date state of the problem of acute arterial obstruction (AAO) is considered and clinical observations of patients with acute emboli and thrombosis of abdominal aorta and main arteries of lower extremitics are analyzed. Complex of detoxication therapy and measures on controlling AAO in patients during postoperation period is presented. Complex application of hemosorption, ultraviolet irradiation of autoblood and hyperbaric oxygenation enabled to achieve correction of some indices. Reliable evidences of high therapeutic efficiency of combined application of mentioned methods to patients with reconstructed blood flow are presented. 38 refs.; 3 tabs

  13. Selected problems in experimental intermediate energy physics

    International Nuclear Information System (INIS)

    Mayes, B.W.; Hungerford, E.V.; Pinsky, L.S.

    1990-09-01

    The objectives of this research program are to: investigate forefront problems in experimental intermediate energy physics; educate students in this field of research; and, develop the instrumentation necessary to undertake this experimental program. Generally, the research is designed to search for physical processes which cannot be explained by conventional models of elementary interactions. This includes the use of nuclear targets where the nucleus provides a many body environment of strongly perturbation of a known interaction by this environment. Unfortunately, such effects may be masked by the complexity of the many body problem and may be difficult to observe. Therefore, experiments must be carefully chosen and analyzed for deviations from the more conventional models. There were three major thrusts of the program; strange particle physics, where a strange quark is embedded in the nuclear medium; muon electro-weak decay, which involves a search for a violation of the standard model of the electro-weak interaction; and measurement of the spin dependent structure function of the neutron

  14. "What constitutes a 'problem'?" Producing 'alcohol problems' through online counselling encounters.

    Science.gov (United States)

    Savic, Michael; Ferguson, Nyssa; Manning, Victoria; Bathish, Ramez; Lubman, Dan I

    2017-08-01

    Typically, health policy, practice and research views alcohol and other drug (AOD) 'problems' as objective things waiting to be detected, diagnosed and treated. However, this approach to policy development and treatment downplays the role of clinical practices, tools, discourses, and systems in shaping how AOD use is constituted as a 'problem'. For instance, people might present to AOD treatment with multiple psycho-social concerns, but usually only a singular AOD-associated 'problem' is considered serviceable. As the assumed nature of 'the serviceable problem' influences what treatment responses people receive, and how they may come to be enacted as 'addicted' or 'normal' subjects, it is important to subject clinical practices of problem formulation to critical analysis. Given that the reach of AOD treatment has expanded via the online medium, in this article we examine how 'problems' are produced in online alcohol counselling encounters involving people aged 55 and over. Drawing on poststructural approaches to problematisation, we not only trace how and what 'problems' are produced, but also what effects these give rise to. We discuss three approaches to problem formulation: (1) Addiction discourses at work; (2) Moving between concerns and alcohol 'problems'; (3) Making 'problems' complex and multiple. On the basis of this analysis, we argue that online AOD counselling does not just respond to pre-existing 'AOD problems'. Rather, through the social and clinical practices of formulation at work in clinical encounters, online counselling also produces them. Thus, given a different set of circumstances, practices and relations, 'problems' might be defined or emerge differently-perhaps not as 'problems' at all or perhaps as different kinds of concerns. We conclude by highlighting the need for a critical reflexivity in AOD treatment and policy in order to open up possibilities for different ways of engaging with, and responding to, people's needs in their complexity

  15. Effective algorithm for solving complex problems of production control and of material flows control of industrial enterprise

    Science.gov (United States)

    Mezentsev, Yu A.; Baranova, N. V.

    2018-05-01

    A universal economical and mathematical model designed for determination of optimal strategies for managing subsystems (components of subsystems) of production and logistics of enterprises is considered. Declared universality allows taking into account on the system level both production components, including limitations on the ways of converting raw materials and components into sold goods, as well as resource and logical restrictions on input and output material flows. The presented model and generated control problems are developed within the framework of the unified approach that allows one to implement logical conditions of any complexity and to define corresponding formal optimization tasks. Conceptual meaning of used criteria and limitations are explained. The belonging of the generated tasks of the mixed programming with the class of NP is shown. An approximate polynomial algorithm for solving the posed optimization tasks for mixed programming of real dimension with high computational complexity is proposed. Results of testing the algorithm on the tasks in a wide range of dimensions are presented.

  16. Analyzing complex wake-terrain interactions and its implications on wind-farm performance.

    Science.gov (United States)

    Tabib, Mandar; Rasheed, Adil; Fuchs, Franz

    2016-09-01

    Rotating wind turbine blades generate complex wakes involving vortices (helical tip-vortex, root-vortex etc.).These wakes are regions of high velocity deficits and high turbulence intensities and they tend to degrade the performance of down-stream turbines. Hence, a conservative inter-turbine distance of up-to 10 times turbine diameter (10D) is sometimes used in wind-farm layout (particularly in cases of flat terrain). This ensures that wake-effects will not reduce the overall wind-farm performance, but this leads to larger land footprint for establishing a wind-farm. In-case of complex-terrain, within a short distance (say 10D) itself, the nearby terrain can rise in altitude and be high enough to influence the wake dynamics. This wake-terrain interaction can happen either (a) indirectly, through an interaction of wake (both near tip vortex and far wake large-scale vortex) with terrain induced turbulence (especially, smaller eddies generated by small ridges within the terrain) or (b) directly, by obstructing the wake-region partially or fully in its flow-path. Hence, enhanced understanding of wake- development due to wake-terrain interaction will help in wind farm design. To this end the current study involves: (1) understanding the numerics for successful simulation of vortices, (2) understanding fundamental vortex-terrain interaction mechanism through studies devoted to interaction of a single vortex with different terrains, (3) relating influence of vortex-terrain interactions to performance of a wind-farm by studying a multi-turbine wind-farm layout under different terrains. The results on interaction of terrain and vortex has shown a much faster decay of vortex for complex terrain compared to a flatter-terrain. The potential reasons identified explaining the observation are (a) formation of secondary vortices in flow and its interaction with the primary vortex and (b) enhanced vorticity diffusion due to increased terrain-induced turbulence. The implications of

  17. Noise problems in coal mining complex- a case discussion

    International Nuclear Information System (INIS)

    Mishra, Y.; Mitra, H.; Ghosh, S.; Pal, A.K.

    1996-01-01

    Noise monitoring study was conducted at Moonidih mining complex of Jharia coal-field. The study included monitoring and analysis of ambient as well as workplace noise levels. An attempt has been made to critically analyse the noise situation through octave band analysis, thereby identifying alarming noise frequencies for each noise generating equipment having Leq level more than 90 dBA. A noise model has also been developed to draw noise contours of the entire mining complex. Based on these studies, suitable control measures have been suggested. (author). 6 refs., 3 figs

  18. Problem statement for optimal design of steel structures

    Directory of Open Access Journals (Sweden)

    Ginzburg Aleksandr Vital'evich

    2014-07-01

    Full Text Available The presented article considers the following complex of tasks. The main stages of the life cycle of a building construction with the indication of process entrance and process exit are described. Requirements imposed on steel constructions are considered. The optimum range of application for steel designs is specified, as well as merits and demerits of a design material. The nomenclature of metal designs is listed - the block diagram is constructed. Possible optimality criteria of steel designs, offered by various authors for various types of constructions are considered. It is established that most often the criterion of a minimum of design mass is accepted as criterion of optimality; more rarely - a minimum of the given expenses, a minimum of a design cost in business. In the present article special attention is paid to a type of objective function of optimization problem. It is also established that depending on the accepted optimality criterion, the use of different types of functions is possible. This complexity of objective function depends on completeness of optimality criterion application. In the work the authors consider the following objective functions: the mass of the main element of a design; objective function by criterion of factory cost; objective function by criterion of cost in business. According to these examples it can be seen that objective functions by the criteria of labor expenses for production of designs are generally non-linear, which complicates solving the optimization problem. Another important factor influencing the problem of optimal design solution for steel designs, which is analyzed, is account for operating restrictions. In the article 8 groups of restrictions are analyzed. Attempts to completely account for the parameters of objective function optimized by particular optimality criteria, taking into account all the operating restrictions, considerably complicates the problem of designing. For solving this

  19. Problem solving using soft systems methodology.

    Science.gov (United States)

    Land, L

    This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.

  20. Effects of intense ultraviolet radiation on electrostatic energy analyzers

    International Nuclear Information System (INIS)

    Mathew, J.; Jennings, W.C.; Hickok, R.L.; Connor, K.A.; Schoch, P.M.; Hallock, G.A.

    1984-01-01

    Intense ultraviolet radiation from the plasma poses a significant problem for the implementation of heavy ion beam probe diagnostic systems on fusion-oriented confinement devices. The radiation enters the electrostatic energy analyzer used to detect secondary ions, resulting in both a distortion of the electric field inside the analyzer and noise generation in the detector channels. Data acquisition procedures and mechanical design techniques have been developed to significantly reduce these effects. We have also been successful in modelling the electric field distortion and have developed a data correction procedure based on this model. Methods for approaching the problems anticipated in future devices are also suggested

  1. Application of the random phase approximation to complex problems in materials science

    International Nuclear Information System (INIS)

    Schimka, L.

    2012-01-01

    This thesis is devoted to the assessment and application of the random phase approximation (RPA) in the adiabatic-connection fluctuation-dissipation (ACFD) framework in solid state physics. The first part presents a review of density functional theory (DFT) and the ACFD theorem in the RPA. This includes an introduction to the many-body problem as well as a description of the implementation of the RPA in the Vienna Ab-initio Simulation Package (VASP). In the results part, the quality of the RPA is assessed and its performance compared to three (beyond) DFT functionals. The experimental values are corrected for the effect of phonon zero-point vibrational energies which were calculated at the DFT level from ab-initio. We find that the RPA describes all bonding situations very accurately, making it a promising candidate for more complex problems in solid state physics. In light of these findings, we investigate the carbon-water interaction in two specific cases: the adsorption of water on benzene and the adsorption of water on a graphene layer. We compare our results to a different correlated method: diffusion Monte Carlo (DMC). We find very good agreement and thus believe that our values can serve as a benchmark for the development of other DFT functionals to treat water-carbon interfaces. The highlight of this thesis is the successful application of the RPA to the long-standing and (at DFT level) unsolved CO adsorption puzzle. We show results for CO adsorption on Cu, late 4d metals and Pt. RPA is at present the only ab-initio method that describes adsorption and surface energies accurately at the same time and predicts the correct adsorption site in every single case. (author) [de

  2. Harm reduction as a complex adaptive system: A dynamic framework for analyzing Tanzanian policies concerning heroin use.

    Science.gov (United States)

    Ratliff, Eric A; Kaduri, Pamela; Masao, Frank; Mbwambo, Jessie K K; McCurdy, Sheryl A

    2016-04-01

    Contrary to popular belief, policies on drug use are not always based on scientific evidence or composed in a rational manner. Rather, decisions concerning drug policies reflect the negotiation of actors' ambitions, values, and facts as they organize in different ways around the perceived problems associated with illicit drug use. Drug policy is thus best represented as a complex adaptive system (CAS) that is dynamic, self-organizing, and coevolving. In this analysis, we use a CAS framework to examine how harm reduction emerged around heroin trafficking and use in Tanzania over the past thirty years (1985-present). This account is an organizational ethnography based on of the observant participation of the authors as actors within this system. We review the dynamic history and self-organizing nature of harm reduction, noting how interactions among system actors and components have coevolved with patterns of heroin us, policing, and treatment activities over time. Using a CAS framework, we describe harm reduction as a complex process where ambitions, values, facts, and technologies interact in the Tanzanian sociopolitical environment. We review the dynamic history and self-organizing nature of heroin policies, noting how the interactions within and between competing prohibitionist and harm reduction policies have changed with patterns of heroin use, policing, and treatment activities over time. Actors learn from their experiences to organize with other actors, align their values and facts, and implement new policies. Using a CAS approach provides researchers and policy actors a better understanding of patterns and intricacies in drug policy. This knowledge of how the system works can help improve the policy process through adaptive action to introduce new actors, different ideas, and avenues for communication into the system. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Preliminary Study of 2-D Time Domain Electromagnetic (TDEM) Modeling to Analyze Subsurface Resistivity Distribution and its Application to the Geothermal Systems

    Science.gov (United States)

    Aji Hapsoro, Cahyo; Purqon, Acep; Srigutomo, Wahyu

    2017-07-01

    2-D Time Domain Electromagnetic (TDEM) has been successfully conducted to illustrate the value of Electric field distribution under the Earth surface. Electric field compared by magnetic field is used to analyze resistivity and resistivity is one of physical properties which very important to determine the reservoir potential area of geothermal systems as one of renewable energy. In this modeling we used Time Domain Electromagnetic method because it can solve EM field interaction problem with complex geometry and to analyze transient problems. TDEM methods used to model the value of electric and magnetic fields as a function of the time combined with the function of distance and depth. The result of this modeling is Electric field intensity value which is capable to describe the structure of the Earth’s subsurface. The result of this modeling can be applied to describe the Earths subsurface resistivity values to determine the reservoir potential of geothermal systems.

  4. Adjustment problems and residential care environment

    Directory of Open Access Journals (Sweden)

    Jan Sebastian Novotný

    2015-01-01

    Full Text Available Problem: Residential care environment represents a specific social space that is associated with a number of negative consequences, covering most aspects of children and youth functioning. The paper analyzes of the presence of adjustment problems among adolescents from institutional care environment and compares this results with a population of adolescents who grew up in a family. Methods: The sample consisted of two groups of adolescents. The first group included 285 adolescents currently growing up in an residential care environment, aged 13 to 21 (M = 16.23, SD = 1.643. The second group consisted of 214 adolescents growing up in a family, aged 15 to 20 (M = 17.07, SD = 1.070. We used a questionnaire Youth Self Report. Data were analyzed using descriptive statistics and MANOVA. Results: Results showed that adolescents in residential care exhibit higher average values in all adjustment problems. Also, in the context of diagnostic categories are the residential care adolescents more frequently in non-normal range (borderline and clinical, primarily in the border range. The greatest differences were reflected in the Thought problems and Rule-breaking behavior. MANOVA showed a significant multivariate effect between groups of adolescents, Hotelling's T = .803, F(8, 490 = 49.202, p <.001, d = .445 (large effect. Univariate analysis further showed a significant effect for Withdrawn/depressed (p = .044, d = .089, small effect, Somatic complaints (p = .002, d = .139, medium effect, Social problems (p = 004, d = .127, a small effect, Thought problems (p <.001, d = .633, strong effect, Attention problems (p <.001, d = .320,strong effect, Rule-breaking behavior (p <.001 , d = .383, strong effect, and Aggressive behavior (p = 015, d = .110, small effect. Results for the dimension of Anxious/depressed were not significant (p = .159. Discussion: The results didn’t confirmed the assumption that more than 30% of residential care adolescents have adjustment

  5. Using Problem-Based Learning in Accounting

    Science.gov (United States)

    Hansen, James D.

    2006-01-01

    In this article, the author describes the process of writing a problem-based learning (PBL) problem and shows how a typical end-of-chapter accounting problem can be converted to a PBL problem. PBL uses complex, real-world problems to motivate students to identify and research the concepts and principles they need to know to solve these problems.…

  6. Complex centers of polynomial differential equations

    Directory of Open Access Journals (Sweden)

    Mohamad Ali M. Alwash

    2007-07-01

    Full Text Available We present some results on the existence and nonexistence of centers for polynomial first order ordinary differential equations with complex coefficients. In particular, we show that binomial differential equations without linear terms do not have complex centers. Classes of polynomial differential equations, with more than two terms, are presented that do not have complex centers. We also study the relation between complex centers and the Pugh problem. An algorithm is described to solve the Pugh problem for equations without complex centers. The method of proof involves phase plane analysis of the polar equations and a local study of periodic solutions.

  7. Framework based on communicability and flow to analyze complex network dynamics

    Science.gov (United States)

    Gilson, M.; Kouvaris, N. E.; Deco, G.; Zamora-López, G.

    2018-05-01

    Graph theory constitutes a widely used and established field providing powerful tools for the characterization of complex networks. The intricate topology of networks can also be investigated by means of the collective dynamics observed in the interactions of self-sustained oscillations (synchronization patterns) or propagationlike processes such as random walks. However, networks are often inferred from real-data-forming dynamic systems, which are different from those employed to reveal their topological characteristics. This stresses the necessity for a theoretical framework dedicated to the mutual relationship between the structure and dynamics in complex networks, as the two sides of the same coin. Here we propose a rigorous framework based on the network response over time (i.e., Green function) to study interactions between nodes across time. For this purpose we define the flow that describes the interplay between the network connectivity and external inputs. This multivariate measure relates to the concepts of graph communicability and the map equation. We illustrate our theory using the multivariate Ornstein-Uhlenbeck process, which describes stable and non-conservative dynamics, but the formalism can be adapted to other local dynamics for which the Green function is known. We provide applications to classical network examples, such as small-world ring and hierarchical networks. Our theory defines a comprehensive framework that is canonically related to directed and weighted networks, thus paving a way to revise the standards for network analysis, from the pairwise interactions between nodes to the global properties of networks including community detection.

  8. A case-based, problem-based learning approach to prepare master of public health candidates for the complexities of global health.

    Science.gov (United States)

    Leon, Juan S; Winskell, Kate; McFarland, Deborah A; del Rio, Carlos

    2015-03-01

    Global health is a dynamic, emerging, and interdisciplinary field. To address current and emerging global health challenges, we need a public health workforce with adaptable and collaborative problem-solving skills. In the 2013-2014 academic year, the Hubert Department of Global Health at the Rollins School of Public Health-Emory University launched an innovative required core course for its first-year Master of Public Health students in the global health track. The course uses a case-based, problem-based learning approach to develop global health competencies. Small teams of students propose solutions to these problems by identifying learning issues and critically analyzing and synthesizing new information. We describe the course structure and logistics used to apply this approach in the context of a large class and share lessons learned.

  9. Analyzing State Security Risks in South China Sea Conflict

    Directory of Open Access Journals (Sweden)

    Дмитрий Владимирович Пивоваров

    2009-09-01

    Full Text Available The article is devoted to the regional security issues in South East Asia. The author analyses the international relations that go closely to the foreign policy and foreign policy strategy problems. The author proposes risk analysis as a new and promising method in political science to generate foreign policy plans and analyze international conflicts and problems.

  10. Assessing Student Written Problem Solutions: A Problem-Solving Rubric with Application to Introductory Physics

    Science.gov (United States)

    Docktor, Jennifer L.; Dornfeld, Jay; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Jackson, Koblar Alan; Mason, Andrew; Ryan, Qing X.; Yang, Jie

    2016-01-01

    Problem solving is a complex process valuable in everyday life and crucial for learning in the STEM fields. To support the development of problem-solving skills it is important for researchers and curriculum developers to have practical tools that can measure the difference between novice and expert problem-solving performance in authentic…

  11. Complex concentrate pretreatment

    International Nuclear Information System (INIS)

    Lokken, R.O.; Scheele, R.D.; Strachan, D.M.; Toste, A.P.

    1991-03-01

    After removal of the transuranics (TRU) by the TRUEX process, complex concentrate waste will be grouted for final storage. The purpose of this project, conducted at the Pacific Northwest Laboratory, is to support a future decision to grout the complexant waste without destruction of the organic contents. It has been demonstrated that grouts with acceptable parameters for the Transportable Grout Facility can be made using actual waste. The acceptability of these grouts from a regulatory view seems to be less of a problem than previously. None of the organics found in the waste have been found on the EPA hazardous chemicals list. Two potential problems with the processing of the complex concentrate wastes were identified during the use of the TRUEX process on samples of several milliliters. One was the amount of foam that is generated during acid addition to the alkaline waste. Some of this foam appears to be of a waxy nature but does redissolve when the waste is strongly acid. The second potential problem is that noticeable amounts of NO x gases are generated. No quantitative measure of the NO x gas generation was made. The problem relates to processing the waste in B-plant where there are no facilities to handle NO x gases. 5 refs., 4 figs., 4 tabs

  12. Parallel Optimization of Polynomials for Large-scale Problems in Stability and Control

    Science.gov (United States)

    Kamyar, Reza

    In this thesis, we focus on some of the NP-hard problems in control theory. Thanks to the converse Lyapunov theory, these problems can often be modeled as optimization over polynomials. To avoid the problem of intractability, we establish a trade off between accuracy and complexity. In particular, we develop a sequence of tractable optimization problems --- in the form of Linear Programs (LPs) and/or Semi-Definite Programs (SDPs) --- whose solutions converge to the exact solution of the NP-hard problem. However, the computational and memory complexity of these LPs and SDPs grow exponentially with the progress of the sequence - meaning that improving the accuracy of the solutions requires solving SDPs with tens of thousands of decision variables and constraints. Setting up and solving such problems is a significant challenge. The existing optimization algorithms and software are only designed to use desktop computers or small cluster computers --- machines which do not have sufficient memory for solving such large SDPs. Moreover, the speed-up of these algorithms does not scale beyond dozens of processors. This in fact is the reason we seek parallel algorithms for setting-up and solving large SDPs on large cluster- and/or super-computers. We propose parallel algorithms for stability analysis of two classes of systems: 1) Linear systems with a large number of uncertain parameters; 2) Nonlinear systems defined by polynomial vector fields. First, we develop a distributed parallel algorithm which applies Polya's and/or Handelman's theorems to some variants of parameter-dependent Lyapunov inequalities with parameters defined over the standard simplex. The result is a sequence of SDPs which possess a block-diagonal structure. We then develop a parallel SDP solver which exploits this structure in order to map the computation, memory and communication to a distributed parallel environment. Numerical tests on a supercomputer demonstrate the ability of the algorithm to

  13. Problems of complex automation of process at a NPP

    International Nuclear Information System (INIS)

    Naumov, A.V.

    1981-01-01

    The importance of theoretical investigation in determining the level and quality of NPP automation is discussed. Achievements gained in this direction are briefly reviewed on the example of domestic NPPs. Two models of the problem solution on function distribution between the operator and technical means are outlined. The processes subjected to automation are enumerated. Development of the optimal methods of power automatic control of power units is one of the most important problems of NPP automation. Automation of discrete operations especially during the start-up, shut-down or in imergency situations becomes important [ru

  14. Early days in complex dynamics a history of complex dynamics in one variable during 1906-1942

    CERN Document Server

    Alexander, Daniel S; Rosa, Alessandro

    2011-01-01

    The theory of complex dynamics, whose roots lie in 19th-century studies of the iteration of complex function conducted by Kœnigs, Schröder, and others, flourished remarkably during the first half of the 20th century, when many of the central ideas and techniques of the subject developed. This book by Alexander, Iavernaro, and Rosa paints a robust picture of the field of complex dynamics between 1906 and 1942 through detailed discussions of the work of Fatou, Julia, Siegel, and several others. A recurrent theme of the authors' treatment is the center problem in complex dynamics. They present its complete history during this period and, in so doing, bring out analogies between complex dynamics and the study of differential equations, in particular, the problem of stability in Hamiltonian systems. Among these analogies are the use of iteration and problems involving small divisors which the authors examine in the work of Poincaré and others, linking them to complex dynamics, principally via the work of Samuel...

  15. The Impact of Complexity on Shaping Logistics Strategies in Global Supply Chains

    Directory of Open Access Journals (Sweden)

    Agnieszka Szmelter

    2017-04-01

    Full Text Available Aim/purpose - The paper aims to summarize approaches to complexity management by implementing particular logistics concepts within logistics strategies in global supply chains and to highlight a research gap in this regard. Additionally, complexity management concepts are presented. Design/methodology/approach - To achieve the research objective, a systematic literature review was used. 11 research paper were analyzed with use of review protocol. Findings - Approaches to mentioned research problem are heterogeneous in current literature and there is a research gap in complexity studies in logistics, precluding further research, for example, on complexity measurement systems. Research implications/limitations - Identified research gap will require further studies. Studied area requires more empirical research, especially in the field of complexity measurement and management techniques in particular global supply chains. Originality/value/contribution - The paper summarizes current knowledge about logistics concepts helping to manage complexity in global supply chains and defines research gaps. There are no available literature summary of that kind. The article contains a full review of logistics complexity management concepts presented in scientific literature until the end of 2016.

  16. Untangling the Complex Needs of People Experiencing Gambling Problems and Homelessness

    Science.gov (United States)

    Holdsworth, Louise; Tiyce, Margaret

    2013-01-01

    People with gambling problems are now recognised among those at increased risk of homelessness, and the link between housing and gambling problems has been identified as an area requiring further research. This paper discusses the findings of a qualitative study that explored the relationship between gambling problems and homelessness. Interviews…

  17. Influence of metal loading and humic acid functional groups on the complexation behavior of trivalent lanthanides analyzed by CE-ICP-MS

    Energy Technology Data Exchange (ETDEWEB)

    Kautenburger, Ralf, E-mail: r.kautenburger@mx.uni-saarland.de [Institute of Inorganic Solid State Chemistry, Saarland University, Campus Dudweiler, Am Markt Zeile 3-5, D-66125 Saarbrücken (Germany); Hein, Christina; Sander, Jonas M. [Institute of Inorganic Solid State Chemistry, Saarland University, Campus Dudweiler, Am Markt Zeile 3-5, D-66125 Saarbrücken (Germany); Beck, Horst P. [Institute of Inorganic and Analytical Chemistry and Radiochemistry, Saarland University, Campus Dudweiler, Am Markt Zeile 5, D-66125 Saarbrücken (Germany)

    2014-03-01

    Highlights: • Free and complexed HA-Ln species are separated by CE-ICP-MS. • Weaker and stronger HA-binding sites for Ln-complexation can be detected. • Complexation by original and modified humic acid (HA) with blocked phenolic hydroxyl- and carboxyl-groups is compared. • Stronger HA-binding sites for Ln³⁺ can be assumed as chelating complexes. • Chelates consist of trivalent Ln and a combination of both OH- and COOH-groups. Abstract: The complexation behavior of Aldrich humic acid (AHA) and a modified humic acid (AHA-PB) with blocked phenolic hydroxyl groups for trivalent lanthanides (Ln) is compared, and their influence on the mobility of Ln(III) in an aquifer is analyzed. As speciation technique, capillary electrophoresis (CE) was hyphenated with inductively coupled plasma mass spectrometry (ICP-MS). For metal loading experiments 25 mg L⁻¹ of AHA and different concentrations (c Ln(Eu+Gd)} = 100–6000 μg L⁻¹) of Eu(III) and Gd(III) in 10 mM NaClO₄ at pH 5 were applied. By CE-ICP-MS, three Ln-fractions, assumed to be uncomplexed, weakly and strongly AHA-complexed metal can be detected. For the used Ln/AHA-ratios conservative complex stability constants log βLnAHA decrease from 6.33 (100 μg L⁻¹ Ln³⁺) to 4.31 (6000 μg L⁻¹ Ln³⁺) with growing Ln-content. In order to verify the postulated weaker and stronger humic acid binding sites for trivalent Eu and Gd, a modified AHA with blocked functional groups was used. For these experiments 500 μg L⁻¹ Eu and 25 mg L⁻¹ AHA and AHA-PB in 10 mM NaClO₄ at pH-values ranging from 3 to 10 have been applied. With AHA-PB, where 84% of the phenolic OH-groups and 40% of the COOH-groups were blocked, Eu complexation was significantly lower, especially at the strong binding sites. The log β-values decrease from 6.11 (pH 10) to 5.61 at pH 3 (AHA) and for AHA-PB from 6.01 (pH 7) to 3.94 at pH 3. As a potential consequence, particularly humic acids with a high amount of

  18. Problem structuring in interactive decision-making processes : How interaction, problem perceptions and knowledge contribute to a joint formulation of a problem and solutions

    NARCIS (Netherlands)

    De Kruijf, J.

    2007-01-01

    Water management issues are often complex, unstructured problems. They are complex, because they are part of a natural and human system wich consists of many diverse, interdependent elements, e.g. upstream events influence the water system downstream, different interdependent goverment layers,

  19. StoRMon: an event log analyzer for Grid Storage Element based on StoRM

    International Nuclear Information System (INIS)

    Zappi, Riccardo; Dal Pra, Stefano; Dibenedetto, Michele; Ronchieri, Elisabetta

    2011-01-01

    Managing a collaborative production Grid infrastructure requires to identify and handle every issue, which might arise, in a timely manner. Currently, the most complex problem of the data Grid infrastructure relates to the data management because of its distributed nature. To ensure that problems are quickly addressed and solved, each site should contribute to the solution providing any useful information about services that run in its administrative domain. Often Grid sites' administrators to be effective must collect, organize and examine the scattered logs events that are produced from every service and component of the Storage Element. This paper focuses on the problem of gathering the events logs on a Grid Storage Element and describes the design of a new service, called StoRMon. StoRMon will be able to collect, archive, analyze and report on events logs produced by each service of Storage Element during the execution of its tasks. The data and the processed information will be available to the site administrators by using a single contact-point to easily identify security incidents, fraudulent activity, and the operational issues mainly. The new service is applied to a Grid Storage Element characterized by StoRM, GridFTP and YAMSS, and collects the usage data of StoRM, transferring and hierarchical storage services.

  20. Herding Complex Networks

    KAUST Repository

    Ruf, Sebastian F.

    2018-04-12

    The problem of controlling complex networks is of interest to disciplines ranging from biology to swarm robotics. However, controllability can be too strict a condition, failing to capture a range of desirable behaviors. Herdability, which describes the ability to drive a system to a specific set in the state space, was recently introduced as an alternative network control notion. This paper considers the application of herdability to the study of complex networks. The herdability of a class of networked systems is investigated and two problems related to ensuring system herdability are explored. The first is the input addition problem, which investigates which nodes in a network should receive inputs to ensure that the system is herdable. The second is a related problem of selecting the best single node from which to herd the network, in the case that a single node is guaranteed to make the system is herdable. In order to select the best herding node, a novel control energy based herdability centrality measure is introduced.

  1. Energy-weighted moments in the problems of fragmentation

    International Nuclear Information System (INIS)

    Kuz'min, V.A.

    1986-01-01

    The problem of fragmentation of simple nuclear states on the complex ones is reduced to real symmetrical matrix eigenvectors and eigenvalue problem. Based on spectral decomposition of this matrix the simple and economical from computing point of view algorithm to calculate energetically-weighted strength function moments is obtained. This permitted one to investigate the sensitivity of solving the fragmentation problem to reducing the basis of complex states. It is shown that the full width of strength function is determined only by the complex states connected directly with the simple ones

  2. From geometry to algebra and vice versa: Realistic mathematics education principles for analyzing geometry tasks

    Science.gov (United States)

    Jupri, Al

    2017-04-01

    In this article we address how Realistic Mathematics Education (RME) principles, including the intertwinement and the reality principles, are used to analyze geometry tasks. To do so, we carried out three phases of a small-scale study. First we analyzed four geometry problems - considered as tasks inviting the use of problem solving and reasoning skills - theoretically in the light of the RME principles. Second, we tested two problems to 31 undergraduate students of mathematics education program and other two problems to 16 master students of primary mathematics education program. Finally, we analyzed student written work and compared these empirical to the theoretical results. We found that there are discrepancies between what we expected theoretically and what occurred empirically in terms of mathematization and of intertwinement of mathematical concepts from geometry to algebra and vice versa. We conclude that the RME principles provide a fruitful framework for analyzing geometry tasks that, for instance, are intended for assessing student problem solving and reasoning skills.

  3. Thinking in complexity the complex dynamics of matter, mind, and mankind

    CERN Document Server

    Mainzer, Klaus

    1994-01-01

    The theory of nonlinear complex systems has become a successful and widely used problem-solving approach in the natural sciences - from laser physics, quantum chaos and meteorology to molecular modeling in chemistry and computer simulations of cell growth in biology In recent times it has been recognized that many of the social, ecological and political problems of mankind are also of a global, complex and nonlinear nature And one of the most exciting topics of present scientific and public interest is the idea that even the human mind is governed largely by the nonlinear dynamics of complex systems In this wide-ranging but concise treatment Prof Mainzer discusses, in nontechnical language, the common framework behind these endeavours Special emphasis is given to the evolution of new structures in natural and cultural systems and it is seen clearly how the new integrative approach of complexity theory can give new insights that were not available using traditional reductionistic methods

  4. Workshop on Recommendation in Complex Scenarios (ComplexRec 2017)

    DEFF Research Database (Denmark)

    Bogers, Toine; Koolen, Marijn; Mobasher, Bamshad

    2017-01-01

    Recommendation algorithms for ratings prediction and item ranking have steadily matured during the past decade. However, these state-of-the-art algorithms are typically applied in relatively straightforward scenarios. In reality, recommendation is often a more complex problem: it is usually just...... a single step in the user's more complex background need. These background needs can often place a variety of constraints on which recommendations are interesting to the user and when they are appropriate. However, relatively little research has been done on these complex recommendation scenarios....... The ComplexRec 2017 workshop addressed this by providing an interactive venue for discussing approaches to recommendation in complex scenarios that have no simple one-size-fits-all-solution....

  5. Technical study on semi-object emulation of structural statics problem

    International Nuclear Information System (INIS)

    Mo Jun; Shi Pingan; Liu Xingfu; Liu Zhiyong; Fu Chunyu

    2002-01-01

    Structural strength analysis depends mainly on finite element method and experiments. For complex structural system, a rather large error can be caused by some uncertain factors, such as load distributions, boundary conditions and constitutive relations in numerical analysis. At the same time, owing to the limitation of measuring and testing techniques, the strength and stiffness of key components can not be estimated by using the limited test data. To simulate stresses accurately under complex static environment, improve man-machine interactive system, and make the best use of fore- and post-processing function in graphic data processing, the authors combine numerical analysis with experimental technique and have developed the semi-object emulation technique to analyze the nonlinear problem of structure statics. The modern optical measuring techniques and image processing techniques are firstly used for the method to acquire displacement data of the vessel surface, and the data are used for the boundary condition to determine the geometrical size of disfigurement in the wall of vessel and the stress level. The experimental verification of a given test model show that these adverse problem can be solved by using semi-object emulation technology

  6. Problem situations in management activity

    OpenAIRE

    N.A. DUBINKO

    2009-01-01

    This article reviews contemporary methodological and theoretical approaches to the problem situations in management activity. Revealed and analyzed the types of problem situations managers dealing with in their activity. Rank correlation of problem situations shows distinctions depending on management work experience. Revealed gender distinctions in the managers' ideas of management problems.

  7. A Categorization of Dynamic Analyzers

    Science.gov (United States)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  8. Engaged Problem Formulation in IS Research

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Persson, John Stouby

    2016-01-01

    problems requires a more substantial engagement with the different stakeholders, especially when their problems are ill structured and situated in complex organizational settings. On this basis, we present an engaged approach to formulating IS problems with, not for, IS practitioners. We have come...

  9. Introduction to n-adaptive fuzzy models to analyze public opinion on AIDS

    CERN Document Server

    Kandasamy, D W B V; Kandasamy, Dr.W.B.Vasantha; Smarandache, Dr.Florentin

    2006-01-01

    There are many fuzzy models like Fuzzy matrices, Fuzzy Cognitive Maps, Fuzzy relational Maps, Fuzzy Associative Memories, Bidirectional Associative memories and so on. But almost all these models can give only one sided solution like hidden pattern or a resultant output vector dependent on the input vector depending in the problem at hand. So for the first time we have defined a n-adaptive fuzzy model which can view or analyze the problem in n ways (n >=2) Though we have defined these n- adaptive fuzzy models theorectically we are not in a position to get a n-adaptive fuzzy model for n > 2 for practical real world problems. The highlight of this model is its capacity to analyze the same problem in different ways thereby arriving at various solutions that mirror multiple perspectives. We have used the 2-adaptive fuzzy model having the two fuzzy models, fuzzy matrices model and BAMs viz. model to analyze the views of public about HIV/ AIDS disease, patient and the awareness program. This book has five chapters ...

  10. Simon on problem solving

    DEFF Research Database (Denmark)

    Foss, Kirsten; Foss, Nicolai Juul

    2006-01-01

    as a general approach to problem solving. We apply these Simonian ideas to organisational issues, specifically new organisational forms. Specifically, Simonian ideas allow us to develop a morphology of new organisational forms and to point to some design problems that characterise these forms.......Two of Herbert Simon's best-known papers are 'The Architecture of Complexity' and 'The Structure of Ill-Structured Problems.' We discuss the neglected links between these two papers, highlighting the role of decomposition in the context of problems on which constraints have been imposed...

  11. Particle Swarm Optimization and Uncertainty Assessment in Inverse Problems

    Directory of Open Access Journals (Sweden)

    José L. G. Pallero

    2018-01-01

    Full Text Available Most inverse problems in the industry (and particularly in geophysical exploration are highly underdetermined because the number of model parameters too high to achieve accurate data predictions and because the sampling of the data space is scarce and incomplete; it is always affected by different kinds of noise. Additionally, the physics of the forward problem is a simplification of the reality. All these facts result in that the inverse problem solution is not unique; that is, there are different inverse solutions (called equivalent, compatible with the prior information that fits the observed data within similar error bounds. In the case of nonlinear inverse problems, these equivalent models are located in disconnected flat curvilinear valleys of the cost-function topography. The uncertainty analysis consists of obtaining a representation of this complex topography via different sampling methodologies. In this paper, we focus on the use of a particle swarm optimization (PSO algorithm to sample the region of equivalence in nonlinear inverse problems. Although this methodology has a general purpose, we show its application for the uncertainty assessment of the solution of a geophysical problem concerning gravity inversion in sedimentary basins, showing that it is possible to efficiently perform this task in a sampling-while-optimizing mode. Particularly, we explain how to use and analyze the geophysical models sampled by exploratory PSO family members to infer different descriptors of nonlinear uncertainty.

  12. Simon on Problem-Solving

    DEFF Research Database (Denmark)

    Foss, Kirsten; Foss, Nicolai Juul

    as a general approach to problem solving. We apply these Simonian ideas to organizational issues, specifically new organizational forms. Specifically, Simonian ideas allow us to develop a morphology of new organizational forms and to point to some design problems that characterize these forms.Keywords: Herbert...... Simon, problem-solving, new organizational forms. JEL Code: D23, D83......Two of Herbert Simon's best-known papers are "The Architecture of Complexity" and "The Structure of Ill-Structured Problems." We discuss the neglected links between these two papers, highlighting the role of decomposition in the context of problems on which constraints have been imposed...

  13. Development of pulse neutron coal analyzer

    International Nuclear Information System (INIS)

    Jing Shiwie; Gu Deshan; Qiao Shuang; Liu Yuren; Liu Linmao; Jing Shiwei

    2005-01-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14 MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented

  14. Multi Criteria Decision Making (MCDM). Complex problems made easy; Multi Criteria Decision Making (MCDM). Complexe vraagstukken behapbaar maken

    Energy Technology Data Exchange (ETDEWEB)

    Van Oeffelen, E.C.M.; Van Zundert, K.; Westerlaekn, A.C. [TNO, Delft (Netherlands)

    2011-12-15

    The existing housing stock needs to become smarter and more sustainable in its energy use. From a technical viewpoint, renovations can usually be realized successfully, but the multitude of preconditions such as phasing and the degree of inconvenience for residents often turn renovation into a complex matter. The MCDM method can be a suitable instrument in handling complex renovation issues. [Dutch] In de bestaande woningvoorraad moet slimmer en vooral duurzamer met energie worden omgegaan. Technisch gezien is een renovatie vaak goed realiseerbaar, maar vele randvoorwaarden, zoals fasering en mate van overlast voor bewoners, maken renovatievraagstukken vaak complex. De MCDM-methodiek kan een geschikt hulpmiddel zijn bij het aanpakken van complexe renovatievraagstukken.

  15. Insight and analysis problem solving in microbes to machines.

    Science.gov (United States)

    Clark, Kevin B

    2015-11-01

    A key feature for obtaining solutions to difficult problems, insight is oftentimes vaguely regarded as a special discontinuous intellectual process and/or a cognitive restructuring of problem representation or goal approach. However, this nearly century-old state of art devised by the Gestalt tradition to explain the non-analytical or non-trial-and-error, goal-seeking aptitude of primate mentality tends to neglect problem-solving capabilities of lower animal phyla, Kingdoms other than Animalia, and advancing smart computational technologies built from biological, artificial, and composite media. Attempting to provide an inclusive, precise definition of insight, two major criteria of insight, discontinuous processing and problem restructuring, are here reframed using terminology and statistical mechanical properties of computational complexity classes. Discontinuous processing becomes abrupt state transitions in algorithmic/heuristic outcomes or in types of algorithms/heuristics executed by agents using classical and/or quantum computational models. And problem restructuring becomes combinatorial reorganization of resources, problem-type substitution, and/or exchange of computational models. With insight bounded by computational complexity, humans, ciliated protozoa, and complex technological networks, for example, show insight when restructuring time requirements, combinatorial complexity, and problem type to solve polynomial and nondeterministic polynomial decision problems. Similar effects are expected from other problem types, supporting the idea that insight might be an epiphenomenon of analytical problem solving and consequently a larger information processing framework. Thus, this computational complexity definition of insight improves the power, external and internal validity, and reliability of operational parameters with which to classify, investigate, and produce the phenomenon for computational agents ranging from microbes to man-made devices. Copyright

  16. Analyzing water/wastewater infrastructure interdependencies

    International Nuclear Information System (INIS)

    Gillette, J. L.; Fisher, R. E.; Peerenboom, J. P.; Whitfield, R. G.

    2002-01-01

    This paper describes four general categories of infrastructure interdependencies (physical, cyber, geographic, and logical) as they apply to the water/wastewater infrastructure, and provides an overview of one of the analytic approaches and tools used by Argonne National Laboratory to evaluate interdependencies. Also discussed are the dimensions of infrastructure interdependency that create spatial, temporal, and system representation complexities that make analyzing the water/wastewater infrastructure particularly challenging. An analytical model developed to incorporate the impacts of interdependencies on infrastructure repair times is briefly addressed

  17. Good practices in managing work-related indoor air problems: a psychosocial perspective.

    Science.gov (United States)

    Lahtinen, Marjaana; Huuhtanen, Pekka; Vähämäki, Kari; Kähkönen, Erkki; Mussalo-Rauhamaa, Helena; Reijula, Kari

    2004-07-01

    Indoor air problems at workplaces are often exceedingly complex. Technical questions are interrelated with the dynamics of the work community, and the cooperation and interaction skills of the parties involved in the problem solving process are also put to the test. The objective of our study was to analyze the process of managing and solving indoor air problems from a psychosocial perspective. This collective case study was based on data from questionnaires, interviews and various documentary materials. Technical inspections of the buildings and indoor air measurements were also carried out. The following four factors best differentiated successful cases from impeded cases: extensive multiprofessional collaboration and participative action, systematic action and perseverance, investment in information and communication, and process thinking and learning. The study also proposed a theoretical model for the role of the psychosocial work environment in indoor air problems. The expertise related to social and human aspects of problem solving plays a significant role in solving indoor air problems. Failures to properly handle these aspects may lead to resources being wasted and result in a problematic situation becoming stagnant or worse. Copyright 2004 Wiley-Liss, Inc.

  18. Exponential-Time Algorithms and Complexity of NP-Hard Graph Problems

    DEFF Research Database (Denmark)

    Taslaman, Nina Sofia

    of algorithms, as well as investigations into how far such improvements can get under reasonable assumptions.      The first part is concerned with detection of cycles in graphs, especially parameterized generalizations of Hamiltonian cycles. A remarkably simple Monte Carlo algorithm is presented......NP-hard problems are deemed highly unlikely to be solvable in polynomial time. Still, one can often find algorithms that are substantially faster than brute force solutions. This thesis concerns such algorithms for problems from graph theory; techniques for constructing and improving this type......, and with high probability any found solution is shortest possible. Moreover, the algorithm can be used to find a cycle of given parity through the specified elements.      The second part concerns the hardness of problems encoded as evaluations of the Tutte polynomial at some fixed point in the rational plane...

  19. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  20. The Complex Economic System of Supply Chain Financing

    Science.gov (United States)

    Zhang, Lili; Yan, Guangle

    Supply Chain Financing (SCF) refers to a series of innovative and complicated financial services based on supply chain. The SCF set-up is a complex system, where the supply chain management and Small and Medium Enterprises (SMEs) financing services interpenetrate systematically. This paper establishes the organization structure of SCF System, and presents two financing models respectively, with or without the participation of the third-party logistic provider (3PL). Using Information Economics and Game Theory, the interrelationship among diverse economic sectors is analyzed, and the economic mechanism of development and existent for SCF system is demonstrated. New thoughts and approaches to solve SMEs financing problem are given.

  1. Analyzing delay causes in Egyptian construction projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2014-01-01

    Full Text Available Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor’s organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  2. The Social Process of Analyzing Real Water Resource Systems Plans and Management Policies

    Science.gov (United States)

    Loucks, Daniel

    2016-04-01

    Developing and applying systems analysis methods for improving the development and management of real world water resource systems, I have learned, is primarily a social process. This talk is a call for more recognition of this reality in the modeling approaches we propose in the papers and books we publish. The mathematical models designed to inform planners and managers of water systems that we see in many of our journals often seem more complex than they need be. They also often seem not as connected to reality as they could be. While it may be easier to publish descriptions of complex models than simpler ones, and while adding complexity to models might make them better able to mimic or resemble the actual complexity of the real physical and/or social systems or processes being analyzed, the usefulness of such models often can be an illusion. Sometimes the important features of reality that are of concern or interest to those who make decisions can be adequately captured using relatively simple models. Finding the right balance for the particular issues being addressed or the particular decisions that need to be made is an art. When applied to real world problems or issues in specific basins or regions, systems modeling projects often involve more attention to the social aspects than the mathematical ones. Mathematical models addressing connected interacting interdependent components of complex water systems are in fact some of the most useful methods we have to study and better understand the systems we manage around us. They can help us identify and evaluate possible alternative solutions to problems facing humanity today. The study of real world systems of interacting components using mathematical models is commonly called applied systems analyses. Performing such analyses with decision makers rather than of decision makers is critical if the needed trust between project personnel and their clients is to be developed. Using examples from recent and ongoing

  3. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela

    2017-08-29

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  4. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela; Bender, Miriam; Ombao, Hernando

    2017-01-01

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  5. Decomposition of overlapping protein complexes: A graph theoretical method for analyzing static and dynamic protein associations

    Directory of Open Access Journals (Sweden)

    Guimarães Katia S

    2006-04-01

    Full Text Available Abstract Background Most cellular processes are carried out by multi-protein complexes, groups of proteins that bind together to perform a specific task. Some proteins form stable complexes, while other proteins form transient associations and are part of several complexes at different stages of a cellular process. A better understanding of this higher-order organization of proteins into overlapping complexes is an important step towards unveiling functional and evolutionary mechanisms behind biological networks. Results We propose a new method for identifying and representing overlapping protein complexes (or larger units called functional groups within a protein interaction network. We develop a graph-theoretical framework that enables automatic construction of such representation. We illustrate the effectiveness of our method by applying it to TNFα/NF-κB and pheromone signaling pathways. Conclusion The proposed representation helps in understanding the transitions between functional groups and allows for tracking a protein's path through a cascade of functional groups. Therefore, depending on the nature of the network, our representation is capable of elucidating temporal relations between functional groups. Our results show that the proposed method opens a new avenue for the analysis of protein interaction networks.

  6. Analyze the optimal solutions of optimization problems by means of fractional gradient based system using VIM

    Directory of Open Access Journals (Sweden)

    Firat Evirgen

    2016-04-01

    Full Text Available In this paper, a class of Nonlinear Programming problem is modeled with gradient based system of fractional order differential equations in Caputo's sense. To see the overlap between the equilibrium point of the fractional order dynamic system and theoptimal solution of the NLP problem in a longer timespan the Multistage Variational İteration Method isapplied. The comparisons among the multistage variational iteration method, the variationaliteration method and the fourth order Runge-Kutta method in fractional and integer order showthat fractional order model and techniques can be seen as an effective and reliable tool for finding optimal solutions of Nonlinear Programming problems.

  7. Problem of quality assurance during metal constructions welding via robotic technological complexes

    Science.gov (United States)

    Fominykh, D. S.; Rezchikov, A. F.; Kushnikov, V. A.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.

    2018-05-01

    The problem of minimizing the probability for critical combinations of events that lead to a loss in welding quality via robotic process automation is examined. The problem is formulated, models and algorithms for its solution are developed. The problem is solved by minimizing the criterion characterizing the losses caused by defective products. Solving the problem may enhance the quality and accuracy of operations performed and reduce the losses caused by defective product

  8. Analyzing composability of applications on MPSoC platforms

    NARCIS (Netherlands)

    Kumar, A.; Mesman, B.; Theelen, B.D.; Corporaal, H.; Yajun, H.

    2008-01-01

    Modern day applications require use of multi-processor systems for reasons of erformance, scalability and power efficiency. As more and more applications are integrated in a single system, mapping and analyzing them on a multi-processor platform becomes a multidimensional problem. Each possible set

  9. The need for simulation in complex industrial systems

    Directory of Open Access Journals (Sweden)

    Aboura Khalid

    2012-10-01

    Full Text Available We discuss the concept of simulation and its application in the resolution of problems in complex industrial systems. Most problems of serious scale, be it an inventory problem, a production and distribution problem, a management of resources or process improvement, all real world problems require a mix of generic, data algorithmic and Ad-hoc solutions making the best of available information. We describe two projects in which analytical solutions were applied or contemplated. The first case study uses linear programming in the optimal allocation of advertising resources by a major internet service provider. The second study, in a series of projects, analyses options for the expansion of the production and distribution network of mining products, as part of a sensitive strategic business review. Using the examples, we make the case for the need of simulation in complex industrial problems where analytical solutions may be attempted but where the size and complexity of the problem forces a Monte Carlo approach.

  10. Guidance for modeling causes and effects in environmental problem solving

    Science.gov (United States)

    Armour, Carl L.; Williamson, Samuel C.

    1988-01-01

    Environmental problems are difficult to solve because their causes and effects are not easily understood. When attempts are made to analyze causes and effects, the principal challenge is organization of information into a framework that is logical, technically defensible, and easy to understand and communicate. When decisionmakers attempt to solve complex problems before an adequate cause and effect analysis is performed there are serious risks. These risks include: greater reliance on subjective reasoning, lessened chance for scoping an effective problem solving approach, impaired recognition of the need for supplemental information to attain understanding, increased chance for making unsound decisions, and lessened chance for gaining approval and financial support for a program/ Cause and effect relationships can be modeled. This type of modeling has been applied to various environmental problems, including cumulative impact assessment (Dames and Moore 1981; Meehan and Weber 1985; Williamson et al. 1987; Raley et al. 1988) and evaluation of effects of quarrying (Sheate 1986). This guidance for field users was written because of the current interest in documenting cause-effect logic as a part of ecological problem solving. Principal literature sources relating to the modeling approach are: Riggs and Inouye (1975a, b), Erickson (1981), and United States Office of Personnel Management (1986).

  11. The Effect of Learning Environments Based on Problem Solving on Students' Achievements of Problem Solving

    Science.gov (United States)

    Karatas, Ilhan; Baki, Adnan

    2013-01-01

    Problem solving is recognized as an important life skill involving a range of processes including analyzing, interpreting, reasoning, predicting, evaluating and reflecting. For that reason educating students as efficient problem solvers is an important role of mathematics education. Problem solving skill is the centre of mathematics curriculum.…

  12. Problems of Forecast

    OpenAIRE

    Kucharavy , Dmitry; De Guio , Roland

    2005-01-01

    International audience; The ability to foresee future technology is a key task of Innovative Design. The paper focuses on the obstacles to reliable prediction of technological evolution for the purpose of Innovative Design. First, a brief analysis of problems for existing forecasting methods is presented. The causes for the complexity of technology prediction are discussed in the context of reduction of the forecast errors. Second, using a contradiction analysis, a set of problems related to ...

  13. Wind energy system time-domain (WEST) analyzers

    Science.gov (United States)

    Dreier, M. E.; Hoffman, J. A.

    1981-01-01

    A portable analyzer which simulates in real time the complex nonlinear dynamics of horizontal axis wind energy systems was constructed. Math models for an aeroelastic rotor featuring nonlinear aerodynamic and inertial terms were implemented with high speed digital controllers and analog calculation. This model was combined with other math models of elastic supports, control systems, a power train and gimballed rotor kinematics. A stroboscopic display system graphically depicting distributed blade loads, motion, and other aerodynamic functions on a cathode ray tube is included. Limited correlation efforts showed good comparison between the results of this analyzer and other sophisticated digital simulations. The digital simulation results were successfully correlated with test data.

  14. Analyzing Virtual Physics Simulations with Tracker

    Science.gov (United States)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  15. Complex numbers from A to Z

    CERN Document Server

    Andreescu, Titu

    2014-01-01

    It is impossible to imagine modern mathematics without complex numbers. The second edition of Complex Numbers from A to … Z introduces the reader to this fascinating subject that, from the time of L. Euler, has become one of the most utilized ideas in mathematics. The exposition concentrates on key concepts and then elementary results concerning these numbers. The reader learns how complex numbers can be used to solve algebraic equations and to understand the geometric interpretation of complex numbers and the operations involving them. The theoretical parts of the book are augmented with rich exercises and problems at various levels of difficulty. Many new problems and solutions have been added in this second edition. A special feature of the book is the last chapter, a selection of outstanding Olympiad and other important mathematical contest problems solved by employing the methods already presented. The book reflects the unique experience of the authors. It distills a vast mathematical literature, most ...

  16. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases.

    Science.gov (United States)

    Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M

    2006-04-21

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association

  17. Anatomy of safety-critical computing problems

    International Nuclear Information System (INIS)

    Swu Yih; Fan Chinfeng; Shirazi, Behrooz

    1995-01-01

    This paper analyzes the obstacles faced by current safety-critical computing applications. The major problem lies in the difficulty to provide complete and convincing safety evidence to prove that the software is safe. We explain this problem from a fundamental perspective by analyzing the essence of safety analysis against that of software developed by current practice. Our basic belief is that in order to perform a successful safety analysis, the state space structure of the analyzed system must have some properties as prerequisites. We propose the concept of safety analyzability, and derive its necessary and sufficient conditions; namely, definability, finiteness, commensurability, and tractability. We then examine software state space structures against these conditions, and affirm that the safety analyzability of safety-critical software developed by current practice is severely restricted by its state space structure and by the problem of exponential growth cost. Thus, except for small and simple systems, the safety evidence may not be complete and convincing. Our concepts and arguments successfully explain the current problematic situation faced by the safety-critical computing domain. The implications are also discussed

  18. Translating concepts of complexity to the field of ergonomics.

    Science.gov (United States)

    Walker, Guy H; Stanton, Neville A; Salmon, Paul M; Jenkins, Daniel P; Rafferty, Laura

    2010-10-01

    Since 1958 more than 80 journal papers from the mainstream ergonomics literature have used either the words 'complex' or 'complexity' in their titles. Of those, more than 90% have been published in only the past 20 years. This observation communicates something interesting about the way in which contemporary ergonomics problems are being understood. The study of complexity itself derives from non-linear mathematics but many of its core concepts have found analogies in numerous non-mathematical domains. Set against this cross-disciplinary background, the current paper aims to provide a similar initial mapping to the field of ergonomics. In it, the ergonomics problem space, complexity metrics and powerful concepts such as emergence raise complexity to the status of an important contingency factor in achieving a match between ergonomics problems and ergonomics methods. The concept of relative predictive efficiency is used to illustrate how this match could be achieved in practice. What is clear overall is that a major source of, and solution to, complexity are the humans in systems. Understanding complexity on its own terms offers the potential to leverage disproportionate effects from ergonomics interventions and to tighten up the often loose usage of the term in the titles of ergonomics papers. STATEMENT OF RELEVANCE: This paper reviews and discusses concepts from the study of complexity and maps them to ergonomics problems and methods. It concludes that humans are a major source of and solution to complexity in systems and that complexity is a powerful contingency factor, which should be considered to ensure that ergonomics approaches match the true nature of ergonomics problems.

  19. On-line analyzers to distributed control system linking

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, S.F.; Buchanan, B.R.; Sanders, M.A.

    1990-01-01

    The Analytical Development Section (ADS) of the Savannah River Laboratory is developing on-line analyzers to monitor various site processes. Data from some of the on-line analyzers (OLA's) will be used for process control by distributed control systems (DCS's) such as the Fisher PRoVOX. A problem in the past has been an efficient and cost effective way to get analyzer data onto the DCS data highway. ADS is developing a system to accomplish the linking of OLA's to PRoVOX DCS's. The system will be described, and results of operation in a research and development environment given. Plans for the installation in the production environment will be discussed.

  20. Computational Complexity of Some Problems on Generalized Cellular Automations

    Directory of Open Access Journals (Sweden)

    P. G. Klyucharev

    2012-03-01

    Full Text Available We prove that the preimage problem of a generalized cellular automation is NP-hard. The results of this work are important for supporting the security of the ciphers based on the cellular automations.

  1. Bridging disciplines through problem based learning

    DEFF Research Database (Denmark)

    Stentoft, Diana

    2011-01-01

    This paper examines whether a problem based approach to students’ learning may support interdisciplinary education at university level, where students are required to engage with the complexities inherent in constructing knowledge across disciplinary boundaries. These complexities include students...... engaging with multiple and conflicting epistemologies, identification and contextualisation of problems involving several disciplines in their solution etc. A practical example found in the case of newly developed BSc and MSc programs in Techno-Anthropology is provided.The paper includes some examples...

  2. CLASSROOM SHARING EXPERIENCES: BUILDING STUDENTS’ AWARENESS FOR PROBLEM SOLVING IN TRANSLATING POETRY

    Directory of Open Access Journals (Sweden)

    Sri Handayani

    2015-12-01

    Abstract This research was aimed at describing the classroom sharing experiences to build students’ awareness dealing with the problem solving in translating poetry. The data were collected through questionnaire, interview and classroom observation involving 85 sixth semester students in two different classes and two lecturers of Translating Literary Works course at the English Language and Literature Studies in one state university in Bandung city.  The questionnaire was completed by 55 (out of 85 students invited to fill in the questionnaire. Interview was done to complete and cross check the information derived from the questionnaire.  Meanwhile, the observation was administered in the two parallel classes to observe the activities done by the two lecturers and students in the two classes.  The observation was focused on the course materials, teaching methods and techniques applied by the lecturers, problems faced and techniques used to solve the problems by the students in translating poetry. The data were then analyzed based on some relevant theories of translation.  The result of the research showed that the classroom sharing experiences gave some advantages to the students with several reasons: (1 motivating students to do their translation works more seriously since they had to present their translation works to the class; (2 developing the students’ self-confidence in translating the tasks since their translation works were given some feedbacks; (3 training the students to analyze the problems to find out the most appropriate techniques to solve the problems; (4 introducing the students to have more critical knowledge of both source and target languages; and (5 building the students’ awareness of how the problems appeared in a very complex translation process were solved. Keywords: awareness, problem solving, sharing experience

  3. Hypertension: Believe it or not, a Complex Problem La hipertensión arterial: aunque no lo parezca, un problema complejo

    Directory of Open Access Journals (Sweden)

    Alfredo Darío Espinosa Brito

    2011-03-01

    Full Text Available High blood pressure (hypertension is recognized as a major health problem due both to its morbidity and the disability it causes and to its impact on mortality, especially cardiovascular mortality. However, effectively addressing its prevention and control, both in individuals and in the general population, does not seem to be an easy task, even these days. This paper aims to present different aspects of arterial hypertension (a concept through diagnosis, treatment and follow-up focusing on this entity as a complex system including multiple elements related to cardiovascular disease.La hipertensión arterial constituye un reconocido problema de salud, tanto por su morbilidad, por la discapacidad que provoca, como por su repercusión en la mortalidad, especialmente cardiovascular. Sin embargo, enfrentar eficazmente su prevención y control, tanto en los individuos como en la población en general, no parece una tarea fácil, aún en nuestros días. Este trabajo tiene como objetivo presentar diferentes aspectos de la hipertensión arterial (concepto pasando por el diagnóstico, tratamiento y seguimiento enfocando dicha entidad como un sistema complejo que abarca múltiples elementos relacionados con las enfermedades cardiovasculares.HYPERTENSION: BELIEVE IT OR NOT, A COMPLEX PROBLEMABSTRACTHigh blood pressure (hypertension is recognized as a major health problem due both to its morbidity and the disability it causes and to its impact on mortality, especially cardiovascular mortality. However, effectively addressing its prevention and control, both in individuals and in the general population, does not seem to be an easy task, even these days. This paper aims to present different aspects of arterial hypertension (a concept through diagnosis, treatment and follow-up focusing on this entity as a complex system including multiple elements related to cardiovascular disease.

  4. On the Combinatorics of SAT and the Complexity of Planar Problems

    DEFF Research Database (Denmark)

    Talebanfard, Navid

    In this thesis we study several problems arising in Boolean satisfiability ranging from lower bounds for SAT algorithms and proof systems to extremal properties of formulas. The first problem is about construction of hard instances for k-SAT algorithms. For PPSZ algorithm [40] we give the first...

  5. Complexity classifications for different equivalence and audit problems for Boolean circuits

    OpenAIRE

    Böhler, Elmar; Creignou, Nadia; Galota, Matthias; Reith, Steffen; Schnoor, Henning; Vollmer, Heribert

    2010-01-01

    We study Boolean circuits as a representation of Boolean functions and conskier different equivalence, audit, and enumeration problems. For a number of restricted sets of gate types (bases) we obtain efficient algorithms, while for all other gate types we show these problems are at least NP-hard.

  6. The effects of monitoring environment on problem-solving performance.

    Science.gov (United States)

    Laird, Brian K; Bailey, Charles D; Hester, Kim

    2018-01-01

    While effective and efficient solving of everyday problems is important in business domains, little is known about the effects of workplace monitoring on problem-solving performance. In a laboratory experiment, we explored the monitoring environment's effects on an individual's propensity to (1) establish pattern solutions to problems, (2) recognize when pattern solutions are no longer efficient, and (3) solve complex problems. Under three work monitoring regimes-no monitoring, human monitoring, and electronic monitoring-114 participants solved puzzles for monetary rewards. Based on research related to worker autonomy and theory of social facilitation, we hypothesized that monitored (versus non-monitored) participants would (1) have more difficulty finding a pattern solution, (2) more often fail to recognize when the pattern solution is no longer efficient, and (3) solve fewer complex problems. Our results support the first two hypotheses, but in complex problem solving, an interaction was found between self-assessed ability and the monitoring environment.

  7. Theories of computational complexity

    CERN Document Server

    Calude, C

    1988-01-01

    This volume presents four machine-independent theories of computational complexity, which have been chosen for their intrinsic importance and practical relevance. The book includes a wealth of results - classical, recent, and others which have not been published before.In developing the mathematics underlying the size, dynamic and structural complexity measures, various connections with mathematical logic, constructive topology, probability and programming theories are established. The facts are presented in detail. Extensive examples are provided, to help clarify notions and constructions. The lists of exercises and problems include routine exercises, interesting results, as well as some open problems.

  8. Path optimization method for the sign problem

    Directory of Open Access Journals (Sweden)

    Ohnishi Akira

    2018-01-01

    Full Text Available We propose a path optimization method (POM to evade the sign problem in the Monte-Carlo calculations for complex actions. Among many approaches to the sign problem, the Lefschetz-thimble path-integral method and the complex Langevin method are promising and extensively discussed. In these methods, real field variables are complexified and the integration manifold is determined by the flow equations or stochastically sampled. When we have singular points of the action or multiple critical points near the original integral surface, however, we have a risk to encounter the residual and global sign problems or the singular drift term problem. One of the ways to avoid the singular points is to optimize the integration path which is designed not to hit the singular points of the Boltzmann weight. By specifying the one-dimensional integration-path as z = t +if(t(f ϵ R and by optimizing f(t to enhance the average phase factor, we demonstrate that we can avoid the sign problem in a one-variable toy model for which the complex Langevin method is found to fail. In this proceedings, we propose POM and discuss how we can avoid the sign problem in a toy model. We also discuss the possibility to utilize the neural network to optimize the path.

  9. Experimental realization of a one-way quantum computer algorithm solving Simon's problem.

    Science.gov (United States)

    Tame, M S; Bell, B A; Di Franco, C; Wadsworth, W J; Rarity, J G

    2014-11-14

    We report an experimental demonstration of a one-way implementation of a quantum algorithm solving Simon's problem-a black-box period-finding problem that has an exponential gap between the classical and quantum runtime. Using an all-optical setup and modifying the bases of single-qubit measurements on a five-qubit cluster state, key representative functions of the logical two-qubit version's black box can be queried and solved. To the best of our knowledge, this work represents the first experimental realization of the quantum algorithm solving Simon's problem. The experimental results are in excellent agreement with the theoretical model, demonstrating the successful performance of the algorithm. With a view to scaling up to larger numbers of qubits, we analyze the resource requirements for an n-qubit version. This work helps highlight how one-way quantum computing provides a practical route to experimentally investigating the quantum-classical gap in the query complexity model.

  10. Analysis and Reduction of Complex Networks Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Roger G [University of Southern California

    2014-07-31

    This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.

  11. Features of rare earth element (3) complexing

    International Nuclear Information System (INIS)

    Martynenko, L.I.

    1991-01-01

    Reasons for nonobeyance to the regularity of tetrad ''W'' effect of rare earth chelate complex compounds are discussed in the review. The concept of metal-ligand ionic bond in rare earth complexes is put in the basis of the consideration. From this viewpoint mutual influence of ligands in lower, higher, polynuclear and different-ligand complexes, formed by the ligands of low, medium and high denticity, is discussed. Problems of intermolecular interaction of complexes with different structure are considered in relation to problems of variation of chelate volatility and selectivity in the processes of sublimation and precipitation

  12. Identifying and localizing network problems using the PuNDIT project

    International Nuclear Information System (INIS)

    Batista, Jorge; McKee, Shawn; Dovrolis, Constantine; Lee, Danny

    2015-01-01

    In today's world of distributed collaborations of scientists, there are many challenges to providing effective infrastructures to couple these groups of scientists with their shared computing and storage resources. The Pythia Network Diagnostic InfrasTructure (PuNDIT[1]) project is integrating and scaling research tools and creating robust code suitable for operational needs addressing the difficult challenge of automating the detection and location of network problems.PuNDIT is building upon the de-facto standard perfSONAR[2] network measurement infrastructure deployed in Open Science Grid(OSG)[3] and the Worldwide LHC Computing Grid(WLCG)[4]to gather and analyze complex real-world network topologies coupled with their corresponding network metrics to identify possible signatures of network problems from a set of symptoms. The PuNDIT Team is working closely with the perfSONAR developers from ESnet and Internet2 to integrate PuNDIT components as part of the perfSONAR Toolkit. A primary goal for PuNDIT is to convert complex network metrics into easily understood diagnoses in an automated way. We will report on the project progress to-date in working with the OSG and WLCG communities, describe the current implementation including some initial results and discuss future plans and the project timeline. (paper)

  13. Metallic materials for the hydrogen energy industry and main gas pipelines: complex physical problems of aging, embrittlement, and failure

    International Nuclear Information System (INIS)

    Nechaev, Yu S

    2008-01-01

    The possibilities of effective solutions of relevant technological problems are considered based on the analysis of fundamental physical aspects, elucidation of the micromechanisms and interrelations of aging and hydrogen embrittlement of materials in the hydrogen industry and gas-main industries. The adverse effects these mechanisms and processes have on the service properties and technological lifetime of materials are analyzed. The concomitant fundamental process of formation of carbohydride-like and other nanosegregation structures at dislocations (with the segregation capacity 1 to 1.5 orders of magnitude greater than in the widely used Cottrell 'atmosphere' model) and grain boundaries is discussed, as is the way in which these structures affect technological processes (aging, hydrogen embrittlement, stress corrosion damage, and failure) and the physicomechanical properties of the metallic materials (including the technological lifetimes of pipeline steels). (reviews of topical problems)

  14. Problem based learning - A brief review

    Science.gov (United States)

    Nunes, Sandra; Oliveira, Teresa A.; Oliveira, Amílcar

    2017-07-01

    Teaching is a complex mission that requires not only the theoretical knowledge transmission, but furthermore requires to provide the students the necessary skills for solving real problems in their respective professional activities where complex issues and problems must be frequently faced. Over more than twenty years we have been experiencing an increase in scholar failure in the scientific area of mathematics, which means that Teaching Mathematics and related areas can be even a more complex and hard task. Scholar failure is a complex phenomenon that depends on various factors as social factors, scholar factors or biophysical factors. After numerous attempts made in order to reduce scholar failure our goal in this paper is to understand the role of "Problem Based Learning" and how this methodology can contribute to the solution of both: increasing mathematical courses success and increasing skills in the near future professionals in Portugal. Before designing a proposal for applying this technique in our institutions, we decided to conduct a survey to provide us with the necessary information about and the respective advantages and disadvantages of this methodology, so this is the brief review aim.

  15. Solving black box computation problems using expert knowledge theory and methods

    International Nuclear Information System (INIS)

    Booker, Jane M.; McNamara, Laura A.

    2004-01-01

    The challenge problems for the Epistemic Uncertainty Workshop at Sandia National Laboratories provide common ground for comparing different mathematical theories of uncertainty, referred to as General Information Theories (GITs). These problems also present the opportunity to discuss the use of expert knowledge as an important constituent of uncertainty quantification. More specifically, how do the principles and methods of eliciting and analyzing expert knowledge apply to these problems and similar ones encountered in complex technical problem solving and decision making? We will address this question, demonstrating how the elicitation issues and the knowledge that experts provide can be used to assess the uncertainty in outputs that emerge from a black box model or computational code represented by the challenge problems. In our experience, the rich collection of GITs provides an opportunity to capture the experts' knowledge and associated uncertainties consistent with their thinking, problem solving, and problem representation. The elicitation process is rightly treated as part of an overall analytical approach, and the information elicited is not simply a source of data. In this paper, we detail how the elicitation process itself impacts the analyst's ability to represent, aggregate, and propagate uncertainty, as well as how to interpret uncertainties in outputs. While this approach does not advocate a specific GIT, answers under uncertainty do result from the elicitation

  16. Developing Seventh Grade Students' Understanding of Complex Environmental Problems with Systems Tools and Representations: a Quasi-experimental Study

    Science.gov (United States)

    Doganca Kucuk, Zerrin; Saysel, Ali Kerem

    2017-03-01

    A systems-based classroom intervention on environmental education was designed for seventh grade students; the results were evaluated to see its impact on the development of systems thinking skills and standard science achievement and whether the systems approach is a more effective way to teach environmental issues that are dynamic and complex. A quasi-experimental methodology was used to compare performances of the participants in various dimensions, including systems thinking skills, competence in dynamic environmental problem solving and success in science achievement tests. The same pre-, post- and delayed tests were used with both the comparison and experimental groups in the same public middle school in Istanbul. Classroom activities designed for the comparison group (N = 20) followed the directives of the Science and Technology Curriculum, while the experimental group (N = 22) covered the same subject matter through activities benefiting from systems tools and representations such as behaviour over time graphs, causal loop diagrams, stock-flow structures and hands-on dynamic modelling. After a one-month systems-based instruction, the experimental group demonstrated significantly better systems thinking and dynamic environmental problem solving skills. Achievement in dynamic problem solving was found to be relatively stable over time. However, standard science achievement did not improve at all. This paper focuses on the quantitative analysis of the results, the weaknesses of the curriculum and educational implications.

  17. Waste Management: An integrated modeling approach for analyzing change in NWC production processes

    International Nuclear Information System (INIS)

    Christensen, D.C.; Sohn, C.L.; Helm, T.M.; Farish, T.J.; Reid, R.A.

    1991-01-01

    A problem-driven, integrated modeling, decision-support framework has been conceptualized to aid a team of experts determine the set of evolving technologies that should receive additional developmental support. This conceptual framework utilizes a variety of decision aiding models including Flowsheeting, Analytical Hierarchy Process, Linear and Goal Programming, and Object-Oriented Discrete Event Simulation. A number of the technologies under consideration are strong candidates to overcome current plutonium processing problems so that effective technology will be available for implementation in Complex 21. Complex 21 is a participatory, inter-installation planning effort sponsored by US DOE to consolidate and revitalize the nuclear weapons complex facilities by the 21st century. A computer-based dynamic simulation model has been constructed that will allow testing of alternative combinations of developing technologies. The modeling of new configurations of technologies under a number of different operating conditions and material flow assumptions provides information needed for effective decision making for Complex 21. 4 figs

  18. Political Limits to the Processing of Policy Problems

    Directory of Open Access Journals (Sweden)

    Peter J. May

    2013-07-01

    Full Text Available This contribution addresses political limits to the processing of policy problems in the United States. Our foci are the forces that limit policymakers' attention to different aspects of problems and how this affects the prospects for problem resolution. We theorize about three sets of forces: interest engagement, linkages among relevant institutions for policymaking, and partisan conflict. We show how the interplay of these forces limits efforts to address complex problems. Based on secondary accounts, we consider these underlying dynamics for ten complex problems. These include the thorny problems of the financial crisis, climate change, and health care; the persistent problems of K-12 education, drug abuse, and food safety; and the looming problems associated with critical infrastructure, the obesity epidemic, ocean health, and terrorism and extreme events. From these accounts we identify different patterns that we label fractured, allied, bureaucratic, and anemic policymaking.

  19. Nuclear weapons complex

    International Nuclear Information System (INIS)

    Rezendes, V.S.

    1991-03-01

    In this book, GAO characterizes DOE's January 1991 Nuclear Weapons Complex Reconfiguration Study as a starting point for reaching agreement on solutions to many of the complex's safety and environmental problems. Key decisions still need to be made about the size of the complex, where to relocate plutonium operations, what technologies to use for new tritium production, and what to do with excess plutonium. The total cost for reconfiguring and modernizing the complex is still uncertain, and some management issues remain unresolved. Congress faces a difficult task in making test decisions given the conflicting demands for scarce resources in a time of growing budget deficits and war in the Persian Gulf

  20. A parallel wavelet-enhanced PWTD algorithm for analyzing transient scattering from electrically very large PEC targets

    KAUST Repository

    Liu, Yang

    2014-07-01

    The computational complexity and memory requirements of classically formulated marching-on-in-time (MOT)-based surface integral equation (SIE) solvers scale as O(Nt Ns 2) and O(Ns 2), respectively; here Nt and Ns denote the number of temporal and spatial degrees of freedom of the current density. The multilevel plane wave time domain (PWTD) algorithm, viz., the time domain counterpart of the multilevel fast multipole method, reduces these costs to O(Nt Nslog2 Ns) and O(Ns 1.5) (Ergin et al., IEEE Trans. Antennas Mag., 41, 39-52, 1999). Previously, PWTD-accelerated MOT-SIE solvers have been used to analyze transient scattering from perfect electrically conducting (PEC) and homogeneous dielectric objects discretized in terms of a million spatial unknowns (Shanker et al., IEEE Trans. Antennas Propag., 51, 628-641, 2003). More recently, an efficient parallelized solver that employs an advanced hierarchical and provably scalable spatial, angular, and temporal load partitioning strategy has been developed to analyze transient scattering problems that involve ten million spatial unknowns (Liu et. al., in URSI Digest, 2013).

  1. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence

    Science.gov (United States)

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H.

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students’ CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence. PMID:26283992

  2. A longitudinal study of higher-order thinking skills: working memory and fluid reasoning in childhood enhance complex problem solving in adolescence.

    Science.gov (United States)

    Greiff, Samuel; Wüstenberg, Sascha; Goetz, Thomas; Vainikainen, Mari-Pauliina; Hautamäki, Jarkko; Bornstein, Marc H

    2015-01-01

    Scientists have studied the development of the human mind for decades and have accumulated an impressive number of empirical studies that have provided ample support for the notion that early cognitive performance during infancy and childhood is an important predictor of later cognitive performance during adulthood. As children move from childhood into adolescence, their mental development increasingly involves higher-order cognitive skills that are crucial for successful planning, decision-making, and problem solving skills. However, few studies have employed higher-order thinking skills such as complex problem solving (CPS) as developmental outcomes in adolescents. To fill this gap, we tested a longitudinal developmental model in a sample of 2,021 Finnish sixth grade students (M = 12.41 years, SD = 0.52; 1,041 female, 978 male, 2 missing sex). We assessed working memory (WM) and fluid reasoning (FR) at age 12 as predictors of two CPS dimensions: knowledge acquisition and knowledge application. We further assessed students' CPS performance 3 years later as a developmental outcome (N = 1696; M = 15.22 years, SD = 0.43; 867 female, 829 male). Missing data partly occurred due to dropout and technical problems during the first days of testing and varied across indicators and time with a mean of 27.2%. Results revealed that FR was a strong predictor of both CPS dimensions, whereas WM exhibited only a small influence on one of the two CPS dimensions. These results provide strong support for the view that CPS involves FR and, to a lesser extent, WM in childhood and from there evolves into an increasingly complex structure of higher-order cognitive skills in adolescence.

  3. Multidimensional integral representations problems of analytic continuation

    CERN Document Server

    Kytmanov, Alexander M

    2015-01-01

    The monograph is devoted to integral representations for holomorphic functions in several complex variables, such as Bochner-Martinelli, Cauchy-Fantappiè, Koppelman, multidimensional logarithmic residue etc., and their boundary properties. The applications considered are problems of analytic continuation of functions from the boundary of a bounded domain in C^n. In contrast to the well-known Hartogs-Bochner theorem, this book investigates functions with the one-dimensional property of holomorphic extension along complex lines, and includes the problems of receiving multidimensional boundary analogs of the Morera theorem.   This book is a valuable resource for specialists in complex analysis, theoretical physics, as well as graduate and postgraduate students with an understanding of standard university courses in complex, real and functional analysis, as well as algebra and geometry.

  4. Assessing student written problem solutions: A problem-solving rubric with application to introductory physics

    OpenAIRE

    Jennifer L. Docktor; Jay Dornfeld; Evan Frodermann; Kenneth Heller; Leonardo Hsu; Koblar Alan Jackson; Andrew Mason; Qing X. Ryan; Jie Yang

    2016-01-01

    Problem solving is a complex process valuable in everyday life and crucial for learning in the STEM fields. To support the development of problem-solving skills it is important for researchers and curriculum developers to have practical tools that can measure the difference between novice and expert problem-solving performance in authentic classroom work. It is also useful if such tools can be employed by instructors to guide their pedagogy. We describe the design, development, and testing of...

  5. Development and operation of an integrated sampling probe and gas analyzer for turbulent mixing studies in complex supersonic flows

    Science.gov (United States)

    Wiswall, John D.

    -temporal characteristic scales of the flow on the resulting time-area-averaged concentration measurements. Two series of experiments were performed to verify the probe's design; the first used Schlieren photography and verified that the probe sampled from the supersonic flowfield isokinetically. The second series involved traversing the probe across a free mixing layer of air and helium, to obtain both mean concentration and high frequency measurements. High-frequency data was statistically analyzed and inspection of the Probability Density Function (PDF) of the hot-film response was instrumental to interpret how well the resulting average mixing measurements represent these types of complex flows. The probe is minimally intrusive, has accuracy comparable to its predecessors, has an improved frequency response for mean concentration measurements, and samples from a very small area in the flowfield.

  6. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  7. Population SAMC vs SAMC: Convergence and Applications to Gene Selection Problems

    KAUST Repository

    Faming Liang, Mingqi Wu

    2013-01-01

    The Bayesian model selection approach has been adopted by more and more people when analyzing a large data. However, it is known that the reversible jump MCMC (RJMCMC) algorithm, which is perhaps the most popular MCMC algorithm for Bayesian model selection, is prone to get trapped into local modes when the model space is complex. The stochastic approximation Monte Carlo (SAMC) algorithm essentially overcomes the local trap problem suffered by conventional MCMC algorithms by introducing a self-adjusting mechanism based on the past samples. In this paper, we propose a population SAMC (Pop-SAMC) algorithm, which works on a population of SAMC chains and can make use of crossover operators from genetic algorithms to further improve its efficiency. Under mild conditions, we show the convergence of this algorithm. Comparing to the single chain SAMC algorithm, Pop-SAMC provides a more efficient self-adjusting mechanism and thus can converge faster. The effectiveness of Pop-SAMC for Bayesian model selection problems is examined through a change-point identification problem and a gene selection problem. The numerical results indicate that Pop-SAMC significantly outperforms both the single chain SAMC and RJMCMC.

  8. On the Use of an Algebraic Signature Analyzer for Mixed-Signal Systems Testing

    Directory of Open Access Journals (Sweden)

    Vadim Geurkov

    2014-01-01

    Full Text Available We propose an approach to design of an algebraic signature analyzer that can be used for mixed-signal systems testing. The analyzer does not contain carry propagating circuitry, which improves its performance as well as fault tolerance. The common design technique of a signature analyzer for mixed-signal systems is based on the rules of an arithmetic finite field. The application of this technique to the systems with an arbitrary radix is a challenging task and the devices designed possess high hardware complexity. The proposed technique is simple and applicable to systems of any size and radix. The hardware complexity is low. The technique can also be used in arithmetic/algebraic coding and cryptography.

  9. The process model of problem solving difficulty

    NARCIS (Netherlands)

    Pala, O.; Rouwette, E.A.J.A.; Vennix, J.A.M.

    2002-01-01

    Groups and organizations, or in general multi-actor decision-making groups, frequently come across complex problems in which neither the problem definition nor the interrelations of parts that make up the problem are well defined. In these kinds of situations, members of a decision-making group

  10. Modern Church Construction in Urals. Problems and Prospects

    Science.gov (United States)

    Surin, D. N.; Tereshina, O. B.

    2017-11-01

    The article analyzes the problems of the modern Orthodox church architecture in Russia, special attention is paid to the problems of the Ural region. It justifies the importance of addressing to this issue connected with the Orthodox traditions revival in Russia over the last decades and the need to compensate for tens of thousands of the churches destroyed in the Soviet period. The works on the theory and history of the Russian architecture and art, studies of the architectural heritage and the art of building of the Ural craftsmen are used as a scientific and methodological base for the church architecture development. The article discloses the historically formed architectural features of the Russian Orthodox churches the artistic image of which is designed to create a certain religious and aesthetic experience. It is stated that the restoration of the Russian church construction tradition is possible on the background of architectural heritage. It sets the tendencies and vital tasks in church construction and outlines a complex of measures to solve these tasks at the public and regional levels.

  11. A restricted Steiner tree problem is solved by Geometric Method II

    Science.gov (United States)

    Lin, Dazhi; Zhang, Youlin; Lu, Xiaoxu

    2013-03-01

    The minimum Steiner tree problem has wide application background, such as transportation system, communication network, pipeline design and VISL, etc. It is unfortunately that the computational complexity of the problem is NP-hard. People are common to find some special problems to consider. In this paper, we first put forward a restricted Steiner tree problem, which the fixed vertices are in the same side of one line L and we find a vertex on L such the length of the tree is minimal. By the definition and the complexity of the Steiner tree problem, we know that the complexity of this problem is also Np-complete. In the part one, we have considered there are two fixed vertices to find the restricted Steiner tree problem. Naturally, we consider there are three fixed vertices to find the restricted Steiner tree problem. And we also use the geometric method to solve such the problem.

  12. Introduction to complex theory of differential equations

    CERN Document Server

    Savin, Anton

    2017-01-01

    This book discusses the complex theory of differential equations or more precisely, the theory of differential equations on complex-analytic manifolds. Although the theory of differential equations on real manifolds is well known – it is described in thousands of papers and its usefulness requires no comments or explanations – to date specialists on differential equations have not focused on the complex theory of partial differential equations. However, as well as being remarkably beautiful, this theory can be used to solve a number of problems in real theory, for instance, the Poincaré balayage problem and the mother body problem in geophysics. The monograph does not require readers to be familiar with advanced notions in complex analysis, differential equations, or topology. With its numerous examples and exercises, it appeals to advanced undergraduate and graduate students, and also to researchers wanting to familiarize themselves with the subject.

  13. Accelerator complex for unstable beams at INS

    International Nuclear Information System (INIS)

    Tomizawa, M.; Arai, S.; Doi, M.; Katayama, T.; Niki, K.; Tokuda, N.; Yoshizawa, M.

    1992-11-01

    The construction of the prototype facility of the Exotic arena in the Japan Hadron Project (JHP) is started in 1992 at the Institute for Nuclear Study (INS), University of Tokyo. The purpose of this facility is to study the various technical problems of the Exotic arena, and to perform the experiment on nuclear and astrophysics with unstable nuclear beam. The unstable nuclei produced by bombarding a thick target with 40 MeV proton beam from the existing SF cyclotron are ionized in the ion sources, mass-analyzed by an ISOL, and transported to the accelerator complex. The accelerator complex consists of a split coaxial RFQ and an interdigital H type linac. The construction of accelerator will be completed in fiscal year 1994. The development of the SCRFQ and the IH linac which is suitable to the post-accelerator of the SCRFQ are reported. Charge stripper and the beam matching between the SCRFQ and the IH linac are explained. A buncher is necessary for the matching of longitudinal phase space between the SCRFQ and the IH linac. (K.I.)

  14. Assessing Complexity in Learning Outcomes--A Comparison between the SOLO Taxonomy and the Model of Hierarchical Complexity

    Science.gov (United States)

    Stålne, Kristian; Kjellström, Sofia; Utriainen, Jukka

    2016-01-01

    An important aspect of higher education is to educate students who can manage complex relationships and solve complex problems. Teachers need to be able to evaluate course content with regard to complexity, as well as evaluate students' ability to assimilate complex content and express it in the form of a learning outcome. One model for evaluating…

  15. Holographic complexity for time-dependent backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Momeni, Davood, E-mail: davoodmomeni78@gmail.com [Eurasian International Center for Theoretical Physics and Department of General Theoretical Physics, Eurasian National University, Astana 010008 (Kazakhstan); Faizal, Mir, E-mail: mirfaizalmir@googlemail.com [Irving K. Barber School of Arts and Sciences, University of British Columbia, Okanagan, 3333 University Way, Kelowna, British Columbia V1V 1V7 (Canada); Department of Physics and Astronomy, University of Lethbridge, Lethbridge, Alberta, T1K 3M4 (Canada); Bahamonde, Sebastian, E-mail: sebastian.beltran.14@ucl.ac.uk [Department of Mathematics, University College London, Gower Street, London, WC1E 6BT (United Kingdom); Myrzakulov, Ratbay [Eurasian International Center for Theoretical Physics and Department of General Theoretical Physics, Eurasian National University, Astana 010008 (Kazakhstan)

    2016-11-10

    In this paper, we will analyze the holographic complexity for time-dependent asymptotically AdS geometries. We will first use a covariant zero mean curvature slicing of the time-dependent bulk geometries, and then use this co-dimension one spacelike slice of the bulk spacetime to define a co-dimension two minimal surface. The time-dependent holographic complexity will be defined using the volume enclosed by this minimal surface. This time-dependent holographic complexity will reduce to the usual holographic complexity for static geometries. We will analyze the time-dependence as a perturbation of the asymptotically AdS geometries. Thus, we will obtain time-dependent asymptotically AdS geometries, and we will calculate the holographic complexity for such time-dependent geometries.

  16. Application of Raptor-M3G to reactor dosimetry problems on massively parallel architectures - 026

    International Nuclear Information System (INIS)

    Longoni, G.

    2010-01-01

    The solution of complex 3-D radiation transport problems requires significant resources both in terms of computation time and memory availability. Therefore, parallel algorithms and multi-processor architectures are required to solve efficiently large 3-D radiation transport problems. This paper presents the application of RAPTOR-M3G (Rapid Parallel Transport Of Radiation - Multiple 3D Geometries) to reactor dosimetry problems. RAPTOR-M3G is a newly developed parallel computer code designed to solve the discrete ordinates (SN) equations on multi-processor computer architectures. This paper presents the results for a reactor dosimetry problem using a 3-D model of a commercial 2-loop pressurized water reactor (PWR). The accuracy and performance of RAPTOR-M3G will be analyzed and the numerical results obtained from the calculation will be compared directly to measurements of the neutron field in the reactor cavity air gap. The parallel performance of RAPTOR-M3G on massively parallel architectures, where the number of computing nodes is in the order of hundreds, will be analyzed up to four hundred processors. The performance results will be presented based on two supercomputing architectures: the POPLE supercomputer operated by the Pittsburgh Supercomputing Center and the Westinghouse computer cluster. The Westinghouse computer cluster is equipped with a standard Ethernet network connection and an InfiniBand R interconnects capable of a bandwidth in excess of 20 GBit/sec. Therefore, the impact of the network architecture on RAPTOR-M3G performance will be analyzed as well. (authors)

  17. A new theoretical approach to analyze complex processes in cytoskeleton proteins.

    Science.gov (United States)

    Li, Xin; Kolomeisky, Anatoly B

    2014-03-20

    Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.

  18. A Different Trolley Problem: The Limits of Environmental Justice and the Promise of Complex Moral Assessments for Transportation Infrastructure.

    Science.gov (United States)

    Epting, Shane

    2016-12-01

    Transportation infrastructure tremendously affects the quality of life for urban residents, influences public and mental health, and shapes social relations. Historically, the topic is rich with social and political controversy and the resultant transit systems in the United States cause problems for minority residents and issues for the public. Environmental justice frameworks provide a means to identify and address harms that affect marginalized groups, but environmental justice has limits that cannot account for the mainstream population. To account for this condition, I employ a complex moral assessment measure that provides a way to talk about harms that affect the public.

  19. Problems and solutions in quantum physics

    CERN Document Server

    Ficek, Zbigniew

    2016-01-01

    This book contains tutorial problems with solutions for the textbook Quantum Physics for Beginners. The reader studying the abstract field of quantum physics needs to solve plenty of practical, especially quantitative, problems. This book places emphasis on basic problems of quantum physics together with some instructive, simulating, and useful applications. A considerable range of complexity is presented by these problems, and not too many of them can be solved using formulas alone.

  20. The relationships between problem characteristics, achievement-related behaviors, and academic achievement in problem-based learning

    NARCIS (Netherlands)

    N. Sockalingam (Nachamma); J.I. Rotgans (Jerome); H.G. Schmidt (Henk)

    2011-01-01

    textabstractThis study investigated the influence of five problem characteristics on students' achievement-related classroom behaviors and academic achievement. Data from 5,949 polytechnic students in PBL curricula across 170 courses were analyzed by means of path analysis. The five problem

  1. Pinning Synchronization of Switched Complex Dynamical Networks

    Directory of Open Access Journals (Sweden)

    Liming Du

    2015-01-01

    Full Text Available Network topology and node dynamics play a key role in forming synchronization of complex networks. Unfortunately there is no effective synchronization criterion for pinning synchronization of complex dynamical networks with switching topology. In this paper, pinning synchronization of complex dynamical networks with switching topology is studied. Two basic problems are considered: one is pinning synchronization of switched complex networks under arbitrary switching; the other is pinning synchronization of switched complex networks by design of switching when synchronization cannot achieved by using any individual connection topology alone. For the two problems, common Lyapunov function method and single Lyapunov function method are used respectively, some global synchronization criteria are proposed and the designed switching law is given. Finally, simulation results verify the validity of the results.

  2. Are middle school mathematics teachers able to solve word problems without using variable?

    Science.gov (United States)

    Gökkurt Özdemir, Burçin; Erdem, Emrullah; Örnek, Tuğba; Soylu, Yasin

    2018-01-01

    Many people consider problem solving as a complex process in which variables such as x, y are used. Problems may not be solved by only using 'variable.' Problem solving can be rationalized and made easier using practical strategies. When especially the development of children at younger ages is considered, it is obvious that mathematics teachers should solve problems through concrete processes. In this context, middle school mathematics teachers' skills to solve word problems without using variables were examined in the current study. Through the case study method, this study was conducted with 60 middle school mathematics teachers who have different professional experiences in five provinces in Turkey. A test consisting of five open-ended word problems was used as the data collection tool. The content analysis technique was used to analyze the data. As a result of the analysis, it was seen that the most of the teachers used trial-and-error strategy or area model as the solution strategy. On the other hand, the teachers who solved the problems using variables such as x, a, n or symbols such as Δ, □, ○, * and who also felt into error by considering these solutions as without variable were also seen in the study.

  3. Probabilistic data integration and computational complexity

    Science.gov (United States)

    Hansen, T. M.; Cordua, K. S.; Mosegaard, K.

    2016-12-01

    Inverse problems in Earth Sciences typically refer to the problem of inferring information about properties of the Earth from observations of geophysical data (the result of nature's solution to the `forward' problem). This problem can be formulated more generally as a problem of `integration of information'. A probabilistic formulation of data integration is in principle simple: If all information available (from e.g. geology, geophysics, remote sensing, chemistry…) can be quantified probabilistically, then different algorithms exist that allow solving the data integration problem either through an analytical description of the combined probability function, or sampling the probability function. In practice however, probabilistic based data integration may not be easy to apply successfully. This may be related to the use of sampling methods, which are known to be computationally costly. But, another source of computational complexity is related to how the individual types of information are quantified. In one case a data integration problem is demonstrated where the goal is to determine the existence of buried channels in Denmark, based on multiple sources of geo-information. Due to one type of information being too informative (and hence conflicting), this leads to a difficult sampling problems with unrealistic uncertainty. Resolving this conflict prior to data integration, leads to an easy data integration problem, with no biases. In another case it is demonstrated how imperfections in the description of the geophysical forward model (related to solving the wave-equation) can lead to a difficult data integration problem, with severe bias in the results. If the modeling error is accounted for, the data integration problems becomes relatively easy, with no apparent biases. Both examples demonstrate that biased information can have a dramatic effect on the computational efficiency solving a data integration problem and lead to biased results, and under

  4. Zoonoses, One Health and complexity: wicked problems and constructive conflict.

    Science.gov (United States)

    Waltner-Toews, David

    2017-07-19

    Infectious zoonoses emerge from complex interactions among social and ecological systems. Understanding this complexity requires the accommodation of multiple, often conflicting, perspectives and narratives, rooted in different value systems and temporal-spatial scales. Therefore, to be adaptive, successful and sustainable, One Health approaches necessarily entail conflicts among observers, practitioners and scholars. Nevertheless, these integrative approaches have, both implicitly and explicitly, tended to marginalize some perspectives and prioritize others, resulting in a kind of technocratic tyranny. An important function of One Health approaches should be to facilitate and manage those conflicts, rather than to impose solutions.This article is part of the themed issue 'One Health for a changing world: zoonoses, ecosystems and human well-being'. © 2017 The Authors.

  5. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  6. Markov Renewal Methods in Restart Problems in Complex Systems

    DEFF Research Database (Denmark)

    Asmussen, Søren; Lipsky, Lester; Thompson, Stephen

    A task with ideal execution time L such as the execution of a computer program or the transmission of a file on a data link may fail, and the task then needs to be restarted. The task is handled by a complex system with features similar to the ones in classical reliability: failures may...

  7. [Modeling and implementation method for the automatic biochemistry analyzer control system].

    Science.gov (United States)

    Wang, Dong; Ge, Wan-cheng; Song, Chun-lin; Wang, Yun-guang

    2009-03-01

    In this paper the system structure The automatic biochemistry analyzer is a necessary instrument for clinical diagnostics. First of is analyzed. The system problems description and the fundamental principles for dispatch are brought forward. Then this text puts emphasis on the modeling for the automatic biochemistry analyzer control system. The objects model and the communications model are put forward. Finally, the implementation method is designed. It indicates that the system based on the model has good performance.

  8. Novel Complexity Indicator of Manufacturing Process Chains and Its Relations to Indirect Complexity Indicators

    Directory of Open Access Journals (Sweden)

    Vladimir Modrak

    2017-01-01

    Full Text Available Manufacturing systems can be considered as a network of machines/workstations, where parts are produced in flow shop or job shop environment, respectively. Such network of machines/workstations can be depicted as a graph, with machines as nodes and material flow between the nodes as links. The aim of this paper is to use sequences of operations and machine network to measure static complexity of manufacturing processes. In this order existing approaches to measure the static complexity of manufacturing systems are analyzed and subsequently compared. For this purpose, analyzed competitive complexity indicators were tested on two different manufacturing layout examples. A subsequent analysis showed relevant potential of the proposed method.

  9. MDcons: Intermolecular contact maps as a tool to analyze the interface of protein complexes from molecular dynamics trajectories

    KAUST Repository

    Abdel-Azeim, Safwat

    2014-05-06

    Background: Molecular Dynamics ( MD) simulations of protein complexes suffer from the lack of specific tools in the analysis step. Analyses of MD trajectories of protein complexes indeed generally rely on classical measures, such as the RMSD, RMSF and gyration radius, conceived and developed for single macromolecules. As a matter of fact, instead, researchers engaged in simulating the dynamics of a protein complex are mainly interested in characterizing the conservation/variation of its biological interface. Results: On these bases, herein we propose a novel approach to the analysis of MD trajectories or other conformational ensembles of protein complexes, MDcons, which uses the conservation of inter-residue contacts at the interface as a measure of the similarity between different snapshots. A "consensus contact map" is also provided, where the conservation of the different contacts is drawn in a grey scale. Finally, the interface area of the complex is monitored during the simulations. To show its utility, we used this novel approach to study two protein-protein complexes with interfaces of comparable size and both dominated by hydrophilic interactions, but having binding affinities at the extremes of the experimental range. MDcons is demonstrated to be extremely useful to analyse the MD trajectories of the investigated complexes, adding important insight into the dynamic behavior of their biological interface. Conclusions: MDcons specifically allows the user to highlight and characterize the dynamics of the interface in protein complexes and can thus be used as a complementary tool for the analysis of MD simulations of both experimental and predicted structures of protein complexes.

  10. Technical study on semi-object emulation of structural statics problem

    CERN Document Server

    MoJun; LiuXingFu; LiuZhiYong; Shi Pin Gan

    2002-01-01

    Structural strength analysis depends mainly on finite element method and experiments. For complex structural system, a rather large error can be caused by some uncertain factors, such as load distributions, boundary conditions and constitutive relations in numerical analysis. At the same time, owing to the limitation of measuring and testing techniques, the strength and stiffness of key components can not be estimated by using the limited test data. To simulate stresses accurately under complex static environment, improve man-machine interactive system, and make the best use of fore- and post-processing function in graphic data processing, the combine numerical analysis with experimental technique and have developed the semi-object emulation technique to analyze the nonlinear problem of structure statics. The modern optical measuring techniques and image processing techniques are firstly used for the method to acquire displacement data of the vessel surface, and the data are used for the boundary condition to...

  11. Computational error and complexity in science and engineering computational error and complexity

    CERN Document Server

    Lakshmikantham, Vangipuram; Chui, Charles K; Chui, Charles K

    2005-01-01

    The book "Computational Error and Complexity in Science and Engineering” pervades all the science and engineering disciplines where computation occurs. Scientific and engineering computation happens to be the interface between the mathematical model/problem and the real world application. One needs to obtain good quality numerical values for any real-world implementation. Just mathematical quantities symbols are of no use to engineers/technologists. Computational complexity of the numerical method to solve the mathematical model, also computed along with the solution, on the other hand, will tell us how much computation/computational effort has been spent to achieve that quality of result. Anyone who wants the specified physical problem to be solved has every right to know the quality of the solution as well as the resources spent for the solution. The computed error as well as the complexity provide the scientific convincing answer to these questions. Specifically some of the disciplines in which the book w...

  12. Arithmetic of Complex Manifolds

    CERN Document Server

    Lange, Herbert

    1989-01-01

    It was the aim of the Erlangen meeting in May 1988 to bring together number theoretists and algebraic geometers to discuss problems of common interest, such as moduli problems, complex tori, integral points, rationality questions, automorphic forms. In recent years such problems, which are simultaneously of arithmetic and geometric interest, have become increasingly important. This proceedings volume contains 12 original research papers. Its main topics are theta functions, modular forms, abelian varieties and algebraic three-folds.

  13. Sparsity regularization for parameter identification problems

    International Nuclear Information System (INIS)

    Jin, Bangti; Maass, Peter

    2012-01-01

    The investigation of regularization schemes with sparsity promoting penalty terms has been one of the dominant topics in the field of inverse problems over the last years, and Tikhonov functionals with ℓ p -penalty terms for 1 ⩽ p ⩽ 2 have been studied extensively. The first investigations focused on regularization properties of the minimizers of such functionals with linear operators and on iteration schemes for approximating the minimizers. These results were quickly transferred to nonlinear operator equations, including nonsmooth operators and more general function space settings. The latest results on regularization properties additionally assume a sparse representation of the true solution as well as generalized source conditions, which yield some surprising and optimal convergence rates. The regularization theory with ℓ p sparsity constraints is relatively complete in this setting; see the first part of this review. In contrast, the development of efficient numerical schemes for approximating minimizers of Tikhonov functionals with sparsity constraints for nonlinear operators is still ongoing. The basic iterated soft shrinkage approach has been extended in several directions and semi-smooth Newton methods are becoming applicable in this field. In particular, the extension to more general non-convex, non-differentiable functionals by variational principles leads to a variety of generalized iteration schemes. We focus on such iteration schemes in the second part of this review. A major part of this survey is devoted to applying sparsity constrained regularization techniques to parameter identification problems for partial differential equations, which we regard as the prototypical setting for nonlinear inverse problems. Parameter identification problems exhibit different levels of complexity and we aim at characterizing a hierarchy of such problems. The operator defining these inverse problems is the parameter-to-state mapping. We first summarize some

  14. Where humans meet machines innovative solutions for knotty natural-language problems

    CERN Document Server

    Markowitz, Judith

    2013-01-01

    Where Humans Meet Machines: Innovative Solutions for Knotty Natural-Language Problems brings humans and machines closer together by showing how linguistic complexities that confound the speech systems of today can be handled effectively by sophisticated natural-language technology. Some of the most vexing natural-language problems that are addressed in this book entail   recognizing and processing idiomatic expressions, understanding metaphors, matching an anaphor correctly with its antecedent, performing word-sense disambiguation, and handling out-of-vocabulary words and phrases. This fourteen-chapter anthology consists of contributions from industry scientists and from academicians working at major universities in North America and Europe. They include researchers who have played a central role in DARPA-funded programs and developers who craft real-world solutions for corporations. These contributing authors analyze the role of natural language technology in the global marketplace; they explore the need f...

  15. Modern problems in applied analysis

    CERN Document Server

    Rogosin, Sergei

    2018-01-01

    This book features a collection of recent findings in Applied Real and Complex Analysis that were presented at the 3rd International Conference “Boundary Value Problems, Functional Equations and Applications” (BAF-3), held in Rzeszow, Poland on 20-23 April 2016. The contributions presented here develop a technique related to the scope of the workshop and touching on the fields of differential and functional equations, complex and real analysis, with a special emphasis on topics related to boundary value problems. Further, the papers discuss various applications of the technique, mainly in solid mechanics (crack propagation, conductivity of composite materials), biomechanics (viscoelastic behavior of the periodontal ligament, modeling of swarms) and fluid dynamics (Stokes and Brinkman type flows, Hele-Shaw type flows). The book is addressed to all readers who are interested in the development and application of innovative research results that can help solve theoretical and real-world problems.

  16. A note on the depth function of combinatorial optimization problems

    NARCIS (Netherlands)

    Woeginger, G.J.

    2001-01-01

    In a recent paper [Discrete Appl. Math. 43 (1993) 115–129], Kern formulates two conjectures on the relationship between the computational complexity of computing the depth function of a discrete optimization problem and the computational complexity of solving this optimization problem to optimality.

  17. Unit-time scheduling problems with time dependent resources

    NARCIS (Netherlands)

    Tautenhahn, T.; Woeginger, G.

    1997-01-01

    We investigate the computational complexity of scheduling problems, where the operations consume certain amounts of renewable resources which are available in time-dependent quantities. In particular, we consider unit-time open shop problems and unit-time scheduling problems with identical parallel

  18. Analysis on complex structure stability under different bar angle with BIM technology

    Directory of Open Access Journals (Sweden)

    Wang Xiongjue

    2016-03-01

    Full Text Available Sun Valley, the landmark building of World Expo in Shanghai, which has free surface with single-layer reticulated shell structure, is a typical complex structure. CAD/CAM integrated information system to design is used for the complex structure; however, it is a very rigorous process to be used widely. The relevant technology of the Sun Valley is not open to the public at present, so we try to use BIM technology to model the Sun Valley, including architecture modelling and structure analysis. By analysis of the Sun Valley structure using this method, it is proved that the problems in modelling may be solved by writing some script codes in Rhino software and the stability of the model can also be analyzed. The new approach is viable and effective in combination with different softwares such as Rhino, Revit, and Midas in solution of the complex shaped surfaces’ structure for modelling and calculation.

  19. The context of ethical problems in medical volunteer work.

    Science.gov (United States)

    Wall, Anji

    2011-06-01

    Ethical problems are common in clinical medicine, so medical volunteers who practice clinical medicine in developing countries should expect to encounter them just as they would in their practice in the developed world. However, as this article argues, medical volunteers in developing countries should not expect to encounter the same ethical problems as those that dominate Western biomedicine or to address ethical problems in the same way as they do in their practice in developed countries. For example, poor health and advanced disease increase the risks and decrease the potential benefits of some interventions. Consequently, when medical volunteers intervene too readily, without considering the nutritional and general health status of patients, the results can be devastating. Medical volunteers cannot assume that the outcomes of interventions in developing countries will be comparable to the outcomes of the same interventions in developed countries. Rather, they must realistically consider the complex medical conditions of patients when determining whether or not to intervene. Similarly, medical volunteers may face the question of whether to provide a pharmaceutical or perform an intervention that is below the acceptable standard of care versus the alternative of doing nothing. This article critically explores the contextual features of medical volunteer work in developing countries that differentiate it from medical practice in developed countries, arguing that this context contributes to the creation of unique ethical problems and affects the way in which these problems should be analyzed and resolved.

  20. Computational issues in complex water-energy optimization problems: Time scales, parameterizations, objectives and algorithms

    Science.gov (United States)

    Efstratiadis, Andreas; Tsoukalas, Ioannis; Kossieris, Panayiotis; Karavokiros, George; Christofides, Antonis; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Modelling of large-scale hybrid renewable energy systems (HRES) is a challenging task, for which several open computational issues exist. HRES comprise typical components of hydrosystems (reservoirs, boreholes, conveyance networks, hydropower stations, pumps, water demand nodes, etc.), which are dynamically linked with renewables (e.g., wind turbines, solar parks) and energy demand nodes. In such systems, apart from the well-known shortcomings of water resources modelling (nonlinear dynamics, unknown future inflows, large number of variables and constraints, conflicting criteria, etc.), additional complexities and uncertainties arise due to the introduction of energy components and associated fluxes. A major difficulty is the need for coupling two different temporal scales, given that in hydrosystem modeling, monthly simulation steps are typically adopted, yet for a faithful representation of the energy balance (i.e. energy production vs. demand) a much finer resolution (e.g. hourly) is required. Another drawback is the increase of control variables, constraints and objectives, due to the simultaneous modelling of the two parallel fluxes (i.e. water and energy) and their interactions. Finally, since the driving hydrometeorological processes of the integrated system are inherently uncertain, it is often essential to use synthetically generated input time series of large length, in order to assess the system performance in terms of reliability and risk, with satisfactory accuracy. To address these issues, we propose an effective and efficient modeling framework, key objectives of which are: (a) the substantial reduction of control variables, through parsimonious yet consistent parameterizations; (b) the substantial decrease of computational burden of simulation, by linearizing the combined water and energy allocation problem of each individual time step, and solve each local sub-problem through very fast linear network programming algorithms, and (c) the substantial

  1. Rapid Mission Design for Dynamically Complex Environments

    Data.gov (United States)

    National Aeronautics and Space Administration — Designing trajectories in dynamically complex environments is very challenging and easily becomes an intractable problem. More complex planning implies potentially...

  2. Nuclear weapons complex

    International Nuclear Information System (INIS)

    Rezendes, V.S.

    1992-04-01

    In addition to long-standing safety and environmental problems plaguing the nuclear weapons complex, this paper reports that the Department of Energy (DOE) faces a major new challenge-how to reconfigure the weapons complex to meet the nation's defense needs in the 21st century. Key decisions still need to be made about the size of the complex; where, if necessary, to relocate various operations; what technologies to use for new tritium production; and what to do with excess weapons-grade material. The choices confronting DOE and Congress are difficult given the conflicting demands for limited resources

  3. Study on the generalized WKB approximation for the inverse scattering problem at fixed energy for complex potentials

    International Nuclear Information System (INIS)

    Pozdnyakov, Yu.A.; Terenetskij, K.O.

    1981-01-01

    The approximate method for solution of the inverse scattering problem (ISP) at fixed energy for complex spherically symmetric potentials decreasing faster 1/r is considered. The method is based on using a generalized WKB approximation. For the designed potential V(r) a sufficiently ''close'' reference potential V(r) has been chosen. For both potentials S-matrix elements (ME) have been calculated and inversion procedure has been carried out. S-ME have been calculated for integral-valued and intermediate angular moment values. S-ME are presented in a graphical form for being restored reference, and restored potentials for proton scattering with Esub(p)=49.48 MeV energy on 12 C nuclei. The restoration is the better the ''closer'' the sought-for potential to the reference one. This allows to specify the potential by means of iterations: the restored potential can be used as a reference one, etc. The operation of a restored potential smoothing before the following iteration is introduced. Drawbacks and advantages of the ISP solution method under consideration are pointed out. The method application is strongly limited by the requirement that the energy should be higher than a certain ''critical'' one. The method is applicable in a wider region of particle energies (in the low-energies direction) than the ordinary WKB method. The method is more simple in realization conformably to complex potentials. The investigations carried out of the proposed ISP solution method at fixed energy for complex spherically-symmetric potentials allow to conclude that the method can be successFully applied to specify the central part of interaction of nucleons, α-particles and heavy ions of average and high energies with atomic nuclei [ru

  4. Physical Complexity and Cognitive Evolution

    Science.gov (United States)

    Jedlicka, Peter

    Our intuition tells us that there is a general trend in the evolution of nature, a trend towards greater complexity. However, there are several definitions of complexity and hence it is difficult to argue for or against the validity of this intuition. Christoph Adami has recently introduced a novel measure called physical complexity that assigns low complexity to both ordered and random systems and high complexity to those in between. Physical complexity measures the amount of information that an organism stores in its genome about the environment in which it evolves. The theory of physical complexity predicts that evolution increases the amount of `knowledge' an organism accumulates about its niche. It might be fruitful to generalize Adami's concept of complexity to the entire evolution (including the evolution of man). Physical complexity fits nicely into the philosophical framework of cognitive biology which considers biological evolution as a progressing process of accumulation of knowledge (as a gradual increase of epistemic complexity). According to this paradigm, evolution is a cognitive `ratchet' that pushes the organisms unidirectionally towards higher complexity. Dynamic environment continually creates problems to be solved. To survive in the environment means to solve the problem, and the solution is an embodied knowledge. Cognitive biology (as well as the theory of physical complexity) uses the concepts of information and entropy and views the evolution from both the information-theoretical and thermodynamical perspective. Concerning humans as conscious beings, it seems necessary to postulate an emergence of a new kind of knowledge - a self-aware and self-referential knowledge. Appearence of selfreflection in evolution indicates that the human brain reached a new qualitative level in the epistemic complexity.

  5. Analyzing the performance index for a hybrid electric vehicle

    NARCIS (Netherlands)

    Ngo, D. V.; Hofman, T.; Steinbuch, M.; Serrarens, A. F A

    2011-01-01

    The definition of a performance index for the optimization design and optimal control problem of a Hybrid Electric Vehicle is not often considered and analyzed explicitly. In literature, there is no study about proposing a method of building or evaluating whether a performance index is appropriate.

  6. A system to analyze the complex physiological states of coal solubilizing fungi

    Energy Technology Data Exchange (ETDEWEB)

    Hoelker, U.; Moenkemann, H.; Hoefer, M. [Universitaet Bonn, Bonn (Germany). Botanisches Institut

    1997-11-01

    The mechanism by which some microorganisms solubilize brown coal is still unknown. The paper discusses the deuteromycetes Fusarium oxysporum and Trichoderma atroviride as a suitable test system to analyse the complex fungal physiology relating to coal solubilization. The two fungi can occur in two different growth substrate-controlled physiological states: a coal-solubilizing one, when cells are grown on glutamate or gluconate as substrate and a non-solubilizing one, when grown on carbohydrates. When grown on carbohydrates, F.oxysporum produces the pigment bikaverein. Purified bikaverein inhibits also coal solubilization by T. atroviride. The ability to solubilize coal is constitutive in F. oxysporum, while in T. atroviride, it has to be induced. 10 refs., 3 figs., 3 tabs.

  7. Analysis and discussion on the experimental data of electrolyte analyzer

    Science.gov (United States)

    Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei

    2018-06-01

    In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.

  8. Development of a nuclear plant analyzer (NPA)

    International Nuclear Information System (INIS)

    De Vlaminck, M.; Mampaey, L.; Vanhoenacker, L.; Bastenaire, F.

    1990-01-01

    A Nuclear Plant Analyzer has been developed by TRACTABEL. Three distinct functional units make up the Nuclear Plant Analyser, a model builder, a run time unit and an analysis unit. The model builder is intended to build simulation models which describe on the one hand the geometric structure and initial conditions of a given plant and on the other hand command control logics and reactor protection systems. The run time unit carries out dialog between the user and the thermal-hydraulic code. The analysis unit is aimed at deep analyzing of the transient results. The model builder is being tested in the framework of the International Standard Problem ISP-26, which is the simulation of a LOCA on the Japanese ROSA facility

  9. STRATEGIC MANAGEMENT OF TRANSPORT CARGO COMPLEX

    Directory of Open Access Journals (Sweden)

    A. M. Okorokov

    2014-06-01

    Full Text Available Purpose. Making the qualitative administrative decisions defining strategy and tactics of transport cargo complexes development, and also its subsystems, is possible only in the presence of flexible optimization model. This model has to consider multiparametricity and multicriteriality of the given task, uncertainty and vagueness of input information, and also to provide process automation of searching the best parameters of the given production facility. The purpose of the research is to develop procedures for the strategic management of complex with view of the most important factors and their stochastic nature, which will execute the improvement of technical equipment of TCC. Methodology. The problem of strategic management is based on solving the complex of issues of the optimal number of shunting locomotives, optimal processing capability of handling the front and rational capacity of warehouses. The problem is solved on the basis of the proposed optimality criterion – the specific set of profit per unit of capital assets of freight industry. The listed problems are solved using simulation modeling of the freight industry. Findings. The use of developed procedure allows one to improve the technical equipment of the freight stations and complexes. Originality. For the first time it was developed the procedure of strategic management of development. This procedure allows taking into account the probabilistic nature of demand for services of transport freight complexes and technological processes of client services on the complex stations. The proposed procedure can be applied during when planning the investments in the creation of transport freight complexes. Practical value. Use as a basic tool of simulation models of complex cargo operation allows estimating the effectiveness of the capital investments, the level of operating costs, as well as the quality of meeting the demands of potential customers in transportations at the stage of

  10. Optimal recombination in genetic algorithms for combinatorial optimization problems: Part II

    Directory of Open Access Journals (Sweden)

    Eremeev Anton V.

    2014-01-01

    Full Text Available This paper surveys results on complexity of the optimal recombination problem (ORP, which consists in finding the best possible offspring as a result of a recombination operator in a genetic algorithm, given two parent solutions. In Part II, we consider the computational complexity of ORPs arising in genetic algorithms for problems on permutations: the Travelling Salesman Problem, the Shortest Hamilton Path Problem and the Makespan Minimization on Single Machine and some other related problems. The analysis indicates that the corresponding ORPs are NP-hard, but solvable by faster algorithms, compared to the problems they are derived from.

  11. PROBLEMS OF UKRAINIAN ENERGY AND THEIR SOLUTIONS

    Directory of Open Access Journals (Sweden)

    G. Fyliuk

    2016-04-01

    Full Text Available The paper studies current situation at the Ukrainian electric power industry. The problems which prevent development of the industry under current conditions are analyzed. The problems of the cross-subsidization are exposed. The ways of the problems solutions are offered.

  12. Optimal management with hybrid dynamics : The shallow lake problem

    NARCIS (Netherlands)

    Reddy, P.V.; Schumacher, Hans; Engwerda, Jacob; Camlibel, M.K.; Julius, A.A.; Pasumarthy, R.

    2015-01-01

    In this article we analyze an optimal management problem that arises in ecological economics using hybrid systems modeling. First, we introduce a discounted autonomous infinite horizon hybrid optimal control problem and develop few tools to analyze the necessary conditions for optimality. Next,

  13. Learning about “wicked” problems in the Global South. Creating a film-based learning environment with “Visual Problem Appraisal”

    Directory of Open Access Journals (Sweden)

    Loes Witteveen

    2012-03-01

    Full Text Available The current complexity of sustainable development in the Global South calls for the design of learning strategies that can deal with this complexity. One such innovative learning strategy, called Visual Problem Appraisal (VPA, is highlighted in this article. The strategy is termed visual as it creates a learning environment that is film-based. VPA enhances the analysis of complex issues, and facilitates stakeholder dialogue and action planning. The strategy is used in workshops dealing with problem analysis and policy design, and involves the participants “meeting” stakeholders through filmed narratives. The article demonstrates the value of using film in multi stakeholder learning environments addressing issues concerning sustainable development.

  14. Composing complex EXAFS problems with severe information constraints

    International Nuclear Information System (INIS)

    Ravel, B

    2009-01-01

    In recent work, a model for the structural environment of Hg bound to a catalytic DNA sensor was proposed on the basis of EXAFS data analysis. Although severely constrained by limited data quality and scant supporting structural data, a compelling structural model was found which agreed with a similar but less detailed model proposed on the basis on NMR data. I discuss in detail the successes and limitations of the analytical strategy that were implemented in the earlier work. I then speculate on future software requirements needed to make this and similarly complex analytical strategies more available to the wider audience of EXAFS practitioners.

  15. Augmented neural networks and problem structure-based heuristics for the bin-packing problem

    Science.gov (United States)

    Kasap, Nihat; Agarwal, Anurag

    2012-08-01

    In this article, we report on a research project where we applied augmented-neural-networks (AugNNs) approach for solving the classical bin-packing problem (BPP). AugNN is a metaheuristic that combines a priority rule heuristic with the iterative search approach of neural networks to generate good solutions fast. This is the first time this approach has been applied to the BPP. We also propose a decomposition approach for solving harder BPP, in which subproblems are solved using a combination of AugNN approach and heuristics that exploit the problem structure. We discuss the characteristics of problems on which such problem structure-based heuristics could be applied. We empirically show the effectiveness of the AugNN and the decomposition approach on many benchmark problems in the literature. For the 1210 benchmark problems tested, 917 problems were solved to optimality and the average gap between the obtained solution and the upper bound for all the problems was reduced to under 0.66% and computation time averaged below 33 s per problem. We also discuss the computational complexity of our approach.

  16. Evolving hard problems: Generating human genetics datasets with a complex etiology

    Directory of Open Access Journals (Sweden)

    Himmelstein Daniel S

    2011-07-01

    Full Text Available Abstract Background A goal of human genetics is to discover genetic factors that influence individuals' susceptibility to common diseases. Most common diseases are thought to result from the joint failure of two or more interacting components instead of single component failures. This greatly complicates both the task of selecting informative genetic variants and the task of modeling interactions between them. We and others have previously developed algorithms to detect and model the relationships between these genetic factors and disease. Previously these methods have been evaluated with datasets simulated according to pre-defined genetic models. Results Here we develop and evaluate a model free evolution strategy to generate datasets which display a complex relationship between individual genotype and disease susceptibility. We show that this model free approach is capable of generating a diverse array of datasets with distinct gene-disease relationships for an arbitrary interaction order and sample size. We specifically generate eight-hundred Pareto fronts; one for each independent run of our algorithm. In each run the predictiveness of single genetic variation and pairs of genetic variants have been minimized, while the predictiveness of third, fourth, or fifth-order combinations is maximized. Two hundred runs of the algorithm are further dedicated to creating datasets with predictive four or five order interactions and minimized lower-level effects. Conclusions This method and the resulting datasets will allow the capabilities of novel methods to be tested without pre-specified genetic models. This allows researchers to evaluate which methods will succeed on human genetics problems where the model is not known in advance. We further make freely available to the community the entire Pareto-optimal front of datasets from each run so that novel methods may be rigorously evaluated. These 76,600 datasets are available from http://discovery.dartmouth.edu/model_free_data/.

  17. Advances in real and complex analysis with applications

    CERN Document Server

    Cho, Yeol; Agarwal, Praveen; Area, Iván

    2017-01-01

    This book discusses a variety of topics in mathematics and engineering as well as their applications, clearly explaining the mathematical concepts in the simplest possible way and illustrating them with a number of solved examples. The topics include real and complex analysis, special functions and analytic number theory, q-series, Ramanujan’s mathematics, fractional calculus, Clifford and harmonic analysis, graph theory, complex analysis, complex dynamical systems, complex function spaces and operator theory, geometric analysis of complex manifolds, geometric function theory, Riemannian surfaces, Teichmüller spaces and Kleinian groups, engineering applications of complex analytic methods, nonlinear analysis, inequality theory, potential theory, partial differential equations, numerical analysis , fixed-point theory, variational inequality, equilibrium problems, optimization problems, stability of functional equations, and mathematical physics.  It includes papers presented at the 24th International Confe...

  18. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    Science.gov (United States)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  19. Ionic strength dependence of stability constants, complexation of Molybdenum(V I) with EDTA

    International Nuclear Information System (INIS)

    Zare, K.; Majlesi, K.; Teimoori, F.

    2002-01-01

    The stability constant of Mo (Vi) complexes with EDTA in aqueous solution has been determined by various authors using different techniques, but according to literature, no work has been reported on ionic strength dependence of these complexes. The present work describes the complexation of Mo (Vi) with EDTA in an ionic strength range of 0.1 to 1.0 moldm - 3 s odium perchlorate at 25 d ig C . The complexation of molybdenum (Vi) with EDTA was investigated in aqueous solution ranging in ph from 5 to 7 using UV spectrophotometric techniques. The composition of the complex was determined by the continuous variations method. It was shown that molybdenum (Vi) forms a 2:1 complex with EDTA of the type (MoO 3 ) 2 L - 4 a t ph =5.5 The parameters that define the dependence on ionic strength were analyzed with the aim of obtaining further information regarding to their variation as a function of the charges involved in the complex reaction. Moreover, a Debye-Huckel type equation makes it possible to estimate a stability constant at a fixed ionic strength when its value is known at another ionic media in the range of 0.1 3 . Therefore the evaluation may make a significant contribution solving many analytical and speciation problems

  20. The application of game theory and cognitive economy to analyze the problem of undesired location

    International Nuclear Information System (INIS)

    Villani, S.

    2008-01-01

    The analysts of the processes of public bodies decision - taking have long been discussing on the establishment of proper strategies to manage environmental conflicts - above all the so-called problems of undesired location of public works and facilities - efficiently (i.e. on a short-period basis so as to grant decision and agreement stability) and fairly (the parties' satisfaction is itself a further guarantee of decision and agreement stability). Each strategy, anyway, is still in progress, like a universe to create and explore. Therefore, in this paper, we will focus on the analysis of the problem and provide as well some theoretical proposals to arrange a new interpreting model of public bodies decision-taking processes based on the achievements of two new subject-matters: evolutionary game theory and cognitive economy. Both sciences share their investigation field with law and economic science. [it

  1. A framework to approach problems of forensic anthropology using complex networks

    Science.gov (United States)

    Caridi, Inés; Dorso, Claudio O.; Gallo, Pablo; Somigliana, Carlos

    2011-05-01

    We have developed a method to analyze and interpret emerging structures in a set of data which lacks some information. It has been conceived to be applied to the problem of getting information about people who disappeared in the Argentine state of Tucumán from 1974 to 1981. Even if the military dictatorship formally started in Argentina had begun in 1976 and lasted until 1983, the disappearance and assassination of people began some months earlier. During this period several circuits of Illegal Detention Centres (IDC) were set up in different locations all over the country. In these secret centres, disappeared people were illegally kept without any sort of constitutional guarantees, and later assassinated. Even today, the final destination of most of the disappeared people’s remains is still unknown. The fundamental hypothesis in this work is that a group of people with the same political affiliation whose disappearances were closely related in time and space shared the same place of captivity (the same IDC or circuit of IDCs). This hypothesis makes sense when applied to the systematic method of repression and disappearances which was actually launched in Tucumán, Argentina (2007) [11]. In this work, the missing individuals are identified as nodes on a network and connections are established among them based on the individuals’ attributes while they were alive, by using rules to link them. In order to determine which rules are the most effective in defining the network, we use other kind of knowledge available in this problem: previous results from the anthropological point of view (based on other sources of information, both oral and written, historical and anthropological data, etc.); and information about the place (one or more IDCs) where some people were kept during their captivity. For these best rules, a prediction about these people’s possible destination is assigned (one or more IDCs where they could have been kept), and the success of the

  2. [Problems encountered by hospital pharmacists with information systems: Analysis of exchanges within social networks].

    Science.gov (United States)

    Charpiat, B; Mille, F; Fombeur, P; Machon, J; Zawadzki, E; Bobay-Madic, A

    2018-05-21

    The development of information systems in French hospitals is mandatory. The aim of this work was to analyze the content of exchanges carried out within social networks, dealing with problems encountered with hospital pharmacies information systems. Messages exchanged via the mailing list of the Association pour le Digital et l'Information en Pharmacie and abstracts of communications presented at hospital pharmacists trade union congresses were analyzed. Those referring to information systems used in hospital pharmacies were selected. From March 2015 to June 2016, 122 e-mails sent by 80 pharmacists concerned information systems. From 2002 to 2016, 45 abstracts dealt with this topic. Problems most often addressed in these 167 documents were "parameterization and/or functionalities" (n=116), interfaces and complexity of the hospital information systems (n=52), relationship with health information technologies vendors and poor reactivity (n=32), additional workload (n=32), ergonomics (n=30), insufficient user training (n=22). These problems are interdependent, lead to errors and in order to mitigate their consequences, they compel pharmacy professionals to divert a significant amount of working hours to the detriment of pharmaceutical care and dispensing and preparing drugs. Hospital pharmacists are faced with many problems of insecurity and inefficiency generated by information systems. Researches are warranted to determine their cost, specify their deleterious effects on care and identify the safest information systems. Copyright © 2018 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  3. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  4. Complexities, Catastrophes and Cities: Emergency Dynamics in Varying Scenarios and Urban Topologies

    Science.gov (United States)

    Narzisi, Giuseppe; Mysore, Venkatesh; Byeon, Jeewoong; Mishra, Bud

    Complex Systems are often characterized by agents capable of interacting with each other dynamically, often in non-linear and non-intuitive ways. Trying to characterize their dynamics often results in partial differential equations that are difficult, if not impossible, to solve. A large city or a city-state is an example of such an evolving and self-organizing complex environment that efficiently adapts to different and numerous incremental changes to its social, cultural and technological infrastructure [1]. One powerful technique for analyzing such complex systems is Agent-Based Modeling (ABM) [9], which has seen an increasing number of applications in social science, economics and also biology. The agent-based paradigm facilitates easier transfer of domain specific knowledge into a model. ABM provides a natural way to describe systems in which the overall dynamics can be described as the result of the behavior of populations of autonomous components: agents, with a fixed set of rules based on local information and possible central control. As part of the NYU Center for Catastrophe Preparedness and Response (CCPR1), we have been exploring how ABM can serve as a powerful simulation technique for analyzing large-scale urban disasters. The central problem in Disaster Management is that it is not immediately apparent whether the current emergency plans are robust against such sudden, rare and punctuated catastrophic events.

  5. Improved Ant Colony Optimization for Seafood Product Delivery Routing Problem

    Directory of Open Access Journals (Sweden)

    Baozhen Yao

    2014-02-01

    Full Text Available This paper deals with a real-life vehicle delivery routing problem, which is a seafood product delivery routing problem. Considering the features of the seafood product delivery routing problem, this paper formulated this problem as a multi-depot open vehicle routing problem. Since the multi-depot open vehicle routing problem is a very complex problem, a method is used to reduce the complexity of the problem by changing the multi-depot open vehicle routing problem into an open vehicle routing problem with a dummy central depot in this paper. Then, ant colony optimization is used to solve the problem. To improve the performance of the algorithm, crossover operation and some adaptive strategies are used. Finally, the computational results for the benchmark problems of the multi-depot vehicle routing problem indicate that the proposed ant colony optimization is an effective method to solve the multi-depot vehicle routing problem. Furthermore, the computation results of the seafood product delivery problem from Dalian, China also suggest that the proposed ant colony optimization is feasible to solve the seafood product delivery routing problem.

  6. Dynamical complexity detection in geomagnetic activity indices using wavelet transforms and Tsallis entropy

    Science.gov (United States)

    Balasis, G.; Daglis, I. A.; Papadimitriou, C.; Kalimeri, M.; Anastasiadis, A.; Eftaxias, K.

    2008-12-01

    Dynamical complexity detection for output time series of complex systems is one of the foremost problems in physics, biology, engineering, and economic sciences. Especially in magnetospheric physics, accurate detection of the dissimilarity between normal and abnormal states (e.g. pre-storm activity and magnetic storms) can vastly improve space weather diagnosis and, consequently, the mitigation of space weather hazards. Herein, we examine the fractal spectral properties of the Dst data using a wavelet analysis technique. We show that distinct changes in associated scaling parameters occur (i.e., transition from anti- persistent to persistent behavior) as an intense magnetic storm approaches. We then analyze Dst time series by introducing the non-extensive Tsallis entropy, Sq, as an appropriate complexity measure. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). The Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization.

  7. Can the complex networks help us in the resolution of the problem of power outages (blackouts) in Brazil?

    Energy Technology Data Exchange (ETDEWEB)

    Castro, Paulo Alexandre de; Souza, Thaianne Lopes de [Universidade Federal de Goias (UFG), Catalao, GO (Brazil)

    2011-07-01

    Full text. What the Brazilian soccer championship, Hollywood actors, the network of the Internet, the spread of viruses and electric distribution network have in common? Until less than two decade ago, the answer would be 'nothing' or 'almost nothing'. However, the answer today to this same question is 'all' or 'almost all'. The answer to these questions and more can be found through a sub-area of statistical physics | called science of complex networks that has been used to approach and study the most diverse natural and non-natural systems, such as systems/social networks, information, technological or biological. In this work we study the distribution network of electric power in Brazil (DEEB), from a perspective of complex networks, where we associate stations and/or substations with a network of vertices and the links between the vertices we associate with the transmission lines. We are doing too a comparative study with the best-known models of complex networks, such as Erdoes-Renyi, Configuration Model and Barabasi-Albert, and then we compare with results obtained in real electrical distribution networks. Based on this information, we do a comparative analysis using the following variables: connectivity distribution, diameter, clustering coefficient, which are frequently used in studies of complex networks. We emphasize that the main objective of this study is to analyze the robustness of the network DEEB, and then propose alternatives for network connectivity, which may contribute to the increase of robustness in maintenance projects and/or expansion of the network, in other words our goal is to make the network to proof the blackouts or improve the endurance the network against the blackouts. For this purpose, we use information from the structural properties of networks, computer modeling and simulation. (author)

  8. Qualitative review of usability problems in health information systems for radiology.

    Science.gov (United States)

    Dias, Camila Rodrigues; Pereira, Marluce Rodrigues; Freire, André Pimenta

    2017-12-01

    Radiology processes are commonly supported by Radiology Information System (RIS), Picture Archiving and Communication System (PACS) and other software for radiology. However, these information technologies can present usability problems that affect the performance of radiologists and physicians, especially considering the complexity of the tasks involved. The purpose of this study was to extract, classify and analyze qualitatively the usability problems in PACS, RIS and other software for radiology. A systematic review was performed to extract usability problems reported in empirical usability studies in the literature. The usability problems were categorized as violations of Nielsen and Molich's usability heuristics. The qualitative analysis indicated the causes and the effects of the identified usability problems. From the 431 papers initially identified, 10 met the study criteria. The analysis of the papers identified 90 instances of usability problems, classified into categories corresponding to established usability heuristics. The five heuristics with the highest number of instances of usability problems were "Flexibility and efficiency of use", "Consistency and standards", "Match between system and the real world", "Recognition rather than recall" and "Help and documentation", respectively. These problems can make the interaction time consuming, causing delays in tasks, dissatisfaction, frustration, preventing users from enjoying all the benefits and functionalities of the system, as well as leading to more errors and difficulties in carrying out clinical analyses. Furthermore, the present paper showed a lack of studies performed on systems for radiology, especially usability evaluations using formal methods of evaluation involving the final users. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Modelling methodology for engineering of complex sociotechnical systems

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2014-10-01

    Full Text Available Different systems engineering techniques and approaches are applied to design and develop complex sociotechnical systems for complex problems. In a complex sociotechnical system cognitive and social humans use information technology to make sense...

  10. Model complexity control for hydrologic prediction

    NARCIS (Netherlands)

    Schoups, G.; Van de Giesen, N.C.; Savenije, H.H.G.

    2008-01-01

    A common concern in hydrologic modeling is overparameterization of complex models given limited and noisy data. This leads to problems of parameter nonuniqueness and equifinality, which may negatively affect prediction uncertainties. A systematic way of controlling model complexity is therefore

  11. Toward Sustainable Anticipatory Governance: Analyzing and Assessing Nanotechnology Innovation Processes

    Science.gov (United States)

    Foley, Rider Williams

    Cities around the globe struggle with socio-economic disparities, resource inefficiency, environmental contamination, and quality-of-life challenges. Technological innovation, as one prominent approach to problem solving, promises to address these challenges; yet, introducing new technologies, such as nanotechnology, into society and cities has often resulted in negative consequences. Recent research has conceptually linked anticipatory governance and sustainability science: to understand the role of technology in complex problems our societies face; to anticipate negative consequences of technological innovation; and to promote long-term oriented and responsible governance of technologies. This dissertation advances this link conceptually and empirically, focusing on nanotechnology and urban sustainability challenges. The guiding question for this dissertation research is: How can nanotechnology be innovated and governed in responsible ways and with sustainable outcomes? The dissertation: analyzes the nanotechnology innovation process from an actor- and activities-oriented perspective (Chapter 2); assesses this innovation process from a comprehensive perspective on sustainable governance (Chapter 3); constructs a small set of future scenarios to consider future implications of different nanotechnology governance models (Chapter 4); and appraises the amenability of sustainability problems to nanotechnological interventions (Chapter 5). The four studies are based on data collected through literature review, document analysis, participant observation, interviews, workshops, and walking audits, as part of process analysis, scenario construction, and technology assessment. Research was conducted in collaboration with representatives from industry, government agencies, and civic organizations. The empirical parts of the four studies focus on Metropolitan Phoenix. Findings suggest that: predefined mandates and economic goals dominate the nanotechnology innovation process

  12. Atrial fibrillation management in older heart failure patients: a complex clinical problem

    Directory of Open Access Journals (Sweden)

    Giovanni Pulignano

    2016-09-01

    Full Text Available BackgroundAtrial fibrillation (AF and heart failure (HF, two problems of growing prevalence as a consequence of the ageing population, are associated with high morbidity, mortality, and healthcare costs. AF and HF also share common risk factors and pathophysiologic processes such as hypertension, diabetes mellitus, ischemic heart disease, and valvular heart disease often occur together. Although elderly patients with both HF and AF are affected by worse symptoms and poorer prognosis, there is a paucity of data on appropriate management of these patients.MethodsPubMed was searched for studies on AF and older patients using the terms atrial fibrillation, elderly, heart failure, cognitive impairment, frailty, stroke, and anticoagulants.ResultsThe clinical picture of HF patients with AF is complex and heterogeneous with a higher prevalence of frailty, cognitive impairment, and disability. Because of the association of mental and physical impairment to non-administration of oral anticoagulants (OACs, screening for these simple variables in clinical practice may allow better strategies for intervention in this high-risk population. Since novel direct OACs (NOACs have a more favorable risk-benefit profile, they may be preferable to vitamin K antagonists (VKAs in many frail elderly patients, especially those at higher risk of falls. Moreover, NOACs are simple to administer and monitor and may be associated with better adherence and safety in patients with cognitive deficits and mobility impairments.ConclusionsLarge multicenter longitudinal studies are needed to examine the effects of VKAs and NOACs on long-term cognitive function and frailty; future studies should include geriatric conditions.

  13. Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics

    Science.gov (United States)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-06-01

    The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.

  14. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of complex biosystem analyzing technology; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Fukugo seibutsukei kaiseki gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    The aim is to utilize the sophisticated functions of complex biosystems. In the research and development of technologies for effectively utilizing unexploited resources and substances such as seeweeds and algae, seaweeds are added to seawater to turn into a microbial suspension after the passage of two weeks, the suspension is next scattered on a carageenan culture medium, and then carageenan decomposing microbes are obtained. In the research and development of technologies for utilizing microbe/fauna-flora complex systems, technologies for exploring and analyzing microbes are studied. For this purpose, 48 kinds of sponges and 300 kinds of bacteria symbiotic with the sponges are sampled in Malaysia. Out of them, 15 exhibit enzyme inhibition and Artemia salina lethality activities. In the development of technologies for analyzing the functions of microbes engaged in the production of useful resources and substances for animals and plants, 150 kinds of micro-algae are subjected to screening using protease and chitinase inhibiting activities as the indexes, and it is found that an extract of Isochrysis galbana displays an intense inhibitory activity. The alga is cultured in quantities, the active component is isolated from 20g of dried alga, and its constitution is determined. (NEDO)

  15. Probabilities and Predictions: Modeling the Development of Scientific Problem-Solving Skills

    Science.gov (United States)

    2005-01-01

    The IMMEX (Interactive Multi-Media Exercises) Web-based problem set platform enables the online delivery of complex, multimedia simulations, the rapid collection of student performance data, and has already been used in several genetic simulations. The next step is the use of these data to understand and improve student learning in a formative manner. This article describes the development of probabilistic models of undergraduate student problem solving in molecular genetics that detailed the spectrum of strategies students used when problem solving, and how the strategic approaches evolved with experience. The actions of 776 university sophomore biology majors from three molecular biology lecture courses were recorded and analyzed. Each of six simulations were first grouped by artificial neural network clustering to provide individual performance measures, and then sequences of these performances were probabilistically modeled by hidden Markov modeling to provide measures of progress. The models showed that students with different initial problem-solving abilities choose different strategies. Initial and final strategies varied across different sections of the same course and were not strongly correlated with other achievement measures. In contrast to previous studies, we observed no significant gender differences. We suggest that instructor interventions based on early student performances with these simulations may assist students to recognize effective and efficient problem-solving strategies and enhance learning. PMID:15746978

  16. Bioinspired computation in combinatorial optimization: algorithms and their computational complexity

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2012-01-01

    Bioinspired computation methods, such as evolutionary algorithms and ant colony optimization, are being applied successfully to complex engineering and combinatorial optimization problems, and it is very important that we understand the computational complexity of these algorithms. This tutorials...... problems. Classical single objective optimization is examined first. They then investigate the computational complexity of bioinspired computation applied to multiobjective variants of the considered combinatorial optimization problems, and in particular they show how multiobjective optimization can help...... to speed up bioinspired computation for single-objective optimization problems. The tutorial is based on a book written by the authors with the same title. Further information about the book can be found at www.bioinspiredcomputation.com....

  17. Mitigating Local Natural Disaster through Social Aware Preparedness Using Complexity Approach

    Science.gov (United States)

    Supadli, Irwan; Saputri, Andini; Mawengkang, Herman

    2018-01-01

    During and after natural disaster, such as, eruption of vulcano, many people have to abandon their living place to a temporary shelter. Usually, there could be several time for the occurrence of the eruption. This situation, for example, happened at Sinabung vulcano, located in Karo district of North Sumatera Province, Indonesia. The people in the disaster area have become indifferent. In terms of the society, the local natural disaster problem belong to a complex societal problem. This research is to find a way what should be done to these society to raise their social awareness that they had experienced serious natural disaster and they will be able to live normally and sustainable as before. Societal complexity approach is used to solve the problems. Social studies referred to in this activity are to analyze the social impacts arising from the implementation of the relocation itself. Scope of social impact assessments include are The social impact of the development program of relocation, including the impact of construction activities and long-term impact of construction activity, particularly related to the source and use of clean water, sewerage system, drainage and waste management (solid waste), Social impacts arising associated with occupant relocation sites and the availability of infrastructure (public facilities, include: worship facilities, health and education) in the local environment (pre-existing). Social analysis carried out on the findings of the field, the study related documents and observations of the condition of the existing social environment Siosar settlements.

  18. Use of multiple singular value decompositions to analyze complex intracellular calcium ion signals

    KAUST Repository

    Martinez, Josue G.

    2009-12-01

    We compare calcium ion signaling (Ca(2+)) between two exposures; the data are present as movies, or, more prosaically, time series of images. This paper describes novel uses of singular value decompositions (SVD) and weighted versions of them (WSVD) to extract the signals from such movies, in a way that is semi-automatic and tuned closely to the actual data and their many complexities. These complexities include the following. First, the images themselves are of no interest: all interest focuses on the behavior of individual cells across time, and thus, the cells need to be segmented in an automated manner. Second, the cells themselves have 100+ pixels, so that they form 100+ curves measured over time, so that data compression is required to extract the features of these curves. Third, some of the pixels in some of the cells are subject to image saturation due to bit depth limits, and this saturation needs to be accounted for if one is to normalize the images in a reasonably un-biased manner. Finally, the Ca(2+) signals have oscillations or waves that vary with time and these signals need to be extracted. Thus, our aim is to show how to use multiple weighted and standard singular value decompositions to detect, extract and clarify the Ca(2+) signals. Our signal extraction methods then lead to simple although finely focused statistical methods to compare Ca(2+) signals across experimental conditions.

  19. Discovering Steiner Triple Systems through Problem Solving

    Science.gov (United States)

    Sriraman, Bharath

    2004-01-01

    An attempt to implement problem solving as a teacher of ninth grade algebra is described. The problems selected were not general ones, they involved combinations and represented various situations and were more complex which lead to the discovery of Steiner triple systems.

  20. Partnerships as panacea for addressing global problems?

    NARCIS (Netherlands)

    A. Kolk (Ans)

    2013-01-01

    textabstractThis chapter examines partnerships and their peculiarities, based on recent research from various disciplines, in the context of the large problems faced by (global) society. These problems are very complex, often cross national boundaries, and cannot easily be 'solved' by one single

  1. Interfacing a fieldable multichannel analyzer to a MicroVAX computer

    International Nuclear Information System (INIS)

    Litherland, K.R.; Johnson, M.W.

    1990-01-01

    This paper reports on software written for interfacing the D.S. Davidson Model 2056 portable multichannel analyzer to a MicroVAX computer running the VMS operating system. The operational objective of the software is to give the user a nearly transparent mechanism for controlling the analyzer with functions equivalent to those on the analyzer's own keyboard, thus minimizing the training requirement for the user. The software is written in VMS enhanced Fortran and consists of a main control program, several subprocesses, and libraries containing graphics commands and other information. Interfaces to other commercially available software packages for data storage and manipulation are provided. Problems encountered and their programming solutions are discussed

  2. Problem solving stages in the five square problem.

    Science.gov (United States)

    Fedor, Anna; Szathmáry, Eörs; Öllinger, Michael

    2015-01-01

    According to the restructuring hypothesis, insight problem solving typically progresses through consecutive stages of search, impasse, insight, and search again for someone, who solves the task. The order of these stages was determined through self-reports of problem solvers and has never been verified behaviorally. We asked whether individual analysis of problem solving attempts of participants revealed the same order of problem solving stages as defined by the theory and whether their subjective feelings corresponded to the problem solving stages they were in. Our participants tried to solve the Five-Square problem in an online task, while we recorded the time and trajectory of their stick movements. After the task they were asked about their feelings related to insight and some of them also had the possibility of reporting impasse while working on the task. We found that the majority of participants did not follow the classic four-stage model of insight, but had more complex sequences of problem solving stages, with search and impasse recurring several times. This means that the classic four-stage model is not sufficient to describe variability on the individual level. We revised the classic model and we provide a new model that can generate all sequences found. Solvers reported insight more often than non-solvers and non-solvers reported impasse more often than solvers, as expected; but participants did not report impasse more often during behaviorally defined impasse stages than during other stages. This shows that impasse reports might be unreliable indicators of impasse. Our study highlights the importance of individual analysis of problem solving behavior to verify insight theory.

  3. Problem solving stages in the five square problem

    Directory of Open Access Journals (Sweden)

    Anna eFedor

    2015-08-01

    Full Text Available According to the restructuring hypothesis, insight problem solving typically progresses through consecutive stages of search, impasse, insight and search again for someone, who solves the task. The order of these stages was determined through self-reports of problem solvers and has never been verified behaviourally. We asked whether individual analysis of problem solving attempts of participants revealed the same order of problem solving stages as defined by the theory and whether their subjective feelings corresponded to the problem solving stages they were in. 101 participants tried to solve the Five-Square problem in an online task, while we recorded the time and trajectory of their stick movements. After the task they were asked about their feelings related to insight and 67 of them also had the possibility of reporting impasse while working on the task. We have found that 49% (19 out of 39 of the solvers and 13% (8 out of 62 of the non-solvers followed the classic four-stage model of insight. The rest of the participants had more complex sequences of problem solving stages, with search and impasse recurring several times. This means that the classic four-stage model must be extended to explain variability on the individual level. We provide a model that can generate all sequences found. Solvers reported insight more often than non-solvers and non-solvers reported impasse more often than solvers, as expected; but participants did not report impasse more often during behaviourally defined impasse stages than during other stages. This shows that impasse reports might be unreliable indicators of impasse. Our study highlights the importance of individual analysis of problem solving behaviour to verify insight theory.

  4. Complex analysis with applications to flows and fields

    CERN Document Server

    Braga da Costa Campos, Luis Manuel

    2012-01-01

    Complex Analysis with Applications to Flows and Fields presents the theory of functions of a complex variable, from the complex plane to the calculus of residues to power series to conformal mapping. The book explores numerous physical and engineering applications concerning potential flows, the gravity field, electro- and magnetostatics, steady heat conduction, and other problems. It provides the mathematical results to sufficiently justify the solution of these problems, eliminating the need to consult external references.The book is conveniently divided into four parts. In each part, the ma

  5. Distributed Cooperation Solution Method of Complex System Based on MAS

    Science.gov (United States)

    Weijin, Jiang; Yuhui, Xu

    To adapt the model in reconfiguring fault diagnosing to dynamic environment and the needs of solving the tasks of complex system fully, the paper introduced multi-Agent and related technology to the complicated fault diagnosis, an integrated intelligent control system is studied in this paper. Based on the thought of the structure of diagnostic decision and hierarchy in modeling, based on multi-layer decomposition strategy of diagnosis task, a multi-agent synchronous diagnosis federation integrated different knowledge expression modes and inference mechanisms are presented, the functions of management agent, diagnosis agent and decision agent are analyzed, the organization and evolution of agents in the system are proposed, and the corresponding conflict resolution algorithm in given, Layered structure of abstract agent with public attributes is build. System architecture is realized based on MAS distributed layered blackboard. The real world application shows that the proposed control structure successfully solves the fault diagnose problem of the complex plant, and the special advantage in the distributed domain.

  6. Forecasting of Processes in Complex Systems for Real-World Problems

    Czech Academy of Sciences Publication Activity Database

    Pelikán, Emil

    2014-01-01

    Roč. 24, č. 6 (2014), s. 567-589 ISSN 1210-0552 Institutional support: RVO:67985807 Keywords : complex systems * data assimilation * ensemble forecasting * forecasting * global solar radiation * judgmental forecasting * multimodel forecasting * pollution Subject RIV: IN - Informatics, Computer Science Impact factor: 0.479, year: 2014

  7. Modeling Complex Time Limits

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2013-01-01

    Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.

  8. How to make a complex story understandable. Communication on nitrogen

    International Nuclear Information System (INIS)

    Bleeker, A.; Hensen, A.; Erisman, J.W.

    2011-01-01

    Understanding is the first step towards solving the nitrogen problem. Various applications have been developed to gain insight in the complex interactions between the nitrogen cycle and the social-economic and environmental aspects. Experience has learned that many users have not only gained a clearer picture of the urgency and complexity of the problem; now they also have options for dealing with the nitrogen problem. [nl

  9. Relating Actor Analysis Methods to Policy Problems

    NARCIS (Netherlands)

    Van der Lei, T.E.

    2009-01-01

    For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy

  10. Optimizing GC Injections when Analyzing δ2H of Vanillin for Traceability Studies

    DEFF Research Database (Denmark)

    Hansen, Anne-Mette Sølvbjerg; Fromberg, Arvid; Frandsen, Henrik Lauritz

    Column overloading is a problem when analyzing δ2H, due to the low natural abundance of deuterium and poor ionization efficiency of H2. This problem can be overcome by using split injections instead of splitless. In this study we have compared the influence upon the measured isotopic ratios when ...

  11. Systemic decision making fundamentals for addressing problems and messes

    CERN Document Server

    Hester, Patrick T

    2017-01-01

    This expanded second edition of the 2014 textbook features dedicated sections on action and observation, so that the reader can combine the use of the developed theoretical basis with practical guidelines for deployment. It also includes a focus on selection and use of a dedicated modeling paradigm – fuzzy cognitive mapping – to facilitate use of the proposed multi-methodology. The end goal of the text is a holistic, interdisciplinary approach to structuring and assessing complex problems, including a dedicated discussion of thinking, acting, and observing complex problems. The multi-methodology developed is scientifically grounded in systems theory and its accompanying principles, while the process emphasizes the nonlinear nature of all complex problem-solving endeavors. The authors’ clear and consistent chapter structure facilitates the book’s use in the classroom.

  12. The Algebra of Complex Numbers.

    Science.gov (United States)

    LePage, Wilbur R.

    This programed text is an introduction to the algebra of complex numbers for engineering students, particularly because of its relevance to important problems of applications in electrical engineering. It is designed for a person who is well experienced with the algebra of real numbers and calculus, but who has no experience with complex number…

  13. Reliability engineering: Old problems and new challenges

    International Nuclear Information System (INIS)

    Zio, E.

    2009-01-01

    The first recorded usage of the word reliability dates back to the 1800s, albeit referred to a person and not a technical system. Since then, the concept of reliability has become a pervasive attribute worth of both qualitative and quantitative connotations. In particular, the revolutionary social, cultural and technological changes that have occurred from the 1800s to the 2000s have contributed to the need for a rational framework and quantitative treatment of the reliability of engineered systems and plants. This has led to the rise of reliability engineering as a scientific discipline. In this paper, some considerations are shared with respect to a number of problems and challenges which researchers and practitioners in reliability engineering are facing when analyzing today's complex systems. The focus will be on the contribution of reliability to system safety and on its role within system risk analysis

  14. Complexity of the positive semidefinite matrix completion problem with a rank constraint.

    NARCIS (Netherlands)

    M. Eisenberg-Nagy (Marianna); M. Laurent (Monique); A. Varvitsiotis (Antonios); K. Bezdek; A. Deza; Y. Ye

    2013-01-01

    htmlabstractWe consider the decision problem asking whether a partial rational symmetric matrix with an all-ones diagonal can be completed to a full positive semidefinite matrix of rank at most k. We show that this problem is NP-hard for any fixed integer k ≥ 2. Equivalently, for k ≥ 2, it is

  15. Complexity of the positive semidefinite matrix completion problem with a rank constraint.

    NARCIS (Netherlands)

    M. Eisenberg-Nagy (Marianna); M. Laurent (Monique); A. Varvitsiotis (Antonios)

    2012-01-01

    htmlabstractWe consider the decision problem asking whether a partial rational symmetric matrix with an all-ones diagonal can be completed to a full positive semidefinite matrix of rank at most k. We show that this problem is NP-hard for any fixed integer k ≥ 2. Equivalently, for k ≥ 2, it is

  16. Phase transitions in Pareto optimal complex networks.

    Science.gov (United States)

    Seoane, Luís F; Solé, Ricard

    2015-09-01

    The organization of interactions in complex systems can be described by networks connecting different units. These graphs are useful representations of the local and global complexity of the underlying systems. The origin of their topological structure can be diverse, resulting from different mechanisms including multiplicative processes and optimization. In spatial networks or in graphs where cost constraints are at work, as it occurs in a plethora of situations from power grids to the wiring of neurons in the brain, optimization plays an important part in shaping their organization. In this paper we study network designs resulting from a Pareto optimization process, where different simultaneous constraints are the targets of selection. We analyze three variations on a problem, finding phase transitions of different kinds. Distinct phases are associated with different arrangements of the connections, but the need of drastic topological changes does not determine the presence or the nature of the phase transitions encountered. Instead, the functions under optimization do play a determinant role. This reinforces the view that phase transitions do not arise from intrinsic properties of a system alone, but from the interplay of that system with its external constraints.

  17. The complexity of the matching-cut problem for planar graphs and other graph classes

    NARCIS (Netherlands)

    Bonsma, P.S.

    2009-01-01

    The Matching-Cut problem is the problem to decide whether a graph has an edge cut that is also a matching. Previously this problem was studied under the name of the Decomposable Graph Recognition problem, and proved to be -complete when restricted to graphs with maximum degree four. In this paper it

  18. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  19. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  20. Wicked Problems in Large Organizations: Why Pilot Retention Continues to Challenge the Air Force

    Science.gov (United States)

    2017-05-25

    solving complex problems even more challenging.10 This idea of complexity extends to another theoretical concept , the complex adaptive system, which... concept in order to avoid the pitfalls and dangers in group problem - solving .26 His ideas to mitigate potential groupthink place responsibility... Problems in Large Organizations: Why Pilot Retention Continues to Challenge the Air Force 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM