WorldWideScience

Sample records for complex computer systems

  1. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  2. Unified Computational Intelligence for Complex Systems

    CERN Document Server

    Seiffertt, John

    2010-01-01

    Computational intelligence encompasses a wide variety of techniques that allow computation to learn, to adapt, and to seek. That is, they may be designed to learn information without explicit programming regarding the nature of the content to be retained, they may be imbued with the functionality to adapt to maintain their course within a complex and unpredictably changing environment, and they may help us seek out truths about our own dynamics and lives through their inclusion in complex system modeling. These capabilities place our ability to compute in a category apart from our ability to e

  3. Metasynthetic computing and engineering of complex systems

    CERN Document Server

    Cao, Longbing

    2015-01-01

    Provides a comprehensive overview and introduction to the concepts, methodologies, analysis, design and applications of metasynthetic computing and engineering. The author: Presents an overview of complex systems, especially open complex giant systems such as the Internet, complex behavioural and social problems, and actionable knowledge discovery and delivery in the big data era. Discusses ubiquitous intelligence in complex systems, including human intelligence, domain intelligence, social intelligence, network intelligence, data intelligence and machine intelligence, and their synergy thro

  4. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  5. Automated System for Teaching Computational Complexity of Algorithms Course

    Directory of Open Access Journals (Sweden)

    Vadim S. Roublev

    2017-01-01

    Full Text Available This article describes problems of designing automated teaching system for “Computational complexity of algorithms” course. This system should provide students with means to familiarize themselves with complex mathematical apparatus and improve their mathematical thinking in the respective area. The article introduces the technique of algorithms symbol scroll table that allows estimating lower and upper bounds of computational complexity. Further, we introduce a set of theorems that facilitate the analysis in cases when the integer rounding of algorithm parameters is involved and when analyzing the complexity of a sum. At the end, the article introduces a normal system of symbol transformations that allows one both to perform any symbol transformations and simplifies the automated validation of such transformations. The article is published in the authors’ wording.

  6. VBOT: Motivating computational and complex systems fluencies with constructionist virtual/physical robotics

    Science.gov (United States)

    Berland, Matthew W.

    As scientists use the tools of computational and complex systems theory to broaden science perspectives (e.g., Bar-Yam, 1997; Holland, 1995; Wolfram, 2002), so can middle-school students broaden their perspectives using appropriate tools. The goals of this dissertation project are to build, study, evaluate, and compare activities designed to foster both computational and complex systems fluencies through collaborative constructionist virtual and physical robotics. In these activities, each student builds an agent (e.g., a robot-bird) that must interact with fellow students' agents to generate a complex aggregate (e.g., a flock of robot-birds) in a participatory simulation environment (Wilensky & Stroup, 1999a). In a participatory simulation, students collaborate by acting in a common space, teaching each other, and discussing content with one another. As a result, the students improve both their computational fluency and their complex systems fluency, where fluency is defined as the ability to both consume and produce relevant content (DiSessa, 2000). To date, several systems have been designed to foster computational and complex systems fluencies through computer programming and collaborative play (e.g., Hancock, 2003; Wilensky & Stroup, 1999b); this study suggests that, by supporting the relevant fluencies through collaborative play, they become mutually reinforcing. In this work, I will present both the design of the VBOT virtual/physical constructionist robotics learning environment and a comparative study of student interaction with the virtual and physical environments across four middle-school classrooms, focusing on the contrast in systems perspectives differently afforded by the two environments. In particular, I found that while performance gains were similar overall, the physical environment supported agent perspectives on aggregate behavior, and the virtual environment supported aggregate perspectives on agent behavior. The primary research questions

  7. Computer modeling of properties of complex molecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Kulkova, E.Yu. [Moscow State University of Technology “STANKIN”, Vadkovsky per., 1, Moscow 101472 (Russian Federation); Khrenova, M.G.; Polyakov, I.V. [Lomonosov Moscow State University, Chemistry Department, Leninskie Gory 1/3, Moscow 119991 (Russian Federation); Nemukhin, A.V. [Lomonosov Moscow State University, Chemistry Department, Leninskie Gory 1/3, Moscow 119991 (Russian Federation); N.M. Emanuel Institute of Biochemical Physics, Russian Academy of Sciences, Kosygina 4, Moscow 119334 (Russian Federation)

    2015-03-10

    Large molecular aggregates present important examples of strongly nonhomogeneous systems. We apply combined quantum mechanics / molecular mechanics approaches that assume treatment of a part of the system by quantum-based methods and the rest of the system with conventional force fields. Herein we illustrate these computational approaches by two different examples: (1) large-scale molecular systems mimicking natural photosynthetic centers, and (2) components of prospective solar cells containing titan dioxide and organic dye molecules. We demonstrate that modern computational tools are capable to predict structures and spectra of such complex molecular aggregates.

  8. Complex systems relationships between control, communications and computing

    CERN Document Server

    2016-01-01

    This book gives a wide-ranging description of the many facets of complex dynamic networks and systems within an infrastructure provided by integrated control and supervision: envisioning, design, experimental exploration, and implementation. The theoretical contributions and the case studies presented can reach control goals beyond those of stabilization and output regulation or even of adaptive control. Reporting on work of the Control of Complex Systems (COSY) research program, Complex Systems follows from and expands upon an earlier collection: Control of Complex Systems by introducing novel theoretical techniques for hard-to-control networks and systems. The major common feature of all the superficially diverse contributions encompassed by this book is that of spotting and exploiting possible areas of mutual reinforcement between control, computing and communications. These help readers to achieve not only robust stable plant system operation but also properties such as collective adaptivity, integrity an...

  9. The Intelligent Safety System: could it introduce complex computing into CANDU shutdown systems

    International Nuclear Information System (INIS)

    Hall, J.A.; Hinds, H.W.; Pensom, C.F.; Barker, C.J.; Jobse, A.H.

    1984-07-01

    The Intelligent Safety System is a computerized shutdown system being developed at the Chalk River Nuclear Laboratories (CRNL) for future CANDU nuclear reactors. It differs from current CANDU shutdown systems in both the algorithm used and the size and complexity of computers required to implement the concept. This paper provides an overview of the project, with emphasis on the computing aspects. Early in the project several needs leading to an introduction of computing complexity were identified, and a computing system that met these needs was conceived. The current work at CRNL centers on building a laboratory demonstration of the Intelligent Safety System, and evaluating the reliability and testability of the concept. Some fundamental problems must still be addressed for the Intelligent Safety System to be acceptable to a CANDU owner and to the regulatory authorities. These are also discussed along with a description of how the Intelligent Safety System might solve these problems

  10. Atomic switch networks-nanoarchitectonic design of a complex system for natural computing.

    Science.gov (United States)

    Demis, E C; Aguilera, R; Sillin, H O; Scharnhorst, K; Sandouk, E J; Aono, M; Stieg, A Z; Gimzewski, J K

    2015-05-22

    Self-organized complex systems are ubiquitous in nature, and the structural complexity of these natural systems can be used as a model to design new classes of functional nanotechnology based on highly interconnected networks of interacting units. Conventional fabrication methods for electronic computing devices are subject to known scaling limits, confining the diversity of possible architectures. This work explores methods of fabricating a self-organized complex device known as an atomic switch network and discusses its potential utility in computing. Through a merger of top-down and bottom-up techniques guided by mathematical and nanoarchitectonic design principles, we have produced functional devices comprising nanoscale elements whose intrinsic nonlinear dynamics and memorization capabilities produce robust patterns of distributed activity and a capacity for nonlinear transformation of input signals when configured in the appropriate network architecture. Their operational characteristics represent a unique potential for hardware implementation of natural computation, specifically in the area of reservoir computing-a burgeoning field that investigates the computational aptitude of complex biologically inspired systems.

  11. Atomic switch networks—nanoarchitectonic design of a complex system for natural computing

    International Nuclear Information System (INIS)

    Demis, E C; Aguilera, R; Sillin, H O; Scharnhorst, K; Sandouk, E J; Gimzewski, J K; Aono, M; Stieg, A Z

    2015-01-01

    Self-organized complex systems are ubiquitous in nature, and the structural complexity of these natural systems can be used as a model to design new classes of functional nanotechnology based on highly interconnected networks of interacting units. Conventional fabrication methods for electronic computing devices are subject to known scaling limits, confining the diversity of possible architectures. This work explores methods of fabricating a self-organized complex device known as an atomic switch network and discusses its potential utility in computing. Through a merger of top-down and bottom-up techniques guided by mathematical and nanoarchitectonic design principles, we have produced functional devices comprising nanoscale elements whose intrinsic nonlinear dynamics and memorization capabilities produce robust patterns of distributed activity and a capacity for nonlinear transformation of input signals when configured in the appropriate network architecture. Their operational characteristics represent a unique potential for hardware implementation of natural computation, specifically in the area of reservoir computing—a burgeoning field that investigates the computational aptitude of complex biologically inspired systems. (paper)

  12. Comparing Virtual and Physical Robotics Environments for Supporting Complex Systems and Computational Thinking

    Science.gov (United States)

    Berland, Matthew; Wilensky, Uri

    2015-01-01

    Both complex systems methods (such as agent-based modeling) and computational methods (such as programming) provide powerful ways for students to understand new phenomena. To understand how to effectively teach complex systems and computational content to younger students, we conducted a study in four urban middle school classrooms comparing…

  13. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

    Science.gov (United States)

    Mitchell, Christine M.

    1993-01-01

    This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

  14. Nature, computation and complexity

    International Nuclear Information System (INIS)

    Binder, P-M; Ellis, G F R

    2016-01-01

    The issue of whether the unfolding of events in the world can be considered a computation is explored in this paper. We come to different conclusions for inert and for living systems (‘no’ and ‘qualified yes’, respectively). We suggest that physical computation as we know it exists only as a tool of complex biological systems: us. (paper)

  15. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  16. Stochastic equations for complex systems theoretical and computational topics

    CERN Document Server

    Bessaih, Hakima

    2015-01-01

    Mathematical analyses and computational predictions of the behavior of complex systems are needed to effectively deal with weather and climate predictions, for example, and the optimal design of technical processes. Given the random nature of such systems and the recognized relevance of randomness, the equations used to describe such systems usually need to involve stochastics.  The basic goal of this book is to introduce the mathematics and application of stochastic equations used for the modeling of complex systems. A first focus is on the introduction to different topics in mathematical analysis. A second focus is on the application of mathematical tools to the analysis of stochastic equations. A third focus is on the development and application of stochastic methods to simulate turbulent flows as seen in reality.  This book is primarily oriented towards mathematics and engineering PhD students, young and experienced researchers, and professionals working in the area of stochastic differential equations ...

  17. Computer Simulations and Theoretical Studies of Complex Systems: from complex fluids to frustrated magnets

    Science.gov (United States)

    Choi, Eunsong

    Computer simulations are an integral part of research in modern condensed matter physics; they serve as a direct bridge between theory and experiment by systemactically applying a microscopic model to a collection of particles that effectively imitate a macroscopic system. In this thesis, we study two very differnt condensed systems, namely complex fluids and frustrated magnets, primarily by simulating classical dynamics of each system. In the first part of the thesis, we focus on ionic liquids (ILs) and polymers--the two complementary classes of materials that can be combined to provide various unique properties. The properties of polymers/ILs systems, such as conductivity, viscosity, and miscibility, can be fine tuned by choosing an appropriate combination of cations, anions, and polymers. However, designing a system that meets a specific need requires a concrete understanding of physics and chemistry that dictates a complex interplay between polymers and ionic liquids. In this regard, molecular dynamics (MD) simulation is an efficient tool that provides a molecular level picture of such complex systems. We study the behavior of Poly (ethylene oxide) (PEO) and the imidazolium based ionic liquids, using MD simulations and statistical mechanics. We also discuss our efforts to develop reliable and efficient classical force-fields for PEO and the ionic liquids. The second part is devoted to studies on geometrically frustrated magnets. In particular, a microscopic model, which gives rise to an incommensurate spiral magnetic ordering observed in a pyrochlore antiferromagnet is investigated. The validation of the model is made via a comparison of the spin-wave spectra with the neutron scattering data. Since the standard Holstein-Primakoff method is difficult to employ in such a complex ground state structure with a large unit cell, we carry out classical spin dynamics simulations to compute spin-wave spectra directly from the Fourier transform of spin trajectories. We

  18. EXAFS Phase Retrieval Solution Tracking for Complex Multi-Component System: Synthesized Topological Inverse Computation

    International Nuclear Information System (INIS)

    Lee, Jay Min; Yang, Dong-Seok; Bunker, Grant B

    2013-01-01

    Using the FEFF kernel A(k,r), we describe the inverse computation from χ(k)-data to g(r)-solution in terms of a singularity regularization method based on complete Bayesian statistics process. In this work, we topologically decompose the system-matched invariant projection operators into two distinct types, (A + AA + A) and (AA + AA + ), and achieved Synthesized Topological Inversion Computation (STIC), by employing a 12-operator-closed-loop emulator of the symplectic transformation. This leads to a numerically self-consistent solution as the optimal near-singular regularization parameters are sought, dramatically suppressing instability problems connected with finite precision arithmetic in ill-posed systems. By statistically correlating a pair of measured data, it was feasible to compute an optimal EXAFS phase retrieval solution expressed in terms of the complex-valued χ(k), and this approach was successfully used to determine the optimal g(r) for a complex multi-component system.

  19. A computational approach to achieve situational awareness from limited observations of a complex system

    Science.gov (United States)

    Sherwin, Jason

    human activities. Nevertheless, since it is not constrained by computational details, the study of situational awareness provides a unique opportunity to approach complex tasks of operation from an analytical perspective. In other words, with SA, we get to see how humans observe, recognize and react to complex systems on which they exert some control. Reconciling this perspective on complexity with complex systems research, it might be possible to further our understanding of complex phenomena if we can probe the anatomical mechanisms by which we, as humans, do it naturally. At this unique intersection of two disciplines, a hybrid approach is needed. So in this work, we propose just such an approach. In particular, this research proposes a computational approach to the situational awareness (SA) of complex systems. Here we propose to implement certain aspects of situational awareness via a biologically-inspired machine-learning technique called Hierarchical Temporal Memory (HTM). In doing so, we will use either simulated or actual data to create and to test computational implementations of situational awareness. This will be tested in two example contexts, one being more complex than the other. The ultimate goal of this research is to demonstrate a possible approach to analyzing and understanding complex systems. By using HTM and carefully developing techniques to analyze the SA formed from data, it is believed that this goal can be obtained.

  20. Computations, Complexity, Experiments, and the World Outside Physics

    International Nuclear Information System (INIS)

    Kadanoff, L.P

    2009-01-01

    Computer Models in the Sciences and Social Sciences. 1. Simulation and Prediction in Complex Systems: the Good the Bad and the Awful. This lecture deals with the history of large-scale computer modeling mostly in the context of the U.S. Department of Energy's sponsorship of modeling for weapons development and innovation in energy sources. 2. Complexity: Making a Splash-Breaking a Neck - The Making of Complexity in Physical System. For ages thinkers have been asking how complexity arise. The laws of physics are very simple. How come we are so complex? This lecture tries to approach this question by asking how complexity arises in physical fluids. 3. Forrester, et. al. Social and Biological Model-Making The partial collapse of the world's economy has raised the question of whether we could improve the performance of economic and social systems by a major effort on creating understanding via large-scale computer models. (author)

  1. The information exchange between moduluses in the system of module programming of the computation complexes

    International Nuclear Information System (INIS)

    Zinin, A.I.; Kolesov, V.E.; Nevinitsa, A.I.

    1975-01-01

    The report contains description of the method of construction of computer programs complexes for computation purposes for M-220 computers using the ALGOL-60 code for programming. The complex is organised on the modulus system principle and can include substantial number of modulus programs. The information exchange between separate moduli is done by means of special interpreting program and the information unit exchanged is a specially arranged file of data. For addressing to the interpreting program in the ALGOL-60 frameworks small number of specially created procedure-codes is used. The method proposed gives possibilities to program separate moduli of the complex independently and to expand the complex if necessary. In this case separate moduli or groups of moduli depending on the method of segmentation of the general problem solved by the complex will be of the independent interest and could be used out of the complex as traditional programs. (author)

  2. Using the calculational simulating complexes when making the computer process control systems for NPP

    International Nuclear Information System (INIS)

    Zimakov, V.N.; Chernykh, V.P.

    1998-01-01

    The problems on creating calculational-simulating (CSC) and their application by developing the program and program-technical means for computer-aided process control systems at NPP are considered. The abo- ve complex is based on the all-mode real time mathematical model, functioning at a special complex of computerized means

  3. Automation of multi-agent control for complex dynamic systems in heterogeneous computational network

    Science.gov (United States)

    Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan

    2017-01-01

    The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.

  4. Computational error and complexity in science and engineering computational error and complexity

    CERN Document Server

    Lakshmikantham, Vangipuram; Chui, Charles K; Chui, Charles K

    2005-01-01

    The book "Computational Error and Complexity in Science and Engineering” pervades all the science and engineering disciplines where computation occurs. Scientific and engineering computation happens to be the interface between the mathematical model/problem and the real world application. One needs to obtain good quality numerical values for any real-world implementation. Just mathematical quantities symbols are of no use to engineers/technologists. Computational complexity of the numerical method to solve the mathematical model, also computed along with the solution, on the other hand, will tell us how much computation/computational effort has been spent to achieve that quality of result. Anyone who wants the specified physical problem to be solved has every right to know the quality of the solution as well as the resources spent for the solution. The computed error as well as the complexity provide the scientific convincing answer to these questions. Specifically some of the disciplines in which the book w...

  5. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  6. Measuring Complexity of SAP Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-10-01

    Full Text Available The paper discusses the reasons of complexity rise in ERP system SAP R/3. It proposes a method for measuring complexity of SAP. Based on this method, the computer program in ABAP for measuring complexity of particular SAP implementation is proposed as a tool for keeping ERP complexity under control. The main principle of the measurement method is counting the number of items or relations in the system. The proposed computer program is based on counting of records in organization tables in SAP.

  7. Stop: a fast procedure for the exact computation of the performance of complex probabilistic systems

    International Nuclear Information System (INIS)

    Corynen, G.C.

    1982-01-01

    A new set-theoretic method for the exact and efficient computation of the probabilistic performance of complex systems has been developed. The core of the method is a fast algorithm for disjointing a collection of product sets which is intended for systems with more than 1000 components and 100,000 cut sets. The method is based on a divide-and-conquer approach, in which a multidimensional problem is progressively decomposed into lower-dimensional subproblems along its dimensions. The method also uses a particular pointer system that eliminates the need to store the subproblems by only requiring the storage of pointers to those problems. Examples of the algorithm and the divide-and-conquer strategy are provided, and comparisons with other significant methods are made. Statistical complexity studies show that the expected time and space complexity of other methods is O(me/sup n/), but that our method is O(nm 3 log(m)). Problems which would require days of Cray-1 computer time with present methods can now be solved in seconds. Large-scale systems that can only be approximated with other techniques can now also be evaluated exactly

  8. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  9. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  10. Environmental Factors Affecting Computer Assisted Language Learning Success: A Complex Dynamic Systems Conceptual Model

    Science.gov (United States)

    Marek, Michael W.; Wu, Wen-Chi Vivian

    2014-01-01

    This conceptual, interdisciplinary inquiry explores Complex Dynamic Systems as the concept relates to the internal and external environmental factors affecting computer assisted language learning (CALL). Based on the results obtained by de Rosnay ["World Futures: The Journal of General Evolution", 67(4/5), 304-315 (2011)], who observed…

  11. Automated validation of a computer operating system

    Science.gov (United States)

    Dervage, M. M.; Milberg, B. A.

    1970-01-01

    Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.

  12. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  13. Computational complexity in entanglement transformations

    Science.gov (United States)

    Chitambar, Eric A.

    In physics, systems having three parts are typically much more difficult to analyze than those having just two. Even in classical mechanics, predicting the motion of three interacting celestial bodies remains an insurmountable challenge while the analogous two-body problem has an elementary solution. It is as if just by adding a third party, a fundamental change occurs in the structure of the problem that renders it unsolvable. In this thesis, we demonstrate how such an effect is likewise present in the theory of quantum entanglement. In fact, the complexity differences between two-party and three-party entanglement become quite conspicuous when comparing the difficulty in deciding what state changes are possible for these systems when no additional entanglement is consumed in the transformation process. We examine this entanglement transformation question and its variants in the language of computational complexity theory, a powerful subject that formalizes the concept of problem difficulty. Since deciding feasibility of a specified bipartite transformation is relatively easy, this task belongs to the complexity class P. On the other hand, for tripartite systems, we find the problem to be NP-Hard, meaning that its solution is at least as hard as the solution to some of the most difficult problems humans have encountered. One can then rigorously defend the assertion that a fundamental complexity difference exists between bipartite and tripartite entanglement since unlike the former, the full range of forms realizable by the latter is incalculable (assuming P≠NP). However, similar to the three-body celestial problem, when one examines a special subclass of the problem---invertible transformations on systems having at least one qubit subsystem---we prove that the problem can be solved efficiently. As a hybrid of the two questions, we find that the question of tripartite to bipartite transformations can be solved by an efficient randomized algorithm. Our results are

  14. Third International Conference on Complex Systems

    CERN Document Server

    Minai, Ali A; Unifying Themes in Complex Systems

    2006-01-01

    In recent years, scientists have applied the principles of complex systems science to increasingly diverse fields. The results have been nothing short of remarkable: their novel approaches have provided answers to long-standing questions in biology, ecology, physics, engineering, computer science, economics, psychology and sociology. The Third International Conference on Complex Systems attracted over 400 researchers from around the world. The conference aimed to encourage cross-fertilization between the many disciplines represented and to deepen our understanding of the properties common to all complex systems. This volume contains over 35 papers selected from those presented at the conference on topics including: self-organization in biology, ecological systems, language, economic modeling, ecological systems, artificial life, robotics, and complexity and art. ALI MINAI is an Affiliate of the New England Complex Systems Institute and an Associate Professor in the Department of Electrical and Computer Engine...

  15. Complexity in Dynamical Systems

    Science.gov (United States)

    Moore, Cristopher David

    The study of chaos has shown us that deterministic systems can have a kind of unpredictability, based on a limited knowledge of their initial conditions; after a finite time, the motion appears essentially random. This observation has inspired a general interest in the subject of unpredictability, and more generally, complexity; how can we characterize how "complex" a dynamical system is?. In this thesis, we attempt to answer this question with a paradigm of complexity that comes from computer science, we extract sets of symbol sequences, or languages, from a dynamical system using standard methods of symbolic dynamics; we then ask what kinds of grammars or automata are needed a generate these languages. This places them in the Chomsky heirarchy, which in turn tells us something about how subtle and complex the dynamical system's behavior is. This gives us insight into the question of unpredictability, since these automata can also be thought of as computers attempting to predict the system. In the culmination of the thesis, we find a class of smooth, two-dimensional maps which are equivalent to the highest class in the Chomsky heirarchy, the turning machine; they are capable of universal computation. Therefore, these systems possess a kind of unpredictability qualitatively different from the usual "chaos": even if the initial conditions are known exactly, questions about the system's long-term dynamics are undecidable. No algorithm exists to answer them. Although this kind of unpredictability has been discussed in the context of distributed, many-degree-of -freedom systems (for instance, cellular automata) we believe this is the first example of such phenomena in a smooth, finite-degree-of-freedom system.

  16. Development of Onboard Computer Complex for Russian Segment of ISS

    Science.gov (United States)

    Branets, V.; Brand, G.; Vlasov, R.; Graf, I.; Clubb, J.; Mikrin, E.; Samitov, R.

    1998-01-01

    Report present a description of the Onboard Computer Complex (CC) that was developed during the period of 1994-1998 for the Russian Segment of ISS. The system was developed in co-operation with NASA and ESA. ESA developed a new computation system under the RSC Energia Technical Assignment, called DMS-R. The CC also includes elements developed by Russian experts and organizations. A general architecture of the computer system and the characteristics of primary elements of this system are described. The system was integrated at RSC Energia with the participation of American and European specialists. The report contains information on software simulators, verification and de-bugging facilities witch were been developed for both stand-alone and integrated tests and verification. This CC serves as the basis for the Russian Segment Onboard Control Complex on ISS.

  17. Fourth International Conference on Complex Systems

    CERN Document Server

    Minai, Ali A; Unifying Themes in Complex Systems IV

    2008-01-01

    In June of 2002, over 500 professors, students and researchers met in Boston, Massachusetts for the Fourth International Conference on Complex Systems. The attendees represented a remarkably diverse collection of fields: biology, ecology, physics, engineering, computer science, economics, psychology and sociology, The goal of the conference was to encourage cross-fertilization between the many disciplines represented and to deepen understanding of the properties common to all complex systems. This volume contains 43 papers selected from the more than 200 presented at the conference. Topics include: cellular automata, neurology, evolution, computer science, network dynamics, and urban planning. About NECSI: For over 10 years, The New England Complex Systems Institute (NECSI) has been instrumental in the development of complex systems science and its applications. NECSI conducts research, education, knowledge dissemination, and community development around the world for the promotion of the study of complex sys...

  18. On the Computational Complexity of the Languages of General Symbolic Dynamical Systems and Beta-Shifts

    DEFF Research Database (Denmark)

    Simonsen, Jakob Grue

    2009-01-01

    We consider the computational complexity of languages of symbolic dynamical systems. In particular, we study complexity hierarchies and membership of the non-uniform class P/poly. We prove: 1.For every time-constructible, non-decreasing function t(n)=@w(n), there is a symbolic dynamical system...... with language decidable in deterministic time O(n^2t(n)), but not in deterministic time o(t(n)). 2.For every space-constructible, non-decreasing function s(n)=@w(n), there is a symbolic dynamical system with language decidable in deterministic space O(s(n)), but not in deterministic space o(s(n)). 3.There...... are symbolic dynamical systems having hard and complete languages under @?"m^l^o^g^s- and @?"m^p-reduction for every complexity class above LOGSPACE in the backbone hierarchy (hence, P-complete, NP-complete, coNP-complete, PSPACE-complete, and EXPTIME-complete sets). 4.There are decidable languages of symbolic...

  19. Distributed simulation of large computer systems

    International Nuclear Information System (INIS)

    Marzolla, M.

    2001-01-01

    Sequential simulation of large complex physical systems is often regarded as a computationally expensive task. In order to speed-up complex discrete-event simulations, the paradigm of Parallel and Distributed Discrete Event Simulation (PDES) has been introduced since the late 70s. The authors analyze the applicability of PDES to the modeling and analysis of large computer system; such systems are increasingly common in the area of High Energy and Nuclear Physics, because many modern experiments make use of large 'compute farms'. Some feasibility tests have been performed on a prototype distributed simulator

  20. The Modeling and Complexity of Dynamical Systems by Means of Computation and Information Theories

    Directory of Open Access Journals (Sweden)

    Robert Logozar

    2011-12-01

    Full Text Available We present the modeling of dynamical systems and finding of their complexity indicators by the use of concepts from computation and information theories, within the framework of J. P. Crutchfield's theory of  ε-machines. A short formal outline of the  ε-machines is given. In this approach, dynamical systems are analyzed directly from the time series that is received from a properly adjusted measuring instrument. The binary strings are parsed through the parse tree, within which morphologically and probabilistically unique subtrees or morphs are recognized as system states. The outline and precise interrelation of the information-theoretic entropies and complexities emanating from the model is given. The paper serves also as a theoretical foundation for the future presentation of the DSA program that implements the  ε-machines modeling up to the stochastic finite automata level.

  1. Computational complexity of Boolean functions

    Energy Technology Data Exchange (ETDEWEB)

    Korshunov, Aleksei D [Sobolev Institute of Mathematics, Siberian Branch of the Russian Academy of Sciences, Novosibirsk (Russian Federation)

    2012-02-28

    Boolean functions are among the fundamental objects of discrete mathematics, especially in those of its subdisciplines which fall under mathematical logic and mathematical cybernetics. The language of Boolean functions is convenient for describing the operation of many discrete systems such as contact networks, Boolean circuits, branching programs, and some others. An important parameter of discrete systems of this kind is their complexity. This characteristic has been actively investigated starting from Shannon's works. There is a large body of scientific literature presenting many fundamental results. The purpose of this survey is to give an account of the main results over the last sixty years related to the complexity of computation (realization) of Boolean functions by contact networks, Boolean circuits, and Boolean circuits without branching. Bibliography: 165 titles.

  2. Semiotics of constructed complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Landauer, C.; Bellman, K.L.

    1996-12-31

    The scope of this paper is limited to software and other constructed complex systems mediated or integrated by software. Our research program studies foundational issues that we believe will help us develop a theoretically sound approach to constructing complex systems. There have really been only two theoretical approaches that have helped us understand and develop computational systems: mathematics and linguistics. We show how semiotics can also play a role, whether we think of it as part of these other theories or as subsuming one or both of them. We describe our notion of {open_quotes}computational semiotics{close_quotes}, which we define to be the study of computational methods of dealing with symbols, show how such a theory might be formed, and describe what we might get from it in terms of more interesting use of symbols by computing systems. This research was supported in part by the Federal Highway Administration`s Office of Advanced Research and by the Advanced Research Projects Agency`s Software and Intelligent Systems Technology Office.

  3. Advances in computational complexity theory

    CERN Document Server

    Cai, Jin-Yi

    1993-01-01

    This collection of recent papers on computational complexity theory grew out of activities during a special year at DIMACS. With contributions by some of the leading experts in the field, this book is of lasting value in this fast-moving field, providing expositions not found elsewhere. Although aimed primarily at researchers in complexity theory and graduate students in mathematics or computer science, the book is accessible to anyone with an undergraduate education in mathematics or computer science. By touching on some of the major topics in complexity theory, this book sheds light on this burgeoning area of research.

  4. Quantum Cybernetics and Complex Quantum Systems Science - A Quantum Connectionist Exploration

    OpenAIRE

    Gonçalves, Carlos Pedro

    2014-01-01

    Quantum cybernetics and its connections to complex quantum systems science is addressed from the perspective of complex quantum computing systems. In this way, the notion of an autonomous quantum computing system is introduced in regards to quantum artificial intelligence, and applied to quantum artificial neural networks, considered as autonomous quantum computing systems, which leads to a quantum connectionist framework within quantum cybernetics for complex quantum computing systems. Sever...

  5. High performance parallel computing of flows in complex geometries: II. Applications

    International Nuclear Information System (INIS)

    Gourdain, N; Gicquel, L; Staffelbach, G; Vermorel, O; Duchaine, F; Boussuge, J-F; Poinsot, T

    2009-01-01

    Present regulations in terms of pollutant emissions, noise and economical constraints, require new approaches and designs in the fields of energy supply and transportation. It is now well established that the next breakthrough will come from a better understanding of unsteady flow effects and by considering the entire system and not only isolated components. However, these aspects are still not well taken into account by the numerical approaches or understood whatever the design stage considered. The main challenge is essentially due to the computational requirements inferred by such complex systems if it is to be simulated by use of supercomputers. This paper shows how new challenges can be addressed by using parallel computing platforms for distinct elements of a more complex systems as encountered in aeronautical applications. Based on numerical simulations performed with modern aerodynamic and reactive flow solvers, this work underlines the interest of high-performance computing for solving flow in complex industrial configurations such as aircrafts, combustion chambers and turbomachines. Performance indicators related to parallel computing efficiency are presented, showing that establishing fair criterions is a difficult task for complex industrial applications. Examples of numerical simulations performed in industrial systems are also described with a particular interest for the computational time and the potential design improvements obtained with high-fidelity and multi-physics computing methods. These simulations use either unsteady Reynolds-averaged Navier-Stokes methods or large eddy simulation and deal with turbulent unsteady flows, such as coupled flow phenomena (thermo-acoustic instabilities, buffet, etc). Some examples of the difficulties with grid generation and data analysis are also presented when dealing with these complex industrial applications.

  6. Complex fluids in biological systems experiment, theory, and computation

    CERN Document Server

    2015-01-01

    This book serves as an introduction to the continuum mechanics and mathematical modeling of complex fluids in living systems. The form and function of living systems are intimately tied to the nature of surrounding fluid environments, which commonly exhibit nonlinear and history dependent responses to forces and displacements. With ever-increasing capabilities in the visualization and manipulation of biological systems, research on the fundamental phenomena, models, measurements, and analysis of complex fluids has taken a number of exciting directions. In this book, many of the world’s foremost experts explore key topics such as: Macro- and micro-rheological techniques for measuring the material properties of complex biofluids and the subtleties of data interpretation Experimental observations and rheology of complex biological materials, including mucus, cell membranes, the cytoskeleton, and blood The motility of microorganisms in complex fluids and the dynamics of active suspensions Challenges and solut...

  7. Bioinspired computation in combinatorial optimization: algorithms and their computational complexity

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2012-01-01

    Bioinspired computation methods, such as evolutionary algorithms and ant colony optimization, are being applied successfully to complex engineering and combinatorial optimization problems, and it is very important that we understand the computational complexity of these algorithms. This tutorials...... problems. Classical single objective optimization is examined first. They then investigate the computational complexity of bioinspired computation applied to multiobjective variants of the considered combinatorial optimization problems, and in particular they show how multiobjective optimization can help...... to speed up bioinspired computation for single-objective optimization problems. The tutorial is based on a book written by the authors with the same title. Further information about the book can be found at www.bioinspiredcomputation.com....

  8. Improvement of computer complex and interface system for compact nuclear simulator

    International Nuclear Information System (INIS)

    Lee, D. Y.; Park, W. M.; Cha, K. H.; Jung, C. H.; Park, J. C.

    1999-01-01

    CNS(Compact Nuclear Simulator) was developed at the end of 1980s, and have been used as training simulator for staffs of KAERI during 10 years. The operator panel interface cards and the graphic interface cards were designed with special purpose only for CNS. As these interface cards were worn out for 10 years, it was very difficult to get spare parts and to repair them. And the interface cards were damaged by over current happened by shortage of lamp in the operator panel. To solve these problem, the project 'Improvement of Compact Nuclear Simulator' was started from 1997. This paper only introduces about the improvement of computer complex and interface system

  9. Multi-agent and complex systems

    CERN Document Server

    Ren, Fenghui; Fujita, Katsuhide; Zhang, Minjie; Ito, Takayuki

    2017-01-01

    This book provides a description of advanced multi-agent and artificial intelligence technologies for the modeling and simulation of complex systems, as well as an overview of the latest scientific efforts in this field. A complex system features a large number of interacting components, whose aggregate activities are nonlinear and self-organized. A multi-agent system is a group or society of agents which interact with others cooperatively and/or competitively in order to reach their individual or common goals. Multi-agent systems are suitable for modeling and simulation of complex systems, which is difficult to accomplish using traditional computational approaches.

  10. Collectives and the design of complex systems

    CERN Document Server

    Wolpert, David

    2004-01-01

    Increasingly powerful computers are making possible distributed systems comprised of many adaptive and self-motivated computational agents. Such systems, when distinguished by system-level performance criteria, are known as "collectives." Collectives and the Design of Complex Systems lays the foundation for a science of collectives and describes how to design them for optimal performance. An introductory survey chapter is followed by descriptions of information-processing problems that can only be solved by the joint actions of large communities of computers, each running its own complex, decentralized machine-learning algorithm. Subsequent chapters analyze the dynamics and structures of collectives, as well as address economic, model-free, and control-theory approaches to designing complex systems. The work assumes a modest understanding of basic statistics and calculus. Topics and Features: Introduces the burgeoning science of collectives and its practical applications in a single useful volume Combines ap...

  11. Transition Manifolds of Complex Metastable Systems: Theory and Data-Driven Computation of Effective Dynamics.

    Science.gov (United States)

    Bittracher, Andreas; Koltai, Péter; Klus, Stefan; Banisch, Ralf; Dellnitz, Michael; Schütte, Christof

    2018-01-01

    We consider complex dynamical systems showing metastable behavior, but no local separation of fast and slow time scales. The article raises the question of whether such systems exhibit a low-dimensional manifold supporting its effective dynamics. For answering this question, we aim at finding nonlinear coordinates, called reaction coordinates, such that the projection of the dynamics onto these coordinates preserves the dominant time scales of the dynamics. We show that, based on a specific reducibility property, the existence of good low-dimensional reaction coordinates preserving the dominant time scales is guaranteed. Based on this theoretical framework, we develop and test a novel numerical approach for computing good reaction coordinates. The proposed algorithmic approach is fully local and thus not prone to the curse of dimension with respect to the state space of the dynamics. Hence, it is a promising method for data-based model reduction of complex dynamical systems such as molecular dynamics.

  12. Energy efficient distributed computing systems

    CERN Document Server

    Lee, Young-Choon

    2012-01-01

    The energy consumption issue in distributed computing systems raises various monetary, environmental and system performance concerns. Electricity consumption in the US doubled from 2000 to 2005.  From a financial and environmental standpoint, reducing the consumption of electricity is important, yet these reforms must not lead to performance degradation of the computing systems.  These contradicting constraints create a suite of complex problems that need to be resolved in order to lead to 'greener' distributed computing systems.  This book brings together a group of outsta

  13. Exponential rise of dynamical complexity in quantum computing through projections.

    Science.gov (United States)

    Burgarth, Daniel Klaus; Facchi, Paolo; Giovannetti, Vittorio; Nakazato, Hiromichi; Pascazio, Saverio; Yuasa, Kazuya

    2014-10-10

    The ability of quantum systems to host exponentially complex dynamics has the potential to revolutionize science and technology. Therefore, much effort has been devoted to developing of protocols for computation, communication and metrology, which exploit this scaling, despite formidable technical difficulties. Here we show that the mere frequent observation of a small part of a quantum system can turn its dynamics from a very simple one into an exponentially complex one, capable of universal quantum computation. After discussing examples, we go on to show that this effect is generally to be expected: almost any quantum dynamics becomes universal once 'observed' as outlined above. Conversely, we show that any complex quantum dynamics can be 'purified' into a simpler one in larger dimensions. We conclude by demonstrating that even local noise can lead to an exponentially complex dynamics.

  14. A new decision sciences for complex systems

    OpenAIRE

    Lempert, Robert J.

    2002-01-01

    Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an...

  15. Computer Simulation of Complex Power System Faults under various Operating Conditions

    International Nuclear Information System (INIS)

    Khandelwal, Tanuj; Bowman, Mark

    2015-01-01

    A power system is normally treated as a balanced symmetrical three-phase network. When a fault occurs, the symmetry is normally upset, resulting in unbalanced currents and voltages appearing in the network. For the correct application of protection equipment, it is essential to know the fault current distribution throughout the system and the voltages in different parts of the system due to the fault. There may be situations where protection engineers have to analyze faults that are more complex than simple shunt faults. One type of complex fault is an open phase condition that can result from a fallen conductor or failure of a breaker pole. In the former case, the condition is often accompanied by a fault detectable with normal relaying. In the latter case, the condition may be undetected by standard line relaying. The effect on a generator is dependent on the location of the open phase and the load level. If an open phase occurs between the generator terminals and the high-voltage side of the GSU in the switchyard, and the generator is at full load, damaging negative sequence current can be generated. However, for the same operating condition, an open conductor at the incoming transmission lines located in the switchyard can result in minimal negative sequence current. In 2012, a nuclear power generating station (NPGS) suffered series or open phase fault due to insulator mechanical failure in the 345 kV switchyard. This resulted in both reactor units tripping offline in two separate incidents. Series fault on one of the phases resulted in voltage imbalance that was not detected by the degraded voltage relays. These under-voltage relays did not initiate a start signal to the emergency diesel generators (EDG) because they sensed adequate voltage on the remaining phases exposing a design vulnerability. This paper is intended to help protection engineers calculate complex circuit faults like open phase condition using computer program. The impact of this type of

  16. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  17. Encyclopedia of Complexity and Systems Science

    CERN Document Server

    Meyers, Robert A

    2009-01-01

    Encyclopedia of Complexity and Systems Science provides an authoritative single source for understanding and applying the concepts of complexity theory together with the tools and measures for analyzing complex systems in all fields of science and engineering. The science and tools of complexity and systems science include theories of self-organization, complex systems, synergetics, dynamical systems, turbulence, catastrophes, instabilities, nonlinearity, stochastic processes, chaos, neural networks, cellular automata, adaptive systems, and genetic algorithms. Examples of near-term problems and major unknowns that can be approached through complexity and systems science include: The structure, history and future of the universe; the biological basis of consciousness; the integration of genomics, proteomics and bioinformatics as systems biology; human longevity limits; the limits of computing; sustainability of life on earth; predictability, dynamics and extent of earthquakes, hurricanes, tsunamis, and other n...

  18. Designing Computer-Supported Complex Systems Curricula for the Next Generation Science Standards in High School Science Classrooms

    Directory of Open Access Journals (Sweden)

    Susan A. Yoon

    2016-12-01

    Full Text Available We present a curriculum and instruction framework for computer-supported teaching and learning about complex systems in high school science classrooms. This work responds to a need in K-12 science education research and practice for the articulation of design features for classroom instruction that can address the Next Generation Science Standards (NGSS recently launched in the USA. We outline the features of the framework, including curricular relevance, cognitively rich pedagogies, computational tools for teaching and learning, and the development of content expertise, and provide examples of how the framework is translated into practice. We follow this up with evidence from a preliminary study conducted with 10 teachers and 361 students, aimed at understanding the extent to which students learned from the activities. Results demonstrated gains in students’ complex systems understanding and biology content knowledge. In interviews, students identified influences of various aspects of the curriculum and instruction framework on their learning.

  19. Complex adaptative systems and computational simulation in Archaeology

    Directory of Open Access Journals (Sweden)

    Salvador Pardo-Gordó

    2017-07-01

    Full Text Available Traditionally the concept of ‘complexity’ is used as a synonym for ‘complex society’, i.e., human groups with characteristics such as urbanism, inequalities, and hierarchy. The introduction of Nonlinear Systems and Complex Adaptive Systems to the discipline of archaeology has nuanced this concept. This theoretical turn has led to the rise of modelling as a method of analysis of historical processes. This work has a twofold objective: to present the theoretical current characterized by generative thinking in archaeology and to present a concrete application of agent-based modelling to an archaeological problem: the dispersal of the first ceramic production in the western Mediterranean.

  20. Wind power systems. Applications of computational intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Lingfeng [Toledo Univ., OH (United States). Dept. of Electrical Engineering and Computer Science; Singh, Chanan [Texas A and M Univ., College Station, TX (United States). Electrical and Computer Engineering Dept.; Kusiak, Andrew (eds.) [Iowa Univ., Iowa City, IA (United States). Mechanical and Industrial Engineering Dept.

    2010-07-01

    Renewable energy sources such as wind power have attracted much attention because they are environmentally friendly, do not produce carbon dioxide and other emissions, and can enhance a nation's energy security. For example, recently more significant amounts of wind power are being integrated into conventional power grids. Therefore, it is necessary to address various important and challenging issues related to wind power systems, which are significantly different from the traditional generation systems. This book is a resource for engineers, practitioners, and decision-makers interested in studying or using the power of computational intelligence based algorithms in handling various important problems in wind power systems at the levels of power generation, transmission, and distribution. Researchers have been developing biologically-inspired algorithms in a wide variety of complex large-scale engineering domains. Distinguished from the traditional analytical methods, the new methods usually accomplish the task through their computationally efficient mechanisms. Computational intelligence methods such as evolutionary computation, neural networks, and fuzzy systems have attracted much attention in electric power systems. Meanwhile, modern electric power systems are becoming more and more complex in order to meet the growing electricity market. In particular, the grid complexity is continuously enhanced by the integration of intermittent wind power as well as the current restructuring efforts in electricity industry. Quite often, the traditional analytical methods become less efficient or even unable to handle this increased complexity. As a result, it is natural to apply computational intelligence as a powerful tool to deal with various important and pressing problems in the current wind power systems. This book presents the state-of-the-art development in the field of computational intelligence applied to wind power systems by reviewing the most up

  1. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  2. European Conference on Complex Systems 2012

    CERN Document Server

    Kirkilionis, Markus; Nicolis, Gregoire

    2013-01-01

    The European Conference on Complex Systems, held under the patronage of the Complex Systems Society, is an annual event that has become the leading European conference devoted to complexity science. ECCS'12, its ninth edition, took place in Brussels, during the first week of September 2012. It gathered about 650 scholars representing a wide range of topics relating to complex systems research, with emphasis on interdisciplinary approaches. More specifically, the following tracks were covered:  1. Foundations of Complex Systems 2. Complexity, Information and Computation 3. Prediction, Policy and Planning, Environment 4. Biological Complexity 5. Interacting Populations, Collective Behavior 6. Social Systems, Economics and Finance This book contains a selection of the contributions presented at the conference and its satellite meetings. Its contents reflect the extent, diversity and richness of research areas in the field, both fundamental and applied.  

  3. Computer-aided design system for a complex of problems on calculation and analysis of engineering and economical indexes of NPP power units

    International Nuclear Information System (INIS)

    Stepanov, V.I.; Koryagin, A.V.; Ruzankov, V.N.

    1988-01-01

    Computer-aided design system for a complex of problems concerning calculation and analysis of engineering and economical indices of NPP power units is described. In the system there are means for automated preparation and debugging of data base software complex, which realizes th plotted algorithm in the power unit control system. Besides, in the system there are devices for automated preparation and registration of technical documentation

  4. Computer aided operation of complex systems

    International Nuclear Information System (INIS)

    Goodstein, L.P.

    1985-09-01

    Advanced technology is having the effect that industrial systems are becoming more highly automated and do not rely on human intervention for the control of normally planned and/or predicted situations. Thus the importance of the operator has shifted from being a manual controller to becoming more of a systems manager and supervisory controller. At the same time, the use of advanced information technology in the control room and its potential impact on human-machine capabilities places additional demands on the designer. This report deals with work carried out to describe the plant-operator relationship in order to systematize the design and evaluation of suitable information systems in the control room. This design process starts with the control requirements from the plant and transforms them into corresponding sets of decision-making tasks with appropriate allocation of responsibilities between computer and operator. To further effectivize this cooperation, appropriate information display and accession are identified. The conceptual work has been supported by experimental studies on a small-scale simulator. (author)

  5. Sandpile model for relaxation in complex systems

    International Nuclear Information System (INIS)

    Vazquez, A.; Sotolongo-Costa, O.; Brouers, F.

    1997-10-01

    The relaxation in complex systems is, in general, nonexponential. After an initial rapid decay the system relaxes slowly following a long time tail. In the present paper a sandpile moderation of the relaxation in complex systems is analysed. Complexity is introduced by a process of avalanches in the Bethe lattice and a feedback mechanism which leads to slower decay with increasing time. In this way, some features of relaxation in complex systems: long time tails relaxation, aging, and fractal distribution of characteristic times, are obtained by simple computer simulations. (author)

  6. A new decision sciences for complex systems.

    Science.gov (United States)

    Lempert, Robert J

    2002-05-14

    Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an approach to decision-making under conditions of deep uncertainty that is ideally suited to applying complex systems to policy analysis. The article demonstrates the approach on the policy problem of global climate change, with a particular focus on the role of technology policies in a robust, adaptive strategy for greenhouse gas abatement.

  7. Computer System Analysis for Decommissioning Management of Nuclear Reactor

    International Nuclear Information System (INIS)

    Nurokhim; Sumarbagiono

    2008-01-01

    Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)

  8. Computability, complexity, and languages fundamentals of theoretical computer science

    CERN Document Server

    Davis, Martin D; Rheinboldt, Werner

    1983-01-01

    Computability, Complexity, and Languages: Fundamentals of Theoretical Computer Science provides an introduction to the various aspects of theoretical computer science. Theoretical computer science is the mathematical study of models of computation. This text is composed of five parts encompassing 17 chapters, and begins with an introduction to the use of proofs in mathematics and the development of computability theory in the context of an extremely simple abstract programming language. The succeeding parts demonstrate the performance of abstract programming language using a macro expa

  9. Analysis and computer simulation for transient flow in complex system of liquid piping

    International Nuclear Information System (INIS)

    Mitry, A.M.

    1985-01-01

    This paper is concerned with unsteady state analysis and development of a digital computer program, FLUTRAN, that performs a simulation of transient flow behavior in a complex system of liquid piping. The program calculates pressure and flow transients in the liquid filled piping system. The analytical model is based on the method of characteristics solution to the fluid hammer continuity and momentum equations. The equations are subject to wide variety of boundary conditions to take into account the effect of hydraulic devices. Water column separation is treated as a boundary condition with known head. Experimental tests are presented that exhibit transients induced by pump failure and valve closure in the McGuire Nuclear Station Low Level Intake Cooling Water System. Numerical simulation is conducted to compare theory with test data. Analytical and test data are shown to be in good agreement and provide validation of the model

  10. Computer systems and nuclear industry

    International Nuclear Information System (INIS)

    Nkaoua, Th.; Poizat, F.; Augueres, M.J.

    1999-01-01

    This article deals with computer systems in nuclear industry. In most nuclear facilities it is necessary to handle a great deal of data and of actions in order to help plant operator to drive, to control physical processes and to assure the safety. The designing of reactors requires reliable computer codes able to simulate neutronic or mechanical or thermo-hydraulic behaviours. Calculations and simulations play an important role in safety analysis. In each of these domains, computer systems have progressively appeared as efficient tools to challenge and master complexity. (A.C.)

  11. Reactor protection system design using micro-computers

    International Nuclear Information System (INIS)

    Fairbrother, D.B.

    1977-01-01

    Reactor Protection Systems for Nuclear Power Plants have traditionally been built using analog hardware. This hardware works quite well for single parameter trip functions; however, optimum protection against DNBR and KW/ft limits requires more complex trip functions than can easily be handled with analog hardware. For this reason, Babcock and Wilcox has introduced a Reactor Protection System, called the RPS-II, that utilizes a micro-computer to handle the more complex trip functions. This paper describes the design of the RPS-II and the operation of the micro-computer within the Reactor Protection System

  12. Reactor protection system design using micro-computers

    International Nuclear Information System (INIS)

    Fairbrother, D.B.

    1976-01-01

    Reactor protection systems for nuclear power plants have traditionally been built using analog hardware. This hardware works quite well for single parameter trip functions; however, optimum protection against DNBR and KW/ft limits requires more complex trip functions than can easily be handled with analog hardware. For this reason, Babcock and Wilcox has introduced a Reactor Protection System, called the RPS-II, that utilizes a micro-computer to handle the more complex trip functions. The paper describes the design of the RPS-II and the operation of the micro-computer within the Reactor Protection System

  13. On the Computational Capabilities of Physical Systems. Part 1; The Impossibility of Infallible Computation

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation

  14. Fast and accurate algorithm for the computation of complex linear canonical transforms.

    Science.gov (United States)

    Koç, Aykut; Ozaktas, Haldun M; Hesselink, Lambertus

    2010-09-01

    A fast and accurate algorithm is developed for the numerical computation of the family of complex linear canonical transforms (CLCTs), which represent the input-output relationship of complex quadratic-phase systems. Allowing the linear canonical transform parameters to be complex numbers makes it possible to represent paraxial optical systems that involve complex parameters. These include lossy systems such as Gaussian apertures, Gaussian ducts, or complex graded-index media, as well as lossless thin lenses and sections of free space and any arbitrary combinations of them. Complex-ordered fractional Fourier transforms (CFRTs) are a special case of CLCTs, and therefore a fast and accurate algorithm to compute CFRTs is included as a special case of the presented algorithm. The algorithm is based on decomposition of an arbitrary CLCT matrix into real and complex chirp multiplications and Fourier transforms. The samples of the output are obtained from the samples of the input in approximately N log N time, where N is the number of input samples. A space-bandwidth product tracking formalism is developed to ensure that the number of samples is information-theoretically sufficient to reconstruct the continuous transform, but not unnecessarily redundant.

  15. Exact complexity: The spectral decomposition of intrinsic computation

    International Nuclear Information System (INIS)

    Crutchfield, James P.; Ellison, Christopher J.; Riechers, Paul M.

    2016-01-01

    We give exact formulae for a wide family of complexity measures that capture the organization of hidden nonlinear processes. The spectral decomposition of operator-valued functions leads to closed-form expressions involving the full eigenvalue spectrum of the mixed-state presentation of a process's ϵ-machine causal-state dynamic. Measures include correlation functions, power spectra, past-future mutual information, transient and synchronization informations, and many others. As a result, a direct and complete analysis of intrinsic computation is now available for the temporal organization of finitary hidden Markov models and nonlinear dynamical systems with generating partitions and for the spatial organization in one-dimensional systems, including spin systems, cellular automata, and complex materials via chaotic crystallography. - Highlights: • We provide exact, closed-form expressions for a hidden stationary process' intrinsic computation. • These include information measures such as the excess entropy, transient information, and synchronization information and the entropy-rate finite-length approximations. • The method uses an epsilon-machine's mixed-state presentation. • The spectral decomposition of the mixed-state presentation relies on the recent development of meromorphic functional calculus for nondiagonalizable operators.

  16. Complex network problems in physics, computer science and biology

    Science.gov (United States)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  17. Language Networks as Complex Systems

    Science.gov (United States)

    Lee, Max Kueiming; Ou, Sheue-Jen

    2008-01-01

    Starting in the late eighties, with a growing discontent with analytical methods in science and the growing power of computers, researchers began to study complex systems such as living organisms, evolution of genes, biological systems, brain neural networks, epidemics, ecology, economy, social networks, etc. In the early nineties, the research…

  18. The computational challenges of Earth-system science.

    Science.gov (United States)

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  19. SUPERCOMPUTER SIMULATION OF CRITICAL PHENOMENA IN COMPLEX SOCIAL SYSTEMS

    Directory of Open Access Journals (Sweden)

    Petrus M.A. Sloot

    2014-09-01

    Full Text Available The paper describes a problem of computer simulation of critical phenomena in complex social systems on a petascale computing systems in frames of complex networks approach. The three-layer system of nested models of complex networks is proposed including aggregated analytical model to identify critical phenomena, detailed model of individualized network dynamics and model to adjust a topological structure of a complex network. The scalable parallel algorithm covering all layers of complex networks simulation is proposed. Performance of the algorithm is studied on different supercomputing systems. The issues of software and information infrastructure of complex networks simulation are discussed including organization of distributed calculations, crawling the data in social networks and results visualization. The applications of developed methods and technologies are considered including simulation of criminal networks disruption, fast rumors spreading in social networks, evolution of financial networks and epidemics spreading.

  20. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task

  1. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  2. Annual Performance Assessment of Complex Fenestration Systems in Sunny Climates Using Advanced Computer Simulations

    Directory of Open Access Journals (Sweden)

    Chantal Basurto

    2015-12-01

    Full Text Available Complex Fenestration Systems (CFS are advanced daylighting systems that are placed on the upper part of a window to improve the indoor daylight distribution within rooms. Due to their double function of daylight redirection and solar protection, they are considered as a solution to mitigate the unfavorable effects due to the admission of direct sunlight in buildings located in prevailing sunny climates (risk of glare and overheating. Accordingly, an adequate assessment of their performance should include an annual evaluation of the main aspects relevant to the use of daylight in such regions: the indoor illuminance distribution, thermal comfort, and visual comfort of the occupant’s. Such evaluation is possible with the use of computer simulations combined with the bi-directional scattering distribution function (BSDF data of these systems. This study explores the use of available methods to assess the visible and thermal annual performance of five different CFS using advanced computer simulations. To achieve results, an on-site daylight monitoring was carried out in a building located in a predominantly sunny climate location, and the collected data was used to create and calibrate a virtual model used to carry-out the simulations. The results can be employed to select the CFS, which improves visual and thermal interior environment for the occupants.

  3. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  4. Krylov Subspace Methods for Complex Non-Hermitian Linear Systems. Thesis

    Science.gov (United States)

    Freund, Roland W.

    1991-01-01

    We consider Krylov subspace methods for the solution of large sparse linear systems Ax = b with complex non-Hermitian coefficient matrices. Such linear systems arise in important applications, such as inverse scattering, numerical solution of time-dependent Schrodinger equations, underwater acoustics, eddy current computations, numerical computations in quantum chromodynamics, and numerical conformal mapping. Typically, the resulting coefficient matrices A exhibit special structures, such as complex symmetry, or they are shifted Hermitian matrices. In this paper, we first describe a Krylov subspace approach with iterates defined by a quasi-minimal residual property, the QMR method, for solving general complex non-Hermitian linear systems. Then, we study special Krylov subspace methods designed for the two families of complex symmetric respectively shifted Hermitian linear systems. We also include some results concerning the obvious approach to general complex linear systems by solving equivalent real linear systems for the real and imaginary parts of x. Finally, numerical experiments for linear systems arising from the complex Helmholtz equation are reported.

  5. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex

  6. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    International Nuclear Information System (INIS)

    Brown, D.L.

    2009-01-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex networked systems

  7. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  8. On the complexity of computing two nonlinearity measures

    DEFF Research Database (Denmark)

    Find, Magnus Gausdal

    2014-01-01

    We study the computational complexity of two Boolean nonlinearity measures: the nonlinearity and the multiplicative complexity. We show that if one-way functions exist, no algorithm can compute the multiplicative complexity in time 2O(n) given the truth table of length 2n, in fact under the same ...

  9. Automated design of complex dynamic systems.

    Directory of Open Access Journals (Sweden)

    Michiel Hermans

    Full Text Available Several fields of study are concerned with uniting the concept of computation with that of the design of physical systems. For example, a recent trend in robotics is to design robots in such a way that they require a minimal control effort. Another example is found in the domain of photonics, where recent efforts try to benefit directly from the complex nonlinear dynamics to achieve more efficient signal processing. The underlying goal of these and similar research efforts is to internalize a large part of the necessary computations within the physical system itself by exploiting its inherent non-linear dynamics. This, however, often requires the optimization of large numbers of system parameters, related to both the system's structure as well as its material properties. In addition, many of these parameters are subject to fabrication variability or to variations through time. In this paper we apply a machine learning algorithm to optimize physical dynamic systems. We show that such algorithms, which are normally applied on abstract computational entities, can be extended to the field of differential equations and used to optimize an associated set of parameters which determine their behavior. We show that machine learning training methodologies are highly useful in designing robust systems, and we provide a set of both simple and complex examples using models of physical dynamical systems. Interestingly, the derived optimization method is intimately related to direct collocation a method known in the field of optimal control. Our work suggests that the application domains of both machine learning and optimal control have a largely unexplored overlapping area which envelopes a novel design methodology of smart and highly complex physical systems.

  10. Computational Complexity and Human Decision-Making.

    Science.gov (United States)

    Bossaerts, Peter; Murawski, Carsten

    2017-12-01

    The rationality principle postulates that decision-makers always choose the best action available to them. It underlies most modern theories of decision-making. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory (CCT) provides a framework for defining and quantifying the difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and metacognition, are intractable from a computational perspective. To be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, and the resource constraints imposed on the decision-maker by biology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Computational complexity of the landscape II-Cosmological considerations

    Science.gov (United States)

    Denef, Frederik; Douglas, Michael R.; Greene, Brian; Zukowski, Claire

    2018-05-01

    We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity". Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.

  12. Complex cellular logic computation using ribocomputing devices.

    Science.gov (United States)

    Green, Alexander A; Kim, Jongmin; Ma, Duo; Silver, Pamela A; Collins, James J; Yin, Peng

    2017-08-03

    Synthetic biology aims to develop engineering-driven approaches to the programming of cellular functions that could yield transformative technologies. Synthetic gene circuits that combine DNA, protein, and RNA components have demonstrated a range of functions such as bistability, oscillation, feedback, and logic capabilities. However, it remains challenging to scale up these circuits owing to the limited number of designable, orthogonal, high-performance parts, the empirical and often tedious composition rules, and the requirements for substantial resources for encoding and operation. Here, we report a strategy for constructing RNA-only nanodevices to evaluate complex logic in living cells. Our 'ribocomputing' systems are composed of de-novo-designed parts and operate through predictable and designable base-pairing rules, allowing the effective in silico design of computing devices with prescribed configurations and functions in complex cellular environments. These devices operate at the post-transcriptional level and use an extended RNA transcript to co-localize all circuit sensing, computation, signal transduction, and output elements in the same self-assembled molecular complex, which reduces diffusion-mediated signal losses, lowers metabolic cost, and improves circuit reliability. We demonstrate that ribocomputing devices in Escherichia coli can evaluate two-input logic with a dynamic range up to 900-fold and scale them to four-input AND, six-input OR, and a complex 12-input expression (A1 AND A2 AND NOT A1*) OR (B1 AND B2 AND NOT B2*) OR (C1 AND C2) OR (D1 AND D2) OR (E1 AND E2). Successful operation of ribocomputing devices based on programmable RNA interactions suggests that systems employing the same design principles could be implemented in other host organisms or in extracellular settings.

  13. Radwaste treatment complex. DRAWMACS planned maintenance system

    International Nuclear Information System (INIS)

    Keel, A.J.

    1992-07-01

    This document describes the operation of the Planned Maintenance System for the Radwaste Treatment Complex. The Planned Maintenance System forms part of the Decommissioning and Radwaste Management Computer System (DRAWMACS). Further detailed information about the data structure of the system is contained in Database Design for the DRAWMACS Planned Maintenance System (AEA-D and R-0285, 2nd issue, 25th February 1992). Information for other components of DRAWMACS is contained in Basic User Guide for the Radwaste Treatment Plant Computer System (AEA-D and R-0019, July 1990). (author)

  14. Intraoperative computed tomography with an integrated navigation system in stabilization surgery for complex craniovertebral junction malformation.

    Science.gov (United States)

    Yu, Xinguang; Li, Lianfeng; Wang, Peng; Yin, Yiheng; Bu, Bo; Zhou, Dingbiao

    2014-07-01

    This study was designed to report our preliminary experience with stabilization procedures for complex craniovertebral junction malformation (CVJM) using intraoperative computed tomography (iCT) with an integrated neuronavigation system (NNS). To evaluate the workflow, feasibility and clinical outcome of stabilization procedures using iCT image-guided navigation for complex CVJM. The stabilization procedures in CVJM are complex because of the area's intricate geometry and bony structures, its critical relationship to neurovascular structures and the intricate biomechanical issues involved. A sliding gantry 40-slice computed tomography scanner was installed in a preexisting operating room. The images were transferred directly from the scanner to the NNS using an automated registration system. On the basis of the analysis of intraoperative computed tomographic images, 23 cases (11 males, 12 females) with complicated CVJM underwent navigated stabilization procedures to allow more control over screw placement. The age of these patients were 19-52 years (mean: 33.5 y). We performed C1-C2 transarticular screw fixation in 6 patients to produce atlantoaxial arthrodesis with better reliability. Because of a high-riding transverse foramen on at least 1 side of the C2 vertebra and an anomalous vertebral artery position, 7 patients underwent C1 lateral mass and C2 pedicle screw fixation. Ten additional patients were treated with individualized occipitocervical fixation surgery from the hypoplasia of C1 or constraints due to C2 bone structure. In total, 108 screws were inserted into 23 patients using navigational assistance. The screws comprised 20 C1 lateral mass screws, 26 C2, 14 C3, or 4 C4 pedicle screws, 32 occipital screws, and 12 C1-C2 transarticular screws. There were no vascular or neural complications except for pedicle perforations that were detected in 2 (1.9%) patients and were corrected intraoperatively without any persistent nerves or vessel damage. The overall

  15. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  17. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  18. Implicit computational complexity and compilers

    DEFF Research Database (Denmark)

    Rubiano, Thomas

    Complexity theory helps us predict and control resources, usually time and space, consumed by programs. Static analysis on specific syntactic criterion allows us to categorize some programs. A common approach is to observe the program’s data’s behavior. For instance, the detection of non...... evolution and a lot of research came from this theory. Until now, these implicit complexity theories were essentially applied on more or less toy languages. This thesis applies implicit computational complexity methods into “real life” programs by manipulating intermediate representation languages...

  19. Computer control system of TRISTAN

    International Nuclear Information System (INIS)

    Kurokawa, Shin-ichi; Shinomoto, Manabu; Kurihara, Michio; Sakai, Hiroshi.

    1984-01-01

    For the operation of a large accelerator, it is necessary to connect an enormous quantity of electro-magnets, power sources, vacuum equipment, high frequency accelerator and so on and to control them harmoniously. For the purpose, a number of computers are adopted, and connected with a network, in this way, a large computer system for laboratory automation which integrates and controls the whole system is constructed. As a distributed system of large scale, the functions such as electro-magnet control, file processing and operation control are assigned to respective computers, and the total control is made feasible by network connection, at the same time, as the interface with controlled equipment, the CAMAC (computer-aided measurement and control) is adopted to ensure the flexibility and the possibility of expansion of the system. Moreover, the language ''NODAL'' having network support function was developed so as to easily make software without considering the composition of more complex distributed system. The accelerator in the TRISTAN project is composed of an electron linear accelerator, an accumulation ring of 6 GeV and a main ring of 30 GeV. Two ring type accelerators must be synchronously operated as one body, and are controlled with one computer system. The hardware and software are outlined. (Kako, I.)

  20. Complex energy system management using optimization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bridgeman, Stuart; Hurdowar-Castro, Diana; Allen, Rick; Olason, Tryggvi; Welt, Francois

    2010-09-15

    Modern energy systems are often very complex with respect to the mix of generation sources, energy storage, transmission, and avenues to market. Historically, power was provided by government organizations to load centers, and pricing was provided in a regulatory manner. In recent years, this process has been displaced by the independent system operator (ISO). This complexity makes the operation of these systems very difficult, since the components of the system are interdependent. Consequently, computer-based large-scale simulation and optimization methods like Decision Support Systems are now being used. This paper discusses the application of a DSS to operations and planning systems.

  1. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  2. Modeling Cu{sup 2+}-Aβ complexes from computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Alí-Torres, Jorge [Departamento de Química, Universidad Nacional de Colombia- Sede Bogotá, 111321 (Colombia); Mirats, Andrea; Maréchal, Jean-Didier; Rodríguez-Santiago, Luis; Sodupe, Mariona, E-mail: Mariona.Sodupe@uab.cat [Departament de Química, Universitat Autònoma de Barcelona, 08193 Bellaterra, Barcelona (Spain)

    2015-09-15

    Amyloid plaques formation and oxidative stress are two key events in the pathology of the Alzheimer disease (AD), in which metal cations have been shown to play an important role. In particular, the interaction of the redox active Cu{sup 2+} metal cation with Aβ has been found to interfere in amyloid aggregation and to lead to reactive oxygen species (ROS). A detailed knowledge of the electronic and molecular structure of Cu{sup 2+}-Aβ complexes is thus important to get a better understanding of the role of these complexes in the development and progression of the AD disease. The computational treatment of these systems requires a combination of several available computational methodologies, because two fundamental aspects have to be addressed: the metal coordination sphere and the conformation adopted by the peptide upon copper binding. In this paper we review the main computational strategies used to deal with the Cu{sup 2+}-Aβ coordination and build plausible Cu{sup 2+}-Aβ models that will afterwards allow determining physicochemical properties of interest, such as their redox potential.

  3. Documentation Driven Development for Complex Real-Time Systems

    Science.gov (United States)

    2004-12-01

    This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real

  4. Common cause failure analysis methodology for complex systems

    International Nuclear Information System (INIS)

    Wagner, D.P.; Cate, C.L.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complex system reliability analysis. This paper extends existing methods of computer aided common cause failure analysis by allowing analysis of the complex systems often encountered in practice. The methods presented here aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  5. Introduction to the LaRC central scientific computing complex

    Science.gov (United States)

    Shoosmith, John N.

    1993-01-01

    The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.

  6. Theories of computational complexity

    CERN Document Server

    Calude, C

    1988-01-01

    This volume presents four machine-independent theories of computational complexity, which have been chosen for their intrinsic importance and practical relevance. The book includes a wealth of results - classical, recent, and others which have not been published before.In developing the mathematics underlying the size, dynamic and structural complexity measures, various connections with mathematical logic, constructive topology, probability and programming theories are established. The facts are presented in detail. Extensive examples are provided, to help clarify notions and constructions. The lists of exercises and problems include routine exercises, interesting results, as well as some open problems.

  7. Increasing efficiency of job execution with resource co-allocation in distributed computer systems

    OpenAIRE

    Cankar, Matija

    2014-01-01

    The field of distributed computer systems, while not new in computer science, is still the subject of a lot of interest in both industry and academia. More powerful computers, faster and more ubiquitous networks, and complex distributed applications are accelerating the growth of distributed computing. Large numbers of computers interconnected in a single network provide additional computing power to users whenever required. Such systems are, however, expensive and complex to manage, which ca...

  8. A computer-controlled conformal radiotherapy system I: overview

    International Nuclear Information System (INIS)

    Fraass, Benedick A.; McShan, Daniel L.; Kessler, Marc L.; Matrone, Gwynne M.; Lewis, James D.; Weaver, Tamar A.

    1995-01-01

    Purpose: Equipment developed for use with computer-controlled conformal radiotherapy (CCRT) treatment techniques, including multileaf collimators and/or computer-control systems for treatment machines, are now available. The purpose of this work is to develop a system that will allow the safe, efficient, and accurate delivery of CCRT treatments as routine clinical treatments, and permit modifications of the system so that the delivery process can be optimized. Methods and Materials: The needs and requirements for a system that can fully support modern computer-controlled treatment machines equipped with multileaf collimators and segmental or dynamic conformal therapy capabilities have been analyzed and evaluated. This analysis has been used to design and then implement a complete approach to the delivery of CCRT treatments. Results: The computer-controlled conformal radiotherapy system (CCRS) described here consists of a process for the delivery of CCRT treatments, and a complex software system that implements the treatment process. The CCRS system described here includes systems for plan transfer, treatment delivery planning, sequencing of the actual treatment delivery process, graphical simulation and verification tools, as well as an electronic chart that is an integral part of the system. The CCRS system has been implemented for use with a number of different treatment machines. The system has been used clinically for more than 2 years to perform CCRT treatments for more than 200 patients. Conclusions: A comprehensive system for the implementation and delivery of computer-controlled conformal radiation therapy (CCRT) plans has been designed and implemented for routine clinical use with multisegment, computer-controlled, multileaf-collimated conformal therapy. The CCRS system has been successfully implemented to perform these complex treatments, and is considered quite important to the clinical use of modern computer-controlled treatment techniques

  9. Distributed Information and Control system reliability enhancement by fog-computing concept application

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-03-01

    The paper focuses on the information and control system reliability issue. Authors of the current paper propose a new complex approach of information and control system reliability enhancement by application of the computing concept elements. The approach proposed consists of a complex of optimization problems to be solved. These problems are: estimation of computational complexity, which can be shifted to the edge of the network and fog-layer, distribution of computations among the data processing elements and distribution of computations among the sensors. The problems as well as some simulated results and discussion are formulated and presented within this paper.

  10. Risk assessment of computer-controlled safety systems for fusion reactors

    International Nuclear Information System (INIS)

    Fryer, M.O.; Bruske, S.Z.

    1983-01-01

    The complexity of fusion reactor systems and the need to display, analyze, and react promptly to large amounts of information during reactor operation will require a number of safety systems in the fusion facilities to be computer controlled. Computer software, therefore, must be included in the reactor safety analyses. Unfortunately, the science of integrating computer software into safety analyses is in its infancy. Combined plant hardware and computer software systems are often treated by making simple assumptions about software performance. This method is not acceptable for assessing risks in the complex fusion systems, and a new technique for risk assessment of combined plant hardware and computer software systems has been developed. This technique is an extension of the traditional fault tree analysis and uses structured flow charts of the software in a manner analogous to wiring or piping diagrams of hardware. The software logic determines the form of much of the fault trees

  11. 11th International Conference on Dependability and Complex Systems

    CERN Document Server

    Mazurkiewicz, Jacek; Sugier, Jarosław; Walkowiak, Tomasz; Kacprzyk, Janusz

    2016-01-01

    These proceedings present the results of the Eleventh International Conference on Dependability and Complex Systems DepCoS-RELCOMEX which took place in a picturesque Brunów Palace in Poland from 27th June to 1st July, 2016. DepCoS-RELCOMEX is a series of international conferences organized annually by Department of Computer Engineering of Wrocław University of Science and Technology since 2006. The roots of the series go as far back as to the seventies of the previous century – the first RELCOMEX conference took place in 1977 – and now its main aim is to promote a multi-disciplinary approach to dependability problems in theory and engineering practice of complex systems. Complex systems, nowadays most often computer-based and distributed, are built upon a variety of technical, information, software and human resources. The challenges in their design, analysis and maintenance not only originate from the involved technical and organizational structures but also from the complexity of the information proce...

  12. Complexity and Control: Towards a Rigorous Behavioral Theory of Complex Dynamical Systems

    Science.gov (United States)

    Ivancevic, Vladimir G.; Reid, Darryn J.

    We introduce our motive for writing this book on complexity and control with a popular "complexity myth," which seems to be quite wide spread among chaos and complexity theory fashionistas: quote>Low-dimensional systems usually exhibit complex behaviours (which we know fromMay's studies of the Logisticmap), while high-dimensional systems usually exhibit simple behaviours (which we know from synchronisation studies of the Kuramoto model)...quote> We admit that this naive view on complex (e.g., human) systems versus simple (e.g., physical) systems might seem compelling to various technocratic managers and politicians; indeed, the idea makes for appealing sound-bites. However, it is enough to see both in the equations and computer simulations of pendula of various degree - (i) a single pendulum, (ii) a double pendulum, and (iii) a triple pendulum - that this popular myth is plain nonsense. The only thing that we can learn from it is what every tyrant already knows: by using force as a strong means of control, it is possible to effectively synchronise even hundreds of millions of people, at least for a while.

  13. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  14. Selection and implementation of a laboratory computer system.

    Science.gov (United States)

    Moritz, V A; McMaster, R; Dillon, T; Mayall, B

    1995-07-01

    The process of selection of a pathology computer system has become increasingly complex as there are an increasing number of facilities that must be provided and stringent performance requirements under heavy computing loads from both human users and machine inputs. Furthermore, the continuing advances in software and hardware technology provide more options and innovative new ways of tackling problems. These factors taken together pose a difficult and complex set of decisions and choices for the system analyst and designer. The selection process followed by the Microbiology Department at Heidelberg Repatriation Hospital included examination of existing systems, development of a functional specification followed by a formal tender process. The successful tenderer was then selected using predefined evaluation criteria. The successful tenderer was a software development company that developed and supplied a system based on a distributed network using a SUN computer as the main processor. The software was written using Informix running on the UNIX operating system. This represents one of the first microbiology systems developed using a commercial relational database and fourth generation language. The advantages of this approach are discussed.

  15. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  16. Distributed redundancy and robustness in complex systems

    KAUST Repository

    Randles, Martin

    2011-03-01

    The uptake and increasing prevalence of Web 2.0 applications, promoting new large-scale and complex systems such as Cloud computing and the emerging Internet of Services/Things, requires tools and techniques to analyse and model methods to ensure the robustness of these new systems. This paper reports on assessing and improving complex system resilience using distributed redundancy, termed degeneracy in biological systems, to endow large-scale complicated computer systems with the same robustness that emerges in complex biological and natural systems. However, in order to promote an evolutionary approach, through emergent self-organisation, it is necessary to specify the systems in an \\'open-ended\\' manner where not all states of the system are prescribed at design-time. In particular an observer system is used to select robust topologies, within system components, based on a measurement of the first non-zero Eigen value in the Laplacian spectrum of the components\\' network graphs; also known as the algebraic connectivity. It is shown, through experimentation on a simulation, that increasing the average algebraic connectivity across the components, in a network, leads to an increase in the variety of individual components termed distributed redundancy; the capacity for structurally distinct components to perform an identical function in a particular context. The results are applied to a specific application where active clustering of like services is used to aid load balancing in a highly distributed network. Using the described procedure is shown to improve performance and distribute redundancy. © 2010 Elsevier Inc.

  17. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  18. Effects of Task Performance and Task Complexity on the Validity of Computational Models of Attention

    NARCIS (Netherlands)

    Koning, L. de; Maanen, P.P. van; Dongen, K. van

    2008-01-01

    Computational models of attention can be used as a component of decision support systems. For accurate support, a computational model of attention has to be valid and robust. The effects of task performance and task complexity on the validity of three different computational models of attention were

  19. Third International Conference on Complex Systems

    CERN Document Server

    Minai, Ali A; Unifying Themes in Complex Systems

    2006-01-01

    In recent years, scientists have applied the principles of complex systems science to increasingly diverse fields. The results have been nothing short of remarkable: their novel approaches have provided answers to long-standing questions in biology, ecology, physics, engineering, computer science, economics, psychology and sociology. The Third International Conference on Complex Systems attracted over 400 researchers from around the world. The conference aimed to encourage cross-fertilization between the many disciplines represented and to deepen our understanding of the properties common to all complex systems. This volume contains selected transcripts from presentations given at the conference. Speakers include: Chris Adami, Kenneth Arrow, Michel Baranger, Dan Braha, Timothy Buchman, Michael Caramanis, Kathleen Carley, Greg Chaitin, David Clark, Jack Cohen, Jim Collins, George Cowan, Clay Easterly, Steven Eppinger, Irving Epstein, Dan Frey, Ary Goldberger, Helen Harte, Leroy Hood, Don Ingber, Atlee Jackson,...

  20. Impact of familiarity on information complexity in human-computer interfaces

    Directory of Open Access Journals (Sweden)

    Bakaev Maxim

    2016-01-01

    Full Text Available A quantitative measure of information complexity remains very much desirable in HCI field, since it may aid in optimization of user interfaces, especially in human-computer systems for controlling complex objects. Our paper is dedicated to exploration of subjective (subject-depended aspect of the complexity, conceptualized as information familiarity. Although research of familiarity in human cognition and behaviour is done in several fields, the accepted models in HCI, such as Human Processor or Hick-Hyman’s law do not generally consider this issue. In our experimental study the subjects performed search and selection of digits and letters, whose familiarity was conceptualized as frequency of occurrence in numbers and texts. The analysis showed significant effect of information familiarity on selection time and throughput in regression models, although the R2 values were somehow low. Still, we hope that our results might aid in quantification of information complexity and its further application for optimizing interaction in human-machine systems.

  1. Mathematical approaches for complexity/predictivity trade-offs in complex system models : LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Mayo, Jackson R.; Bhattacharyya, Arnab (Massachusetts Institute of Technology, Cambridge, MA); Armstrong, Robert C.; Vanderveen, Keith

    2008-09-01

    The goal of this research was to examine foundational methods, both computational and theoretical, that can improve the veracity of entity-based complex system models and increase confidence in their predictions for emergent behavior. The strategy was to seek insight and guidance from simplified yet realistic models, such as cellular automata and Boolean networks, whose properties can be generalized to production entity-based simulations. We have explored the usefulness of renormalization-group methods for finding reduced models of such idealized complex systems. We have prototyped representative models that are both tractable and relevant to Sandia mission applications, and quantified the effect of computational renormalization on the predictive accuracy of these models, finding good predictivity from renormalized versions of cellular automata and Boolean networks. Furthermore, we have theoretically analyzed the robustness properties of certain Boolean networks, relevant for characterizing organic behavior, and obtained precise mathematical constraints on systems that are robust to failures. In combination, our results provide important guidance for more rigorous construction of entity-based models, which currently are often devised in an ad-hoc manner. Our results can also help in designing complex systems with the goal of predictable behavior, e.g., for cybersecurity.

  2. The impact of treatment complexity and computer-control delivery technology on treatment delivery errors

    International Nuclear Information System (INIS)

    Fraass, Benedick A.; Lash, Kathy L.; Matrone, Gwynne M.; Volkman, Susan K.; McShan, Daniel L.; Kessler, Marc L.; Lichter, Allen S.

    1998-01-01

    Purpose: To analyze treatment delivery errors for three-dimensional (3D) conformal therapy performed at various levels of treatment delivery automation and complexity, ranging from manual field setup to virtually complete computer-controlled treatment delivery using a computer-controlled conformal radiotherapy system (CCRS). Methods and Materials: All treatment delivery errors which occurred in our department during a 15-month period were analyzed. Approximately 34,000 treatment sessions (114,000 individual treatment segments [ports]) on four treatment machines were studied. All treatment delivery errors logged by treatment therapists or quality assurance reviews (152 in all) were analyzed. Machines 'M1' and 'M2' were operated in a standard manual setup mode, with no record and verify system (R/V). MLC machines 'M3' and 'M4' treated patients under the control of the CCRS system, which (1) downloads the treatment delivery plan from the planning system; (2) performs some (or all) of the machine set up and treatment delivery for each field; (3) monitors treatment delivery; (4) records all treatment parameters; and (5) notes exceptions to the electronically-prescribed plan. Complete external computer control is not available on M3; therefore, it uses as many CCRS features as possible, while M4 operates completely under CCRS control and performs semi-automated and automated multi-segment intensity modulated treatments. Analysis of treatment complexity was based on numbers of fields, individual segments, nonaxial and noncoplanar plans, multisegment intensity modulation, and pseudoisocentric treatments studied for a 6-month period (505 patients) concurrent with the period in which the delivery errors were obtained. Treatment delivery time was obtained from the computerized scheduling system (for manual treatments) or from CCRS system logs. Treatment therapists rotate among the machines; therefore, this analysis does not depend on fixed therapist staff on particular

  3. A user's manual of Tools for Error Estimation of Complex Number Matrix Computation (Ver.1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi.

    1997-03-01

    'Tools for Error Estimation of Complex Number Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the complex number linear system's solutions or the Hermitian matrices' eigen values. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calulate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The error estimation subroutines for Hermitian matrix eigen values' derive the error ranges of the eigen values according to the Korn-Kato's formula. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  4. Ubiquitous Computing, Complexity and Culture

    DEFF Research Database (Denmark)

    environments, experience time, and develop identities individually and socially. Interviews with working media artists lend further perspectives on these cultural transformations. Drawing on cultural theory, new media art studies, human-computer interaction theory, and software studies, this cutting-edge book......The ubiquitous nature of mobile and pervasive computing has begun to reshape and complicate our notions of space, time, and identity. In this collection, over thirty internationally recognized contributors reflect on ubiquitous computing’s implications for the ways in which we interact with our...... critically unpacks the complex ubiquity-effects confronting us every day....

  5. Computational Strategies for Dissecting the High-Dimensional Complexity of Adaptive Immune Repertoires

    Directory of Open Access Journals (Sweden)

    Enkelejda Miho

    2018-02-01

    Full Text Available The adaptive immune system recognizes antigens via an immense array of antigen-binding antibodies and T-cell receptors, the immune repertoire. The interrogation of immune repertoires is of high relevance for understanding the adaptive immune response in disease and infection (e.g., autoimmunity, cancer, HIV. Adaptive immune receptor repertoire sequencing (AIRR-seq has driven the quantitative and molecular-level profiling of immune repertoires, thereby revealing the high-dimensional complexity of the immune receptor sequence landscape. Several methods for the computational and statistical analysis of large-scale AIRR-seq data have been developed to resolve immune repertoire complexity and to understand the dynamics of adaptive immunity. Here, we review the current research on (i diversity, (ii clustering and network, (iii phylogenetic, and (iv machine learning methods applied to dissect, quantify, and compare the architecture, evolution, and specificity of immune repertoires. We summarize outstanding questions in computational immunology and propose future directions for systems immunology toward coupling AIRR-seq with the computational discovery of immunotherapeutics, vaccines, and immunodiagnostics.

  6. Statistical physics of networks, information and complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Ecke, Robert E [Los Alamos National Laboratory

    2009-01-01

    In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.

  7. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  8. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  9. Study of application technology of ultra-high speed computer to the elucidation of complex phenomena

    International Nuclear Information System (INIS)

    Sekiguchi, Tomotsugu

    1996-01-01

    The basic design of numerical information library in the decentralized computer network was explained at the first step of constructing the application technology of ultra-high speed computer to the elucidation of complex phenomena. Establishment of the system makes possible to construct the efficient application environment of ultra-high speed computer system to be scalable with the different computing systems. We named the system Ninf (Network Information Library for High Performance Computing). The summary of application technology of library was described as follows: the application technology of library under the distributed environment, numeric constants, retrieval of value, library of special functions, computing library, Ninf library interface, Ninf remote library and registration. By the system, user is able to use the program concentrating the analyzing technology of numerical value with high precision, reliability and speed. (S.Y.)

  10. Applications of Computer Technology in Complex Craniofacial Reconstruction

    Directory of Open Access Journals (Sweden)

    Kristopher M. Day, MD

    2018-03-01

    Conclusion:. Modern 3D technology allows the surgeon to better analyze complex craniofacial deformities, precisely plan surgical correction with computer simulation of results, customize osteotomies, plan distractions, and print 3DPCI, as needed. The use of advanced 3D computer technology can be applied safely and potentially improve aesthetic and functional outcomes after complex craniofacial reconstruction. These techniques warrant further study and may be reproducible in various centers of care.

  11. Smart modeling and simulation for complex systems practice and theory

    CERN Document Server

    Ren, Fenghui; Zhang, Minjie; Ito, Takayuki; Tang, Xijin

    2015-01-01

    This book aims to provide a description of these new Artificial Intelligence technologies and approaches to the modeling and simulation of complex systems, as well as an overview of the latest scientific efforts in this field such as the platforms and/or the software tools for smart modeling and simulating complex systems. These tasks are difficult to accomplish using traditional computational approaches due to the complex relationships of components and distributed features of resources, as well as the dynamic work environments. In order to effectively model the complex systems, intelligent technologies such as multi-agent systems and smart grids are employed to model and simulate the complex systems in the areas of ecosystem, social and economic organization, web-based grid service, transportation systems, power systems and evacuation systems.

  12. Computability, complexity, logic

    CERN Document Server

    Börger, Egon

    1989-01-01

    The theme of this book is formed by a pair of concepts: the concept of formal language as carrier of the precise expression of meaning, facts and problems, and the concept of algorithm or calculus, i.e. a formally operating procedure for the solution of precisely described questions and problems. The book is a unified introduction to the modern theory of these concepts, to the way in which they developed first in mathematical logic and computability theory and later in automata theory, and to the theory of formal languages and complexity theory. Apart from considering the fundamental themes an

  13. ANCON: A code for the evaluation of complex fault trees in personal computers

    International Nuclear Information System (INIS)

    Napoles, J.G.; Salomon, J.; Rivero, J.

    1990-01-01

    Performing probabilistic safety analysis has been recognized worldwide as one of the more effective ways for further enhancing safety of Nuclear Power Plants. The evaluation of fault trees plays a fundamental role in these analysis. Some existing limitations in RAM and execution speed of personal computers (PC) has restricted so far their use in the analysis of complex fault trees. Starting from new approaches in the data structure and other possibilities the ANCON code can evaluate complex fault trees in a PC, allowing the user to do a more comprehensive analysis of the considered system in reduced computing time

  14. Simulation model of load balancing in distributed computing systems

    Science.gov (United States)

    Botygin, I. A.; Popov, V. N.; Frolov, S. G.

    2017-02-01

    The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.

  15. Sustaining Economic Exploitation of Complex Ecosystems in Computational Models of Coupled Human-Natural Networks

    OpenAIRE

    Martinez, Neo D.; Tonin, Perrine; Bauer, Barbara; Rael, Rosalyn C.; Singh, Rahul; Yoon, Sangyuk; Yoon, Ilmi; Dunne, Jennifer A.

    2012-01-01

    Understanding ecological complexity has stymied scientists for decades. Recent elucidation of the famously coined "devious strategies for stability in enduring natural systems" has opened up a new field of computational analyses of complex ecological networks where the nonlinear dynamics of many interacting species can be more realistically mod-eled and understood. Here, we describe the first extension of this field to include coupled human-natural systems. This extension elucidates new strat...

  16. Coherence and computational complexity of quantifier-free dependence logic formulas

    NARCIS (Netherlands)

    Kontinen, J.; Kontinen, J.; Väänänen, J.

    2010-01-01

    We study the computational complexity of the model checking for quantifier-free dependence logic (D) formulas. We point out three thresholds in the computational complexity: logarithmic space, non- deterministic logarithmic space and non-deterministic polynomial time.

  17. Complex Data Modeling and Computationally Intensive Statistical Methods

    CERN Document Server

    Mantovan, Pietro

    2010-01-01

    The last years have seen the advent and development of many devices able to record and store an always increasing amount of complex and high dimensional data; 3D images generated by medical scanners or satellite remote sensing, DNA microarrays, real time financial data, system control datasets. The analysis of this data poses new challenging problems and requires the development of novel statistical models and computational methods, fueling many fascinating and fast growing research areas of modern statistics. The book offers a wide variety of statistical methods and is addressed to statistici

  18. The Convergence of the telematic, computing and information services as a basis for using artificial intelligence to manage complex techno-organizational systems

    Directory of Open Access Journals (Sweden)

    Raikov Alexander

    2018-01-01

    Full Text Available The authors analyses the problems of using artificial intelligence to manage complex techno-organizational systems on the basis of the convergence of the telematic, computing and information services in order to manage complex techno-organizational systems in the aerospace industry. This means getting the space objects a higher level of management based on the self-organizing integration principle. Using the artificial intelligence elements allows us to get more optimal and limit values parameters of the ordinal and critical situations in real time. Thus, it helps us to come closer to the limit values parameters of the managed objects due to rising managing and observant possibilities.

  19. Morphogenetic Engineering Toward Programmable Complex Systems

    CERN Document Server

    Sayama, Hiroki; Michel, Olivier

    2012-01-01

    Generally, spontaneous pattern formation phenomena are random and repetitive, whereas elaborate devices are the deterministic product of human design. Yet, biological organisms and collective insect constructions are exceptional examples of complex systems that are both self-organized and architectural.   This book is the first initiative of its kind toward establishing a new field of research, Morphogenetic Engineering, to explore the modeling and implementation of “self-architecturing” systems. Particular emphasis is placed on the programmability and computational abilities of self-organization, properties that are often underappreciated in complex systems science—while, conversely, the benefits of self-organization are often underappreciated in engineering methodologies.   Altogether, the aim of this work is to provide a framework for and examples of a larger class of “self-architecturing” systems, while addressing fundamental questions such as   > How do biological organisms carry out morphog...

  20. Stochastic transport in complex systems from molecules to vehicles

    CERN Document Server

    Schadschneider, Andreas; Nishinari, Katsuhiro

    2011-01-01

    What is common between a motor protein, an ant and a vehicle? Each can be modelled as a"self-propelled particle"whose forward movement can be hindered by another in front of it. Traffic flow of such interacting driven"particles"has become an active area of interdisciplinary research involving physics, civil engineering and computer science. We present a unified pedagogical introduction to the analytical and computational methods which are currently used for studying such complex systems far from equilibrium. We also review a number of applications ranging from intra-cellular molecular motor transport in living systems to ant trails and vehicular traffic. Researchers working on complex systems, in general, and on classical stochastic transport, in particular, will find the pedagogical style, scholarly critical overview and extensive list of references extremely useful.

  1. Computer-aided control system design

    International Nuclear Information System (INIS)

    Lebenhaft, J.R.

    1986-01-01

    Control systems are typically implemented using conventional PID controllers, which are then tuned manually during plant commissioning to compensate for interactions between feedback loops. As plants increase in size and complexity, such controllers can fail to provide adequate process regulations. Multivariable methods can be utilized to overcome these limitations. At the Chalk River Nuclear Laboratories, modern control systems are designed and analyzed with the aid of MVPACK, a system of computer programs that appears to the user like a high-level calculator. The software package solves complicated control problems, and provides useful insight into the dynamic response and stability of multivariable systems

  2. Interactive granular computations in networks and systems engineering a practical perspective

    CERN Document Server

    Jankowski, Andrzej

    2017-01-01

    The book outlines selected projects conducted under the supervision of the author. Moreover, it discusses significant relations between Interactive Granular Computing (IGrC) and numerous dynamically developing scientific domains worldwide, along with features characteristic of the author’s approach to IGrC. The results presented are a continuation and elaboration of various aspects of Wisdom Technology, initiated and developed in cooperation with Professor Andrzej Skowron. Based on the empirical findings from these projects, the author explores the following areas: (a) understanding the causes of the theory and practice gap problem (TPGP) in complex systems engineering (CSE);(b) generalizing computing models of complex adaptive systems (CAS) (in particular, natural computing models) by constructing an interactive granular computing (IGrC) model of networks of interrelated interacting complex granules (c-granules), belonging to a single agent and/or to a group of agents; (c) developing methodologies based ...

  3. Spectrometer user interface to computer systems

    International Nuclear Information System (INIS)

    Salmon, L.; Davies, M.; Fry, F.A.; Venn, J.B.

    1979-01-01

    A computer system for use in radiation spectrometry should be designed around the needs and comprehension of the user and his operating environment. To this end, the functions of the system should be built in a modular and independent fashion such that they can be joined to the back end of an appropriate user interface. The point that this interface should be designed rather than just allowed to evolve is illustrated by reference to four related computer systems of differing complexity and function. The physical user interfaces in all cases are keyboard terminals, and the virtues and otherwise of these devices are discussed and compared with others. The language interface needs to satisfy a number of requirements, often conflicting. Among these, simplicity and speed of operation compete with flexibility and scope. Both experienced and novice users need to be considered, and any individual's needs may vary from naive to complex. To be efficient and resilient, the implementation must use an operating system, but the user needs to be protected from its complex and unfamiliar syntax. At the same time the interface must allow the user access to all services appropriate to his needs. The user must also receive an image of privacy in a multi-user system. The interface itself must be stable and exhibit continuity between implementations. Some of these conflicting needs have been overcome by the SABRE interface with languages operating at several levels. The foundation is a simple semimnemonic command language that activates indididual and independent functions. The commands can be used with positional parameters or in an interactive dialogue the precise nature of which depends upon the operating environment and the user's experience. A command procedure or macrolanguage allows combinations of commands with conditional branching and arithmetic features. Thus complex but repetitive operations are easily performed

  4. Physiological Dynamics in Demyelinating Diseases: Unraveling Complex Relationships through Computer Modeling

    Directory of Open Access Journals (Sweden)

    Jay S. Coggan

    2015-09-01

    Full Text Available Despite intense research, few treatments are available for most neurological disorders. Demyelinating diseases are no exception. This is perhaps not surprising considering the multifactorial nature of these diseases, which involve complex interactions between immune system cells, glia and neurons. In the case of multiple sclerosis, for example, there is no unanimity among researchers about the cause or even which system or cell type could be ground zero. This situation precludes the development and strategic application of mechanism-based therapies. We will discuss how computational modeling applied to questions at different biological levels can help link together disparate observations and decipher complex mechanisms whose solutions are not amenable to simple reductionism. By making testable predictions and revealing critical gaps in existing knowledge, such models can help direct research and will provide a rigorous framework in which to integrate new data as they are collected. Nowadays, there is no shortage of data; the challenge is to make sense of it all. In that respect, computational modeling is an invaluable tool that could, ultimately, transform how we understand, diagnose, and treat demyelinating diseases.

  5. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  6. Infrastructure Support for Collaborative Pervasive Computing Systems

    DEFF Research Database (Denmark)

    Vestergaard Mogensen, Martin

    Collaborative Pervasive Computing Systems (CPCS) are currently being deployed to support areas such as clinical work, emergency situations, education, ad-hoc meetings, and other areas involving information sharing and collaboration.These systems allow the users to work together synchronously......, but from different places, by sharing information and coordinating activities. Several researchers have shown the value of such distributed collaborative systems. However, building these systems is by no means a trivial task and introduces a lot of yet unanswered questions. The aforementioned areas......, are all characterized by unstable, volatile environments, either due to the underlying components changing or the nomadic work habits of users. A major challenge, for the creators of collaborative pervasive computing systems, is the construction of infrastructures supporting the system. The complexity...

  7. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  8. COALA--A Computational System for Interlanguage Analysis.

    Science.gov (United States)

    Pienemann, Manfred

    1992-01-01

    Describes a linguistic analysis computational system that responds to highly complex queries about morphosyntactic and semantic structures contained in large sets of language acquisition data by identifying, displaying, and analyzing sentences that meet the defined linguistic criteria. (30 references) (Author/CB)

  9. Moving alcohol prevention research forward-Part I: introducing a complex systems paradigm.

    Science.gov (United States)

    Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller

    2018-02-01

    The drinking environment is a complex system consisting of a number of heterogeneous, evolving and interacting components, which exhibit circular causality and emergent properties. These characteristics reduce the efficacy of commonly used research approaches, which typically do not account for the underlying dynamic complexity of alcohol consumption and the interdependent nature of diverse factors influencing misuse over time. We use alcohol misuse among college students in the United States as an example for framing our argument for a complex systems paradigm. A complex systems paradigm, grounded in socio-ecological and complex systems theories and computational modeling and simulation, is introduced. Theoretical, conceptual, methodological and analytical underpinnings of this paradigm are described in the context of college drinking prevention research. The proposed complex systems paradigm can transcend limitations of traditional approaches, thereby fostering new directions in alcohol prevention research. By conceptualizing student alcohol misuse as a complex adaptive system, computational modeling and simulation methodologies and analytical techniques can be used. Moreover, use of participatory model-building approaches to generate simulation models can further increase stakeholder buy-in, understanding and policymaking. A complex systems paradigm for research into alcohol misuse can provide a holistic understanding of the underlying drinking environment and its long-term trajectory, which can elucidate high-leverage preventive interventions. © 2017 Society for the Study of Addiction.

  10. Stephen Jay Kline on systems, or physics, complex systems, and the gap between.

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Philip LaRoche

    2011-06-01

    At the end of his life, Stephen Jay Kline, longtime professor of mechanical engineering at Stanford University, completed a book on how to address complex systems. The title of the book is 'Conceptual Foundations of Multi-Disciplinary Thinking' (1995), but the topic of the book is systems. Kline first establishes certain limits that are characteristic of our conscious minds. Kline then establishes a complexity measure for systems and uses that complexity measure to develop a hierarchy of systems. Kline then argues that our minds, due to their characteristic limitations, are unable to model the complex systems in that hierarchy. Computers are of no help to us here. Our attempts at modeling these complex systems are based on the way we successfully model some simple systems, in particular, 'inert, naturally-occurring' objects and processes, such as what is the focus of physics. But complex systems overwhelm such attempts. As a result, the best we can do in working with these complex systems is to use a heuristic, what Kline calls the 'Guideline for Complex Systems.' Kline documents the problems that have developed due to 'oversimple' system models and from the inappropriate application of a system model from one domain to another. One prominent such problem is the Procrustean attempt to make the disciplines that deal with complex systems be 'physics-like.' Physics deals with simple systems, not complex ones, using Kline's complexity measure. The models that physics has developed are inappropriate for complex systems. Kline documents a number of the wasteful and dangerous fallacies of this type.

  11. Soft computing in green and renewable energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Gopalakrishnan, Kasthurirangan [Iowa State Univ., Ames, IA (United States). Iowa Bioeconomy Inst.; US Department of Energy, Ames, IA (United States). Ames Lab; Kalogirou, Soteris [Cyprus Univ. of Technology, Limassol (Cyprus). Dept. of Mechanical Engineering and Materials Sciences and Engineering; Khaitan, Siddhartha Kumar (eds.) [Iowa State Univ. of Science and Technology, Ames, IA (United States). Dept. of Electrical Engineering and Computer Engineering

    2011-07-01

    Soft Computing in Green and Renewable Energy Systems provides a practical introduction to the application of soft computing techniques and hybrid intelligent systems for designing, modeling, characterizing, optimizing, forecasting, and performance prediction of green and renewable energy systems. Research is proceeding at jet speed on renewable energy (energy derived from natural resources such as sunlight, wind, tides, rain, geothermal heat, biomass, hydrogen, etc.) as policy makers, researchers, economists, and world agencies have joined forces in finding alternative sustainable energy solutions to current critical environmental, economic, and social issues. The innovative models, environmentally benign processes, data analytics, etc. employed in renewable energy systems are computationally-intensive, non-linear and complex as well as involve a high degree of uncertainty. Soft computing technologies, such as fuzzy sets and systems, neural science and systems, evolutionary algorithms and genetic programming, and machine learning, are ideal in handling the noise, imprecision, and uncertainty in the data, and yet achieve robust, low-cost solutions. As a result, intelligent and soft computing paradigms are finding increasing applications in the study of renewable energy systems. Researchers, practitioners, undergraduate and graduate students engaged in the study of renewable energy systems will find this book very useful. (orig.)

  12. Numerical Nuclear Second Derivatives on a Computing Grid: Enabling and Accelerating Frequency Calculations on Complex Molecular Systems.

    Science.gov (United States)

    Yang, Tzuhsiung; Berry, John F

    2018-06-04

    The computation of nuclear second derivatives of energy, or the nuclear Hessian, is an essential routine in quantum chemical investigations of ground and transition states, thermodynamic calculations, and molecular vibrations. Analytic nuclear Hessian computations require the resolution of costly coupled-perturbed self-consistent field (CP-SCF) equations, while numerical differentiation of analytic first derivatives has an unfavorable 6 N ( N = number of atoms) prefactor. Herein, we present a new method in which grid computing is used to accelerate and/or enable the evaluation of the nuclear Hessian via numerical differentiation: NUMFREQ@Grid. Nuclear Hessians were successfully evaluated by NUMFREQ@Grid at the DFT level as well as using RIJCOSX-ZORA-MP2 or RIJCOSX-ZORA-B2PLYP for a set of linear polyacenes with systematically increasing size. For the larger members of this group, NUMFREQ@Grid was found to outperform the wall clock time of analytic Hessian evaluation; at the MP2 or B2LYP levels, these Hessians cannot even be evaluated analytically. We also evaluated a 156-atom catalytically relevant open-shell transition metal complex and found that NUMFREQ@Grid is faster (7.7 times shorter wall clock time) and less demanding (4.4 times less memory requirement) than an analytic Hessian. Capitalizing on the capabilities of parallel grid computing, NUMFREQ@Grid can outperform analytic methods in terms of wall time, memory requirements, and treatable system size. The NUMFREQ@Grid method presented herein demonstrates how grid computing can be used to facilitate embarrassingly parallel computational procedures and is a pioneer for future implementations.

  13. Physical approach to complex systems

    Science.gov (United States)

    Kwapień, Jarosław; Drożdż, Stanisław

    2012-06-01

    Typically, complex systems are natural or social systems which consist of a large number of nonlinearly interacting elements. These systems are open, they interchange information or mass with environment and constantly modify their internal structure and patterns of activity in the process of self-organization. As a result, they are flexible and easily adapt to variable external conditions. However, the most striking property of such systems is the existence of emergent phenomena which cannot be simply derived or predicted solely from the knowledge of the systems’ structure and the interactions among their individual elements. This property points to the holistic approaches which require giving parallel descriptions of the same system on different levels of its organization. There is strong evidence-consolidated also in the present review-that different, even apparently disparate complex systems can have astonishingly similar characteristics both in their structure and in their behaviour. One can thus expect the existence of some common, universal laws that govern their properties. Physics methodology proves helpful in addressing many of the related issues. In this review, we advocate some of the computational methods which in our opinion are especially fruitful in extracting information on selected-but at the same time most representative-complex systems like human brain, financial markets and natural language, from the time series representing the observables associated with these systems. The properties we focus on comprise the collective effects and their coexistence with noise, long-range interactions, the interplay between determinism and flexibility in evolution, scale invariance, criticality, multifractality and hierarchical structure. The methods described either originate from “hard” physics-like the random matrix theory-and then were transmitted to other fields of science via the field of complex systems research, or they originated elsewhere but

  14. Molecular computing towards a novel computing architecture for complex problem solving

    CERN Document Server

    Chang, Weng-Long

    2014-01-01

    This textbook introduces a concise approach to the design of molecular algorithms for students or researchers who are interested in dealing with complex problems. Through numerous examples and exercises, you will understand the main difference of molecular circuits and traditional digital circuits to manipulate the same problem and you will also learn how to design a molecular algorithm of solving any a problem from start to finish. The book starts with an introduction to computational aspects of digital computers and molecular computing, data representation of molecular computing, molecular operations of molecular computing and number representation of molecular computing, and provides many molecular algorithm to construct the parity generator and the parity checker of error-detection codes on digital communication, to encode integers of different formats, single precision and double precision of floating-point numbers, to implement addition and subtraction of unsigned integers, to construct logic operations...

  15. Structured analysis and modeling of complex systems

    Science.gov (United States)

    Strome, David R.; Dalrymple, Mathieu A.

    1992-01-01

    The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.

  16. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  17. Architecture, systems research and computational sciences

    CERN Document Server

    2012-01-01

    The Winter 2012 (vol. 14 no. 1) issue of the Nexus Network Journal is dedicated to the theme “Architecture, Systems Research and Computational Sciences”. This is an outgrowth of the session by the same name which took place during the eighth international, interdisciplinary conference “Nexus 2010: Relationships between Architecture and Mathematics, held in Porto, Portugal, in June 2010. Today computer science is an integral part of even strictly historical investigations, such as those concerning the construction of vaults, where the computer is used to survey the existing building, analyse the data and draw the ideal solution. What the papers in this issue make especially evident is that information technology has had an impact at a much deeper level as well: architecture itself can now be considered as a manifestation of information and as a complex system. The issue is completed with other research papers, conference reports and book reviews.

  18. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems.

    Science.gov (United States)

    Wu, Jun; Su, Zhou; Wang, Shen; Li, Jianhua

    2017-07-30

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on "friend" relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems.

  19. Simulating Complex Systems by Cellular Automata

    CERN Document Server

    Kroc, Jiri; Hoekstra, Alfons G

    2010-01-01

    Deeply rooted in fundamental research in Mathematics and Computer Science, Cellular Automata (CA) are recognized as an intuitive modeling paradigm for Complex Systems. Already very basic CA, with extremely simple micro dynamics such as the Game of Life, show an almost endless display of complex emergent behavior. Conversely, CA can also be designed to produce a desired emergent behavior, using either theoretical methodologies or evolutionary techniques. Meanwhile, beyond the original realm of applications - Physics, Computer Science, and Mathematics – CA have also become work horses in very different disciplines such as epidemiology, immunology, sociology, and finance. In this context of fast and impressive progress, spurred further by the enormous attraction these topics have on students, this book emerges as a welcome overview of the field for its practitioners, as well as a good starting point for detailed study on the graduate and post-graduate level. The book contains three parts, two major parts on th...

  20. ANS main control complex three-dimensional computer model development

    International Nuclear Information System (INIS)

    Cleaves, J.E.; Fletcher, W.M.

    1993-01-01

    A three-dimensional (3-D) computer model of the Advanced Neutron Source (ANS) main control complex is being developed. The main control complex includes the main control room, the technical support center, the materials irradiation control room, computer equipment rooms, communications equipment rooms, cable-spreading rooms, and some support offices and breakroom facilities. The model will be used to provide facility designers and operations personnel with capabilities for fit-up/interference analysis, visual ''walk-throughs'' for optimizing maintain-ability, and human factors and operability analyses. It will be used to determine performance design characteristics, to generate construction drawings, and to integrate control room layout, equipment mounting, grounding equipment, electrical cabling, and utility services into ANS building designs. This paper describes the development of the initial phase of the 3-D computer model for the ANS main control complex and plans for its development and use

  1. Computational intelligence for decision support in cyber-physical systems

    CERN Document Server

    Ali, A; Riaz, Zahid

    2014-01-01

    This book is dedicated to applied computational intelligence and soft computing techniques with special reference to decision support in Cyber Physical Systems (CPS), where the physical as well as the communication segment of the networked entities interact with each other. The joint dynamics of such systems result in a complex combination of computers, software, networks and physical processes all combined to establish a process flow at system level. This volume provides the audience with an in-depth vision about how to ensure dependability, safety, security and efficiency in real time by making use of computational intelligence in various CPS applications ranging from the nano-world to large scale wide area systems of systems. Key application areas include healthcare, transportation, energy, process control and robotics where intelligent decision support has key significance in establishing dynamic, ever-changing and high confidence future technologies. A recommended text for graduate students and researche...

  2. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  3. Modelling, Estimation and Control of Networked Complex Systems

    CERN Document Server

    Chiuso, Alessandro; Frasca, Mattia; Rizzo, Alessandro; Schenato, Luca; Zampieri, Sandro

    2009-01-01

    The paradigm of complexity is pervading both science and engineering, leading to the emergence of novel approaches oriented at the development of a systemic view of the phenomena under study; the definition of powerful tools for modelling, estimation, and control; and the cross-fertilization of different disciplines and approaches. This book is devoted to networked systems which are one of the most promising paradigms of complexity. It is demonstrated that complex, dynamical networks are powerful tools to model, estimate, and control many interesting phenomena, like agent coordination, synchronization, social and economics events, networks of critical infrastructures, resources allocation, information processing, or control over communication networks. Moreover, it is shown how the recent technological advances in wireless communication and decreasing in cost and size of electronic devices are promoting the appearance of large inexpensive interconnected systems, each with computational, sensing and mobile cap...

  4. The use of CAMAC with small computers in the TRIUMF control system

    International Nuclear Information System (INIS)

    Gurd, D.P.; Heywood, D.R.; Johnson, R.R.

    1975-08-01

    The TRIUMF control system uses several small computers. This allows tasks to be partitioned in hardware rather than by a complex operating system. This flexibility was especially convenient during the developmental stages of TRIUMF. The multi-mini approach also improves mean time to repair. All control system computers are to be interfaced simultaneously to a single CAMAC system of 35 crates on seven branches. Other computers, belonging to separate systems, communicate with the control system via parallel CAMAC-to-CAMAC links. Modularity at both the computer and controller levels, combined with CAMAC multisourcing, has allowed the introduction of considerable redundancy, thereby increasing overall system reliability. (author)

  5. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity.

    Science.gov (United States)

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.

  6. A general digital computer procedure for synthesizing linear automatic control systems

    International Nuclear Information System (INIS)

    Cummins, J.D.

    1961-10-01

    The fundamental concepts required for synthesizing a linear automatic control system are considered. A generalized procedure for synthesizing automatic control systems is demonstrated. This procedure has been programmed for the Ferranti Mercury and the IBM 7090 computers. Details of the programmes are given. The procedure uses the linearized set of equations which describe the plant to be controlled as the starting point. Subsequent computations determine the transfer functions between any desired variables. The programmes also compute the root and phase loci for any linear (and some non-linear) configurations in the complex plane, the open loop and closed loop frequency responses of a system, the residues of a function of the complex variable 's' and the time response corresponding to these residues. With these general programmes available the design of 'one point' automatic control systems becomes a routine scientific procedure. Also dynamic assessments of plant may be carried out. Certain classes of multipoint automatic control problems may also be solved with these procedures. Autonomous systems, invariant systems and orthogonal systems may also be studied. (author)

  7. Evaluating the response of complex systems to environmental threats: the Σ II method

    International Nuclear Information System (INIS)

    Corynen, G.C.

    1983-05-01

    The Σ II method was developed to model and compute the probabilistic performance of systems that operate in a threatening environment. Although we emphasize the vulnerability of complex systems to earthquakes and to electromagnetic threats such as EMP (electromagnetic pulse), the method applies in general to most large-scale systems or networks that are embedded in a potentially harmful environment. Other methods exist for obtaining system vulnerability, but their complexity increases exponentially as the size of systems is increased. The complexity of the Σ II method is polynomial, and accurate solutions are now possible for problems for which current methods require the use of rough statistical bounds, confidence statements, and other approximations. For super-large problems, where the costs of precise answers may be prohibitive, a desired accuracy can be specified, and the Σ II algorithms will halt when that accuracy has been reached. We summarize the results of a theoretical complexity analysis - which is reported elsewhere - and validate the theory with computer experiments conducted both on worst-case academic problems and on more reasonable problems occurring in practice. Finally, we compare our method with the exact methods of Abraham and Nakazawa, and with current bounding methods, and we demonstrate the computational efficiency and accuracy of Σ II

  8. Characterizations and computational complexity of systolic trellis automata

    Energy Technology Data Exchange (ETDEWEB)

    Ibarra, O H; Kim, S M

    1984-03-01

    Systolic trellis automata are simple models for VLSI. The authors characterize the computing power of these models in terms of turing machines. The characterizations are useful in proving new results as well as giving simpler proofs of known results. They also derive lower and upper bounds on the computational complexity of the models. 18 references.

  9. Reduction of treatment delivery variances with a computer-controlled treatment delivery system

    International Nuclear Information System (INIS)

    Fraass, B.A.; Lash, K.L.; Matrone, G.M.; Lichter, A.S.

    1997-01-01

    Purpose: To analyze treatment delivery variances for 3-D conformal therapy performed at various levels of treatment delivery automation, ranging from manual field setup to virtually complete computer-controlled treatment delivery using a computer-controlled conformal radiotherapy system. Materials and Methods: All external beam treatments performed in our department during six months of 1996 were analyzed to study treatment delivery variances versus treatment complexity. Treatments for 505 patients (40,641 individual treatment ports) on four treatment machines were studied. All treatment variances noted by treatment therapists or quality assurance reviews (39 in all) were analyzed. Machines 'M1' (CLinac (6(100))) and 'M2' (CLinac 1800) were operated in a standard manual setup mode, with no record and verify system (R/V). Machines 'M3' (CLinac 2100CD/MLC) and ''M4'' (MM50 racetrack microtron system with MLC) treated patients under the control of a computer-controlled conformal radiotherapy system (CCRS) which 1) downloads the treatment delivery plan from the planning system, 2) performs some (or all) of the machine set-up and treatment delivery for each field, 3) monitors treatment delivery, 4) records all treatment parameters, and 5) notes exceptions to the electronically-prescribed plan. Complete external computer control is not available on M3, so it uses as many CCRS features as possible, while M4 operates completely under CCRS control and performs semi-automated and automated multi-segment intensity modulated treatments. Analysis of treatment complexity was based on numbers of fields, individual segments (ports), non-axial and non-coplanar plans, multi-segment intensity modulation, and pseudo-isocentric treatments (and other plans with computer-controlled table motions). Treatment delivery time was obtained from the computerized scheduling system (for manual treatments) or from CCRS system logs. Treatment therapists rotate among the machines, so this analysis

  10. A programming environment for distributed complex computing. An overview of the Framework for Interdisciplinary Design Optimization (FIDO) project. NASA Langley TOPS exhibit H120b

    Science.gov (United States)

    Townsend, James C.; Weston, Robert P.; Eidson, Thomas M.

    1993-01-01

    The Framework for Interdisciplinary Design Optimization (FIDO) is a general programming environment for automating the distribution of complex computing tasks over a networked system of heterogeneous computers. For example, instead of manually passing a complex design problem between its diverse specialty disciplines, the FIDO system provides for automatic interactions between the discipline tasks and facilitates their communications. The FIDO system networks all the computers involved into a distributed heterogeneous computing system, so they have access to centralized data and can work on their parts of the total computation simultaneously in parallel whenever possible. Thus, each computational task can be done by the most appropriate computer. Results can be viewed as they are produced and variables changed manually for steering the process. The software is modular in order to ease migration to new problems: different codes can be substituted for each of the current code modules with little or no effect on the others. The potential for commercial use of FIDO rests in the capability it provides for automatically coordinating diverse computations on a networked system of workstations and computers. For example, FIDO could provide the coordination required for the design of vehicles or electronics or for modeling complex systems.

  11. Transition Manifolds of Complex Metastable Systems

    Science.gov (United States)

    Bittracher, Andreas; Koltai, Péter; Klus, Stefan; Banisch, Ralf; Dellnitz, Michael; Schütte, Christof

    2018-04-01

    We consider complex dynamical systems showing metastable behavior, but no local separation of fast and slow time scales. The article raises the question of whether such systems exhibit a low-dimensional manifold supporting its effective dynamics. For answering this question, we aim at finding nonlinear coordinates, called reaction coordinates, such that the projection of the dynamics onto these coordinates preserves the dominant time scales of the dynamics. We show that, based on a specific reducibility property, the existence of good low-dimensional reaction coordinates preserving the dominant time scales is guaranteed. Based on this theoretical framework, we develop and test a novel numerical approach for computing good reaction coordinates. The proposed algorithmic approach is fully local and thus not prone to the curse of dimension with respect to the state space of the dynamics. Hence, it is a promising method for data-based model reduction of complex dynamical systems such as molecular dynamics.

  12. Common data buffer system. [communication with computational equipment utilized in spacecraft operations

    Science.gov (United States)

    Byrne, F. (Inventor)

    1981-01-01

    A high speed common data buffer system is described for providing an interface and communications medium between a plurality of computers utilized in a distributed computer complex forming part of a checkout, command and control system for space vehicles and associated ground support equipment. The system includes the capability for temporarily storing data to be transferred between computers, for transferring a plurality of interrupts between computers, for monitoring and recording these transfers, and for correcting errors incurred in these transfers. Validity checks are made on each transfer and appropriate error notification is given to the computer associated with that transfer.

  13. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems

    Directory of Open Access Journals (Sweden)

    Jun Wu

    2017-07-01

    Full Text Available Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on “friend” relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems.

  14. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems

    Science.gov (United States)

    Wu, Jun; Su, Zhou; Li, Jianhua

    2017-01-01

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on “friend” relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems. PMID:28758943

  15. Phase transition and computational complexity in a stochastic prime number generator

    Energy Technology Data Exchange (ETDEWEB)

    Lacasa, L; Luque, B [Departamento de Matematica Aplicada y EstadIstica, ETSI Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain); Miramontes, O [Departamento de Sistemas Complejos, Instituto de FIsica, Universidad Nacional Autonoma de Mexico, Mexico 01415 DF (Mexico)], E-mail: lucas@dmae.upm.es

    2008-02-15

    We introduce a prime number generator in the form of a stochastic algorithm. The character of this algorithm gives rise to a continuous phase transition which distinguishes a phase where the algorithm is able to reduce the whole system of numbers into primes and a phase where the system reaches a frozen state with low prime density. In this paper, we firstly present a broader characterization of this phase transition, both in analytical and numerical terms. Critical exponents are calculated, and data collapse is provided. Further on, we redefine the model as a search problem, fitting it in the hallmark of computational complexity theory. We suggest that the system belongs to the class NP. The computational cost is maximal around the threshold, as is common in many algorithmic phase transitions, revealing the presence of an easy-hard-easy pattern. We finally relate the nature of the phase transition to an average-case classification of the problem.

  16. Towards a global monitoring system for CMS computing operations

    CERN Multimedia

    CERN. Geneva; Bauerdick, Lothar A.T.

    2012-01-01

    The operation of the CMS computing system requires a complex monitoring system to cover all its aspects: central services, databases, the distributed computing infrastructure, production and analysis workflows, the global overview of the CMS computing activities and the related historical information. Several tools are available to provide this information, developed both inside and outside of the collaboration and often used in common with other experiments. Despite the fact that the current monitoring allowed CMS to successfully perform its computing operations, an evolution of the system is clearly required, to adapt to the recent changes in the data and workload management tools and models and to address some shortcomings that make its usage less than optimal. Therefore, a recent and ongoing coordinated effort was started in CMS, aiming at improving the entire monitoring system by identifying its weaknesses and the new requirements from the stakeholders, rationalise and streamline existing components and ...

  17. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    Science.gov (United States)

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and

  18. Complex Systems and Self-organization Modelling

    CERN Document Server

    Bertelle, Cyrille; Kadri-Dahmani, Hakima

    2009-01-01

    The concern of this book is the use of emergent computing and self-organization modelling within various applications of complex systems. The authors focus their attention both on the innovative concepts and implementations in order to model self-organizations, but also on the relevant applicative domains in which they can be used efficiently. This book is the outcome of a workshop meeting within ESM 2006 (Eurosis), held in Toulouse, France in October 2006.

  19. Complex computation in the retina

    Science.gov (United States)

    Deshmukh, Nikhil Rajiv

    Elucidating the general principles of computation in neural circuits is a difficult problem requiring both a tractable model circuit as well as sophisticated measurement tools. This thesis advances our understanding of complex computation in the salamander retina and its underlying circuitry and furthers the development of advanced tools to enable detailed study of neural circuits. The retina provides an ideal model system for neural circuits in general because it is capable of producing complex representations of the visual scene, and both its inputs and outputs are accessible to the experimenter. Chapter 2 describes the biophysical mechanisms that give rise to the omitted stimulus response in retinal ganglion cells described in Schwartz et al., (2007) and Schwartz and Berry, (2008). The extra response to omitted flashes is generated at the input to bipolar cells, and is separable from the characteristic latency shift of the OSR apparent in ganglion cells, which must occur downstream in the circuit. Chapter 3 characterizes the nonlinearities at the first synapse of the ON pathway in response to high contrast flashes and develops a phenomenological model that captures the effect of synaptic activation and intracellular signaling dynamics on flash responses. This work is the first attempt to model the dynamics of the poorly characterized mGluR6 transduction cascade unique to ON bipolar cells, and explains the second lobe of the biphasic flash response. Complementary to the study of neural circuits, recent advances in wafer-scale photolithography have made possible new devices to measure the electrical and mechanical properties of neurons. Chapter 4 reports a novel piezoelectric sensor that facilitates the simultaneous measurement of electrical and mechanical signals in neural tissue. This technology could reveal the relationship between the electrical activity of neurons and their local mechanical environment, which is critical to the study of mechanoreceptors

  20. Computational complexity of symbolic dynamics at the onset of chaos

    Science.gov (United States)

    Lakdawala, Porus

    1996-05-01

    In a variety of studies of dynamical systems, the edge of order and chaos has been singled out as a region of complexity. It was suggested by Wolfram, on the basis of qualitative behavior of cellular automata, that the computational basis for modeling this region is the universal Turing machine. In this paper, following a suggestion of Crutchfield, we try to show that the Turing machine model may often be too powerful as a computational model to describe the boundary of order and chaos. In particular we study the region of the first accumulation of period doubling in unimodal and bimodal maps of the interval, from the point of view of language theory. We show that in relation to the ``extended'' Chomsky hierarchy, the relevant computational model in the unimodal case is the nested stack automaton or the related indexed languages, while the bimodal case is modeled by the linear bounded automaton or the related context-sensitive languages.

  1. Concept of a Cloud Service for Data Preparation and Computational Control on Custom HPC Systems in Application to Molecular Dynamics

    Science.gov (United States)

    Puzyrkov, Dmitry; Polyakov, Sergey; Podryga, Viktoriia; Markizov, Sergey

    2018-02-01

    At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.

  2. Concept of a Cloud Service for Data Preparation and Computational Control on Custom HPC Systems in Application to Molecular Dynamics

    Directory of Open Access Journals (Sweden)

    Puzyrkov Dmitry

    2018-01-01

    Full Text Available At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.

  3. Systems reliability analysis: applications of the SPARCS System-Reliability Assessment Computer Program

    International Nuclear Information System (INIS)

    Locks, M.O.

    1978-01-01

    SPARCS-2 (Simulation Program for Assessing the Reliabilities of Complex Systems, Version 2) is a PL/1 computer program for assessing (establishing interval estimates for) the reliability and the MTBF of a large and complex s-coherent system of any modular configuration. The system can consist of a complex logical assembly of independently failing attribute (binomial-Bernoulli) and time-to-failure (Poisson-exponential) components, without regard to their placement. Alternatively, it can be a configuration of independently failing modules, where each module has either or both attribute and time-to-failure components. SPARCS-2 also has an improved super modularity feature. Modules with minimal-cut unreliabiliy calculations can be mixed with those having minimal-path reliability calculations. All output has been standardized to system reliability or probability of success, regardless of the form in which the input data is presented, and whatever the configuration of modules or elements within modules

  4. Controlling Complex Systems and Developing Dynamic Technology

    Science.gov (United States)

    Avizienis, Audrius Victor

    In complex systems, control and understanding become intertwined. Following Ilya Prigogine, we define complex systems as having control parameters which mediate transitions between distinct modes of dynamical behavior. From this perspective, determining the nature of control parameters and demonstrating the associated dynamical phase transitions are practically equivalent and fundamental to engaging with complexity. In the first part of this work, a control parameter is determined for a non-equilibrium electrochemical system by studying a transition in the morphology of structures produced by an electroless deposition reaction. Specifically, changing the size of copper posts used as the substrate for growing metallic silver structures by the reduction of Ag+ from solution under diffusion-limited reaction conditions causes a dynamical phase transition in the crystal growth process. For Cu posts with edge lengths on the order of one micron, local forces promoting anisotropic growth predominate, and the reaction produces interconnected networks of Ag nanowires. As the post size is increased above 10 microns, the local interfacial growth reaction dynamics couple with the macroscopic diffusion field, leading to spatially propagating instabilities in the electrochemical potential which induce periodic branching during crystal growth, producing dendritic deposits. This result is interesting both as an example of control and understanding in a complex system, and as a useful combination of top-down lithography with bottom-up electrochemical self-assembly. The second part of this work focuses on the technological development of devices fabricated using this non-equilibrium electrochemical process, towards a goal of integrating a complex network as a dynamic functional component in a neuromorphic computing device. Self-assembled networks of silver nanowires were reacted with sulfur to produce interfacial "atomic switches": silver-silver sulfide junctions, which exhibit

  5. Advanced Kalman Filter for Real-Time Responsiveness in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Welch, Gregory Francis [UNC-Chapel Hill/University of Central Florida; Zhang, Jinghe [UNC-Chapel Hill/Virginia Tech

    2014-06-10

    Complex engineering systems pose fundamental challenges in real-time operations and control because they are highly dynamic systems consisting of a large number of elements with severe nonlinearities and discontinuities. Today’s tools for real-time complex system operations are mostly based on steady state models, unable to capture the dynamic nature and too slow to prevent system failures. We developed advanced Kalman filtering techniques and the formulation of dynamic state estimation using Kalman filtering techniques to capture complex system dynamics in aiding real-time operations and control. In this work, we looked at complex system issues including severe nonlinearity of system equations, discontinuities caused by system controls and network switches, sparse measurements in space and time, and real-time requirements of power grid operations. We sought to bridge the disciplinary boundaries between Computer Science and Power Systems Engineering, by introducing methods that leverage both existing and new techniques. While our methods were developed in the context of electrical power systems, they should generalize to other large-scale scientific and engineering applications.

  6. A complex network approach to cloud computing

    International Nuclear Information System (INIS)

    Travieso, Gonzalo; Ruggiero, Carlos Antônio; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2016-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the clients’ tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlaid by Erdős–Rényi (ER) and Barabási-Albert (BA) topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of the cost of communication between the client and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter, the ER topology provides better performance than the BA for smaller average degrees and opposite behaviour for larger average degrees. With respect to cost, smaller values are found in the BA topology irrespective of the average degree. In addition, we also verified that it is easier to find good servers in ER than in BA networks. Surprisingly, balance and cost are not too much affected by the presence of communities. However, for a well-defined community network, we found that it is important to assign each server to a different community so as to achieve better performance. (paper: interdisciplinary statistical mechanics )

  7. Optical interconnection networks for high-performance computing systems

    International Nuclear Information System (INIS)

    Biberman, Aleksandr; Bergman, Keren

    2012-01-01

    Enabled by silicon photonic technology, optical interconnection networks have the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. Chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, offer unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of our work, we demonstrate such feasibility of waveguides, modulators, switches and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. We propose novel silicon photonic devices, subsystems, network topologies and architectures to enable unprecedented performance of these photonic interconnection networks. Furthermore, the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers. (review article)

  8. Towards a Global Monitoring System for CMS Computing Operations

    Energy Technology Data Exchange (ETDEWEB)

    Bauerdick, L. A.T. [Fermilab; Sciaba, Andrea [CERN

    2012-01-01

    The operation of the CMS computing system requires a complex monitoring system to cover all its aspects: central services, databases, the distributed computing infrastructure, production and analysis workflows, the global overview of the CMS computing activities and the related historical information. Several tools are available to provide this information, developed both inside and outside of the collaboration and often used in common with other experiments. Despite the fact that the current monitoring allowed CMS to successfully perform its computing operations, an evolution of the system is clearly required, to adapt to the recent changes in the data and workload management tools and models and to address some shortcomings that make its usage less than optimal. Therefore, a recent and ongoing coordinated effort was started in CMS, aiming at improving the entire monitoring system by identifying its weaknesses and the new requirements from the stakeholders, rationalise and streamline existing components and drive future software development. This contribution gives a complete overview of the CMS monitoring system and a description of all the recent activities that have been started with the goal of providing a more integrated, modern and functional global monitoring system for computing operations.

  9. The CESR computer control system

    International Nuclear Information System (INIS)

    Helmke, R.G.; Rice, D.H.; Strohman, C.

    1986-01-01

    The control system for the Cornell Electron Storage Ring (CESR) has functioned satisfactorily since its implementation in 1979. Key characteristics are fast tuning response, almost exclusive use of FORTRAN as a programming language, and efficient coordinated ramping of CESR guide field elements. This original system has not, however, been able to keep pace with the increasing complexity of operation of CESR associated with performance upgrades. Limitations in address space, expandability, access to data system-wide, and program development impediments have prompted the undertaking of a major upgrade. The system under development accomodates up to 8 VAX computers for all applications programs. The database and communications semaphores reside in a shared multi-ported memory, and each hardware interface bus is controlled by a dedicated 32 bit micro-processor in a VME based system. (orig.)

  10. ADAM: analysis of discrete models of biological systems using computer algebra.

    Science.gov (United States)

    Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard

    2011-07-20

    Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web

  11. Life system modeling and intelligent computing. Pt. II. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part II of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 56 papers in this volume are organized in topical sections on advanced evolutionary computing theory and algorithms; advanced neural network and fuzzy system theory and algorithms; modeling and simulation of societies and collective behavior; biomedical signal processing, imaging, and visualization; intelligent computing and control in distributed power generation systems; intelligent methods in power and energy infrastructure development; intelligent modeling, monitoring, and control of complex nonlinear systems. (orig.)

  12. Strategies and Rubrics for Teaching Complex Systems Theory to Novices (Invited)

    Science.gov (United States)

    Fichter, L. S.

    2010-12-01

    Bifurcation. Self-similarity. Fractal. Sensitive dependent. Agents. Self-organized criticality. Avalanche behavior. Power laws. Strange attractors. Emergence. The language of complexity is fundamentally different from the language of equilibrium. If students do not know these phenomena, and what they tell us about the pulse of dynamic systems, complex systems will be opaque. A complex system is a group of agents. (individual interacting units, like birds in a flock, sand grains in a ripple, or individual friction units along a fault zone), existing far from equilibrium, interacting through positive and negative feedbacks, following simple rules, forming interdependent, dynamic, evolutionary networks. Complex systems produce behaviors that cannot be predicted deductively from knowledge of the behaviors of the individual components themselves; they must be experienced. What complexity theory demonstrates is that, by following simple rules, all the agents end up coordinating their behavior—self organizing—so that what emerges is not chaos, but meaningful patterns. How can we introduce Freshman, non-science, general education students to complex systems theories, in 3 to 5 classes; in a way they really get it, and can use the principles to understand real systems? Complex systems theories are not a series of unconnected or disconnected equations or models; they are developed as narratives that makes sense of how all the pieces and properties are interrelated. The principles of complex systems must be taught as deliberately and systematically as the equilibrium principles normally taught; as, say, the systematic training from pre-algebra and geometry to algebra. We have developed a sequence of logically connected narratives (strategies and rubrics) that introduce complex systems principles using models that can be simulated in a computer, in class, in real time. The learning progression has a series of 12 models (e.g. logistic system, bifurcation diagrams, genetic

  13. Treatment of human-computer interface in a decision support system

    International Nuclear Information System (INIS)

    Heger, A.S.; Duran, F.A.; Cox, R.G.

    1992-01-01

    One of the most challenging applications facing the computer community is development of effective adaptive human-computer interface. This challenge stems from the complex nature of the human part of this symbiosis. The application of this discipline to the environmental restoration and waste management is further complicated due to the nature of environmental data. The information that is required to manage environmental impacts of human activity is fundamentally complex. This paper will discuss the efforts at Sandia National Laboratories in developing the adaptive conceptual model manager within the constraint of the environmental decision-making. A computer workstation, that hosts the Conceptual Model Manager and the Sandia Environmental Decision Support System will also be discussed

  14. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W

    2013-01-01

    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  15. Computer tomography in complex diagnosis of laryngeal cancer

    International Nuclear Information System (INIS)

    Savin, A.A.

    1999-01-01

    To specify the role of computer tomography in the diagnosis of malignant of the larynx. Forty-two patients with suspected laryngeal tumors were examined: 38 men and 4 women aged 41-68 years. X-ray examinations included traditional immediate tomography of the larynx. Main X-ray and computer tomographic symptoms of laryngeal tumors of different localizations are described. It is shown that the use of computer tomography in complex diagnosis of laryngeal cancer permits an objective assessment of the tumor, its structure and dissemination, and of the regional lymph nodes [ru

  16. Reliability assessment of complex electromechanical systems under epistemic uncertainty

    International Nuclear Information System (INIS)

    Mi, Jinhua; Li, Yan-Feng; Yang, Yuan-Jian; Peng, Weiwen; Huang, Hong-Zhong

    2016-01-01

    The appearance of macro-engineering and mega-project have led to the increasing complexity of modern electromechanical systems (EMSs). The complexity of the system structure and failure mechanism makes it more difficult for reliability assessment of these systems. Uncertainty, dynamic and nonlinearity characteristics always exist in engineering systems due to the complexity introduced by the changing environments, lack of data and random interference. This paper presents a comprehensive study on the reliability assessment of complex systems. In view of the dynamic characteristics within the system, it makes use of the advantages of the dynamic fault tree (DFT) for characterizing system behaviors. The lifetime of system units can be expressed as bounded closed intervals by incorporating field failures, test data and design expertize. Then the coefficient of variation (COV) method is employed to estimate the parameters of life distributions. An extended probability-box (P-Box) is proposed to convey the present of epistemic uncertainty induced by the incomplete information about the data. By mapping the DFT into an equivalent Bayesian network (BN), relevant reliability parameters and indexes have been calculated. Furthermore, the Monte Carlo (MC) simulation method is utilized to compute the DFT model with consideration of system replacement policy. The results show that this integrated approach is more flexible and effective for assessing the reliability of complex dynamic systems. - Highlights: • A comprehensive study on the reliability assessment of complex system is presented. • An extended probability-box is proposed to convey the present of epistemic uncertainty. • The dynamic fault tree model is built. • Bayesian network and Monte Carlo simulation methods are used. • The reliability assessment of a complex electromechanical system is performed.

  17. Assessing the impact of large-scale computing on the size and complexity of first-principles electromagnetic models

    International Nuclear Information System (INIS)

    Miller, E.K.

    1990-01-01

    There is a growing need to determine the electromagnetic performance of increasingly complex systems at ever higher frequencies. The ideal approach would be some appropriate combination of measurement, analysis, and computation so that system design and assessment can be achieved to a needed degree of accuracy at some acceptable cost. Both measurement and computation benefit from the continuing growth in computer power that, since the early 1950s, has increased by a factor of more than a million in speed and storage. For example, a CRAY2 has an effective throughput (not the clock rate) of about 10 11 floating-point operations (FLOPs) per hour compared with the approximate 10 5 provided by the UNIVAC-1. The purpose of this discussion is to illustrate the computational complexity of modeling large (in wavelengths) electromagnetic problems. In particular the author makes the point that simply relying on faster computers for increasing the size and complexity of problems that can be modeled is less effective than might be anticipated from this raw increase in computer throughput. He suggests that rather than depending on faster computers alone, various analytical and numerical alternatives need development for reducing the overall FLOP count required to acquire the information desired. One approach is to decrease the operation count of the basic model computation itself, by reducing the order of the frequency dependence of the various numerical operations or their multiplying coefficients. Another is to decrease the number of model evaluations that are needed, an example being the number of frequency samples required to define a wideband response, by using an auxiliary model of the expected behavior. 11 refs., 5 figs., 2 tabs

  18. The Computational Complexity, Parallel Scalability, and Performance of Atmospheric Data Assimilation Algorithms

    Science.gov (United States)

    Lyster, Peter M.; Guo, J.; Clune, T.; Larson, J. W.; Atlas, Robert (Technical Monitor)

    2001-01-01

    The computational complexity of algorithms for Four Dimensional Data Assimilation (4DDA) at NASA's Data Assimilation Office (DAO) is discussed. In 4DDA, observations are assimilated with the output of a dynamical model to generate best-estimates of the states of the system. It is thus a mapping problem, whereby scattered observations are converted into regular accurate maps of wind, temperature, moisture and other variables. The DAO is developing and using 4DDA algorithms that provide these datasets, or analyses, in support of Earth System Science research. Two large-scale algorithms are discussed. The first approach, the Goddard Earth Observing System Data Assimilation System (GEOS DAS), uses an atmospheric general circulation model (GCM) and an observation-space based analysis system, the Physical-space Statistical Analysis System (PSAS). GEOS DAS is very similar to global meteorological weather forecasting data assimilation systems, but is used at NASA for climate research. Systems of this size typically run at between 1 and 20 gigaflop/s. The second approach, the Kalman filter, uses a more consistent algorithm to determine the forecast error covariance matrix than does GEOS DAS. For atmospheric assimilation, the gridded dynamical fields typically have More than 10(exp 6) variables, therefore the full error covariance matrix may be in excess of a teraword. For the Kalman filter this problem can easily scale to petaflop/s proportions. We discuss the computational complexity of GEOS DAS and our implementation of the Kalman filter. We also discuss and quantify some of the technical issues and limitations in developing efficient, in terms of wall clock time, and scalable parallel implementations of the algorithms.

  19. [The Psychomat computer complex for psychophysiologic studies].

    Science.gov (United States)

    Matveev, E V; Nadezhdin, D S; Shemsudov, A I; Kalinin, A V

    1991-01-01

    The authors analyze the principles of the design of a computed psychophysiological system for universal uses. Show the effectiveness of the use of computed technology as a combination of universal computation and control potentialities of a personal computer equipped with problem-oriented specialized facilities of stimuli presentation and detection of the test subject's reactions. Define the hardware and software configuration of the microcomputer psychophysiological system "Psychomat". Describe its functional possibilities and the basic medico-technical characteristics. Review organizational issues of the maintenance of its full-scale production.

  20. A Linear Time Complexity of Breadth-First Search Using P System with Membrane Division

    Directory of Open Access Journals (Sweden)

    Einallah Salehi

    2013-01-01

    Full Text Available One of the known methods for solving the problems with exponential time complexity such as NP-complete problems is using the brute force algorithms. Recently, a new parallel computational framework called Membrane Computing is introduced which can be applied in brute force algorithms. The usual way to find a solution for the problems with exponential time complexity with Membrane Computing techniques is by P System with active membrane using division rule. It makes an exponential workspace and solves the problems with exponential complexity in a polynomial (even linear time. On the other hand, searching is currently one of the most used methods for finding solution for problems in real life, that the blind search algorithms are accurate, but their time complexity is exponential such as breadth-first search (BFS algorithm. In this paper, we proposed a new approach for implementation of BFS by using P system with division rule technique for first time. The theorem shows time complexity of BSF in this framework on randomly binary trees reduced from O(2d to O(d.

  1. 10th International Conference on Dependability and Complex Systems

    CERN Document Server

    Mazurkiewicz, Jacek; Sugier, Jarosław; Walkowiak, Tomasz; Kacprzyk, Janusz

    2015-01-01

    Building upon a long tradition of scientifi c conferences dealing with problems of reliability in technical systems, in 2006 Department of Computer Engineering at Wrocław University of Technology established DepCoS-RELCOMEX series of events in order to promote a comprehensive approach to evaluation of system performability which is now commonly called dependability. Contemporary complex systems integrate variety of technical, information, soft ware and human (users, administrators and management) resources. Their complexity comes not only from involved technical and organizational structures but mainly from complexity of information processes that must be implemented in specific operational environment (data processing, monitoring, management, etc.). In such a case traditional methods of reliability evaluation focused mainly on technical levels are insufficient and more innovative, multidisciplinary methods of dependability analysis must be applied. Selection of submissions for these proceedings exemplify di...

  2. Computing complex Airy functions by numerical quadrature

    NARCIS (Netherlands)

    A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2001-01-01

    textabstractIntegral representations are considered of solutions of the Airydifferential equation w''-z, w=0 for computing Airy functions for complex values of z.In a first method contour integral representations of the Airyfunctions are written as non-oscillating

  3. Services Recommendation System based on Heterogeneous Network Analysis in Cloud Computing

    OpenAIRE

    Junping Dong; Qingyu Xiong; Junhao Wen; Peng Li

    2014-01-01

    Resources are provided mainly in the form of services in cloud computing. In the distribute environment of cloud computing, how to find the needed services efficiently and accurately is the most urgent problem in cloud computing. In cloud computing, services are the intermediary of cloud platform, services are connected by lots of service providers and requesters and construct the complex heterogeneous network. The traditional recommendation systems only consider the functional and non-functi...

  4. Life system modeling and intelligent computing. Pt. I. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part I of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 55 papers in this volume are organized in topical sections on intelligent modeling, monitoring, and control of complex nonlinear systems; autonomy-oriented computing and intelligent agents; advanced theory and methodology in fuzzy systems and soft computing; computational intelligence in utilization of clean and renewable energy resources; intelligent modeling, control and supervision for energy saving and pollution reduction; intelligent methods in developing vehicles, engines and equipments; computational methods and intelligence in modeling genetic and biochemical networks and regulation. (orig.)

  5. Constructing optimized binary masks for reservoir computing with delay systems

    Science.gov (United States)

    Appeltant, Lennert; van der Sande, Guy; Danckaert, Jan; Fischer, Ingo

    2014-01-01

    Reservoir computing is a novel bio-inspired computing method, capable of solving complex tasks in a computationally efficient way. It has recently been successfully implemented using delayed feedback systems, allowing to reduce the hardware complexity of brain-inspired computers drastically. In this approach, the pre-processing procedure relies on the definition of a temporal mask which serves as a scaled time-mutiplexing of the input. Originally, random masks had been chosen, motivated by the random connectivity in reservoirs. This random generation can sometimes fail. Moreover, for hardware implementations random generation is not ideal due to its complexity and the requirement for trial and error. We outline a procedure to reliably construct an optimal mask pattern in terms of multipurpose performance, derived from the concept of maximum length sequences. Not only does this ensure the creation of the shortest possible mask that leads to maximum variability in the reservoir states for the given reservoir, it also allows for an interpretation of the statistical significance of the provided training samples for the task at hand.

  6. ACCESS TO A COMPUTER SYSTEM. BETWEEN LEGAL PROVISIONS AND TECHNICAL REALITY

    Directory of Open Access Journals (Sweden)

    Maxim DOBRINOIU

    2016-05-01

    Full Text Available Nowadays, on a rise of cybersecurity incidents and a very complex IT&C environment, the national legal systems must adapt in order to properly address the new and modern forms of criminality in cyberspace. The illegal access to a computer system remains one of the most important cyber-related crimes due to its popularity but also from the perspective as being a door opened to computer data and sometimes a vehicle for other tech crimes. In the same time, the information society services slightly changed the IT paradigm and represent the new interface between users and systems. Is true that services rely on computer systems, but accessing services goes now beyond the simple accessing computer systems as commonly understood by most of the legislations. The article intends to explain other sides of the access related to computer systems and services, with the purpose to advance possible legal solutions to certain case scenarios.

  7. Computation of 3D form factors in complex environments

    International Nuclear Information System (INIS)

    Coulon, N.

    1989-01-01

    The calculation of radiant interchange among opaque surfaces in a complex environment poses the general problem of determining the visible and hidden parts of the environment. In many thermal engineering applications, surfaces are separated by radiatively non-participating media and may be idealized as diffuse emitters and reflectors. Consenquently the net radiant energy fluxes are intimately related to purely geometrical quantities called form factors, that take into account hidden parts: the problem is reduced to the form factor evaluation. This paper presents the method developed for the computation of 3D form factors in the finite-element module of the system TRIO, which is a general computer code for thermal and fluid flow analysis. The method is derived from an algorithm devised for synthetic image generation. A comparison is performed with the standard contour integration method also implemented and suited to convex geometries. Several illustrative examples of finite-element thermal calculations in radiating enclosures are given

  8. Power efficient low complexity precoding for massive MIMO systems

    KAUST Repository

    Sifaou, Houssem

    2014-12-01

    This work aims at designing a low-complexity precoding technique in the downlink of a large-scale multiple-input multiple-output (MIMO) system in which the base station (BS) is equipped with M antennas to serve K single-antenna user equipments. This is motivated by the high computational complexity required by the widely used zero-forcing or regularized zero-forcing precoding techniques, especially when K grows large. To reduce the computational burden, we adopt a precoding technique based on truncated polynomial expansion (TPE) and make use of the asymptotic analysis to compute the deterministic equivalents of its corresponding signal-to-interference-plus-noise ratios (SINRs) and transmit power. The asymptotic analysis is conducted in the regime in which M and K tend to infinity with the same pace under the assumption that imperfect channel state information is available at the BS. The results are then used to compute the TPE weights that minimize the asymptotic transmit power while meeting a set of target SINR constraints. Numerical simulations are used to validate the theoretical analysis. © 2014 IEEE.

  9. Computational Fluid and Particle Dynamics in the Human Respiratory System

    CERN Document Server

    Tu, Jiyuan; Ahmadi, Goodarz

    2013-01-01

    Traditional research methodologies in the human respiratory system have always been challenging due to their invasive nature. Recent advances in medical imaging and computational fluid dynamics (CFD) have accelerated this research. This book compiles and details recent advances in the modelling of the respiratory system for researchers, engineers, scientists, and health practitioners. It breaks down the complexities of this field and provides both students and scientists with an introduction and starting point to the physiology of the respiratory system, fluid dynamics and advanced CFD modeling tools. In addition to a brief introduction to the physics of the respiratory system and an overview of computational methods, the book contains best-practice guidelines for establishing high-quality computational models and simulations. Inspiration for new simulations can be gained through innovative case studies as well as hands-on practice using pre-made computational code. Last but not least, students and researcher...

  10. 1989 lectures in complex systems

    International Nuclear Information System (INIS)

    Jen, E.

    1990-01-01

    This report contains papers on the following topics: Lectures on a Theory of Computation and Complexity over the Reals; Algorithmic Information Content, Church-Turing Thesis, Physical Entroph, and Maxwell's Demon; Physical Measures of Complexity; An Introduction to Chaos and Prediction; Hamiltonian Chaos in Nonlinear Polarized Optical Beam; Chemical Oscillators and Nonlinear Chemical Dynamics; Isotropic Navier-Stokes Turbulence. I. Qualitative Features and Basic Equations; Isotropic Navier-Stokes Turbulence. II. Statistical Approximation Methods; Lattice Gases; Data-Parallel Computation and the Connection Machine; Preimages and Forecasting for Cellular Automata; Lattice-Gas Models for Multiphase Flows and Magnetohydrodynamics; Probabilistic Cellular Automata: Some Statistical Mechanical Considerations; Complexity Due to Disorder and Frustration; Self-Organization by Simulated Evolution; Theoretical Immunology; Morphogenesis by Cell Intercalation; and Theoretical Physics Meets Experimental Neurobiology

  11. A computer-based spectrometry system for assessment of body radioactivity

    International Nuclear Information System (INIS)

    Venn, J.B.

    1985-01-01

    This paper describes a PDP-11 computer system operating under RT-11 for the acquisition and processing of pulse height spectra in the measurement of body radioactivity. SABRA (system for the assessment of body radioactivity) provides control of multiple detection systems from visual display consoles by means of a command language. A wide range of facilities is available for the display, processing and storage of acquired spectra and complex operations may be pre-programmed by means of the SABRE MACRO language. The hardware includes a CAMAC interface to the detection systems, disc cartridge drives for mass storage of data and programs, and data-links to other computers. The software is written in assembler language and includes special features for the dynamic allocation of computer memory and for safeguarding acquired data. (orig.)

  12. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  13. The computer program system for structural design of nuclear power plants

    International Nuclear Information System (INIS)

    Aihara, S.; Atsumi, K.; Sasagawa, K.; Satoh, S.

    1979-01-01

    In recent days, the design method of the Nuclear Power Plant has become more complex than in the past. The Finite Element Method (FEM) applied for analysis of Nuclear Power Plants, especially requires more computer use. The recent computers have made remarkable progress, so that in design work manpower and time necessary for analysis have been reduced considerably. However, instead the arrangement of outputs have increased tremendously. Therefore, a computer program system was developed for performing all of the processes, from data making to output arrangement, and rebar evaluations. This report introduces the computer program system pertaining to the design flow of the Reactor Building. (orig.)

  14. Method of Computer-aided Instruction in Situation Control Systems

    Directory of Open Access Journals (Sweden)

    Anatoliy O. Kargin

    2013-01-01

    Full Text Available The article considers the problem of computer-aided instruction in context-chain motivated situation control system of the complex technical system behavior. The conceptual and formal models of situation control with practical instruction are considered. Acquisition of new behavior knowledge is presented as structural changes in system memory in the form of situational agent set. Model and method of computer-aided instruction represent formalization, based on the nondistinct theories by physiologists and cognitive psychologists.The formal instruction model describes situation and reaction formation and dependence on different parameters, effecting education, such as the reinforcement value, time between the stimulus, action and the reinforcement. The change of the contextual link between situational elements when using is formalized.The examples and results of computer instruction experiments of the robot device “LEGO MINDSTORMS NXT”, equipped with ultrasonic distance, touch, light sensors.

  15. Basic user guide for the radwaste treatment plant computer system

    International Nuclear Information System (INIS)

    Keel, A.

    1990-07-01

    This guide has been produced as an aid to using the Radwaste Treatment Plant computer system. It is designed to help new users to use the database menu system. Some of the forms can be used in ways different from those explained and more complex queries can be performed. (UK)

  16. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  17. Computational complexity a quantitative perspective

    CERN Document Server

    Zimand, Marius

    2004-01-01

    There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively. The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length. One chapter is dedicated to abs...

  18. Justification of computational methods to ensure information management systems

    Directory of Open Access Journals (Sweden)

    E. D. Chertov

    2016-01-01

    Full Text Available Summary. Due to the diversity and complexity of organizational management tasks a large enterprise, the construction of an information management system requires the establishment of interconnected complexes of means, implementing the most efficient way collect, transfer, accumulation and processing of information necessary drivers handle different ranks in the governance process. The main trends of the construction of integrated logistics management information systems can be considered: the creation of integrated data processing systems by centralizing storage and processing of data arrays; organization of computer systems to realize the time-sharing; aggregate-block principle of the integrated logistics; Use a wide range of peripheral devices with the unification of information and hardware communication. Main attention is paid to the application of the system of research of complex technical support, in particular, the definition of quality criteria for the operation of technical complex, the development of information base analysis methods of management information systems and define the requirements for technical means, as well as methods of structural synthesis of the major subsystems of integrated logistics. Thus, the aim is to study on the basis of systematic approach of integrated logistics management information system and the development of a number of methods of analysis and synthesis of complex logistics that are suitable for use in the practice of engineering systems design. The objective function of the complex logistics management information systems is the task of gathering systems, transmission and processing of specified amounts of information in the regulated time intervals with the required degree of accuracy while minimizing the reduced costs for the establishment and operation of technical complex. Achieving the objective function of the complex logistics to carry out certain organization of interaction of information

  19. Using an adaptive expertise lens to understand the quality of teachers' classroom implementation of computer-supported complex systems curricula in high school science

    Science.gov (United States)

    Yoon, Susan A.; Koehler-Yom, Jessica; Anderson, Emma; Lin, Joyce; Klopfer, Eric

    2015-05-01

    Background: This exploratory study is part of a larger-scale research project aimed at building theoretical and practical knowledge of complex systems in students and teachers with the goal of improving high school biology learning through professional development and a classroom intervention. Purpose: We propose a model of adaptive expertise to better understand teachers' classroom practices as they attempt to navigate myriad variables in the implementation of biology units that include working with computer simulations, and learning about and teaching through complex systems ideas. Sample: Research participants were three high school biology teachers, two females and one male, ranging in teaching experience from six to 16 years. Their teaching contexts also ranged in student achievement from 14-47% advanced science proficiency. Design and methods: We used a holistic multiple case study methodology and collected data during the 2011-2012 school year. Data sources include classroom observations, teacher and student surveys, and interviews. Data analyses and trustworthiness measures were conducted through qualitative mining of data sources and triangulation of findings. Results: We illustrate the characteristics of adaptive expertise of more or less successful teaching and learning when implementing complex systems curricula. We also demonstrate differences between case study teachers in terms of particular variables associated with adaptive expertise. Conclusions: This research contributes to scholarship on practices and professional development needed to better support teachers to teach through a complex systems pedagogical and curricular approach.

  20. A Multifaceted Mathematical Approach for Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, F.; Anitescu, M.; Bell, J.; Brown, D.; Ferris, M.; Luskin, M.; Mehrotra, S.; Moser, B.; Pinar, A.; Tartakovsky, A.; Willcox, K.; Wright, S.; Zavala, V.

    2012-03-07

    Applied mathematics has an important role to play in developing the tools needed for the analysis, simulation, and optimization of complex problems. These efforts require the development of the mathematical foundations for scientific discovery, engineering design, and risk analysis based on a sound integrated approach for the understanding of complex systems. However, maximizing the impact of applied mathematics on these challenges requires a novel perspective on approaching the mathematical enterprise. Previous reports that have surveyed the DOE's research needs in applied mathematics have played a key role in defining research directions with the community. Although these reports have had significant impact, accurately assessing current research needs requires an evaluation of today's challenges against the backdrop of recent advances in applied mathematics and computing. To address these needs, the DOE Applied Mathematics Program sponsored a Workshop for Mathematics for the Analysis, Simulation and Optimization of Complex Systems on September 13-14, 2011. The workshop had approximately 50 participants from both the national labs and academia. The goal of the workshop was to identify new research areas in applied mathematics that will complement and enhance the existing DOE ASCR Applied Mathematics Program efforts that are needed to address problems associated with complex systems. This report describes recommendations from the workshop and subsequent analysis of the workshop findings by the organizing committee.

  1. Statistical screening of input variables in a complex computer code

    International Nuclear Information System (INIS)

    Krieger, T.J.

    1982-01-01

    A method is presented for ''statistical screening'' of input variables in a complex computer code. The object is to determine the ''effective'' or important input variables by estimating the relative magnitudes of their associated sensitivity coefficients. This is accomplished by performing a numerical experiment consisting of a relatively small number of computer runs with the code followed by a statistical analysis of the results. A formula for estimating the sensitivity coefficients is derived. Reference is made to an earlier work in which the method was applied to a complex reactor code with good results

  2. New technique for determining unavailability of computer controlled safety systems

    International Nuclear Information System (INIS)

    Fryer, M.O.; Bruske, S.Z.

    1984-04-01

    The availability of a safety system for a fusion reactor is determined. A fusion reactor processes tritium and requires an Emergency Tritium Cleanup (ETC) system for accidental tritium releases. The ETC is computer controlled and because of its complexity, is an excellent candidate for this analysis. The ETC system unavailability, for preliminary untested software, is calculated based on different assumptions about operator response. These assumptions are: (a) the operator shuts down the system after the first indication of plant failure; (b) the operator shuts down the system after following optimized failure verification procedures; or (c) the operator is taken out of the decision process, and the computer uses the optimized failure verification procedures

  3. An intelligent multi-media human-computer dialogue system

    Science.gov (United States)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  4. Forecasting of Processes in Complex Systems for Real-World Problems

    Czech Academy of Sciences Publication Activity Database

    Pelikán, Emil

    2014-01-01

    Roč. 24, č. 6 (2014), s. 567-589 ISSN 1210-0552 Institutional support: RVO:67985807 Keywords : complex systems * data assimilation * ensemble forecasting * forecasting * global solar radiation * judgmental forecasting * multimodel forecasting * pollution Subject RIV: IN - Informatics, Computer Science Impact factor: 0.479, year: 2014

  5. Metaheuristics progress in complex systems optimization

    CERN Document Server

    Doerner, Karl F; Greistorfer, Peter; Gutjahr, Walter; Hartl, Richard F; Reimann, Marc

    2007-01-01

    The aim of ""Metaheuristics: Progress in Complex Systems Optimization"" is to provide several different kinds of information: a delineation of general metaheuristics methods, a number of state-of-the-art articles from a variety of well-known classical application areas as well as an outlook to modern computational methods in promising new areas. Therefore, this book may equally serve as a textbook in graduate courses for students, as a reference book for people interested in engineering or social sciences, and as a collection of new and promising avenues for researchers working in this field.

  6. High performance parallel computing of flows in complex geometries: I. Methods

    International Nuclear Information System (INIS)

    Gourdain, N; Gicquel, L; Montagnac, M; Vermorel, O; Staffelbach, G; Garcia, M; Boussuge, J-F; Gazaix, M; Poinsot, T

    2009-01-01

    Efficient numerical tools coupled with high-performance computers, have become a key element of the design process in the fields of energy supply and transportation. However flow phenomena that occur in complex systems such as gas turbines and aircrafts are still not understood mainly because of the models that are needed. In fact, most computational fluid dynamics (CFD) predictions as found today in industry focus on a reduced or simplified version of the real system (such as a periodic sector) and are usually solved with a steady-state assumption. This paper shows how to overcome such barriers and how such a new challenge can be addressed by developing flow solvers running on high-end computing platforms, using thousands of computing cores. Parallel strategies used by modern flow solvers are discussed with particular emphases on mesh-partitioning, load balancing and communication. Two examples are used to illustrate these concepts: a multi-block structured code and an unstructured code. Parallel computing strategies used with both flow solvers are detailed and compared. This comparison indicates that mesh-partitioning and load balancing are more straightforward with unstructured grids than with multi-block structured meshes. However, the mesh-partitioning stage can be challenging for unstructured grids, mainly due to memory limitations of the newly developed massively parallel architectures. Finally, detailed investigations show that the impact of mesh-partitioning on the numerical CFD solutions, due to rounding errors and block splitting, may be of importance and should be accurately addressed before qualifying massively parallel CFD tools for a routine industrial use.

  7. A Complexity-Aware Video Adaptation Mechanism for Live Streaming Systems

    Directory of Open Access Journals (Sweden)

    Chen Homer H

    2007-01-01

    Full Text Available The paradigm shift of network design from performance-centric to constraint-centric has called for new signal processing techniques to deal with various aspects of resource-constrained communication and networking. In this paper, we consider the computational constraints of a multimedia communication system and propose a video adaptation mechanism for live video streaming of multiple channels. The video adaptation mechanism includes three salient features. First, it adjusts the computational resource of the streaming server block by block to provide a fine control of the encoding complexity. Second, as far as we know, it is the first mechanism to allocate the computational resource to multiple channels. Third, it utilizes a complexity-distortion model to determine the optimal coding parameter values to achieve global optimization. These techniques constitute the basic building blocks for a successful application of wireless and Internet video to digital home, surveillance, IPTV, and online games.

  8. A Complexity-Aware Video Adaptation Mechanism for Live Streaming Systems

    Science.gov (United States)

    Lu, Meng-Ting; Yao, Jason J.; Chen, Homer H.

    2007-12-01

    The paradigm shift of network design from performance-centric to constraint-centric has called for new signal processing techniques to deal with various aspects of resource-constrained communication and networking. In this paper, we consider the computational constraints of a multimedia communication system and propose a video adaptation mechanism for live video streaming of multiple channels. The video adaptation mechanism includes three salient features. First, it adjusts the computational resource of the streaming server block by block to provide a fine control of the encoding complexity. Second, as far as we know, it is the first mechanism to allocate the computational resource to multiple channels. Third, it utilizes a complexity-distortion model to determine the optimal coding parameter values to achieve global optimization. These techniques constitute the basic building blocks for a successful application of wireless and Internet video to digital home, surveillance, IPTV, and online games.

  9. Markov Renewal Methods in Restart Problems in Complex Systems

    DEFF Research Database (Denmark)

    Asmussen, Søren; Lipsky, Lester; Thompson, Stephen

    A task with ideal execution time L such as the execution of a computer program or the transmission of a file on a data link may fail, and the task then needs to be restarted. The task is handled by a complex system with features similar to the ones in classical reliability: failures may...

  10. Bourbaki's structure theory in the problem of complex systems simulation models synthesis and model-oriented programming

    Science.gov (United States)

    Brodsky, Yu. I.

    2015-01-01

    The work is devoted to the application of Bourbaki's structure theory to substantiate the synthesis of simulation models of complex multicomponent systems, where every component may be a complex system itself. An application of the Bourbaki's structure theory offers a new approach to the design and computer implementation of simulation models of complex multicomponent systems—model synthesis and model-oriented programming. It differs from the traditional object-oriented approach. The central concept of this new approach and at the same time, the basic building block for the construction of more complex structures is the concept of models-components. A model-component endowed with a more complicated structure than, for example, the object in the object-oriented analysis. This structure provides to the model-component an independent behavior-the ability of standard responds to standard requests of its internal and external environment. At the same time, the computer implementation of model-component's behavior is invariant under the integration of models-components into complexes. This fact allows one firstly to construct fractal models of any complexity, and secondly to implement a computational process of such constructions uniformly-by a single universal program. In addition, the proposed paradigm allows one to exclude imperative programming and to generate computer code with a high degree of parallelism.

  11. @neurIST: infrastructure for advanced disease management through integration of heterogeneous data, computing, and complex processing services.

    Science.gov (United States)

    Benkner, Siegfried; Arbona, Antonio; Berti, Guntram; Chiarini, Alessandro; Dunlop, Robert; Engelbrecht, Gerhard; Frangi, Alejandro F; Friedrich, Christoph M; Hanser, Susanne; Hasselmeyer, Peer; Hose, Rod D; Iavindrasana, Jimison; Köhler, Martin; Iacono, Luigi Lo; Lonsdale, Guy; Meyer, Rodolphe; Moore, Bob; Rajasekaran, Hariharan; Summers, Paul E; Wöhrer, Alexander; Wood, Steven

    2010-11-01

    The increasing volume of data describing human disease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the @neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system's architecture is generic enough that it could be adapted to the treatment of other diseases. Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers clinicians the tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medical researchers gain access to a critical mass of aneurysm related data due to the system's ability to federate distributed information sources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access and work on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand for performing computationally intensive simulations for treatment planning and research.

  12. Complex Engineered Systems: A New Paradigm

    Science.gov (United States)

    Mina, Ali A.; Braha, Dan; Bar-Yam, Yaneer

    Human history is often seen as an inexorable march towards greater complexity — in ideas, artifacts, social, political and economic systems, technology, and in the structure of life itself. While we do not have detailed knowledge of ancient times, it is reasonable to conclude that the average resident of New York City today faces a world of much greater complexity than the average denizen of Carthage or Tikal. A careful consideration of this change, however, suggests that most of it has occurred recently, and has been driven primarily by the emergence of technology as a force in human life. In the 4000 years separating the Indus Valley Civilization from 18th century Europe, human transportation evolved from the bullock cart to the hansom, and the methods of communication used by George Washington did not differ significantly from those used by Alexander or Rameses. The world has moved radically towards greater complexity in the last two centuries. We have moved from buggies and letter couriers to airplanes and the Internet — an increase in capacity, and through its diversity also in complexity, orders of magnitude greater than that accumulated through the rest of human history. In addition to creating iconic artifacts — the airplane, the car, the computer, the television, etc. — this change has had a profound effect on the scope of experience by creating massive, connected and multiultra- level systems — traffic networks, power grids, markets, multinational corporations — that defy analytical understanding and seem to have a life of their own. This is where complexity truly enters our lives.

  13. Statistical mechanics of complex neural systems and high dimensional data

    International Nuclear Information System (INIS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-01-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks. (paper)

  14. CAESY - COMPUTER AIDED ENGINEERING SYSTEM

    Science.gov (United States)

    Wette, M. R.

    1994-01-01

    Many developers of software and algorithms for control system design have recognized that current tools have limits in both flexibility and efficiency. Many forces drive the development of new tools including the desire to make complex system modeling design and analysis easier and the need for quicker turnaround time in analysis and design. Other considerations include the desire to make use of advanced computer architectures to help in control system design, adopt new methodologies in control, and integrate design processes (e.g., structure, control, optics). CAESY was developed to provide a means to evaluate methods for dealing with user needs in computer-aided control system design. It is an interpreter for performing engineering calculations and incorporates features of both Ada and MATLAB. It is designed to be reasonably flexible and powerful. CAESY includes internally defined functions and procedures, as well as user defined ones. Support for matrix calculations is provided in the same manner as MATLAB. However, the development of CAESY is a research project, and while it provides some features which are not found in commercially sold tools, it does not exhibit the robustness that many commercially developed tools provide. CAESY is written in C-language for use on Sun4 series computers running SunOS 4.1.1 and later. The program is designed to optionally use the LAPACK math library. The LAPACK math routines are available through anonymous ftp from research.att.com. CAESY requires 4Mb of RAM for execution. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. CAESY was developed in 1993 and is a copyrighted work with all copyright vested in NASA.

  15. Some Comparisons of Complexity in Dictionary-Based and Linear Computational Models

    Czech Academy of Sciences Publication Activity Database

    Gnecco, G.; Kůrková, Věra; Sanguineti, M.

    2011-01-01

    Roč. 24, č. 2 (2011), s. 171-182 ISSN 0893-6080 R&D Project s: GA ČR GA201/08/1744 Grant - others:CNR - AV ČR project 2010-2012(XE) Complexity of Neural-Network and Kernel Computational Models Institutional research plan: CEZ:AV0Z10300504 Keywords : linear approximation schemes * variable-basis approximation schemes * model complexity * worst-case errors * neural networks * kernel models Subject RIV: IN - Informatics, Computer Science Impact factor: 2.182, year: 2011

  16. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    Science.gov (United States)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that

  17. A computer-controlled conformal radiotherapy system. IV: Electronic chart

    International Nuclear Information System (INIS)

    Fraass, Benedick A.; McShan, Daniel L.; Matrone, Gwynne M.; Weaver, Tamar A.; Lewis, James D.; Kessler, Marc L.

    1995-01-01

    Purpose: The design and implementation of a system for electronically tracking relevant plan, prescription, and treatment data for computer-controlled conformal radiation therapy is described. Methods and Materials: The electronic charting system is implemented on a computer cluster coupled by high-speed networks to computer-controlled therapy machines. A methodical approach to the specification and design of an integrated solution has been used in developing the system. The electronic chart system is designed to allow identification and access of patient-specific data including treatment-planning data, treatment prescription information, and charting of doses. An in-house developed database system is used to provide an integrated approach to the database requirements of the design. A hierarchy of databases is used for both centralization and distribution of the treatment data for specific treatment machines. Results: The basic electronic database system has been implemented and has been in use since July 1993. The system has been used to download and manage treatment data on all patients treated on our first fully computer-controlled treatment machine. To date, electronic dose charting functions have not been fully implemented clinically, requiring the continued use of paper charting for dose tracking. Conclusions: The routine clinical application of complex computer-controlled conformal treatment procedures requires the management of large quantities of information for describing and tracking treatments. An integrated and comprehensive approach to this problem has led to a full electronic chart for conformal radiation therapy treatments

  18. Markov analysis of different standby computer based systems

    International Nuclear Information System (INIS)

    Srinivas, G.; Guptan, Rajee; Mohan, Nalini; Ghadge, S.G.; Bajaj, S.S.

    2006-01-01

    As against the conventional triplicated systems of hardware and the generation of control signals for the actuator elements by means of redundant hardwired median circuits, employed in the early Indian PHWR's, a new approach of generating control signals based on software by a redundant system of computers is introduced in the advanced/current generation of Indian PHWR's. Reliability is increased by fault diagnostics and automatic switch over of all the loads to one computer in case of total failure of the other computer. Independent processing by a redundant CPU in each system enables inter-comparison to quickly identify system failure, in addition to the other self-diagnostic features provided. Combinatorial models such as reliability block diagrams and fault trees are frequently used to predict the reliability, maintainability and safety of complex systems. Unfortunately, these methods cannot accurately model dynamic system behavior; Because of its unique ability to handle dynamic cases, Markov analysis can be a powerful tool in the reliability maintainability and safety (RMS) analyses of dynamic systems. A Markov model breaks the system configuration into a number of states. Each of these states is connected to all other states by transition rates. It then utilizes transition matrices to evaluate the reliability and safety of the systems, either through matrix manipulation or other analytical solution methods, such as Laplace transforms. Thus, Markov analysis is a powerful reliability, maintainability and safety analysis tool. It allows the analyst to model complex, dynamic, highly distributed, fault tolerant systems that would otherwise be very difficult to model using classical techniques like the Fault tree method. The Dual Processor Hot Standby Process Control System (DPHS-PCS) and the Computerized Channel Temperature Monitoring System (CCTM) are typical examples of hot standby systems in the Indian PHWR's. While such systems currently in use in Indian PHWR

  19. Low complexity symbol-wise beamforming for MIMO-OFDM systems

    KAUST Repository

    Lee, Hyun Ho

    2011-12-01

    In this paper, we consider a low complexity symbol-wise beamforming for MIMO-OFDM systems. We propose a non-iterative algorithm for the symbol-wise beamforming, which can provide the performance approaching that of the conventional symbol-wise beamforming based on the iterative algorithm. We demonstrate that our proposed scheme can reduce the computational complexity significantly. From our simulation results, it is evident that our proposed scheme leads to a negligible performance loss compared to the conventional symbol-wise beamforming regardless of spatial correlation or presence of co-channel interference. © 2011 IEEE.

  20. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  1. Study, design and evaluation of nuclear reactor computer control system

    International Nuclear Information System (INIS)

    Menacer, S.

    1988-01-01

    Nuclear reactor control is a complex process that varies with each reactor and there is no universal agreement as to the best type of control system. After the use of conventional systems for a long time, attention turned towards digital techniques in the reactor control system. This interest emerged because of the difficulties faced in the data manipulation, mainly for post-incident analysis. However, it is not sufficient to insert a computer in a system to solve all the data-handling problems and also the insertion of a computer in a real-time system is not without any effect on the overall system. The scope of this thesis is to show the important parameters that have to be taken into account when choosing and evaluate the performances of the selected system

  2. A Concise Introduction to the Statistical Physics of Complex Systems

    CERN Document Server

    Bertin, Eric

    2012-01-01

    This concise primer (based on lectures given at summer schools on complex systems and on a masters degree course in complex systems modeling) will provide graduate students and newcomers to the field with the basic knowledge of the concepts and methods of statistical physics and its potential for application to interdisciplinary topics.  Indeed, in recent years, statistical physics has begun to attract the interest of a broad community of researchers in the field of complex system sciences, ranging from biology to the social sciences, economics and computer science. More generally, a growing number of graduate students and researchers feel the need to learn some basic concepts and questions originating in other disciplines without necessarily having to master all of the corresponding technicalities and jargon. Generally speaking, the goals of statistical physics may be summarized as follows: on the one hand to study systems composed of a large number of interacting ‘entities’, and on the other to predict...

  3. Norm estimates of complex symmetric operators applied to quantum systems

    International Nuclear Information System (INIS)

    Prodan, Emil; Garcia, Stephan R; Putinar, Mihai

    2006-01-01

    This paper communicates recent results in the theory of complex symmetric operators and shows, through two non-trivial examples, their potential usefulness in the study of Schroedinger operators. In particular, we propose a formula for computing the norm of a compact complex symmetric operator. This observation is applied to two concrete problems related to quantum mechanical systems. First, we give sharp estimates on the exponential decay of the resolvent and the single-particle density matrix for Schroedinger operators with spectral gaps. Second, we provide new ways of evaluating the resolvent norm for Schroedinger operators appearing in the complex scaling theory of resonances

  4. Low Computational Complexity Network Coding For Mobile Networks

    DEFF Research Database (Denmark)

    Heide, Janus

    2012-01-01

    Network Coding (NC) is a technique that can provide benefits in many types of networks, some examples from wireless networks are: In relay networks, either the physical or the data link layer, to reduce the number of transmissions. In reliable multicast, to reduce the amount of signaling and enable......-flow coding technique. One of the key challenges of this technique is its inherent computational complexity which can lead to high computational load and energy consumption in particular on the mobile platforms that are the target platform in this work. To increase the coding throughput several...

  5. Abstraction in artificial intelligence and complex systems

    CERN Document Server

    Saitta, Lorenza

    2013-01-01

    Abstraction is a fundamental mechanism underlying both human and artificial perception, representation of knowledge, reasoning and learning. This mechanism plays a crucial role in many disciplines, notably Computer Programming, Natural and Artificial Vision, Complex Systems, Artificial Intelligence and Machine Learning, Art, and Cognitive Sciences. This book first provides the reader with an overview of the notions of abstraction proposed in various disciplines by comparing both commonalities and differences.  After discussing the characterizing properties of abstraction, a formal model, the K

  6. The complexity of computing the MCD-estimator

    DEFF Research Database (Denmark)

    Bernholt, T.; Fischer, Paul

    2004-01-01

    In modem statistics the robust estimation of parameters is a central problem, i.e., an estimation that is not or only slightly affected by outliers in the data. The minimum covariance determinant (MCD) estimator (J. Amer. Statist. Assoc. 79 (1984) 871) is probably one of the most important robust...... estimators of location and scatter. The complexity of computing the MCD, however, was unknown and generally thought to be exponential even if the dimensionality of the data is fixed. Here we present a polynomial time algorithm for MCD for fixed dimension of the data. In contrast we show that computing...... the MCD-estimator is NP-hard if the dimension varies. (C) 2004 Elsevier B.V. All rights reserved....

  7. [The P300-based brain-computer interface: presentation of the complex "flash + movement" stimuli].

    Science.gov (United States)

    Ganin, I P; Kaplan, A Ia

    2014-01-01

    The P300 based brain-computer interface requires the detection of P300 wave of brain event-related potentials. Most of its users learn the BCI control in several minutes and after the short classifier training they can type a text on the computer screen or assemble an image of separate fragments in simple BCI-based video games. Nevertheless, insufficient attractiveness for users and conservative stimuli organization in this BCI may restrict its integration into real information processes control. At the same time initial movement of object (motion-onset stimuli) may be an independent factor that induces P300 wave. In current work we checked the hypothesis that complex "flash + movement" stimuli together with drastic and compact stimuli organization on the computer screen may be much more attractive for user while operating in P300 BCI. In 20 subjects research we showed the effectiveness of our interface. Both accuracy and P300 amplitude were higher for flashing stimuli and complex "flash + movement" stimuli compared to motion-onset stimuli. N200 amplitude was maximal for flashing stimuli, while for "flash + movement" stimuli and motion-onset stimuli it was only a half of it. Similar BCI with complex stimuli may be embedded into compact control systems requiring high level of user attention under impact of negative external effects obstructing the BCI control.

  8. Use of neural networks in the analysis of complex systems

    International Nuclear Information System (INIS)

    Uhrig, R.E.

    1992-01-01

    The application of neural networks, alone or in conjunction with other advanced technologies (expert systems, fuzzy logic, and/or genetic algorithms) to some of the problems of complex engineering systems has the potential to enhance the safety reliability and operability of these systems. The work described here deals with complex systems or parts of such systems that can be isolated from the total system. Typically, the measured variables from the systems are analog variables that must be sampled and normalized to expected peak values before they are introduced into neural networks. Often data must be processed to put it into a form more acceptable to the neural network. The neural networks are usually simulated on modern high-speed computers that carry out the calculations serially. However, it is possible to implement neural networks using specially designed microchips where the network calculations are truly carried out in parallel, thereby providing virtually instantaneous outputs for each set of inputs. Specific applications described include: Diagnostics: State of the Plant; Hybrid System for Transient Identification; Detection of Change of Mode in Complex Systems; Sensor Validation; Plant-Wide Monitoring; Monitoring of Performance and Efficiency; and Analysis of Vibrations. Although the specific examples described deal with nuclear power plants or their subsystems, the techniques described can be applied to a wide variety of complex engineering systems

  9. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Directory of Open Access Journals (Sweden)

    Samreen Laghari

    Full Text Available Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT implies an inherent difficulty in modeling problems.It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS. The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC framework to model a Complex communication network problem.We use Exploratory Agent-based Modeling (EABM, as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  10. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Science.gov (United States)

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  11. Experimental and Computational Evidence for the Mechanism of Intradiol Catechol Dioxygenation by Non- Heme Iron(III) Complexes

    NARCIS (Netherlands)

    Jastrzebski, Robin; Quesne, Matthew G.; Weckhuysen, Bert M.; de Visser, Sam P.; Bruijnincx, Pieter C. A.

    2014-01-01

    Catechol intradiol dioxygenation is a unique reaction catalyzed by iron-dependent enzymes and nonheme iron(III) complexes. The mechanism by which these systems activate dioxygen in this important metabolic process remains controversial. Using a combination of kinetic measurements and computational

  12. Complex data modeling and computationally intensive methods for estimation and prediction

    CERN Document Server

    Secchi, Piercesare; Advances in Complex Data Modeling and Computational Methods in Statistics

    2015-01-01

    The book is addressed to statisticians working at the forefront of the statistical analysis of complex and high dimensional data and offers a wide variety of statistical models, computer intensive methods and applications: network inference from the analysis of high dimensional data; new developments for bootstrapping complex data; regression analysis for measuring the downsize reputational risk; statistical methods for research on the human genome dynamics; inference in non-euclidean settings and for shape data; Bayesian methods for reliability and the analysis of complex data; methodological issues in using administrative data for clinical and epidemiological research; regression models with differential regularization; geostatistical methods for mobility analysis through mobile phone data exploration. This volume is the result of a careful selection among the contributions presented at the conference "S.Co.2013: Complex data modeling and computationally intensive methods for estimation and prediction" held...

  13. Optimal non-coherent data detection for massive SIMO wireless systems: A polynomial complexity solution

    KAUST Repository

    Alshamary, Haider Ali Jasim

    2016-01-04

    © 2015 IEEE. This paper considers the joint maximum likelihood (ML) channel estimation and data detection problem for massive SIMO (single input multiple output) wireless systems. We propose efficient algorithms achieving the exact ML non-coherent data detection, for both constant-modulus constellations and nonconstant-modulus constellations. Despite a large number of unknown channel coefficients in massive SIMO systems, we show that the expected computational complexity is linear in the number of receive antennas and polynomial in channel coherence time. To the best of our knowledge, our algorithms are the first efficient algorithms to achieve the exact joint ML channel estimation and data detection performance for massive SIMO systems with general constellations. Simulation results show our algorithms achieve considerable performance gains at a low computational complexity.

  14. Optimal non-coherent data detection for massive SIMO wireless systems: A polynomial complexity solution

    KAUST Repository

    Alshamary, Haider Ali Jasim; Al-Naffouri, Tareq Y.; Zaib, Alam; Xu, Weiyu

    2016-01-01

    © 2015 IEEE. This paper considers the joint maximum likelihood (ML) channel estimation and data detection problem for massive SIMO (single input multiple output) wireless systems. We propose efficient algorithms achieving the exact ML non-coherent data detection, for both constant-modulus constellations and nonconstant-modulus constellations. Despite a large number of unknown channel coefficients in massive SIMO systems, we show that the expected computational complexity is linear in the number of receive antennas and polynomial in channel coherence time. To the best of our knowledge, our algorithms are the first efficient algorithms to achieve the exact joint ML channel estimation and data detection performance for massive SIMO systems with general constellations. Simulation results show our algorithms achieve considerable performance gains at a low computational complexity.

  15. A Low-Complexity Joint Detection-Decoding Algorithm for Nonbinary LDPC-Coded Modulation Systems

    OpenAIRE

    Wang, Xuepeng; Bai, Baoming; Ma, Xiao

    2010-01-01

    In this paper, we present a low-complexity joint detection-decoding algorithm for nonbinary LDPC codedmodulation systems. The algorithm combines hard-decision decoding using the message-passing strategy with the signal detector in an iterative manner. It requires low computational complexity, offers good system performance and has a fast rate of decoding convergence. Compared to the q-ary sum-product algorithm (QSPA), it provides an attractive candidate for practical applications of q-ary LDP...

  16. Generalized Combination Complex Synchronization for Fractional-Order Chaotic Complex Systems

    Directory of Open Access Journals (Sweden)

    Cuimei Jiang

    2015-07-01

    Full Text Available Based on two fractional-order chaotic complex drive systems and one fractional-order chaotic complex response system with different dimensions, we propose generalized combination complex synchronization. In this new synchronization scheme, there are two complex scaling matrices that are non-square matrices. On the basis of the stability theory of fractional-order linear systems, we design a general controller via active control. Additionally, by virtue of two complex scaling matrices, generalized combination complex synchronization between fractional-order chaotic complex systems and real systems is investigated. Finally, three typical examples are given to demonstrate the effectiveness and feasibility of the schemes.

  17. Life cycle costs measurement of complex systems manufactured by an engineer-to-order company

    NARCIS (Netherlands)

    Öner, K.B.; Franssen, R.; Kiesmüller, G.P.; Houtum, van G.J.J.A.N.; Qui, R.G.; Russell, D.W.; Sullivan, W.G.

    2007-01-01

    Complex technical systems such as packaging lines, computer networks, material handling systems, are crucial for the operations at the companies (or institutions) where they are installed. Companies require high availability because their primary processes may halt when these systems are down. High

  18. Experimental and Computational Evidence for the Mechanism of Intradiol Catechol Dioxygenation by Non-Heme Iron(III) Complexes

    Science.gov (United States)

    Jastrzebski, Robin; Quesne, Matthew G; Weckhuysen, Bert M; de Visser, Sam P; Bruijnincx, Pieter C A

    2014-01-01

    Catechol intradiol dioxygenation is a unique reaction catalyzed by iron-dependent enzymes and non-heme iron(III) complexes. The mechanism by which these systems activate dioxygen in this important metabolic process remains controversial. Using a combination of kinetic measurements and computational modelling of multiple iron(III) catecholato complexes, we have elucidated the catechol cleavage mechanism and show that oxygen binds the iron center by partial dissociation of the substrate from the iron complex. The iron(III) superoxide complex that is formed subsequently attacks the carbon atom of the substrate by a rate-determining C=O bond formation step. PMID:25322920

  19. Pattern-recognition software detecting the onset of failures in complex systems

    International Nuclear Information System (INIS)

    Mott, J.; King, R.

    1987-01-01

    A very general mathematical framework for embodying learned data from a complex system and combining it with a current observation to estimate the true current state of the system has been implemented using nearly universal pattern-recognition algorithms and applied to surveillance of the EBR-II power plant. In this application the methodology can provide signal validation and replacement of faulty signals on a near-real-time basis for hundreds of plant parameters. The mathematical framework, the pattern-recognition algorithms, examples of the learning and estimating process, and plant operating decisions made using this methodology are discussed. The entire methodology has been reduced to a set of FORTRAN subroutines which are small, fast, robust and executable on a personal computer with a serial link to the system's data acquisition computer, or on the data acquisition computer itself

  20. Applications of small computers for systems control on the Tandem Mirror Experiment-Upgrade

    International Nuclear Information System (INIS)

    Bork, R.G.; Kane, R.J.; Moore, T.L.

    1983-01-01

    Desktop computers operating into a CAMAC-based interface are used to control and monitor the operation of the various subsystems on the Tandem Mirror Experiment-Upgrade (TMX-U) at Lawrence Livermore National Laboratory (LLNL). These systems include: shot sequencer/master timing, neutral beam control (four consoles), magnet power system control, ion-cyclotron resonant heating (ICRH) control, thermocouple monitoring, getter system control, gas fueling system control, and electron-cyclotron resonant heating (ECRH) monitoring. Two additional computers are used to control the TMX-U neutral beam test stand and provide computer-aided repair/test and development of CAMAC modules. These machines are usually programmed in BASIC, but some codes have been interpreted into assembly language to increase speed. Details of the computer interfaces and system complexity are described as well as the evolution of the systems to their present states

  1. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  2. Risk-return relationship in a complex adaptive system.

    Directory of Open Access Journals (Sweden)

    Kunyu Song

    Full Text Available For survival and development, autonomous agents in complex adaptive systems involving the human society must compete against or collaborate with others for sharing limited resources or wealth, by using different methods. One method is to invest, in order to obtain payoffs with risk. It is a common belief that investments with a positive risk-return relationship (namely, high risk high return and vice versa are dominant over those with a negative risk-return relationship (i.e., high risk low return and vice versa in the human society; the belief has a notable impact on daily investing activities of investors. Here we investigate the risk-return relationship in a model complex adaptive system, in order to study the effect of both market efficiency and closeness that exist in the human society and play an important role in helping to establish traditional finance/economics theories. We conduct a series of computer-aided human experiments, and also perform agent-based simulations and theoretical analysis to confirm the experimental observations and reveal the underlying mechanism. We report that investments with a negative risk-return relationship have dominance over those with a positive risk-return relationship instead in such a complex adaptive systems. We formulate the dynamical process for the system's evolution, which helps to discover the different role of identical and heterogeneous preferences. This work might be valuable not only to complexity science, but also to finance and economics, to management and social science, and to physics.

  3. Risk-return relationship in a complex adaptive system.

    Science.gov (United States)

    Song, Kunyu; An, Kenan; Yang, Guang; Huang, Jiping

    2012-01-01

    For survival and development, autonomous agents in complex adaptive systems involving the human society must compete against or collaborate with others for sharing limited resources or wealth, by using different methods. One method is to invest, in order to obtain payoffs with risk. It is a common belief that investments with a positive risk-return relationship (namely, high risk high return and vice versa) are dominant over those with a negative risk-return relationship (i.e., high risk low return and vice versa) in the human society; the belief has a notable impact on daily investing activities of investors. Here we investigate the risk-return relationship in a model complex adaptive system, in order to study the effect of both market efficiency and closeness that exist in the human society and play an important role in helping to establish traditional finance/economics theories. We conduct a series of computer-aided human experiments, and also perform agent-based simulations and theoretical analysis to confirm the experimental observations and reveal the underlying mechanism. We report that investments with a negative risk-return relationship have dominance over those with a positive risk-return relationship instead in such a complex adaptive systems. We formulate the dynamical process for the system's evolution, which helps to discover the different role of identical and heterogeneous preferences. This work might be valuable not only to complexity science, but also to finance and economics, to management and social science, and to physics.

  4. Interactive computer graphics and its role in control system design of large space structures

    Science.gov (United States)

    Reddy, A. S. S. R.

    1985-01-01

    This paper attempts to show the relevance of interactive computer graphics in the design of control systems to maintain attitude and shape of large space structures to accomplish the required mission objectives. The typical phases of control system design, starting from the physical model such as modeling the dynamics, modal analysis, and control system design methodology are reviewed and the need of the interactive computer graphics is demonstrated. Typical constituent parts of large space structures such as free-free beams and free-free plates are used to demonstrate the complexity of the control system design and the effectiveness of the interactive computer graphics.

  5. Computer program determines chemical composition of physical system at equilibrium

    Science.gov (United States)

    Kwong, S. S.

    1966-01-01

    FORTRAN 4 digital computer program calculates equilibrium composition of complex, multiphase chemical systems. This is a free energy minimization method with solution of the problem reduced to mathematical operations, without concern for the chemistry involved. Also certain thermodynamic properties are determined as byproducts of the main calculations.

  6. Personal computer control system for small size tandem accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Takayama, Hiroshi; Kawano, Kazuhiro; Shinozaki, Masataka [Nissin - High Voltage Co. Ltd., Kyoto (Japan)

    1996-12-01

    As the analysis apparatus using tandem accelerator has a lot of control parameter, numbers of control parts set on control panel are so many to make the panel more complex and its operativity worse. In order to improve these faults, development and design of a control system using personal computer for the control panel mainly constituted by conventional hardware parts were tried. Their predominant characteristics are shown as follows: (1) To make the control panel construction simpler and more compact, because the hardware device on the panel surface becomes the smallest limit as required by using a personal computer for man-machine interface. (2) To make control speed more rapid, because sequence control is closed within each block by driving accelerator system to each block and installing local station of the sequencer network at each block. (3) To make expandability larger, because of few improvement of the present hardware by interrupting the sequencer local station into the net and correcting image of the computer when increasing a new beamline. And, (4) to make control system cheaper, because of cheaper investment and easier programming by using the personal computer. (G.K.)

  7. Understanding complex urban systems integrating multidisciplinary data in urban models

    CERN Document Server

    Gebetsroither-Geringer, Ernst; Atun, Funda; Werner, Liss

    2016-01-01

    This book is devoted to the modeling and understanding of complex urban systems. This second volume of Understanding Complex Urban Systems focuses on the challenges of the modeling tools, concerning, e.g., the quality and quantity of data and the selection of an appropriate modeling approach. It is meant to support urban decision-makers—including municipal politicians, spatial planners, and citizen groups—in choosing an appropriate modeling approach for their particular modeling requirements. The contributors to this volume are from different disciplines, but all share the same goal: optimizing the representation of complex urban systems. They present and discuss a variety of approaches for dealing with data-availability problems and finding appropriate modeling approaches—and not only in terms of computer modeling. The selection of articles featured in this volume reflect a broad variety of new and established modeling approaches such as: - An argument for using Big Data methods in conjunction with Age...

  8. Computational complexity and memory usage for multi-frontal direct solvers used in p finite element analysis

    KAUST Repository

    Calo, Victor M.; Collier, Nathan; Pardo, David; Paszyński, Maciej R.

    2011-01-01

    The multi-frontal direct solver is the state of the art for the direct solution of linear systems. This paper provides computational complexity and memory usage estimates for the application of the multi-frontal direct solver algorithm on linear systems resulting from p finite elements. Specifically we provide the estimates for systems resulting from C0 polynomial spaces spanned by B-splines. The structured grid and uniform polynomial order used in isogeometric meshes simplifies the analysis.

  9. Computational complexity and memory usage for multi-frontal direct solvers used in p finite element analysis

    KAUST Repository

    Calo, Victor M.

    2011-05-14

    The multi-frontal direct solver is the state of the art for the direct solution of linear systems. This paper provides computational complexity and memory usage estimates for the application of the multi-frontal direct solver algorithm on linear systems resulting from p finite elements. Specifically we provide the estimates for systems resulting from C0 polynomial spaces spanned by B-splines. The structured grid and uniform polynomial order used in isogeometric meshes simplifies the analysis.

  10. Petascale Many Body Methods for Complex Correlated Systems

    Science.gov (United States)

    Pruschke, Thomas

    2012-02-01

    Correlated systems constitute an important class of materials in modern condensed matter physics. Correlation among electrons are at the heart of all ordering phenomena and many intriguing novel aspects, such as quantum phase transitions or topological insulators, observed in a variety of compounds. Yet, theoretically describing these phenomena is still a formidable task, even if one restricts the models used to the smallest possible set of degrees of freedom. Here, modern computer architectures play an essential role, and the joint effort to devise efficient algorithms and implement them on state-of-the art hardware has become an extremely active field in condensed-matter research. To tackle this task single-handed is quite obviously not possible. The NSF-OISE funded PIRE collaboration ``Graduate Education and Research in Petascale Many Body Methods for Complex Correlated Systems'' is a successful initiative to bring together leading experts around the world to form a virtual international organization for addressing these emerging challenges and educate the next generation of computational condensed matter physicists. The collaboration includes research groups developing novel theoretical tools to reliably and systematically study correlated solids, experts in efficient computational algorithms needed to solve the emerging equations, and those able to use modern heterogeneous computer architectures to make then working tools for the growing community.

  11. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang

    2007-01-01

    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  12. Complexity estimates based on integral transforms induced by computational units

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra

    2012-01-01

    Roč. 33, September (2012), s. 160-167 ISSN 0893-6080 R&D Projects: GA ČR GAP202/11/1368 Institutional research plan: CEZ:AV0Z10300504 Institutional support: RVO:67985807 Keywords : neural networks * estimates of model complexity * approximation from a dictionary * integral transforms * norms induced by computational units Subject RIV: IN - Informatics, Computer Science Impact factor: 1.927, year: 2012

  13. Programs for Testing Processor-in-Memory Computing Systems

    Science.gov (United States)

    Katz, Daniel S.

    2006-01-01

    The Multithreaded Microbenchmarks for Processor-In-Memory (PIM) Compilers, Simulators, and Hardware are computer programs arranged in a series for use in testing the performances of PIM computing systems, including compilers, simulators, and hardware. The programs at the beginning of the series test basic functionality; the programs at subsequent positions in the series test increasingly complex functionality. The programs are intended to be used while designing a PIM system, and can be used to verify that compilers, simulators, and hardware work correctly. The programs can also be used to enable designers of these system components to examine tradeoffs in implementation. Finally, these programs can be run on non-PIM hardware (either single-threaded or multithreaded) using the POSIX pthreads standard to verify that the benchmarks themselves operate correctly. [POSIX (Portable Operating System Interface for UNIX) is a set of standards that define how programs and operating systems interact with each other. pthreads is a library of pre-emptive thread routines that comply with one of the POSIX standards.

  14. A web-based, collaborative modeling, simulation, and parallel computing environment for electromechanical systems

    Directory of Open Access Journals (Sweden)

    Xiaoliang Yin

    2015-03-01

    Full Text Available Complex electromechanical system is usually composed of multiple components from different domains, including mechanical, electronic, hydraulic, control, and so on. Modeling and simulation for electromechanical system on a unified platform is one of the research hotspots in system engineering at present. It is also the development trend of the design for complex electromechanical system. The unified modeling techniques and tools based on Modelica language provide a satisfactory solution. To meet with the requirements of collaborative modeling, simulation, and parallel computing for complex electromechanical systems based on Modelica, a general web-based modeling and simulation prototype environment, namely, WebMWorks, is designed and implemented. Based on the rich Internet application technologies, an interactive graphic user interface for modeling and post-processing on web browser was implemented; with the collaborative design module, the environment supports top-down, concurrent modeling and team cooperation; additionally, service-oriented architecture–based architecture was applied to supply compiling and solving services which run on cloud-like servers, so the environment can manage and dispatch large-scale simulation tasks in parallel on multiple computing servers simultaneously. An engineering application about pure electric vehicle is tested on WebMWorks. The results of simulation and parametric experiment demonstrate that the tested web-based environment can effectively shorten the design cycle of the complex electromechanical system.

  15. Nostradamus 2014 prediction, modeling and analysis of complex systems

    CERN Document Server

    Suganthan, Ponnuthurai; Chen, Guanrong; Snasel, Vaclav; Abraham, Ajith; Rössler, Otto

    2014-01-01

    The prediction of behavior of complex systems, analysis and modeling of its structure is a vitally important problem in engineering, economy and generally in science today. Examples of such systems can be seen in the world around us (including our bodies) and of course in almost every scientific discipline including such “exotic” domains as the earth’s atmosphere, turbulent fluids, economics (exchange rate and stock markets), population growth, physics (control of plasma), information flow in social networks and its dynamics, chemistry and complex networks. To understand such complex dynamics, which often exhibit strange behavior, and to use it in research or industrial applications, it is paramount to create its models. For this purpose there exists a rich spectrum of methods, from classical such as ARMA models or Box Jenkins method to modern ones like evolutionary computation, neural networks, fuzzy logic, geometry, deterministic chaos amongst others. This proceedings book is a collection of accepted ...

  16. System Testability Analysis for Complex Electronic Devices Based on Multisignal Model

    International Nuclear Information System (INIS)

    Long, B; Tian, S L; Huang, J G

    2006-01-01

    It is necessary to consider the system testability problems for electronic devices during their early design phase because modern electronic devices become smaller and more compositive while their function and structure are more complex. Multisignal model, combining advantage of structure model and dependency model, is used to describe the fault dependency relationship for the complex electronic devices, and the main testability indexes (including optimal test program, fault detection rate, fault isolation rate, etc.) to evaluate testability and corresponding algorithms are given. The system testability analysis process is illustrated for USB-GPIB interface circuit with TEAMS toolbox. The experiment results show that the modelling method is simple, the computation speed is rapid and this method has important significance to improve diagnostic capability for complex electronic devices

  17. Prospects of a mathematical theory of human behavior in complex man-machine systems tasks. [time sharing computer analogy of automobile driving

    Science.gov (United States)

    Johannsen, G.; Rouse, W. B.

    1978-01-01

    A hierarchy of human activities is derived by analyzing automobile driving in general terms. A structural description leads to a block diagram and a time-sharing computer analogy. The range of applicability of existing mathematical models is considered with respect to the hierarchy of human activities in actual complex tasks. Other mathematical tools so far not often applied to man machine systems are also discussed. The mathematical descriptions at least briefly considered here include utility, estimation, control, queueing, and fuzzy set theory as well as artificial intelligence techniques. Some thoughts are given as to how these methods might be integrated and how further work might be pursued.

  18. Current topics in pure and computational complex analysis

    CERN Document Server

    Dorff, Michael; Lahiri, Indrajit

    2014-01-01

    The book contains 13 articles, some of which are survey articles and others research papers. Written by eminent mathematicians, these articles were presented at the International Workshop on Complex Analysis and Its Applications held at Walchand College of Engineering, Sangli. All the contributing authors are actively engaged in research fields related to the topic of the book. The workshop offered a comprehensive exposition of the recent developments in geometric functions theory, planar harmonic mappings, entire and meromorphic functions and their applications, both theoretical and computational. The recent developments in complex analysis and its applications play a crucial role in research in many disciplines.

  19. Computation of resonances by two methods involving the use of complex coordinates

    International Nuclear Information System (INIS)

    Bylicki, M.; Nicolaides, C.A.

    1993-01-01

    We have studied two different systems producing resonances, a highly excited multielectron Coulombic negative ion (the He - 2s2p 2 4 P state) and a hydrogen atom in a magnetic field, via the complex-coordinate rotation (CCR) and the state-specific complex-eigenvalue Schroedinger equation (CESE) approaches. For the He - 2s2p 2 4 P resonance, a series of large CCR calculations, up to 353 basis functions with explicit r ij dependence, were carried out to serve as benchmarks. For the magnetic-field problem, the CCR results were taken from the literature. Comparison shows that the state-specific CESE theory allows the physics of the problem to be incorporated systematically while keeping the overall size of the computation tractable regardless of the number of electrons

  20. Graphics processing units in bioinformatics, computational biology and systems biology.

    Science.gov (United States)

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  1. Complexity vs energy: theory of computation and theoretical physics

    International Nuclear Information System (INIS)

    Manin, Y I

    2014-01-01

    This paper is a survey based upon the talk at the satellite QQQ conference to ECM6, 3Quantum: Algebra Geometry Information, Tallinn, July 2012. It is dedicated to the analogy between the notions of complexity in theoretical computer science and energy in physics. This analogy is not metaphorical: I describe three precise mathematical contexts, suggested recently, in which mathematics related to (un)computability is inspired by and to a degree reproduces formalisms of statistical physics and quantum field theory.

  2. Fail-safe design criteria for computer-based reactor protection systems

    International Nuclear Information System (INIS)

    Keats, A.B.

    1980-01-01

    The increasing quantity and complexity of the instrumentation required in nuclear power plants provides a strong incentive for using on-line computers as the basis of the control and protection systems. On-line computers using multiplexed sampled data are already well established but their application to nuclear reactor protection systems requires special measures to satisfy the very high reliability which is demanded in the interests of safety and availability. Some existing codes of practice relating to segregation of replicated subsysttems continue to be applicable and lead to division of the computer functions into two distinct parts. The first computer, referred to as the Trip Algorithm Computer may also control the multiplexer. Voting on each group of status inputs yielded by the trip algorithm computers is performed by the Vote Algorithm Computer. The conceptual disparities between hardwired reactor-protection systems and those employing computers also rise to a need for some new criteria. An important objective of these criteria, minimising the need for a failure-mode-and-effect-analysis of the computer software, but is achieved almost entirely by 'hardware' properties of the system: the systematic use of hardwired test inputs which cause excursions of the trip algorithms into the tripped state in a uniquely ordered but easily recognisable sequence, and the use of hardwired 'pattern recognition logic' which generates a dynamic 'healthy' stimulus for the shutdown actuators only in response to the unique sequence generated by the hardwired input signal pattern. The adoption of the proposed design criteria ensure not only failure-to-safety in the hardware but the elimination, or at least minimisation, of the dependence on the correct functioning of the computer software for the safety system. (auth)

  3. Computer systems a programmer's perspective

    CERN Document Server

    Bryant, Randal E

    2016-01-01

    Computer systems: A Programmer’s Perspective explains the underlying elements common among all computer systems and how they affect general application performance. Written from the programmer’s perspective, this book strives to teach readers how understanding basic elements of computer systems and executing real practice can lead them to create better programs. Spanning across computer science themes such as hardware architecture, the operating system, and systems software, the Third Edition serves as a comprehensive introduction to programming. This book strives to create programmers who understand all elements of computer systems and will be able to engage in any application of the field--from fixing faulty software, to writing more capable programs, to avoiding common flaws. It lays the groundwork for readers to delve into more intensive topics such as computer architecture, embedded systems, and cybersecurity. This book focuses on systems that execute an x86-64 machine code, and recommends th...

  4. Modelling and simulation of complex sociotechnical systems: envisioning and analysing work environments

    Science.gov (United States)

    Hettinger, Lawrence J.; Kirlik, Alex; Goh, Yang Miang; Buckle, Peter

    2015-01-01

    Accurate comprehension and analysis of complex sociotechnical systems is a daunting task. Empirically examining, or simply envisioning the structure and behaviour of such systems challenges traditional analytic and experimental approaches as well as our everyday cognitive capabilities. Computer-based models and simulations afford potentially useful means of accomplishing sociotechnical system design and analysis objectives. From a design perspective, they can provide a basis for a common mental model among stakeholders, thereby facilitating accurate comprehension of factors impacting system performance and potential effects of system modifications. From a research perspective, models and simulations afford the means to study aspects of sociotechnical system design and operation, including the potential impact of modifications to structural and dynamic system properties, in ways not feasible with traditional experimental approaches. This paper describes issues involved in the design and use of such models and simulations and describes a proposed path forward to their development and implementation. Practitioner Summary: The size and complexity of real-world sociotechnical systems can present significant barriers to their design, comprehension and empirical analysis. This article describes the potential advantages of computer-based models and simulations for understanding factors that impact sociotechnical system design and operation, particularly with respect to process and occupational safety. PMID:25761227

  5. General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results

    Czech Academy of Sciences Publication Activity Database

    Šíma, Jiří; Orponen, P.

    2003-01-01

    Roč. 15, č. 12 (2003), s. 2727-2778 ISSN 0899-7667 R&D Projects: GA AV ČR IAB2030007; GA ČR GA201/02/1456 Institutional research plan: AV0Z1030915 Keywords : computational power * computational complexity * perceptrons * radial basis functions * spiking neurons * feedforward networks * reccurent networks * probabilistic computation * analog computation Subject RIV: BA - General Mathematics Impact factor: 2.747, year: 2003

  6. Complex Adaptive Systems of Systems (CASOS) engineering environment.

    Energy Technology Data Exchange (ETDEWEB)

    Detry, Richard Joseph; Linebarger, John Michael; Finley, Patrick D.; Maffitt, S. Louise; Glass, Robert John, Jr.; Beyeler, Walter Eugene; Ames, Arlo Leroy

    2012-02-01

    Complex Adaptive Systems of Systems, or CASoS, are vastly complex physical-socio-technical systems which we must understand to design a secure future for the nation. The Phoenix initiative implements CASoS Engineering principles combining the bottom up Complex Systems and Complex Adaptive Systems view with the top down Systems Engineering and System-of-Systems view. CASoS Engineering theory and practice must be conducted together to develop a discipline that is grounded in reality, extends our understanding of how CASoS behave and allows us to better control the outcomes. The pull of applications (real world problems) is critical to this effort, as is the articulation of a CASoS Engineering Framework that grounds an engineering approach in the theory of complex adaptive systems of systems. Successful application of the CASoS Engineering Framework requires modeling, simulation and analysis (MS and A) capabilities and the cultivation of a CASoS Engineering Community of Practice through knowledge sharing and facilitation. The CASoS Engineering Environment, itself a complex adaptive system of systems, constitutes the two platforms that provide these capabilities.

  7. Description of the TREBIL, CRESSEX and STREUSL computer programs, that belongs to RALLY computer code pack for the analysis of reliability systems

    International Nuclear Information System (INIS)

    Fernandes Filho, T.L.

    1982-11-01

    The RALLY computer code pack (RALLY pack) is a set of computer codes destinate to the reliability of complex systems, aiming to a risk analysis. Three of the six codes, are commented, presenting their purpose, input description, calculation methods and results obtained with each one of those computer codes. The computer codes are: TREBIL, to obtain the fault tree logical equivalent; CRESSEX, to obtain the minimal cut and the punctual values of the non-reliability and non-availability of the system; and STREUSL, for the dispersion calculation of those values around the media. In spite of the CRESSEX, in its version available at CNEN, uses a little long method to obtain the minimal cut in an HB-CNEN system, the three computer programs show good results, mainly the STREUSL, which permits the simulation of various components. (E.G.) [pt

  8. Computer Operating System Maintenance.

    Science.gov (United States)

    1982-06-01

    FACILITY The Computer Management Information Facility ( CMIF ) system was developed by Rapp Systems to fulfill the need at the CRF to record and report on...computer center resource usage and utilization. The foundation of the CMIF system is a System 2000 data base (CRFMGMT) which stores and permits access

  9. Experimental technique for study on three-particle reactions in kinematically total experiments with usage of the two-processor complex on the M-400 computer basis

    International Nuclear Information System (INIS)

    Berezin, F.N.; Kisurin, V.A.; Nemets, O.F.; Ofengenden, R.G.; Pugach, V.M.; Pavlenko, Yu.N.; Patlan', Yu.V.; Savrasov, S.S.

    1981-01-01

    Experimental technique for investigation of three-particle nuclear reactions in kinematically total experiments is described. The technique provides the storage of one-dimensional and two- dimensional energy spectra from several detectors. A block diagram of the measuring system, using this technique, is presented. The measuring system consists of analog equipment for rapid-slow coincidences and of a two-processor complex on the base of the M-400 computer with a general bus. Application of a two-processor complex, each computer of which has a possibility of direct access to memory of another computer, permits to separate functions of data collection and data operational presentation and to perform necessary physical calculations. Software of the measuring complex which includes programs written using the ASSEMBLER language for the first computer and functional programs written using the BASIC language for the second computer, is considered. Software of the first computer includes the DISPETCHER dialog control program, driver package for control of external devices, of applied program package and system modules. The technique, described, is tested in experiment on investigation of d+ 10 B→α+α+α three- particle reaction at deutron energy of 13.6 MeV. The two-dimensional energy spectrum reaction obtained with the help of the technique described is presented [ru

  10. Propagating wave correlations in complex systems

    International Nuclear Information System (INIS)

    Creagh, Stephen C; Gradoni, Gabriele; Hartmann, Timo; Tanner, Gregor

    2017-01-01

    We describe a novel approach for computing wave correlation functions inside finite spatial domains driven by complex and statistical sources. By exploiting semiclassical approximations, we provide explicit algorithms to calculate the local mean of these correlation functions in terms of the underlying classical dynamics. By defining appropriate ensemble averages, we show that fluctuations about the mean can be characterised in terms of classical correlations. We give in particular an explicit expression relating fluctuations of diagonal contributions to those of the full wave correlation function. The methods have a wide range of applications both in quantum mechanics and for classical wave problems such as in vibro-acoustics and electromagnetism. We apply the methods here to simple quantum systems, so-called quantum maps, which model the behaviour of generic problems on Poincaré sections. Although low-dimensional, these models exhibit a chaotic classical limit and share common characteristics with wave propagation in complex structures. (paper)

  11. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  12. Counting loop diagrams: computational complexity of higher-order amplitude evaluation

    International Nuclear Information System (INIS)

    Eijk, E. van; Kleiss, R.; Lazopoulos, A.

    2004-01-01

    We discuss the computational complexity of the perturbative evaluation of scattering amplitudes, both by the Caravaglios-Moretti algorithm and by direct evaluation of the individual diagrams. For a self-interacting scalar theory, we determine the complexity as a function of the number of external legs. We describe a method for obtaining the number of topologically inequivalent Feynman graphs containing closed loops, and apply this to 1- and 2-loop amplitudes. We also compute the number of graphs weighted by their symmetry factors, thus arriving at exact and asymptotic estimates for the average symmetry factor of diagrams. We present results for the asymptotic number of diagrams up to 10 loops, and prove that the average symmetry factor approaches unity as the number of external legs becomes large. (orig.)

  13. Data based identification and prediction of nonlinear and complex dynamical systems

    Science.gov (United States)

    Wang, Wen-Xu; Lai, Ying-Cheng; Grebogi, Celso

    2016-07-01

    The problem of reconstructing nonlinear and complex dynamical systems from measured data or time series is central to many scientific disciplines including physical, biological, computer, and social sciences, as well as engineering and economics. The classic approach to phase-space reconstruction through the methodology of delay-coordinate embedding has been practiced for more than three decades, but the paradigm is effective mostly for low-dimensional dynamical systems. Often, the methodology yields only a topological correspondence of the original system. There are situations in various fields of science and engineering where the systems of interest are complex and high dimensional with many interacting components. A complex system typically exhibits a rich variety of collective dynamics, and it is of great interest to be able to detect, classify, understand, predict, and control the dynamics using data that are becoming increasingly accessible due to the advances of modern information technology. To accomplish these goals, especially prediction and control, an accurate reconstruction of the original system is required. Nonlinear and complex systems identification aims at inferring, from data, the mathematical equations that govern the dynamical evolution and the complex interaction patterns, or topology, among the various components of the system. With successful reconstruction of the system equations and the connecting topology, it may be possible to address challenging and significant problems such as identification of causal relations among the interacting components and detection of hidden nodes. The "inverse" problem thus presents a grand challenge, requiring new paradigms beyond the traditional delay-coordinate embedding methodology. The past fifteen years have witnessed rapid development of contemporary complex graph theory with broad applications in interdisciplinary science and engineering. The combination of graph, information, and nonlinear dynamical

  14. Data based identification and prediction of nonlinear and complex dynamical systems

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Wen-Xu [School of Systems Science, Beijing Normal University, Beijing, 100875 (China); Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China); Lai, Ying-Cheng, E-mail: Ying-Cheng.Lai@asu.edu [School of Electrical, Computer and Energy Engineering, Arizona State University, Tempe, AZ 85287 (United States); Department of Physics, Arizona State University, Tempe, AZ 85287 (United States); Institute for Complex Systems and Mathematical Biology, King’s College, University of Aberdeen, Aberdeen AB24 3UE (United Kingdom); Grebogi, Celso [Institute for Complex Systems and Mathematical Biology, King’s College, University of Aberdeen, Aberdeen AB24 3UE (United Kingdom)

    2016-07-12

    The problem of reconstructing nonlinear and complex dynamical systems from measured data or time series is central to many scientific disciplines including physical, biological, computer, and social sciences, as well as engineering and economics. The classic approach to phase-space reconstruction through the methodology of delay-coordinate embedding has been practiced for more than three decades, but the paradigm is effective mostly for low-dimensional dynamical systems. Often, the methodology yields only a topological correspondence of the original system. There are situations in various fields of science and engineering where the systems of interest are complex and high dimensional with many interacting components. A complex system typically exhibits a rich variety of collective dynamics, and it is of great interest to be able to detect, classify, understand, predict, and control the dynamics using data that are becoming increasingly accessible due to the advances of modern information technology. To accomplish these goals, especially prediction and control, an accurate reconstruction of the original system is required. Nonlinear and complex systems identification aims at inferring, from data, the mathematical equations that govern the dynamical evolution and the complex interaction patterns, or topology, among the various components of the system. With successful reconstruction of the system equations and the connecting topology, it may be possible to address challenging and significant problems such as identification of causal relations among the interacting components and detection of hidden nodes. The “inverse” problem thus presents a grand challenge, requiring new paradigms beyond the traditional delay-coordinate embedding methodology. The past fifteen years have witnessed rapid development of contemporary complex graph theory with broad applications in interdisciplinary science and engineering. The combination of graph, information, and nonlinear

  15. Data based identification and prediction of nonlinear and complex dynamical systems

    International Nuclear Information System (INIS)

    Wang, Wen-Xu; Lai, Ying-Cheng; Grebogi, Celso

    2016-01-01

    The problem of reconstructing nonlinear and complex dynamical systems from measured data or time series is central to many scientific disciplines including physical, biological, computer, and social sciences, as well as engineering and economics. The classic approach to phase-space reconstruction through the methodology of delay-coordinate embedding has been practiced for more than three decades, but the paradigm is effective mostly for low-dimensional dynamical systems. Often, the methodology yields only a topological correspondence of the original system. There are situations in various fields of science and engineering where the systems of interest are complex and high dimensional with many interacting components. A complex system typically exhibits a rich variety of collective dynamics, and it is of great interest to be able to detect, classify, understand, predict, and control the dynamics using data that are becoming increasingly accessible due to the advances of modern information technology. To accomplish these goals, especially prediction and control, an accurate reconstruction of the original system is required. Nonlinear and complex systems identification aims at inferring, from data, the mathematical equations that govern the dynamical evolution and the complex interaction patterns, or topology, among the various components of the system. With successful reconstruction of the system equations and the connecting topology, it may be possible to address challenging and significant problems such as identification of causal relations among the interacting components and detection of hidden nodes. The “inverse” problem thus presents a grand challenge, requiring new paradigms beyond the traditional delay-coordinate embedding methodology. The past fifteen years have witnessed rapid development of contemporary complex graph theory with broad applications in interdisciplinary science and engineering. The combination of graph, information, and nonlinear

  16. A novel random-pulser concept for empirical reliability studies of complex systems

    International Nuclear Information System (INIS)

    Priesmeyer, H.G.

    1985-01-01

    The concept of a computer-controlled pseudo-random pulser is described, which is able to produce pulse sequences obeying statistical distributions, used in probability assessments of safety technology. It shall be used in empirical investigations of the reliability of complex systems. (orig.) [de

  17. MVPACK: a computer-aided design tool for multivariable control systems

    International Nuclear Information System (INIS)

    Mensah, S.; Frketich, G.

    1985-10-01

    The design and analysis of high-performance controllers for complex plants require a collection of interactive, powerful computer software. MVPACK, an open-ended package for the computer-aided design of control systems, has been developed in the Reactor Control Branch of the Chalk River Nuclear Laboratories. The package is fully interactive and includes a comprehensive state-of-the-art mathematical library to support development of complex, multivariable, control algorithms. Coded in RATFOR, MVPACK is portable with minimal changes. It operates with a flexible data structure which makes efficient use of minicomputer resources and provides a standard framework for program generation. The existence of a help mechanism enhances the simplicity of package utilization. This paper provides a brief tutorial overview of the package. It reviews the specifications used in the design and implementation of the package and briefly describes the database structure, supporting libraries and some design and analysis modules of MVPACK. Several application examples to illustrate the capability of the package are given. Experience with MVPACK shows that the package provides a synergistic environment for the design of control and regulation systems, and that it is a unique tool for training of control system engineers

  18. Technical Note. The Concept of a Computer System for Interpretation of Tight Rocks Using X-Ray Computed Tomography Results

    Directory of Open Access Journals (Sweden)

    Habrat Magdalena

    2017-03-01

    Full Text Available The article presents the concept of a computer system for interpreting unconventional oil and gas deposits with the use of X-ray computed tomography results. The functional principles of the solution proposed are presented in the article. The main goal is to design a product which is a complex and useful tool in a form of a specialist computer software for qualitative and quantitative interpretation of images obtained from X-ray computed tomography. It is devoted to the issues of prospecting and identification of unconventional hydrocarbon deposits. The article focuses on the idea of X-ray computed tomography use as a basis for the analysis of tight rocks, considering especially functional principles of the system, which will be developed by the authors. The functional principles include the issues of graphical visualization of rock structure, qualitative and quantitative interpretation of model for visualizing rock samples, interpretation and a description of the parameters within realizing the module of quantitative interpretation.

  19. Interactive computer-enhanced remote viewing system

    International Nuclear Information System (INIS)

    Tourtellott, J.A.; Wagner, J.F.

    1995-01-01

    Remediation activities such as decontamination and decommissioning (D ampersand D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths am clear of obstacles. This need for a task space model is most pronounced in the remediation of obsolete production facilities and underground storage tanks. Production facilities at many sites contain compact process machinery and systems that were used to produce weapons grade material. For many such systems, a complex maze of pipes (with potentially dangerous contents) must be removed, and this represents a significant D ampersand D challenge. In an analogous way, the underground storage tanks at sites such as Hanford represent a challenge because of their limited entry and the tumbled profusion of in-tank hardware. In response to this need, the Interactive Computer-Enhanced Remote Viewing System (ICERVS) is being designed as a software system to: (1) Provide a reliable geometric description of a robotic task space, and (2) Enable robotic remediation to be conducted more effectively and more economically than with available techniques. A system such as ICERVS is needed because of the problems discussed below

  20. International Symposium on Complex Computing-Networks

    CERN Document Server

    Sevgi, L; CCN2005; Complex computing networks: Brain-like and wave-oriented electrodynamic algorithms

    2006-01-01

    This book uniquely combines new advances in the electromagnetic and the circuits&systems theory. It integrates both fields regarding computational aspects of common interest. Emphasized subjects are those methods which mimic brain-like and electrodynamic behaviour; among these are cellular neural networks, chaos and chaotic dynamics, attractor-based computation and stream ciphers. The book contains carefully selected contributions from the Symposium CCN2005. Pictures from the bestowal of Honorary Doctorate degrees to Leon O. Chua and Leopold B. Felsen are included.

  1. Green IT engineering concepts, models, complex systems architectures

    CERN Document Server

    Kondratenko, Yuriy; Kacprzyk, Janusz

    2017-01-01

    This volume provides a comprehensive state of the art overview of a series of advanced trends and concepts that have recently been proposed in the area of green information technologies engineering as well as of design and development methodologies for models and complex systems architectures and their intelligent components. The contributions included in the volume have their roots in the authors’ presentations, and vivid discussions that have followed the presentations, at a series of workshop and seminars held within the international TEMPUS-project GreenCo project in United Kingdom, Italy, Portugal, Sweden and the Ukraine, during 2013-2015 and at the 1st - 5th Workshops on Green and Safe Computing (GreenSCom) held in Russia, Slovakia and the Ukraine. The book presents a systematic exposition of research on principles, models, components and complex systems and a description of industry- and society-oriented aspects of the green IT engineering. A chapter-oriented structure has been adopted for this book ...

  2. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    Science.gov (United States)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  3. Computer System Design System-on-Chip

    CERN Document Server

    Flynn, Michael J

    2011-01-01

    The next generation of computer system designers will be less concerned about details of processors and memories, and more concerned about the elements of a system tailored to particular applications. These designers will have a fundamental knowledge of processors and other elements in the system, but the success of their design will depend on the skills in making system-level tradeoffs that optimize the cost, performance and other attributes to meet application requirements. This book provides a new treatment of computer system design, particularly for System-on-Chip (SOC), which addresses th

  4. Computed tomography of von Meyenburg complex simulating micro-abscesses

    International Nuclear Information System (INIS)

    Sada, P.N.; Ramakrishna, B.

    1994-01-01

    A case is presented of a bile duct hamartoma in a 44 year old man being evaluated for abdominal pain. The computed tomography (CT) findings suggested micro-abscesses in the liver and a CT guided tru-cut biopsy showed von Meyenburg complex. 9 refs., 3 figs

  5. Ninth International Conference on Dependability and Complex Systems

    CERN Document Server

    Mazurkiewicz, Jacek; Sugier, Jarosław; Walkowiak, Tomasz; Kacprzyk, Janusz

    2014-01-01

    DepCoS – RELCOMEX is an annual series of conferences organized by Wrocław University of Technology to promote a comprehensive approach to evaluation of system performability which is now commonly called dependability. In contrast to classic analyses which were concentrated on reliability of technical resources and structures built from them, dependability is based on multi-disciplinary approach to theory, technology, and maintenance of a system considered to be a multifaceted amalgamation of technical, information, organization, software and human (users, administrators, supervisors, etc.) resources. Diversity of processes being realized (data processing, system management, system monitoring, etc.), their concurrency and their reliance on in-system intelligence often severely impedes construction of strict mathematical models and calls for application of intelligent and soft computing methods. This book presents the proceedings of the Ninth International Conference on Dependability and Complex Systems DepC...

  6. On system behaviour using complex networks of a compression algorithm

    Science.gov (United States)

    Walker, David M.; Correa, Debora C.; Small, Michael

    2018-01-01

    We construct complex networks of scalar time series using a data compression algorithm. The structure and statistics of the resulting networks can be used to help characterize complex systems, and one property, in particular, appears to be a useful discriminating statistic in surrogate data hypothesis tests. We demonstrate these ideas on systems with known dynamical behaviour and also show that our approach is capable of identifying behavioural transitions within electroencephalogram recordings as well as changes due to a bifurcation parameter of a chaotic system. The technique we propose is dependent on a coarse grained quantization of the original time series and therefore provides potential for a spatial scale-dependent characterization of the data. Finally the method is as computationally efficient as the underlying compression algorithm and provides a compression of the salient features of long time series.

  7. MENTAL SHIFT TOWARDS SYSTEMS THINKING SKILLS IN COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    MILDEOVÁ, Stanislava

    2012-03-01

    Full Text Available When seeking solutions to current problems in the field of computer science – and other fields – we encounter situations where traditional approaches no longer bring the desired results. Our cognitive skills also limit the implementation of reliable mental simulation within the basic set of relations. The world around us is becoming more complex and mutually interdependent, and this is reflected in the demands on computer support. Thus, in today’s education and science in the field of computer science and all other disciplines and areas of life need to address the issue of the paradigm shift, which is generally accepted by experts. The goal of the paper is to present the systems thinking that facilitates and extends the understanding of the world through relations and linkages. Moreover, the paper introduces the essence of systems thinking and the possibilities to achieve mental a shift toward systems thinking skills. At the same time, the link between systems thinking and functional literacy is presented. We adopted the “Bathtub Test” from the variety of systems thinking tests that allow people to assess the understanding of basic systemic concepts, in order to assess the level of systems thinking. University students (potential information managers were the examined subjects of the examination of systems thinking that was conducted over a longer time period and whose aim was to determine the status of systems thinking. . The paper demonstrates that some pedagogical concepts and activities, in our case the subject of System Dynamics that leads to the appropriate integration of systems thinking in education. There is some evidence that basic knowledge of system dynamics and systems thinking principles will affect students, and their thinking will contribute to an improved approach to solving problems of computer science both in theory and practice.

  8. Computer controlled vacuum control system for synchrotron radiation beam lines

    International Nuclear Information System (INIS)

    Goldberg, S.M.; Wang, C.; Yang, J.

    1983-01-01

    The increasing number and complexity of vacuum control systems at the Stanford Synchrotron Radiation Laboratory has resulted in the need to computerize its operations in order to lower costs and increase efficiency of operation. Status signals are transmitted through digital and analog serial data links which use microprocessors to monitor vacuum status continuously. Each microprocessor has a unique address and up to 256 can be connected to the host computer over a single RS232 data line. A FORTRAN program on the host computer will request status messages and send control messages via only one RS232 line per beam line, signal the operator when a fault condition occurs, take automatic corrective actions, warn of impending valve failure, and keep a running log of all changes in vacuum status for later recall. Wiring costs are thus greatly reduced and more status conditions can be monitored without adding excessively to the complexity of the system. Operators can then obtain status reports at various locations in the lab quickly without having to read a large number of meter and LED's

  9. Accurate Computed Enthalpies of Spin Crossover in Iron and Cobalt Complexes

    DEFF Research Database (Denmark)

    Kepp, Kasper Planeta; Cirera, J

    2009-01-01

    Despite their importance in many chemical processes, the relative energies of spin states of transition metal complexes have so far been haunted by large computational errors. By the use of six functionals, B3LYP, BP86, TPSS, TPSSh, M06L, and M06L, this work studies nine complexes (seven with iron...

  10. Efficient physical embedding of topologically complex information processing networks in brains and computer circuits.

    Directory of Open Access Journals (Sweden)

    Danielle S Bassett

    2010-04-01

    Full Text Available Nervous systems are information processing networks that evolved by natural selection, whereas very large scale integrated (VLSI computer circuits have evolved by commercially driven technology development. Here we follow historic intuition that all physical information processing systems will share key organizational properties, such as modularity, that generally confer adaptivity of function. It has long been observed that modular VLSI circuits demonstrate an isometric scaling relationship between the number of processing elements and the number of connections, known as Rent's rule, which is related to the dimensionality of the circuit's interconnect topology and its logical capacity. We show that human brain structural networks, and the nervous system of the nematode C. elegans, also obey Rent's rule, and exhibit some degree of hierarchical modularity. We further show that the estimated Rent exponent of human brain networks, derived from MRI data, can explain the allometric scaling relations between gray and white matter volumes across a wide range of mammalian species, again suggesting that these principles of nervous system design are highly conserved. For each of these fractal modular networks, the dimensionality of the interconnect topology was greater than the 2 or 3 Euclidean dimensions of the space in which it was embedded. This relatively high complexity entailed extra cost in physical wiring: although all networks were economically or cost-efficiently wired they did not strictly minimize wiring costs. Artificial and biological information processing systems both may evolve to optimize a trade-off between physical cost and topological complexity, resulting in the emergence of homologous principles of economical, fractal and modular design across many different kinds of nervous and computational networks.

  11. Supporting Privacy of Computations in Mobile Big Data Systems

    Directory of Open Access Journals (Sweden)

    Sriram Nandha Premnath

    2016-05-01

    Full Text Available Cloud computing systems enable clients to rent and share computing resources of third party platforms, and have gained widespread use in recent years. Numerous varieties of mobile, small-scale devices such as smartphones, red e-health devices, etc., across users, are connected to one another through the massive internetwork of vastly powerful servers on the cloud. While mobile devices store “private information” of users such as location, payment, health data, etc., they may also contribute “semi-public information” (which may include crowdsourced data such as transit, traffic, nearby points of interests, etc. for data analytics. In such a scenario, a mobile device may seek to obtain the result of a computation, which may depend on its private inputs, crowdsourced data from other mobile devices, and/or any “public inputs” from other servers on the Internet. We demonstrate a new method of delegating real-world computations of resource-constrained mobile clients using an encrypted program known as the garbled circuit. Using the garbled version of a mobile client’s inputs, a server in the cloud executes the garbled circuit and returns the resulting garbled outputs. Our system assures privacy of the mobile client’s input data and output of the computation, and also enables the client to verify that the evaluator actually performed the computation. We analyze the complexity of our system. We measure the time taken to construct the garbled circuit as well as evaluate it for varying number of servers. Using real-world data, we evaluate our system for a practical, privacy preserving search application that locates the nearest point of interest for the mobile client to demonstrate feasibility.

  12. Three-dimensional coupled Monte Carlo-discrete ordinates computational scheme for shielding calculations of large and complex nuclear facilities

    International Nuclear Information System (INIS)

    Chen, Y.; Fischer, U.

    2005-01-01

    Shielding calculations of advanced nuclear facilities such as accelerator based neutron sources or fusion devices of the tokamak type are complicated due to their complex geometries and their large dimensions, including bulk shields of several meters thickness. While the complexity of the geometry in the shielding calculation can be hardly handled by the discrete ordinates method, the deep penetration of radiation through bulk shields is a severe challenge for the Monte Carlo particle transport technique. This work proposes a dedicated computational scheme for coupled Monte Carlo-Discrete Ordinates transport calculations to handle this kind of shielding problems. The Monte Carlo technique is used to simulate the particle generation and transport in the target region with both complex geometry and reaction physics, and the discrete ordinates method is used to treat the deep penetration problem in the bulk shield. The coupling scheme has been implemented in a program system by loosely integrating the Monte Carlo transport code MCNP, the three-dimensional discrete ordinates code TORT and a newly developed coupling interface program for mapping process. Test calculations were performed with comparison to MCNP solutions. Satisfactory agreements were obtained between these two approaches. The program system has been chosen to treat the complicated shielding problem of the accelerator-based IFMIF neutron source. The successful application demonstrates that coupling scheme with the program system is a useful computational tool for the shielding analysis of complex and large nuclear facilities. (authors)

  13. A stand alone computer system to aid the development of mirror fusion test facility RF heating systems

    International Nuclear Information System (INIS)

    Thomas, R.A.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF-B) control system architecture requires the Supervisory Control and Diagnostic System (SCDS) to communicate with a LSI-11 Local Control Computer (LCC) that in turn communicates via a fiber optic link to CAMAC based control hardware located near the machine. In many cases, the control hardware is very complex and requires a sizable development effort prior to being integrated into the overall MFTF-B system. One such effort was the development of the Electron Cyclotron Resonance Heating (ECRH) system. It became clear that a stand alone computer system was needed to simulate the functions of SCDS. This paper describes the hardware and software necessary to implement the SCDS Simulation Computer (SSC). It consists of a Digital Equipment Corporation (DEC) LSI-11 computer and a Winchester/Floppy disk operating under the DEC RT-11 operating system. All application software for MFTF-B is programmed in PASCAL, which allowed us to adapt procedures originally written for SCDS to the SSC. This nearly identical software interface means that software written during the equipment development will be useful to the SCDS programmers in the integration phase

  14. Modeling Complex Systems

    CERN Document Server

    Boccara, Nino

    2010-01-01

    Modeling Complex Systems, 2nd Edition, explores the process of modeling complex systems, providing examples from such diverse fields as ecology, epidemiology, sociology, seismology, and economics. It illustrates how models of complex systems are built and provides indispensable mathematical tools for studying their dynamics. This vital introductory text is useful for advanced undergraduate students in various scientific disciplines, and serves as an important reference book for graduate students and young researchers. This enhanced second edition includes: . -recent research results and bibliographic references -extra footnotes which provide biographical information on cited scientists who have made significant contributions to the field -new and improved worked-out examples to aid a student’s comprehension of the content -exercises to challenge the reader and complement the material Nino Boccara is also the author of Essentials of Mathematica: With Applications to Mathematics and Physics (Springer, 2007).

  15. Resilient computer system design

    CERN Document Server

    Castano, Victor

    2015-01-01

    This book presents a paradigm for designing new generation resilient and evolving computer systems, including their key concepts, elements of supportive theory, methods of analysis and synthesis of ICT with new properties of evolving functioning, as well as implementation schemes and their prototyping. The book explains why new ICT applications require a complete redesign of computer systems to address challenges of extreme reliability, high performance, and power efficiency. The authors present a comprehensive treatment for designing the next generation of computers, especially addressing safety-critical, autonomous, real time, military, banking, and wearable health care systems.   §  Describes design solutions for new computer system - evolving reconfigurable architecture (ERA) that is free from drawbacks inherent in current ICT and related engineering models §  Pursues simplicity, reliability, scalability principles of design implemented through redundancy and re-configurability; targeted for energy-,...

  16. Theory and simulation of cavity quantum electro-dynamics in multi-partite quantum complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Alidoosty Shahraki, Moslem; Khorasani, Sina; Aram, Mohammad Hasan [Sharif University of Technology, School of Electrical Engineering, Tehran (Iran, Islamic Republic of)

    2014-05-15

    The cavity quantum electrodynamics of various complex systems is here analyzed using a general versatile code developed in this research. Such quantum multi-partite systems normally consist of an arbitrary number of quantum dots in interaction with an arbitrary number of cavity modes. As an example, a nine-partition system is simulated under different coupling regimes, consisting of eight emitters interacting with one cavity mode. Two-level emitters (e.g. quantum dots) are assumed to have an arrangement in the form of a linear chain, defining the mutual dipole-dipole interactions. It was observed that plotting the system trajectory in the phase space reveals a chaotic behavior in the so-called ultrastrong-coupling regime. This result is mathematically confirmed by detailed calculation of the Kolmogorov entropy, as a measure of chaotic behavior. In order to study the computational complexity of our code, various multi-partite systems consisting of one to eight quantum dots in interaction with one cavity mode were solved individually. Computation run times and the allocated memory for each system were measured. (orig.)

  17. Attacks on computer systems

    Directory of Open Access Journals (Sweden)

    Dejan V. Vuletić

    2012-01-01

    Full Text Available Computer systems are a critical component of the human society in the 21st century. Economic sector, defense, security, energy, telecommunications, industrial production, finance and other vital infrastructure depend on computer systems that operate at local, national or global scales. A particular problem is that, due to the rapid development of ICT and the unstoppable growth of its application in all spheres of the human society, their vulnerability and exposure to very serious potential dangers increase. This paper analyzes some typical attacks on computer systems.

  18. Extending Life Concepts to Complex Systems

    Directory of Open Access Journals (Sweden)

    Jean Le Fur

    2013-01-01

    Full Text Available There is still no consensus definition of complex systems. This article explores, as a heuristic approach, the possibility of using notions associated with life as transversal concepts for defining complex systems. This approach is developed within a general classification of systems, with complex systems considered as a general ‘living things’ category and living organisms as a specialised class within this category. Concepts associated with life are first explored in the context of complex systems: birth, death and lifetime, adaptation, ontogeny and growth, reproduction. Thereafter, a refutation approach is used to test the proposed classification against a set of diverse systems, including a reference case, edge cases and immaterial complex systems. The summary of this analysis is then used to generate a definition of complex systems, based on the proposal, and within the background of cybernetics, complex adaptive systems and biology. Using notions such as ‘birth’ or ‘lifespan’ as transversal concepts may be of heuristic value for the generic characterization of complex systems, opening up new lines of research for improving their definition.

  19. Regimes of data output from an automated scanning system into a computer

    International Nuclear Information System (INIS)

    Ovsov, Yu.V.; Shaislamov, P.T.

    1984-01-01

    A method is described for accomplishment of rather a complex algorithm of various coordinate and service data transmission from different automated scanning system devices into a monitoring computer in the automated system for processing images from bubble chambers. The accepted data output algorithm and the developed appropriate equipment enable data transmission both in separate words and word arrays

  20. Analysis of the Health Information and Communication System and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2015-05-01

    Full Text Available This paper describes an analysis and shows its use in analysing strengths, weaknesses, opportunities and threats (risks within the health care system.The aim is further more to show strengths, weaknesses, opportunities and threats when using cloud computing in the health care system. Cloud computing in medicine is an integral part of telemedicine. Based on the information presented in this paper, employees may identify the advantages and disadvantages of using cloud computing. When introducing new information technologies in the health care business the implementers will encounter numerous problems, such as: the complexity of the existing and the new information system, the costs of maintaining and updating the software, the cost of implementing new modules,a way of protecting the existing data in the database and the data that will be collected in the diagnosis. Using the SWOT analysis this paper evaluates the feasibility and possibility of adopting cloud computing in the health sector to improve health services based on samples (examples from abroad. The intent of cloud computing in medicine is to send data of the patient to the doctor instead of the patient sending it himself/herself.

  1. Increasing complexity with quantum physics.

    Science.gov (United States)

    Anders, Janet; Wiesner, Karoline

    2011-09-01

    We argue that complex systems science and the rules of quantum physics are intricately related. We discuss a range of quantum phenomena, such as cryptography, computation and quantum phases, and the rules responsible for their complexity. We identify correlations as a central concept connecting quantum information and complex systems science. We present two examples for the power of correlations: using quantum resources to simulate the correlations of a stochastic process and to implement a classically impossible computational task.

  2. The Evolution of Software and Its Impact on Complex System Design in Robotic Spacecraft Embedded Systems

    Science.gov (United States)

    Butler, Roy

    2013-01-01

    The growth in computer hardware performance, coupled with reduced energy requirements, has led to a rapid expansion of the resources available to software systems, driving them towards greater logical abstraction, flexibility, and complexity. This shift in focus from compacting functionality into a limited field towards developing layered, multi-state architectures in a grand field has both driven and been driven by the history of embedded processor design in the robotic spacecraft industry.The combinatorial growth of interprocess conditions is accompanied by benefits (concurrent development, situational autonomy, and evolution of goals) and drawbacks (late integration, non-deterministic interactions, and multifaceted anomalies) in achieving mission success, as illustrated by the case of the Mars Reconnaissance Orbiter. Approaches to optimizing the benefits while mitigating the drawbacks have taken the form of the formalization of requirements, modular design practices, extensive system simulation, and spacecraft data trend analysis. The growth of hardware capability and software complexity can be expected to continue, with future directions including stackable commodity subsystems, computer-generated algorithms, runtime reconfigurable processors, and greater autonomy.

  3. Single photon emission computed tomography in AIDS dementia complex

    International Nuclear Information System (INIS)

    Pohl, P.; Vogl, G.; Fill, H.; Roessler, H.Z.; Zangerle, R.; Gerstenbrand, F.

    1988-01-01

    Single photon emission computed tomography (SPECT) studies were performed in AIDS dementia complex using IMP in 12 patients (and HM-PAO in four of these same patients). In all patients, SPECT revealed either multiple or focal uptake defects, the latter corresponding with focal signs or symptoms in all but one case. Computerized tomography showed a diffuse cerebral atrophy in eight of 12 patients, magnetic resonance imaging exhibited changes like atrophy and/or leukoencephalopathy in two of five cases. Our data indicate that both disturbance of cerebral amine metabolism and alteration of local perfusion share in the pathogenesis of AIDS dementia complex. SPECT is an important aid in the diagnosis of AIDS dementia complex and contributes to the understanding of the pathophysiological mechanisms of this disorder

  4. Lukasiewicz-Moisil Many-Valued Logic Algebra of Highly-Complex Systems

    Directory of Open Access Journals (Sweden)

    James F. Glazebrook

    2010-06-01

    Full Text Available The fundamentals of Lukasiewicz-Moisil logic algebras and their applications to complex genetic network dynamics and highly complex systems are presented in the context of a categorical ontology theory of levels, Medical Bioinformatics and self-organizing, highly complex systems. Quantum Automata were defined in refs.[2] and [3] as generalized, probabilistic automata with quantum state spaces [1]. Their next-state functions operate through transitions between quantum states defined by the quantum equations of motions in the SchrÄodinger representation, with both initial and boundary conditions in space-time. A new theorem is proven which states that the category of quantum automata and automata-homomorphisms has both limits and colimits. Therefore, both categories of quantum automata and classical automata (sequential machines are bicomplete. A second new theorem establishes that the standard automata category is a subcategory of the quantum automata category. The quantum automata category has a faithful representation in the category of Generalized (M,R-Systems which are open, dynamic biosystem networks [4] with de¯ned biological relations that represent physiological functions of primordial(s, single cells and the simpler organisms. A new category of quantum computers is also defined in terms of reversible quantum automata with quantum state spaces represented by topological groupoids that admit a local characterization through unique, quantum Lie algebroids. On the other hand, the category of n-Lukasiewicz algebras has a subcategory of centered n-Lukasiewicz algebras (as proven in ref. [2] which can be employed to design and construct subcategories of quantum automata based on n-Lukasiewicz diagrams of existing VLSI. Furthermore, as shown in ref. [2] the category of centered n-Lukasiewicz algebras and the category of Boolean algebras are naturally equivalent. A `no-go' conjecture is also proposed which states that Generalized (M,R-Systems

  5. New computer systems

    International Nuclear Information System (INIS)

    Faerber, G.

    1975-01-01

    Process computers have already become indespensable technical aids for monitoring and automation tasks in nuclear power stations. Yet there are still some problems connected with their use whose elimination should be the main objective in the development of new computer systems. In the paper, some of these problems are summarized, new tendencies in hardware development are outlined, and finally some new systems concepts made possible by the hardware development are explained. (orig./AK) [de

  6. Coupling a system code with computational fluid dynamics for the simulation of complex coolant reactivity effects

    International Nuclear Information System (INIS)

    Bertolotto, D.

    2011-11-01

    The current doctoral research is focused on the development and validation of a coupled computational tool, to combine the advantages of computational fluid dynamics (CFD) in analyzing complex flow fields and of state-of-the-art system codes employed for nuclear power plant (NPP) simulations. Such a tool can considerably enhance the analysis of NPP transient behavior, e.g. in the case of pressurized water reactor (PWR) accident scenarios such as Main Steam Line Break (MSLB) and boron dilution, in which strong coolant flow asymmetries and multi-dimensional mixing effects strongly influence the reactivity of the reactor core, as described in Chap. 1. To start with, a literature review on code coupling is presented in Chap. 2, together with the corresponding ongoing projects in the international community. Special reference is made to the framework in which this research has been carried out, i.e. the Paul Scherrer Institute's (PSI) project STARS (Steady-state and Transient Analysis Research for the Swiss reactors). In particular, the codes chosen for the coupling, i.e. the CFD code ANSYS CFX V11.0 and the system code US-NRC TRACE V5.0, are part of the STARS codes system. Their main features are also described in Chap. 2. The development of the coupled tool, named CFX/TRACE from the names of the two constitutive codes, has proven to be a complex and broad-based task, and therefore constraints had to be put on the target requirements, while keeping in mind a certain modularity to allow future extensions to be made with minimal efforts. After careful consideration, the coupling was defined to be on-line, parallel and with non-overlapping domains connected by an interface, which was developed through the Parallel Virtual Machines (PVM) software, as described in Chap. 3. Moreover, two numerical coupling schemes were implemented and tested: a sequential explicit scheme and a sequential semi-implicit scheme. Finally, it was decided that the coupling would be single

  7. Development of distributed computer systems for future nuclear power plants

    International Nuclear Information System (INIS)

    Yan, G.; L'Archeveque, J.V.R.

    1978-01-01

    Dual computers have been used for direct digital control in CANDU power reactors since 1963. However, as reactor plants have grown in size and complexity, some drawbacks to centralized control appear such as, for example, the surprisingly large amount of cabling required for information transmission. Dramatic changes in costs of components and a desire to improve system performance have stimulated a broad-based research and development effort in distribution systems. This paper outlines work in this area

  8. TVENT: a computer program for analysis of tornado-induced transients in ventilation systems

    International Nuclear Information System (INIS)

    Duerre, K.H.; Andrae, R.W.; Gregory, W.S.

    1978-07-01

    The report describes TVENT, a portable FORTRAN computer program for predicting flows and pressures in a ventilation system subject to a tornado. The pressure and flow values calculated by TVENT can be used as a basis for structural analysis. TVENT is a one-dimensional, lumped-parameter model with incompressible flow augmented by fluid storage. The theoretical basis for the mathematical modeling and analysis is presented, and a description of the input for the computer code is provided. Modeling techniques specific to ventilation systems are described. Sample problems illustrate the use of TVENT in analyzing ventilation systems. Other sample problems illustrate modeling techniques used in reducing complex systems

  9. A Symbolic and Graphical Computer Representation of Dynamical Systems

    Science.gov (United States)

    Gould, Laurence I.

    2005-04-01

    AUTONO is a Macsyma/Maxima program, designed at the University of Hartford, for solving autonomous systems of differential equations as well as for relating Lagrangians and Hamiltonians to their associated dynamical equations. AUTONO can be used in a number of fields to decipher a variety of complex dynamical systems with ease, producing their Lagrangian and Hamiltonian equations in seconds. These equations can then be incorporated into VisSim, a modeling and simulation program, which yields graphical representations of motion in a given system through easily chosen input parameters. The program, along with the VisSim differential-equations graphical package, allows for resolution and easy understanding of complex problems in a relatively short time; thus enabling quicker and more advanced computing of dynamical systems on any number of platforms---from a network of sensors on a space probe, to the behavior of neural networks, to the effects of an electromagnetic field on components in a dynamical system. A flowchart of AUTONO, along with some simple applications and VisSim output, will be shown.

  10. Belle computing system

    International Nuclear Information System (INIS)

    Adachi, Ichiro; Hibino, Taisuke; Hinz, Luc; Itoh, Ryosuke; Katayama, Nobu; Nishida, Shohei; Ronga, Frederic; Tsukamoto, Toshifumi; Yokoyama, Masahiko

    2004-01-01

    We describe the present status of the computing system in the Belle experiment at the KEKB e+e- asymmetric-energy collider. So far, we have logged more than 160fb-1 of data, corresponding to the world's largest data sample of 170M BB-bar pairs at the -bar (4S) energy region. A large amount of event data has to be processed to produce an analysis event sample in a timely fashion. In addition, Monte Carlo events have to be created to control systematic errors accurately. This requires stable and efficient usage of computing resources. Here, we review our computing model and then describe how we efficiently proceed DST/MC productions in our system

  11. Characterizing chemical systems with on-line computers and graphics

    International Nuclear Information System (INIS)

    Frazer, J.W.; Rigdon, L.P.; Brand, H.R.; Pomernacki, C.L.

    1979-01-01

    Incorporating computers and graphics on-line to chemical experiments and processes opens up new opportunities for the study and control of complex systems. Systems having many variables can be characterized even when the variable interactions are nonlinear, and the system cannot a priori be represented by numerical methods and models. That is, large sets of accurate data can be rapidly acquired, then modeling and graphic techniques can be used to obtain partial interpretation plus design of further experimentation. The experimenter can thus comparatively quickly iterate between experimentation and modeling to obtain a final solution. We have designed and characterized a versatile computer-controlled apparatus for chemical research, which incorporates on-line instrumentation and graphics. It can be used to determine the mechanism of enzyme-induced reactions or to optimize analytical methods. The apparatus can also be operated as a pilot plant to design control strategies. On-line graphics were used to display conventional plots used by biochemists and three-dimensional response-surface plots

  12. Complex Systems: An Introduction

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 9. Complex Systems: An Introduction - Anthropic Principle, Terrestrial Complexity, Complex Materials. V K Wadhawan. General Article Volume 14 Issue 9 September 2009 pp 894-906 ...

  13. Petascale Computational Systems

    OpenAIRE

    Bell, Gordon; Gray, Jim; Szalay, Alex

    2007-01-01

    Computational science is changing to be data intensive. Super-Computers must be balanced systems; not just CPU farms but also petascale IO and networking arrays. Anyone building CyberInfrastructure should allocate resources to support a balanced Tier-1 through Tier-3 design.

  14. Complex Systems and Dependability

    CERN Document Server

    Zamojski, Wojciech; Sugier, Jaroslaw

    2012-01-01

    Typical contemporary complex system is a multifaceted amalgamation of technical, information, organization, software and human (users, administrators and management) resources. Complexity of such a system comes not only from its involved technical and organizational structure but mainly from complexity of information processes that must be implemented in the operational environment (data processing, monitoring, management, etc.). In such case traditional methods of reliability analysis focused mainly on technical level are usually insufficient in performance evaluation and more innovative meth

  15. Several problems of algorithmization in integrated computation programs on third generation computers for short circuit currents in complex power networks

    Energy Technology Data Exchange (ETDEWEB)

    Krylov, V.A.; Pisarenko, V.P.

    1982-01-01

    Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

  16. Computationally Efficient Power Allocation Algorithm in Multicarrier-Based Cognitive Radio Networks: OFDM and FBMC Systems

    Directory of Open Access Journals (Sweden)

    Shaat Musbah

    2010-01-01

    Full Text Available Cognitive Radio (CR systems have been proposed to increase the spectrum utilization by opportunistically access the unused spectrum. Multicarrier communication systems are promising candidates for CR systems. Due to its high spectral efficiency, filter bank multicarrier (FBMC can be considered as an alternative to conventional orthogonal frequency division multiplexing (OFDM for transmission over the CR networks. This paper addresses the problem of resource allocation in multicarrier-based CR networks. The objective is to maximize the downlink capacity of the network under both total power and interference introduced to the primary users (PUs constraints. The optimal solution has high computational complexity which makes it unsuitable for practical applications and hence a low complexity suboptimal solution is proposed. The proposed algorithm utilizes the spectrum holes in PUs bands as well as active PU bands. The performance of the proposed algorithm is investigated for OFDM and FBMC based CR systems. Simulation results illustrate that the proposed resource allocation algorithm with low computational complexity achieves near optimal performance and proves the efficiency of using FBMC in CR context.

  17. Expert-systems and computer-based industrial systems

    International Nuclear Information System (INIS)

    Terrien, J.F.

    1987-01-01

    Framatome makes wide use of expert systems, computer-assisted engineering, production management and personnel training. It has set up separate business units and subsidiaries and also participates in other companies which have the relevant expertise. Five examples of the products and services available in these are discussed. These are in the field of applied artificial intelligence and expert-systems, in integrated computer-aid design and engineering, structural analysis, computer-related products and services and document management systems. The structure of the companies involved and the work they are doing is discussed. (UK)

  18. Evolution of perturbed dynamical systems: analytical computation with time independent accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Gurzadyan, A.V. [Russian-Armenian (Slavonic) University, Department of Mathematics and Mathematical Modelling, Yerevan (Armenia); Kocharyan, A.A. [Monash University, School of Physics and Astronomy, Clayton (Australia)

    2016-12-15

    An analytical method for investigation of the evolution of dynamical systems with independent on time accuracy is developed for perturbed Hamiltonian systems. The error-free estimation using of computer algebra enables the application of the method to complex multi-dimensional Hamiltonian and dissipative systems. It also opens principal opportunities for the qualitative study of chaotic trajectories. The performance of the method is demonstrated on perturbed two-oscillator systems. It can be applied to various non-linear physical and astrophysical systems, e.g. to long-term planetary dynamics. (orig.)

  19. Wireless Mobile Computing and its Links to Descriptive Complexity

    Czech Academy of Sciences Publication Activity Database

    Wiedermann, Jiří; Pardubská, D.

    2008-01-01

    Roč. 19, č. 4 (2008), s. 887-913 ISSN 0129-0541 R&D Projects: GA AV ČR 1ET100300517 Institutional research plan: CEZ:AV0Z10300504 Keywords : alternating Turing machine * simulation * simultaneous time-space complexity * wireless parallel Turing machine Subject RIV: IN - Informatics, Computer Science Impact factor: 0.554, year: 2008

  20. Adaptive generalized combination complex synchronization of uncertain real and complex nonlinear systems

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Shi-bing, E-mail: wang-shibing@dlut.edu.cn, E-mail: wangxy@dlut.edu.cn [School of Computer and Information Engineering, Fuyang Normal University, Fuyang 236041 (China); Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian 116024 (China); Wang, Xing-yuan, E-mail: wang-shibing@dlut.edu.cn, E-mail: wangxy@dlut.edu.cn [Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian 116024 (China); Wang, Xiu-you [School of Computer and Information Engineering, Fuyang Normal University, Fuyang 236041 (China); Zhou, Yu-fei [College of Electrical Engineering and Automation, Anhui University, Hefei 230601 (China)

    2016-04-15

    With comprehensive consideration of generalized synchronization, combination synchronization and adaptive control, this paper investigates a novel adaptive generalized combination complex synchronization (AGCCS) scheme for different real and complex nonlinear systems with unknown parameters. On the basis of Lyapunov stability theory and adaptive control, an AGCCS controller and parameter update laws are derived to achieve synchronization and parameter identification of two real drive systems and a complex response system, as well as two complex drive systems and a real response system. Two simulation examples, namely, ACGCS for chaotic real Lorenz and Chen systems driving a hyperchaotic complexsystem, and hyperchaotic complex Lorenz and Chen systems driving a real chaotic Lü system, are presented to verify the feasibility and effectiveness of the proposed scheme.

  1. Nonlinear Dynamics, Chaotic and Complex Systems

    Science.gov (United States)

    Infeld, E.; Zelazny, R.; Galkowski, A.

    2011-04-01

    Part I. Dynamic Systems Bifurcation Theory and Chaos: 1. Chaos in random dynamical systems V. M. Gunldach; 2. Controlling chaos using embedded unstable periodic orbits: the problem of optimal periodic orbits B. R. Hunt and E. Ott; 3. Chaotic tracer dynamics in open hydrodynamical flows G. Karolyi, A. Pentek, T. Tel and Z. Toroczkai; 4. Homoclinic chaos L. P. Shilnikov; Part II. Spatially Extended Systems: 5. Hydrodynamics of relativistic probability flows I. Bialynicki-Birula; 6. Waves in ionic reaction-diffusion-migration systems P. Hasal, V. Nevoral, I. Schreiber, H. Sevcikova, D. Snita, and M. Marek; 7. Anomalous scaling in turbulence: a field theoretical approach V. Lvov and I. Procaccia; 8. Abelian sandpile cellular automata M. Markosova; 9. Transport in an incompletely chaotic magnetic field F. Spineanu; Part III. Dynamical Chaos Quantum Physics and Foundations Of Statistical Mechanics: 10. Non-equilibrium statistical mechanics and ergodic theory L. A. Bunimovich; 11. Pseudochaos in statistical physics B. Chirikov; 12. Foundations of non-equilibrium statistical mechanics J. P. Dougherty; 13. Thermomechanical particle simulations W. G. Hoover, H. A. Posch, C. H. Dellago, O. Kum, C. G. Hoover, A. J. De Groot and B. L. Holian; 14. Quantum dynamics on a Markov background and irreversibility B. Pavlov; 15. Time chaos and the laws of nature I. Prigogine and D. J. Driebe; 16. Evolutionary Q and cognitive systems: dynamic entropies and predictability of evolutionary processes W. Ebeling; 17. Spatiotemporal chaos information processing in neural networks H. Szu; 18. Phase transitions and learning in neural networks C. Van den Broeck; 19. Synthesis of chaos A. Vanecek and S. Celikovsky; 20. Computational complexity of continuous problems H. Wozniakowski; Part IV. Complex Systems As An Interface Between Natural Sciences and Environmental Social and Economic Sciences: 21. Stochastic differential geometry in finance studies V. G. Makhankov; Part V. Conference Banquet

  2. Management of complex dynamical systems

    Science.gov (United States)

    MacKay, R. S.

    2018-02-01

    Complex dynamical systems are systems with many interdependent components which evolve in time. One might wish to control their trajectories, but a more practical alternative is to control just their statistical behaviour. In many contexts this would be both sufficient and a more realistic goal, e.g. climate and socio-economic systems. I refer to it as ‘management’ of complex dynamical systems. In this paper, some mathematics for management of complex dynamical systems is developed in the weakly dependent regime, and questions are posed for the strongly dependent regime.

  3. 40-Gb/s PAM4 with low-complexity equalizers for next-generation PON systems

    Science.gov (United States)

    Tang, Xizi; Zhou, Ji; Guo, Mengqi; Qi, Jia; Hu, Fan; Qiao, Yaojun; Lu, Yueming

    2018-01-01

    In this paper, we demonstrate 40-Gb/s four-level pulse amplitude modulation (PAM4) transmission with 10 GHz devices and low-complexity equalizers for next-generation passive optical network (PON) systems. Simple feed-forward equalizer (FFE) and decision feedback equalizer (DFE) enable 20 km fiber transmission while high-complexity Volterra algorithm in combination with FFE and DFE can extend the transmission distance to 40 km. A simplified Volterra algorithm is proposed for reducing computational complexity. Simulation results show that the simplified Volterra algorithm reduces up to ∼75% computational complexity at a relatively low cost of only 0.4 dB power budget. At a forward error correction (FEC) threshold of 10-3 , we achieve 31.2 dB and 30.8 dB power budget over 40 km fiber transmission using traditional FFE-DFE-Volterra and our simplified FFE-DFE-Volterra, respectively.

  4. Computational Design and Experimental Validation of New Thermal Barrier Systems

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Shengmin; Yang, Shizhong; Khosravi, Ebrahim

    2011-12-31

    This project (10/01/2010-9/30/2013), “Computational Design and Experimental Validation of New Thermal Barrier Systems”, originates from Louisiana State University (LSU) Mechanical Engineering Department and Southern University (SU) Department of Computer Science. This proposal will directly support the technical goals specified in DE-FOA-0000248, Topic Area 3: Turbine Materials, by addressing key technologies needed to enable the development of advanced turbines and turbine-based systems that will operate safely and efficiently using coal-derived synthesis gases. We will develop novel molecular dynamics method to improve the efficiency of simulation on novel TBC materials; we will perform high performance computing (HPC) on complex TBC structures to screen the most promising TBC compositions; we will perform material characterizations and oxidation/corrosion tests; and we will demonstrate our new Thermal barrier coating (TBC) systems experimentally under Integrated gasification combined cycle (IGCC) environments. The durability of the coating will be examined using the proposed High Temperature/High Pressure Durability Test Rig under real syngas product compositions.

  5. Computer-assisted learning and simulation systems in dentistry--a challenge to society.

    Science.gov (United States)

    Welk, A; Splieth, Ch; Wierinck, E; Gilpatrick, R O; Meyer, G

    2006-07-01

    Computer technology is increasingly used in practical training at universities. However, in spite of their potential, computer-assisted learning (CAL) and computer-assisted simulation (CAS) systems still appear to be underutilized in dental education. Advantages, challenges, problems, and solutions of computer-assisted learning and simulation in dentistry are discussed by means of MEDLINE, open Internet platform searches, and key results of a study among German dental schools. The advantages of computer-assisted learning are seen for example in self-paced and self-directed learning and increased motivation. It is useful for both objective theoretical and practical tests and for training students to handle complex cases. CAL can lead to more structured learning and can support training in evidence-based decision-making. The reasons for the still relatively rare implementation of CAL/CAS systems in dental education include an inability to finance, lack of studies of CAL/CAS, and too much effort required to integrate CAL/CAS systems into the curriculum. To overcome the reasons for the relative low degree of computer technology use, we should strive for multicenter research and development projects monitored by the appropriate national and international scientific societies, so that the potential of computer technology can be fully realized in graduate, postgraduate, and continuing dental education.

  6. Usage of super high speed computer for clarification of complex phenomena

    International Nuclear Information System (INIS)

    Sekiguchi, Tomotsugu; Sato, Mitsuhisa; Nakata, Hideki; Tatebe, Osami; Takagi, Hiromitsu

    1999-01-01

    This study aims at construction of an efficient super high speed computer system application environment in response to parallel distributed system with easy transplantation to different computer system and different number by conducting research and development on super high speed computer application technology required for elucidation of complicated phenomenon in elucidation of complicated phenomenon of nuclear power field due to computed scientific method. In order to realize such environment, the Electrotechnical Laboratory has conducted development on Ninf, a network numerical information library. This Ninf system can supply a global network infrastructure for worldwide computing with high performance on further wide range distributed network (G.K.)

  7. Estimation of accuracy of the dimensional characteristics of complex objects with kinect systems

    Directory of Open Access Journals (Sweden)

    Елена Юрьевна Мураховская-Печенежская

    2015-03-01

    Full Text Available The article deals with methods for determining the size of objects with complex shapes using computer Kinect systems. The developed methods allow to determine the real shapes and sizes of arbitrary sections on the base of field-based analysis points obtained in Kinect systems. Experimental studies in determining the shape of the human body showed difference not exceeding 2.5 %

  8. Optimal sensor configuration for complex systems

    DEFF Research Database (Denmark)

    Sadegh, Payman; Spall, J. C.

    1998-01-01

    configuration is based on maximizing the overall sensor response while minimizing the correlation among the sensor outputs. The procedure for sensor configuration is based on simultaneous perturbation stochastic approximation (SPSA). SPSA avoids the need for detailed modeling of the sensor response by simply......Considers the problem of sensor configuration for complex systems. Our approach involves definition of an appropriate optimality criterion or performance measure, and description of an efficient and practical algorithm for achieving the optimality objective. The criterion for optimal sensor...... relying on observed responses as obtained by limited experimentation with test sensor configurations. We illustrate the approach with the optimal placement of acoustic sensors for signal detection in structures. This includes both a computer simulation study for an aluminum plate, and real...

  9. Complexity and Dynamical Depth

    Directory of Open Access Journals (Sweden)

    Terrence Deacon

    2014-07-01

    Full Text Available We argue that a critical difference distinguishing machines from organisms and computers from brains is not complexity in a structural sense, but a difference in dynamical organization that is not well accounted for by current complexity measures. We propose a measure of the complexity of a system that is largely orthogonal to computational, information theoretic, or thermodynamic conceptions of structural complexity. What we call a system’s dynamical depth is a separate dimension of system complexity that measures the degree to which it exhibits discrete levels of nonlinear dynamical organization in which successive levels are distinguished by local entropy reduction and constraint generation. A system with greater dynamical depth than another consists of a greater number of such nested dynamical levels. Thus, a mechanical or linear thermodynamic system has less dynamical depth than an inorganic self-organized system, which has less dynamical depth than a living system. Including an assessment of dynamical depth can provide a more precise and systematic account of the fundamental difference between inorganic systems (low dynamical depth and living systems (high dynamical depth, irrespective of the number of their parts and the causal relations between them.

  10. Reconfigurable Computing Platforms and Target System Architectures for Automatic HW/SW Compilation

    OpenAIRE

    Lange, Holger

    2011-01-01

    Embedded systems found their way into all areas of technology and everyday life, from transport systems, facility management, health care, to hand-held computers and cell phones as well as television sets and electric cookers. Modern fabrication techniques enable the integration of such complex sophisticated systems on a single chip (System-on-Chip, SoC). In many cases, a high processing power is required at predetermined, often limited energy budgets. To adjust the processing power even more...

  11. Control of complex systems

    CERN Document Server

    Albertos, Pedro; Blanke, Mogens; Isidori, Alberto; Schaufelberger, Walter; Sanz, Ricardo

    2001-01-01

    The world of artificial systems is reaching complexity levels that es­ cape human understanding. Surface traffic, electricity distribution, air­ planes, mobile communications, etc. , are examples that demonstrate that we are running into problems that are beyond classical scientific or engi­ neering knowledge. There is an ongoing world-wide effort to understand these systems and develop models that can capture its behavior. The reason for this work is clear, if our lack of understanding deepens, we will lose our capability to control these systems and make they behave as we want. Researchers from many different fields are trying to understand and develop theories for complex man-made systems. This book presents re­ search from the perspective of control and systems theory. The book has grown out of activities in the research program Control of Complex Systems (COSY). The program has been sponsored by the Eu­ ropean Science Foundation (ESF) which for 25 years has been one of the leading players in stimula...

  12. Synchronization in node of complex networks consist of complex chaotic system

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Qiang, E-mail: qiangweibeihua@163.com [Beihua University computer and technology College, BeiHua University, Jilin, 132021, Jilin (China); Digital Images Processing Institute of Beihua University, BeiHua University, Jilin, 132011, Jilin (China); Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, 116024 (China); Xie, Cheng-jun [Beihua University computer and technology College, BeiHua University, Jilin, 132021, Jilin (China); Digital Images Processing Institute of Beihua University, BeiHua University, Jilin, 132011, Jilin (China); Liu, Hong-jun [School of Information Engineering, Weifang Vocational College, Weifang, 261041 (China); Li, Yan-hui [The Library, Weifang Vocational College, Weifang, 261041 (China)

    2014-07-15

    A new synchronization method is investigated for node of complex networks consists of complex chaotic system. When complex networks realize synchronization, different component of complex state variable synchronize up to different scaling complex function by a designed complex feedback controller. This paper change synchronization scaling function from real field to complex field for synchronization in node of complex networks with complex chaotic system. Synchronization in constant delay and time-varying coupling delay complex networks are investigated, respectively. Numerical simulations are provided to show the effectiveness of the proposed method.

  13. Cloud Computing Platform for an Online Model Library System

    Directory of Open Access Journals (Sweden)

    Mingang Chen

    2013-01-01

    Full Text Available The rapid developing of digital content industry calls for online model libraries. For the efficiency, user experience, and reliability merits of the model library, this paper designs a Web 3D model library system based on a cloud computing platform. Taking into account complex models, which cause difficulties in real-time 3D interaction, we adopt the model simplification and size adaptive adjustment methods to make the system with more efficient interaction. Meanwhile, a cloud-based architecture is developed to ensure the reliability and scalability of the system. The 3D model library system is intended to be accessible by online users with good interactive experiences. The feasibility of the solution has been tested by experiments.

  14. Models, methods and software tools for building complex adaptive traffic systems

    International Nuclear Information System (INIS)

    Alyushin, S.A.

    2011-01-01

    The paper studies the modern methods and tools to simulate the behavior of complex adaptive systems (CAS), the existing systems of traffic modeling in simulators and their characteristics; proposes requirements for assessing the suitability of the system to simulate the CAS behavior in simulators. The author has developed a model of adaptive agent representation and its functioning environment to meet certain requirements set above, and has presented methods of agents' interactions and methods of conflict resolution in simulated traffic situations. A simulation system realizing computer modeling for simulating the behavior of CAS in traffic situations has been created [ru

  15. Power, autonomy, utopia new approaches toward complex systems

    CERN Document Server

    1986-01-01

    The "world" is becoming more and more intractable. We have learned to discern "systems" in it, we have developed a highly sophisticated math­ ematical apparatus to "model'" them, large computer simulation programs handle thousands of equations with zillions of parameters. But how ade­ quate are these efforts? Part One of this volume is a discussion containing some proposals for eliminating the constraints we encounter when approaching complex systems with our models: Is it possible, at all, to design a political or econom­ ic system without considering killing, torture, and oppression? Can we adequately model the present state of affairs while ignoring their often symbolic and paradoxical nature? Is it possible to explain teleological concepts such as "means" and "ends" in terms of basically 17th century Newtonian mechanics? Can we really make appropriate use of the vast a­ mount of systems concepts without exploring their relations, without de­ veloping a "system of systems concepts"? And why do more th...

  16. Using multi-criteria analysis of simulation models to understand complex biological systems

    Science.gov (United States)

    Maureen C. Kennedy; E. David. Ford

    2011-01-01

    Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...

  17. Computation of Ground-State Properties in Molecular Systems: Back-Propagation with Auxiliary-Field Quantum Monte Carlo.

    Science.gov (United States)

    Motta, Mario; Zhang, Shiwei

    2017-11-14

    We address the computation of ground-state properties of chemical systems and realistic materials within the auxiliary-field quantum Monte Carlo method. The phase constraint to control the Fermion phase problem requires the random walks in Slater determinant space to be open-ended with branching. This in turn makes it necessary to use back-propagation (BP) to compute averages and correlation functions of operators that do not commute with the Hamiltonian. Several BP schemes are investigated, and their optimization with respect to the phaseless constraint is considered. We propose a modified BP method for the computation of observables in electronic systems, discuss its numerical stability and computational complexity, and assess its performance by computing ground-state properties in several molecular systems, including small organic molecules.

  18. Forecasting in Complex Systems

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2014-12-01

    Complex nonlinear systems are typically characterized by many degrees of freedom, as well as interactions between the elements. Interesting examples can be found in the areas of earthquakes and finance. In these two systems, fat tails play an important role in the statistical dynamics. For earthquake systems, the Gutenberg-Richter magnitude-frequency is applicable, whereas for daily returns for the securities in the financial markets are known to be characterized by leptokurtotic statistics in which the tails are power law. Very large fluctuations are present in both systems. In earthquake systems, one has the example of great earthquakes such as the M9.1, March 11, 2011 Tohoku event. In financial systems, one has the example of the market crash of October 19, 1987. Both were largely unexpected events that severely impacted the earth and financial systems systemically. Other examples include the M9.3 Andaman earthquake of December 26, 2004, and the Great Recession which began with the fall of Lehman Brothers investment bank on September 12, 2013. Forecasting the occurrence of these damaging events has great societal importance. In recent years, national funding agencies in a variety of countries have emphasized the importance of societal relevance in research, and in particular, the goal of improved forecasting technology. Previous work has shown that both earthquakes and financial crashes can be described by a common Landau-Ginzburg-type free energy model. These metastable systems are characterized by fat tail statistics near the classical spinodal. Correlations in these systems can grow and recede, but do not imply causation, a common source of misunderstanding. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In this talk, we describe the basic phenomenology of these systems and emphasize their similarities and differences. We also consider the problem of forecast validation and verification

  19. Information and Self-Organization A Macroscopic Approach to Complex Systems

    CERN Document Server

    Haken, Hermann

    2006-01-01

    This book presents the concepts needed to deal with self-organizing complex systems from a unifying point of view that uses macroscopic data. The various meanings of the concept "information" are discussed and a general formulation of the maximum information (entropy) principle is used. With the aid of results from synergetics, adequate objective constraints for a large class of self-organizing systems are formulated and examples are given from physics, life and computer science. The relationship to chaos theory is examined and it is further shown that, based on possibly scarce and noisy data, unbiased guesses about processes of complex systems can be made and the underlying deterministic and random forces determined. This allows for probabilistic predictions of processes, with applications to numerous fields in science, technology, medicine and economics. The extensions of the third edition are essentially devoted to an introduction to the meaning of information in the quantum context. Indeed, quantum inform...

  20. Robust and High Order Computational Method for Parachute and Air Delivery and MAV System

    Science.gov (United States)

    2017-11-01

    numerical algorithms and develop a computational platform forthe study of the dynamic system involving highly complex geometric interface immersed in...students in their summer internship. Results Dissemination: Our research project has produced two publications in the Journal of Fluid and Structure, one...publication in the AIAA journal , one in Communication in Computational Physics, along with several related publications in other journals . Two other

  1. Fault tolerant computing systems

    International Nuclear Information System (INIS)

    Randell, B.

    1981-01-01

    Fault tolerance involves the provision of strategies for error detection damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (orig.)

  2. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  3. Reduction of Subjective and Objective System Complexity

    Science.gov (United States)

    Watson, Michael D.

    2015-01-01

    Occam's razor is often used in science to define the minimum criteria to establish a physical or philosophical idea or relationship. Albert Einstein is attributed the saying "everything should be made as simple as possible, but not simpler". These heuristic ideas are based on a belief that there is a minimum state or set of states for a given system or phenomena. In looking at system complexity, these heuristics point us to an idea that complexity can be reduced to a minimum. How then, do we approach a reduction in complexity? Complexity has been described as a subjective concept and an objective measure of a system. Subjective complexity is based on human cognitive comprehension of the functions and inter relationships of a system. Subjective complexity is defined by the ability to fully comprehend the system. Simplifying complexity, in a subjective sense, is thus gaining a deeper understanding of the system. As Apple's Jonathon Ive has stated," It's not just minimalism or the absence of clutter. It involves digging through the depth of complexity. To be truly simple, you have to go really deep". Simplicity is not the absence of complexity but a deeper understanding of complexity. Subjective complexity, based on this human comprehension, cannot then be discerned from the sociological concept of ignorance. The inability to comprehend a system can be either a lack of knowledge, an inability to understand the intricacies of a system, or both. Reduction in this sense is based purely on a cognitive ability to understand the system and no system then may be truly complex. From this view, education and experience seem to be the keys to reduction or eliminating complexity. Objective complexity, is the measure of the systems functions and interrelationships which exist independent of human comprehension. Jonathon Ive's statement does not say that complexity is removed, only that the complexity is understood. From this standpoint, reduction of complexity can be approached

  4. Computation of infinite dilute activity coefficients of binary liquid alloys using complex formation model

    Energy Technology Data Exchange (ETDEWEB)

    Awe, O.E., E-mail: draweoe2004@yahoo.com; Oshakuade, O.M.

    2016-04-15

    A new method for calculating Infinite Dilute Activity Coefficients (γ{sup ∞}s) of binary liquid alloys has been developed. This method is basically computing γ{sup ∞}s from experimental thermodynamic integral free energy of mixing data using Complex formation model. The new method was first used to theoretically compute the γ{sup ∞}s of 10 binary alloys whose γ{sup ∞}s have been determined by experiments. The significant agreement between the computed values and the available experimental values served as impetus for applying the new method to 22 selected binary liquid alloys whose γ{sup ∞}s are either nonexistent or incomplete. In order to verify the reliability of the computed γ{sup ∞}s of the 22 selected alloys, we recomputed the γ{sup ∞}s using three other existing methods of computing or estimating γ{sup ∞}s and then used the γ{sup ∞}s obtained from each of the four methods (the new method inclusive) to compute thermodynamic activities of components of each of the binary systems. The computed activities were compared with available experimental activities. It is observed that the results from the method being proposed, in most of the selected alloys, showed better agreement with experimental activity data. Thus, the new method is an alternative and in certain instances, more reliable approach of computing γ{sup ∞}s of binary liquid alloys.

  5. Computer system operation

    International Nuclear Information System (INIS)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A.

    1993-12-01

    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new

  6. Computer system operation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1993-12-01

    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new.

  7. Optimized design of embedded DSP system hardware supporting complex algorithms

    Science.gov (United States)

    Li, Yanhua; Wang, Xiangjun; Zhou, Xinling

    2003-09-01

    The paper presents an optimized design method for a flexible and economical embedded DSP system that can implement complex processing algorithms as biometric recognition, real-time image processing, etc. It consists of a floating-point DSP, 512 Kbytes data RAM, 1 Mbytes FLASH program memory, a CPLD for achieving flexible logic control of input channel and a RS-485 transceiver for local network communication. Because of employing a high performance-price ratio DSP TMS320C6712 and a large FLASH in the design, this system permits loading and performing complex algorithms with little algorithm optimization and code reduction. The CPLD provides flexible logic control for the whole DSP board, especially in input channel, and allows convenient interface between different sensors and DSP system. The transceiver circuit can transfer data between DSP and host computer. In the paper, some key technologies are also introduced which make the whole system work efficiently. Because of the characters referred above, the hardware is a perfect flat for multi-channel data collection, image processing, and other signal processing with high performance and adaptability. The application section of this paper presents how this hardware is adapted for the biometric identification system with high identification precision. The result reveals that this hardware is easy to interface with a CMOS imager and is capable of carrying out complex biometric identification algorithms, which require real-time process.

  8. Computers as components principles of embedded computing system design

    CERN Document Server

    Wolf, Marilyn

    2012-01-01

    Computers as Components: Principles of Embedded Computing System Design, 3e, presents essential knowledge on embedded systems technology and techniques. Updated for today's embedded systems design methods, this edition features new examples including digital signal processing, multimedia, and cyber-physical systems. Author Marilyn Wolf covers the latest processors from Texas Instruments, ARM, and Microchip Technology plus software, operating systems, networks, consumer devices, and more. Like the previous editions, this textbook: Uses real processors to demonstrate both technology and tec

  9. Increase of Organization in Complex Systems

    OpenAIRE

    Georgiev, Georgi Yordanov; Daly, Michael; Gombos, Erin; Vinod, Amrit; Hoonjan, Gajinder

    2013-01-01

    Measures of complexity and entropy have not converged to a single quantitative description of levels of organization of complex systems. The need for such a measure is increasingly necessary in all disciplines studying complex systems. To address this problem, starting from the most fundamental principle in Physics, here a new measure for quantity of organization and rate of self-organization in complex systems based on the principle of least (stationary) action is applied to a model system -...

  10. Quality plan and configuration management in complex systems

    International Nuclear Information System (INIS)

    Gonzalez Junto, J.; Merchan Teyssiere

    1993-01-01

    Since the Second World War, the philosophy behind the quality systems of industries and service companies has evolved to embrace the whole life cycle of the product, system or service. In this evolution process, quality has become a strategic factor in the survival of entreprises. The first steps in trying to establish quality systems were taken for the armed forces, followed by space, aeronautical and nuclear projects, whose products were more and more complex and sophisticated. These systems were established by means of quality plans or programmes, and their basic objective was to guarantee a high safety level for the user and/or the general population. In later years, the main concern was to reach a determined quality level not only in one phase of the product life, but in the complete life cycle of the final product. Today a new goal is established and pursued: better quality of the product, service or system life cycle at a lower cost. Methods of improving the quality of systems and processes are the subject of numerous initiatives and studies, to better availability and maintainability of complex equipment or installations, with an extended useful life and greater requirements. Experience in the performance of complex projects shows that a higher quality may be obtained through designing a comprehensive quality plan which pays special attention to information management and modifications of the original design. Obtaining a high reliability level for an installation (equipment, systems, etc), increasing its availability and rationalizing its maintenance may be little less than fanciful without a deep knowledge of the installation, of its activities and its current status in day-to-day operation, which shows the importance of truthful information available to operators and corresponding exactly to their needs. In this frame of mind, a quality plan comprising a configuration management system of information and documents constitutes the basic support tool for

  11. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  12. Planning and Building Qualifiable Embedded Systems: Safety and Risk Properties Assessment for a Large and Complex System with Embedded Subsystems

    Science.gov (United States)

    Silva, N.; Lopes, R.; Barbosa, R.

    2012-01-01

    Systems based on embedded components and applications are today used in all markets. They are planned and developed by all types of institutions with different types of background experience, multidisciplinary teams and all types of capability and maturity levels. Organisational/engineering maturity has an impact on all aspects of the engineering of large and complex systems. An embedded system is a specific computer system designed to perform one or more dedicated functions, usually with real-time constraints. It is generally integrated as part of a more complex device typically composed of specific hardware such as sensors and actuators. This article presents an experimented technique to evaluate the organisation, processes, system and software engineering practices, methods, tools and the planned/produced artefacts themselves, leading towards certification/qualification. The safety and risk assessment of such core and complex systems is explained, described on a step-by- step manner, while presenting the main results and conclusions of the application of the technique to a real case study.

  13. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  14. Anti-synchronization between different chaotic complex systems

    International Nuclear Information System (INIS)

    Liu Ping; Liu Shutang

    2011-01-01

    Many studies on the anti-synchronization of nonlinear real dynamic systems have been carried out, whereas the anti-synchronization of chaotic complex systems has not been studied extensively. In this work, the anti-synchronization between a new chaotic complex system and a complex Lorenz system and that between a new chaotic complex system and a complex Lue system were separately investigated by active control and nonlinear control methods, and explicit expressions were derived for the controllers that are used to achieve the anti-synchronization of chaotic complex systems. These expressions were tested numerically and excellent agreement was found. Concerning the new chaotic complex system, we discuss its dynamical properties including dissipation, chaotic behavior, fixed points, and their stability and invariance.

  15. Complexity of Economical Systems

    Directory of Open Access Journals (Sweden)

    G. P. Pavlos

    2015-01-01

    Full Text Available In this study new theoretical concepts are described concerning the interpretation of economical complex dynamics. In addition a summary of an extended algorithm of nonlinear time series analysis is provided which is applied not only in economical time series but also in other physical complex systems (e.g. [22, 24]. In general, Economy is a vast and complicated set of arrangements and actions wherein agents—consumers, firms, banks, investors, government agencies—buy and sell, speculate, trade, oversee, bring products into being, offer services, invest in companies, strategize, explore, forecast, compete, learn, innovate, and adapt. As a result the economic and financial variables such as foreign exchange rates, gross domestic product, interest rates, production, stock market prices and unemployment exhibit large-amplitude and aperiodic fluctuations evident in complex systems. Thus, the Economics can be considered as spatially distributed non-equilibrium complex system, for which new theoretical concepts, such as Tsallis non extensive statistical mechanics and strange dynamics, percolation, nonGaussian, multifractal and multiscale dynamics related to fractional Langevin equations can be used for modeling and understanding of the economical complexity locally or globally.

  16. Experiment Dashboard for Monitoring of the LHC Distributed Computing Systems

    International Nuclear Information System (INIS)

    Andreeva, J; Campos, M Devesas; Cros, J Tarragon; Gaidioz, B; Karavakis, E; Kokoszkiewicz, L; Lanciotti, E; Maier, G; Ollivier, W; Nowotka, M; Rocha, R; Sadykov, T; Saiz, P; Sargsyan, L; Sidorova, I; Tuckett, D

    2011-01-01

    LHC experiments are currently taking collisions data. A distributed computing model chosen by the four main LHC experiments allows physicists to benefit from resources spread all over the world. The distributed model and the scale of LHC computing activities increase the level of complexity of middleware, and also the chances of possible failures or inefficiencies in involved components. In order to ensure the required performance and functionality of the LHC computing system, monitoring the status of the distributed sites and services as well as monitoring LHC computing activities are among the key factors. Over the last years, the Experiment Dashboard team has been working on a number of applications that facilitate the monitoring of different activities: including following up jobs, transfers, and also site and service availabilities. This presentation describes Experiment Dashboard applications used by the LHC experiments and experience gained during the first months of data taking.

  17. Process computer system for the prototype ATR 'Fugen'

    International Nuclear Information System (INIS)

    Oteru, Shigeru

    1979-01-01

    In recent nuclear power plants, computers are regarded as one of component equipments, and data processing, plant monitoring and performance calculation tend to be carried out with one on-line computer. As plants become large and complex, and the operational conditions become strict, the system having the function of performance calculation and reflecting the results immediately to operation is introduced. In the process computer for the prototype ATR ''Fugen'', the function of prediction to simulate the state after operation before the operation accompanied by the change of reactivity in a core, such as the operation of control rods and the control of liquid poison during operation, was given in addition to the functions of data processing, plant monitoring and detailed performance calculation. Core periodic monitoring program, core operational aid program, core any time data collecting program, and core periodic data collecting program, and their application programs are explained. Core performance calculation is the calculation of thermal output distribution in the core and the various accompanying characteristics and the monitoring of thermal limiting values. The computer used is a Hitachi control computer HIDIC-500, and typewriters, a process colored display, an operating console and other peripheral equipments are connected to it. (Kako, I.)

  18. Modeling Complex Systems

    International Nuclear Information System (INIS)

    Schreckenberg, M

    2004-01-01

    This book by Nino Boccara presents a compilation of model systems commonly termed as 'complex'. It starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics. The subsequent chapters are devoted to various categories of mean-field type models (differential and recurrence equations, chaos) and of agent-based models (cellular automata, networks and power-law distributions). Each chapter is supplemented by a number of exercises and their solutions. The table of contents looks a little arbitrary but the author took the most prominent model systems investigated over the years (and up until now there has been no unified theory covering the various aspects of complex dynamics). The model systems are explained by looking at a number of applications in various fields. The book is written as a textbook for interested students as well as serving as a comprehensive reference for experts. It is an ideal source for topics to be presented in a lecture on dynamics of complex systems. This is the first book on this 'wide' topic and I have long awaited such a book (in fact I planned to write it myself but this is much better than I could ever have written it!). Only section 6 on cellular automata is a little too limited to the author's point of view and one would have expected more about the famous Domany-Kinzel model (and more accurate citation!). In my opinion this is one of the best textbooks published during the last decade and even experts can learn a lot from it. Hopefully there will be an actualization after, say, five years since this field is growing so quickly. The price is too high for students but this, unfortunately, is the normal case today. Nevertheless I think it will be a great success! (book review)

  19. Model-Based Development and Evaluation of Control for Complex Multi-Domain Systems

    DEFF Research Database (Denmark)

    Grujic, Ivan; Nilsson, Rene

    A Cyber-Physical System (CPS) incorporates sensing, actuating, computing and communicative capabilities, which are often combined to control the system. The development of CPSs poses a challenge, since the complexity of the physical system dynamics must be taken into account when designing...... Unmanned Aerial Vehicle (UAV) has been constructed and used to develop an attitude controller based on Model Predictive Control (MPC). The MPC controller has been compared to an existing open source Proportional Integral Derivative (PID) attitude controller. This thesis contributes to the discipline...

  20. Optimal control of complex atomic quantum systems.

    Science.gov (United States)

    van Frank, S; Bonneau, M; Schmiedmayer, J; Hild, S; Gross, C; Cheneau, M; Bloch, I; Pichler, T; Negretti, A; Calarco, T; Montangero, S

    2016-10-11

    Quantum technologies will ultimately require manipulating many-body quantum systems with high precision. Cold atom experiments represent a stepping stone in that direction: a high degree of control has been achieved on systems of increasing complexity. However, this control is still sub-optimal. In many scenarios, achieving a fast transformation is crucial to fight against decoherence and imperfection effects. Optimal control theory is believed to be the ideal candidate to bridge the gap between early stage proof-of-principle demonstrations and experimental protocols suitable for practical applications. Indeed, it can engineer protocols at the quantum speed limit - the fastest achievable timescale of the transformation. Here, we demonstrate such potential by computing theoretically and verifying experimentally the optimal transformations in two very different interacting systems: the coherent manipulation of motional states of an atomic Bose-Einstein condensate and the crossing of a quantum phase transition in small systems of cold atoms in optical lattices. We also show that such processes are robust with respect to perturbations, including temperature and atom number fluctuations.

  1. A new efficient algorithm for computing the imprecise reliability of monotone systems

    International Nuclear Information System (INIS)

    Utkin, Lev V.

    2004-01-01

    Reliability analysis of complex systems by partial information about reliability of components and by different conditions of independence of components may be carried out by means of the imprecise probability theory which provides a unified framework (natural extension, lower and upper previsions) for computing the system reliability. However, the application of imprecise probabilities to reliability analysis meets with a complexity of optimization problems which have to be solved for obtaining the system reliability measures. Therefore, an efficient simplified algorithm to solve and decompose the optimization problems is proposed in the paper. This algorithm allows us to practically implement reliability analysis of monotone systems under partial and heterogeneous information about reliability of components and under conditions of the component independence or the lack of information about independence. A numerical example illustrates the algorithm

  2. Complex logistics audit system

    Directory of Open Access Journals (Sweden)

    Zuzana Marková

    2010-02-01

    Full Text Available Complex logistics audit system is a tool for realization of logistical audit in the company. The current methods for logistics auditare based on “ad hok” analysis of logisticsl system. This paper describes system for complex logistics audit. It is a global diagnosticsof logistics processes and functions of enterprise. The goal of logistics audit is to provide comparative documentation for managementabout state of logistics in company and to show the potential of logistics changes in order to achieve more effective companyperformance.

  3. Systems analysis and the computer

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, A S

    1983-08-01

    The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.

  4. Equation-free and variable free modeling for complex/multiscale systems. Coarse-grained computation in science and engineering using fine-grained models

    Energy Technology Data Exchange (ETDEWEB)

    Kevrekidis, Ioannis G. [Princeton Univ., NJ (United States)

    2017-02-01

    The work explored the linking of modern developing machine learning techniques (manifold learning and in particular diffusion maps) with traditional PDE modeling/discretization/scientific computation techniques via the equation-free methodology developed by the PI. The result (in addition to several PhD degrees, two of them by CSGF Fellows) was a sequence of strong developments - in part on the algorithmic side, linking data mining with scientific computing, and in part on applications, ranging from PDE discretizations to molecular dynamics and complex network dynamics.

  5. Reduced-Complexity Direction of Arrival Estimation Using Real-Valued Computation with Arbitrary Array Configurations

    Directory of Open Access Journals (Sweden)

    Feng-Gang Yan

    2018-01-01

    Full Text Available A low-complexity algorithm is presented to dramatically reduce the complexity of the multiple signal classification (MUSIC algorithm for direction of arrival (DOA estimation, in which both tasks of eigenvalue decomposition (EVD and spectral search are implemented with efficient real-valued computations, leading to about 75% complexity reduction as compared to the standard MUSIC. Furthermore, the proposed technique has no dependence on array configurations and is hence suitable for arbitrary array geometries, which shows a significant implementation advantage over most state-of-the-art unitary estimators including unitary MUSIC (U-MUSIC. Numerical simulations over a wide range of scenarios are conducted to show the performance of the new technique, which demonstrates that with a significantly reduced computational complexity, the new approach is able to provide a close accuracy to the standard MUSIC.

  6. European Conference on Complex Systems

    CERN Document Server

    Pellegrini, Francesco; Caldarelli, Guido; Merelli, Emanuela

    2016-01-01

    This work contains a stringent selection of extended contributions presented at the meeting of 2014 and its satellite meetings, reflecting scope, diversity and richness of research areas in the field, both fundamental and applied. The ECCS meeting, held under the patronage of the Complex Systems Society, is an annual event that has become the leading European conference devoted to complexity science. It offers cutting edge research and unique opportunities to study novel scientific approaches in a multitude of application areas. ECCS'14, its eleventh occurrence, took place in Lucca, Italy. It gathered some 650 scholars representing a wide range of topics relating to complex systems research, with emphasis on interdisciplinary approaches. The editors are among the best specialists in the area. The book is of great interest to scientists, researchers and graduate students in complexity, complex systems and networks.

  7. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  8. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow, Rutgers University/Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure

  9. MVPACK: a package for the computer-aided design of multivariable control systems

    International Nuclear Information System (INIS)

    Mensah, S.

    1984-01-01

    The design and analysis of high performing controllers for large complex plants require a collection of interactive, powerful computer software. MVPACK, an open-ended package for the computer aided design of control systems has been developed in the Reactor Control Branch of the Chalk River Nuclear Laboratories. The package is fully interactive, and includes a comprehensive state-of-the-art mathematical library to support development of complex multivariable control algorithms. Coded in RATFOR, MVPACK operates with a flexible data structure which makes efficient use of minicomputer resources and provides a standard framework for program generation. The existence of a help mechanism enhances the simplicity of package utilization. This report provides the technical description of the package. It reviews the specifications used in the design and implementation of the package. The database structure, the supporting libraries and the design and analysis modules of MVPACK are described. The report includes several application examples to illustrate the capability of the package. Experience with MVPACK shows that the package provides a synergistic environment for control and regulation systems design, and that it is a unique tool in training of control system engineers

  10. Sixth International Conference on Complex Systems

    CERN Document Server

    Minai, Ali; Bar-Yam, Yaneer; Unifying Themes in Complex Systems

    2008-01-01

    The International Conference on Complex Systems (ICCS) creates a unique atmosphere for scientists of all fields, engineers, physicians, executives, and a host of other professionals to explore the common themes and applications of complex systems science. In June 2006, 500 participants convened in Boston for the sixth ICCS, exploring an array of topics, including networks, systems biology, evolution and ecology, nonlinear dynamics and pattern formation, as well as neural, psychological, psycho-social, socio-economic, and global systems. This volume selects 77 papers from over 300 presented at the conference. With this new volume, Unifying Themes in Complex Systems continues to build common ground between the wide-ranging domains of complex systems science.

  11. CIMS: A Context-Based Intelligent Multimedia System for Ubiquitous Cloud Computing

    Directory of Open Access Journals (Sweden)

    Abhilash Sreeramaneni

    2015-06-01

    Full Text Available Mobile users spend a tremendous amount of time surfing multimedia contents over the Internet to pursue their interests. A resource-constrained smart device demands more intensive computing tasks and lessens the battery life. To address the resource limitations (i.e., memory, lower maintenance cost, easier access, computing tasks in mobile devices, mobile cloud computing is needed. Several approaches have been proposed to confront the challenges of mobile cloud computing, but difficulties still remain. However, in the coming years, context collecting, processing, and interchanging the results on a heavy network will cause vast computations and reduce the battery life in mobiles. In this paper, we propose a “context-based intelligent multimedia system” (CIMS for ubiquitous cloud computing. The main goal of this research is to lessen the computing percentage, storage complexities, and battery life for mobile users by using pervasive cloud computing. Moreover, to reduce the computing and storage concerns in mobiles, the cloud server collects several groups of user profiles with similarities by executing K-means clustering on users’ data (context and multimedia contents. The distribution process conveys real-time notifications to smartphone users, according to what is stated in his/her profile. We considered a mobile cloud offloading system, which decides the offloading actions to/from cloud servers. Context-aware decision-making (CAD customizes the mobile device performance with different specifications such as short response time and lesser energy consumption. The analysis says that our CIMS takes advantage of cost-effective features to produce high-quality information for mobile (or smart device users in real time. Moreover, our CIMS lessens the computation and storage complexities for mobile users, as well as cloud servers. Simulation analysis suggests that our approach is more efficient than existing domains.

  12. Multiaxis, Lightweight, Computer-Controlled Exercise System

    Science.gov (United States)

    Haynes, Leonard; Bachrach, Benjamin; Harvey, William

    2006-01-01

    The multipurpose, multiaxial, isokinetic dynamometer (MMID) is a computer-controlled system of exercise machinery that can serve as a means for quantitatively assessing a subject s muscle coordination, range of motion, strength, and overall physical condition with respect to a wide variety of forces, motions, and exercise regimens. The MMID is easily reconfigurable and compactly stowable and, in comparison with prior computer-controlled exercise systems, it weighs less, costs less, and offers more capabilities. Whereas a typical prior isokinetic exercise machine is limited to operation in only one plane, the MMID can operate along any path. In addition, the MMID is not limited to the isokinetic (constant-speed) mode of operation. The MMID provides for control and/or measurement of position, force, and/or speed of exertion in as many as six degrees of freedom simultaneously; hence, it can accommodate more complex, more nearly natural combinations of motions and, in so doing, offers greater capabilities for physical conditioning and evaluation. The MMID (see figure) includes as many as eight active modules, each of which can be anchored to a floor, wall, ceiling, or other fixed object. A cable is payed out from a reel in each module to a bar or other suitable object that is gripped and manipulated by the subject. The reel is driven by a DC brushless motor or other suitable electric motor via a gear reduction unit. The motor can be made to function as either a driver or an electromagnetic brake, depending on the required nature of the interaction with the subject. The module includes a force and a displacement sensor for real-time monitoring of the tension in and displacement of the cable, respectively. In response to commands from a control computer, the motor can be operated to generate a required tension in the cable, to displace the cable a required distance, or to reel the cable in or out at a required speed. The computer can be programmed, either locally or via

  13. Computer simulations of dendrimer-polyelectrolyte complexes.

    Science.gov (United States)

    Pandav, Gunja; Ganesan, Venkat

    2014-08-28

    We carry out a systematic analysis of static properties of the clusters formed by complexation between charged dendrimers and linear polyelectrolyte (LPE) chains in a dilute solution under good solvent conditions. We use single chain in mean-field simulations and analyze the structure of the clusters through radial distribution functions of the dendrimer, cluster size, and charge distributions. The effects of LPE length, charge ratio between LPE and dendrimer, the influence of salt concentration, and the dendrimer generation number are examined. Systems with short LPEs showed a reduced propensity for aggregation with dendrimers, leading to formation of smaller clusters. In contrast, larger dendrimers and longer LPEs lead to larger clusters with significant bridging. Increasing salt concentration was seen to reduce aggregation between dendrimers as a result of screening of electrostatic interactions. Generally, maximum complexation was observed in systems with an equal amount of net dendrimer and LPE charges, whereas either excess LPE or dendrimer concentrations resulted in reduced clustering between dendrimers.

  14. Real-life applications with membrane computing

    CERN Document Server

    Zhang, Gexiang; Gheorghe, Marian

    2017-01-01

    This book thoroughly investigates the underlying theoretical basis of membrane computing models, and reveals their latest applications. In addition, to date there have been no illustrative case studies or complex real-life applications that capitalize on the full potential of the sophisticated membrane systems computational apparatus; gaps that this book remedies. By studying various complex applications – including engineering optimization, power systems fault diagnosis, mobile robot controller design, and complex biological systems involving data modeling and process interactions – the book also extends the capabilities of membrane systems models with features such as formal verification techniques, evolutionary approaches, and fuzzy reasoning methods. As such, the book offers a comprehensive and up-to-date guide for all researchers, PhDs and undergraduate students in the fields of computer science, engineering and the bio-sciences who are interested in the applications of natural computing models.

  15. On a computational method for modelling complex ecosystems by superposition procedure

    International Nuclear Information System (INIS)

    He Shanyu.

    1986-12-01

    In this paper, the Superposition Procedure is concisely described, and a computational method for modelling a complex ecosystem is proposed. With this method, the information contained in acceptable submodels and observed data can be utilized to maximal degree. (author). 1 ref

  16. A design of a computer complex including vector processors

    International Nuclear Information System (INIS)

    Asai, Kiyoshi

    1982-12-01

    We, members of the Computing Center, Japan Atomic Energy Research Institute have been engaged for these six years in the research of adaptability of vector processing to large-scale nuclear codes. The research has been done in collaboration with researchers and engineers of JAERI and a computer manufacturer. In this research, forty large-scale nuclear codes were investigated from the viewpoint of vectorization. Among them, twenty-six codes were actually vectorized and executed. As the results of the investigation, it is now estimated that about seventy percents of nuclear codes and seventy percents of our total amount of CPU time of JAERI are highly vectorizable. Based on the data obtained by the investigation, (1)currently vectorizable CPU time, (2)necessary number of vector processors, (3)necessary manpower for vectorization of nuclear codes, (4)computing speed, memory size, number of parallel 1/0 paths, size and speed of 1/0 buffer of vector processor suitable for our applications, (5)necessary software and operational policy for use of vector processors are discussed, and finally (6)a computer complex including vector processors is presented in this report. (author)

  17. Construction and clinical application of complex utility programs in the SEGAMS-80 system

    International Nuclear Information System (INIS)

    Mate, E.; Csirik, J.; Csernay, L.; Makay, A.

    1981-01-01

    SEGAMS-80 is a system for processing isotope-diagnostic pictures easily and safely for physicians. The functions built into the system form a tree-structure. In certain stages of processing, tables completed according to the medical point of view show identification and a short description of the actual performable functions. The functions available allow an interactive performance of diagnostic processes for different purposes. Interactivity is undesirable while processing routine examinations, since the functions to be performed, their sequence and parameters could be identical in all cases. SEGAMS-80 makes it possible to construct complex programs, to put them into the system and execute them. During the complex program the desired functions are automatically executed. The operator's interference is needed only where the author of the complex program has stated that it is necessary from the medical aspects. Experience gained with several SEGAMS-80 systems has shown that they can be successfully used in isotope diagnostics, without requiring any training in computing techniques from the physicians. A schematic description is given of the structure of SEGAMS-80 together with a detailed account of how to construct complex utility programs. (author)

  18. Philosophy of complex systems

    CERN Document Server

    2011-01-01

    The domain of nonlinear dynamical systems and its mathematical underpinnings has been developing exponentially for a century, the last 35 years seeing an outpouring of new ideas and applications and a concomitant confluence with ideas of complex systems and their applications from irreversible thermodynamics. A few examples are in meteorology, ecological dynamics, and social and economic dynamics. These new ideas have profound implications for our understanding and practice in domains involving complexity, predictability and determinism, equilibrium, control, planning, individuality, responsibility and so on. Our intention is to draw together in this volume, we believe for the first time, a comprehensive picture of the manifold philosophically interesting impacts of recent developments in understanding nonlinear systems and the unique aspects of their complexity. The book will focus specifically on the philosophical concepts, principles, judgments and problems distinctly raised by work in the domain of comple...

  19. Computer-aided system design

    Science.gov (United States)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  20. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  1. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  2. Advanced topics in security computer system design

    International Nuclear Information System (INIS)

    Stachniak, D.E.; Lamb, W.R.

    1989-01-01

    The capability, performance, and speed of contemporary computer processors, plus the associated performance capability of the operating systems accommodating the processors, have enormously expanded the scope of possibilities for designers of nuclear power plant security computer systems. This paper addresses the choices that could be made by a designer of security computer systems working with contemporary computers and describes the improvement in functionality of contemporary security computer systems based on an optimally chosen design. Primary initial considerations concern the selection of (a) the computer hardware and (b) the operating system. Considerations for hardware selection concern processor and memory word length, memory capacity, and numerous processor features

  3. 5th International Conference on Complex Systems

    CERN Document Server

    Braha, Dan; Bar-Yam, Yaneer

    2011-01-01

    The International Conference on Complex Systems (ICCS) creates a unique atmosphere for scientists of all fields, engineers, physicians, executives, and a host of other professionals to explore common themes and applications of complex system science. With this new volume, Unifying Themes in Complex Systems continues to build common ground between the wide-ranging domains of complex system science.

  4. 7th International Conference on Complex Systems

    CERN Document Server

    Braha, Dan; Bar-Yam, Yaneer

    2012-01-01

    The International Conference on Complex Systems (ICCS) creates a unique atmosphere for scientists of all fields, engineers, physicians, executives, and a host of other professionals to explore common themes and applications of complex system science. With this new volume, Unifying Themes in Complex Systems continues to build common ground between the wide-ranging domains of complex system science.

  5. Single neuron computation

    CERN Document Server

    McKenna, Thomas M; Zornetzer, Steven F

    1992-01-01

    This book contains twenty-two original contributions that provide a comprehensive overview of computational approaches to understanding a single neuron structure. The focus on cellular-level processes is twofold. From a computational neuroscience perspective, a thorough understanding of the information processing performed by single neurons leads to an understanding of circuit- and systems-level activity. From the standpoint of artificial neural networks (ANNs), a single real neuron is as complex an operational unit as an entire ANN, and formalizing the complex computations performed by real n

  6. Investigation of anticancer properties of caffeinated complexes via computational chemistry methods

    Science.gov (United States)

    Sayin, Koray; Üngördü, Ayhan

    2018-03-01

    Computational investigations were performed for 1,3,7-trimethylpurine-2,6-dione, 3,7-dimethylpurine-2,6-dione, their Ru(II) and Os(III) complexes. B3LYP/6-311 ++G(d,p)(LANL2DZ) level was used in numerical calculations. Geometric parameters, IR spectrum, 1H-, 13C and 15N NMR spectrum were examined in detail. Additionally, contour diagram of frontier molecular orbitals (FMOs), molecular electrostatic potential (MEP) maps, MEP contour and some quantum chemical descriptors were used in the determination of reactivity rankings and active sites. The electron density on the surface was similar to each other in studied complexes. Quantum chemical descriptors were investigated and the anticancer activity of complexes were more than cisplatin and their ligands. Additionally, molecular docking calculations were performed in water between related complexes and a protein (ID: 3WZE). The most interact complex was found as Os complex. The interaction energy was calculated as 342.9 kJ/mol.

  7. Reducing the Computational Complexity of Reconstruction in Compressed Sensing Nonuniform Sampling

    DEFF Research Database (Denmark)

    Grigoryan, Ruben; Jensen, Tobias Lindstrøm; Arildsen, Thomas

    2013-01-01

    sparse signals, but requires computationally expensive reconstruction algorithms. This can be an obstacle for real-time applications. The reduction of complexity is achieved by applying a multi-coset sampling procedure. This proposed method reduces the size of the dictionary matrix, the size...

  8. Complexity and Intensionality in a Type-1 Framework for Computable Analysis

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2005-01-01

    This paper describes a type-1 framework for computable analysis designed to facilitate efficient implementations and discusses properties that have not been well studied before for type-1 approaches: the introduction of complexity measures for type-1 representations of real functions, and ways...

  9. Secure computing on reconfigurable systems

    OpenAIRE

    Fernandes Chaves, R.J.

    2007-01-01

    This thesis proposes a Secure Computing Module (SCM) for reconfigurable computing systems. SC provides a protected and reliable computational environment, where data security and protection against malicious attacks to the system is assured. SC is strongly based on encryption algorithms and on the attestation of the executed functions. The use of SC on reconfigurable devices has the advantage of being highly adaptable to the application and the user requirements, while providing high performa...

  10. QA Issues for Computer-Controlled Treatment Delivery: This Is Not Your Old R/V System Any More!

    International Nuclear Information System (INIS)

    Fraass, Benedick A.

    2008-01-01

    State-of-the-art radiotherapy treatment delivery has changed dramatically during the past decade, moving from manual individual field setup and treatment to automated computer-controlled delivery of complex treatments, including intensity-modulated radiotherapy and other similarly complex delivery strategies. However, the quality assurance methods typically used to ensure treatment is performed precisely and correctly have not evolved in a similarly dramatic way. This paper reviews the old manual treatment process and use of record-and-verify systems, and describes differences with modern computer-controlled treatment delivery. The process and technology used for computer-controlled treatment delivery are analyzed in terms of potential (and actual) problems, as well as relevant published guidance on quality assurance. The potential for improved quality assurance for computer-controlled delivery is discussed

  11. Reliability of large and complex systems

    CERN Document Server

    Kolowrocki, Krzysztof

    2014-01-01

    Reliability of Large and Complex Systems, previously titled Reliability of Large Systems, is an innovative guide to the current state and reliability of large and complex systems. In addition to revised and updated content on the complexity and safety of large and complex mechanisms, this new edition looks at the reliability of nanosystems, a key research topic in nanotechnology science. The author discusses the importance of safety investigation of critical infrastructures that have aged or have been exposed to varying operational conditions. This reference provides an asympt

  12. Platinum Group Thiophenoxyimine Complexes: Syntheses,Crystallographic and Computational Studies of Structural Properties

    Energy Technology Data Exchange (ETDEWEB)

    Krinsky, Jamin L.; Arnold, John; Bergman, Robert G.

    2006-10-03

    Monomeric thiosalicylaldiminate complexes of rhodium(I) and iridium(I) were prepared by ligand transfer from the homoleptic zinc(II) species. In the presence of strongly donating ligands, the iridium complexes undergo insertion of the metal into the imine carbon-hydrogen bond. Thiophenoxyketimines were prepared by non-templated reaction of o-mercaptoacetophenone with anilines, and were complexed with rhodium(I), iridium(I), nickel(II) and platinum(II). X-ray crystallographic studies showed that while the thiosalicylaldiminate complexes display planar ligand conformations, those of the thiophenoxyketiminates are strongly distorted. Results of a computational study were consistent with a steric-strain interpretation of the difference in preferred ligand geometries.

  13. Complexity-aware high efficiency video coding

    CERN Document Server

    Correa, Guilherme; Agostini, Luciano; Cruz, Luis A da Silva

    2016-01-01

    This book discusses computational complexity of High Efficiency Video Coding (HEVC) encoders with coverage extending from the analysis of HEVC compression efficiency and computational complexity to the reduction and scaling of its encoding complexity. After an introduction to the topic and a review of the state-of-the-art research in the field, the authors provide a detailed analysis of the HEVC encoding tools compression efficiency and computational complexity.  Readers will benefit from a set of algorithms for scaling the computational complexity of HEVC encoders, all of which take advantage from the flexibility of the frame partitioning structures allowed by the standard.  The authors also provide a set of early termination methods based on data mining and machine learning techniques, which are able to reduce the computational complexity required to find the best frame partitioning structures. The applicability of the proposed methods is finally exemplified with an encoding time control system that emplo...

  14. Impact of new computing systems on finite element computations

    International Nuclear Information System (INIS)

    Noor, A.K.; Fulton, R.E.; Storaasi, O.O.

    1983-01-01

    Recent advances in computer technology that are likely to impact finite element computations are reviewed. The characteristics of supersystems, highly parallel systems, and small systems (mini and microcomputers) are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario is presented for future hardware/software environment and finite element systems. A number of research areas which have high potential for improving the effectiveness of finite element analysis in the new environment are identified

  15. Geographical National Condition and Complex System

    Directory of Open Access Journals (Sweden)

    WANG Jiayao

    2016-01-01

    Full Text Available The significance of studying the complex system of geographical national conditions lies in rationally expressing the complex relationships of the “resources-environment-ecology-economy-society” system. Aiming to the problems faced by the statistical analysis of geographical national conditions, including the disunity of research contents, the inconsistency of range, the uncertainty of goals, etc.the present paper conducted a range of discussions from the perspectives of concept, theory and method, and designed some solutions based on the complex system theory and coordination degree analysis methods.By analyzing the concepts of geographical national conditions, geographical national conditions survey and geographical national conditions statistical analysis, as well as investigating the relationships between theirs, the statistical contents and the analytical range of geographical national conditions are clarified and defined. This investigation also clarifies the goals of the statistical analysis by analyzing the basic characteristics of the geographical national conditions and the complex system, and the consistency between the analysis of the degree of coordination and statistical analyses. It outlines their goals, proposes a concept for the complex system of geographical national conditions, and it describes the concept. The complex system theory provides new theoretical guidance for the statistical analysis of geographical national conditions. The degree of coordination offers new approaches on how to undertake the analysis based on the measurement method and decision-making analysis scheme upon which the complex system of geographical national conditions is based. It analyzes the overall trend via the degree of coordination of the complex system on a macro level, and it determines the direction of remediation on a micro level based on the degree of coordination among various subsystems and of single systems. These results establish

  16. Retrofitting of NPP Computer systems

    International Nuclear Information System (INIS)

    Pettersen, G.

    1994-01-01

    Retrofitting of nuclear power plant control rooms is a continuing process for most utilities. This involves introducing and/or extending computer-based solutions for surveillance and control as well as improving the human-computer interface. The paper describes typical requirements when retrofitting NPP process computer systems, and focuses on the activities of Institute for energieteknikk, OECD Halden Reactor project with respect to such retrofitting, using examples from actual delivery projects. In particular, a project carried out for Forsmarksverket in Sweden comprising upgrade of the operator system in the control rooms of units 1 and 2 is described. As many of the problems of retrofitting NPP process computer systems are similar to such work in other kinds of process industries, an example from a non-nuclear application area is also given

  17. Epidemic modeling in complex realities.

    Science.gov (United States)

    Colizza, Vittoria; Barthélemy, Marc; Barrat, Alain; Vespignani, Alessandro

    2007-04-01

    In our global world, the increasing complexity of social relations and transport infrastructures are key factors in the spread of epidemics. In recent years, the increasing availability of computer power has enabled both to obtain reliable data allowing one to quantify the complexity of the networks on which epidemics may propagate and to envision computational tools able to tackle the analysis of such propagation phenomena. These advances have put in evidence the limits of homogeneous assumptions and simple spatial diffusion approaches, and stimulated the inclusion of complex features and heterogeneities relevant in the description of epidemic diffusion. In this paper, we review recent progresses that integrate complex systems and networks analysis with epidemic modelling and focus on the impact of the various complex features of real systems on the dynamics of epidemic spreading.

  18. The method of measurement and synchronization control for large-scale complex loading system

    International Nuclear Information System (INIS)

    Liao Min; Li Pengyuan; Hou Binglin; Chi Chengfang; Zhang Bo

    2012-01-01

    With the development of modern industrial technology, measurement and control system was widely used in high precision, complex industrial control equipment and large-tonnage loading device. The measurement and control system is often used to analyze the distribution of stress and displacement in the complex bearing load or the complex nature of the mechanical structure itself. In ITER GS mock-up with 5 flexible plates, for each load combination, detect and measure potential slippage between the central flexible plate and the neighboring spacers is necessary as well as the potential slippage between each pre-stressing bar and its neighboring plate. The measurement and control system consists of seven sets of EDC controller and board, computer system, 16-channel quasi-dynamic strain gauge, 25 sets of displacement sensors, 7 sets of load and displacement sensors in the cylinders. This paper demonstrates the principles and methods of EDC220 digital controller to achieve synchronization control, and R and D process of multi-channel loading control software and measurement software. (authors)

  19. Stochastic dynamics of complex systems: from glasses to evolution (series on complexity science)

    CERN Document Server

    Sibani, Paolo

    2013-01-01

    Dynamical evolution over long time scales is a prominent feature of all the systems we intuitively think of as complex - for example, ecosystems, the brain or the economy. In physics, the term ageing is used for this type of slow change, occurring over time scales much longer than the patience, or indeed the lifetime, of the observer. The main focus of this book is on the stochastic processes which cause ageing, and the surprising fact that the ageing dynamics of systems which are very different at the microscopic level can be treated in similar ways. The first part of this book provides the necessary mathematical and computational tools and the second part describes the intuition needed to deal with these systems. Some of the first few chapters have been covered in several other books, but the emphasis and selection of the topics reflect both the authors' interests and the overall theme of the book. The second part contains an introduction to the scientific literature and deals in some detail with the desc...

  20. Complexity hints for economic policy

    CERN Document Server

    Salzano, Massimo

    2007-01-01

    This volume extends the complexity approach to economics. This complexity approach is not a completely new way of doing economics, and that it is a replacement for existing economics, but rather the integration of some new analytic and computational techniques into economists’ bag of tools. It provides some alternative pattern generators, which can supplement existing approaches by providing an alternative way of finding patterns than be obtained by the traditional scientific approach. On this new kind of policy hints can be obtained. The reason why the complexity approach is taking hold now in economics is because the computing technology has advanced. This advance allows consideration of analytical systems that could not previously be considered by economists. Consideration of these systems suggested that the results of the "control-based" models might not extend easily to more complicated systems, and that we now have a method—piggybacking computer assisted analysis onto analytic methods—to start gen...

  1. A Pharmacy Computer System

    OpenAIRE

    Claudia CIULCA-VLADAIA; Călin MUNTEAN

    2009-01-01

    Objective: Describing a model of evaluation seen from a customer’s point of view for the current needed pharmacy computer system. Data Sources: literature research, ATTOFARM, WINFARM P.N.S., NETFARM, Info World - PHARMACY MANAGER and HIPOCRATE FARMACIE. Study Selection: Five Pharmacy Computer Systems were selected due to their high rates of implementing at a national level. We used the new criteria recommended by EUROREC Institute in EHR that modifies the model of data exchanges between the E...

  2. Computable Types for Dynamic Systems

    NARCIS (Netherlands)

    P.J. Collins (Pieter); K. Ambos-Spies; B. Loewe; W. Merkle

    2009-01-01

    textabstractIn this paper, we develop a theory of computable types suitable for the study of dynamic systems in discrete and continuous time. The theory uses type-two effectivity as the underlying computational model, but we quickly develop a type system which can be manipulated abstractly, but for

  3. Effects of complex feedback on computer-assisted modular instruction

    NARCIS (Netherlands)

    Gordijn, Jan; Nijhof, W.J.

    2002-01-01

    The aim of this study is to determine the effects of two versions of Computer-Based Feedback within a prevocational system of modularized education in The Netherlands. The implementation and integration of Computer-Based Feedback (CBF) in Installation Technology modules in all schools (n=60) in The

  4. Computer network defense system

    Science.gov (United States)

    Urias, Vincent; Stout, William M. S.; Loverro, Caleb

    2017-08-22

    A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves network connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.

  5. MITT writer and MITT writer advanced development: Developing authoring and training systems for complex technical domains

    Science.gov (United States)

    Wiederholt, Bradley J.; Browning, Elica J.; Norton, Jeffrey E.; Johnson, William B.

    1991-01-01

    MITT Writer is a software system for developing computer based training for complex technical domains. A training system produced by MITT Writer allows a student to learn and practice troubleshooting and diagnostic skills. The MITT (Microcomputer Intelligence for Technical Training) architecture is a reasonable approach to simulation based diagnostic training. MITT delivers training on available computing equipment, delivers challenging training and simulation scenarios, and has economical development and maintenance costs. A 15 month effort was undertaken in which the MITT Writer system was developed. A workshop was also conducted to train instructors in how to use MITT Writer. Earlier versions were used to develop an Intelligent Tutoring System for troubleshooting the Minuteman Missile Message Processing System.

  6. Intelligent Computer Vision System for Automated Classification

    International Nuclear Information System (INIS)

    Jordanov, Ivan; Georgieva, Antoniya

    2010-01-01

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  7. Quantum Accelerators for High-performance Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  8. Intelligent computational systems for space applications

    Science.gov (United States)

    Lum, Henry; Lau, Sonie

    Intelligent computational systems can be described as an adaptive computational system integrating both traditional computational approaches and artificial intelligence (AI) methodologies to meet the science and engineering data processing requirements imposed by specific mission objectives. These systems will be capable of integrating, interpreting, and understanding sensor input information; correlating that information to the "world model" stored within its data base and understanding the differences, if any; defining, verifying, and validating a command sequence to merge the "external world" with the "internal world model"; and, controlling the vehicle and/or platform to meet the scientific and engineering mission objectives. Performance and simulation data obtained to date indicate that the current flight processors baselined for many missions such as Space Station Freedom do not have the computational power to meet the challenges of advanced automation and robotics systems envisioned for the year 2000 era. Research issues which must be addressed to achieve greater than giga-flop performance for on-board intelligent computational systems have been identified, and a technology development program has been initiated to achieve the desired long-term system performance objectives.

  9. Operational Complexity of Supplier-Customer Systems Measured by Entropy—Case Studies

    Directory of Open Access Journals (Sweden)

    Ladislav Lukáš

    2016-04-01

    Full Text Available This paper discusses a unified entropy-based approach for the quantitative measurement of operational complexity of company supplier-customer relations. Classical Shannon entropy is utilized. Beside this quantification tool, we also explore the relations between Shannon entropy and (c,d-entropy in more details. An analytic description of so called iso-quant curves is given, too. We present five case studies, albeit in an anonymous setting, describing various details of general procedures for measuring the operational complexity of supplier-customer systems. In general, we assume a problem-oriented database exists, which contains detailed records of all product forecasts, orders and deliveries both in quantity and time, scheduled and realized, too. Data processing detects important flow variations both in volumes and times, e.g., order—forecast, delivery—order, and actual production—scheduled one. The unifying quantity used for entropy computation is the time gap between actual delivery time and order issue time, which is nothing else but a lead time in inventory control models. After data consistency checks, histograms and empirical distribution functions are constructed. Finally, the entropy, information-theoretic measure of supplier-customer operational complexity, is calculated. Basic steps of the algorithm are mentioned briefly, too. Results of supplier-customer system analysis from selected Czech small and medium-sized enterprises (SMEs are presented in various computational and managerial decision making details. An enterprise is ranked as SME one, if it has at most 250 employees and its turnover does not exceed 50 million USD per year, or its balance sheet total does not exceed 43 million USD per year, alternatively.

  10. High performance computing system in the framework of the Higgs boson studies

    CERN Document Server

    Belyaev, Nikita; The ATLAS collaboration

    2017-01-01

    The Higgs boson physics is one of the most important and promising fields of study in modern High Energy Physics. To perform precision measurements of the Higgs boson properties, the use of fast and efficient instruments of Monte Carlo event simulation is required. Due to the increasing amount of data and to the growing complexity of the simulation software tools, the computing resources currently available for Monte Carlo simulation on the LHC GRID are not sufficient. One of the possibilities to address this shortfall of computing resources is the usage of institutes computer clusters, commercial computing resources and supercomputers. In this paper, a brief description of the Higgs boson physics, the Monte-Carlo generation and event simulation techniques are presented. A description of modern high performance computing systems and tests of their performance are also discussed. These studies have been performed on the Worldwide LHC Computing Grid and Kurchatov Institute Data Processing Center, including Tier...

  11. Solving the Coupled System Improves Computational Efficiency of the Bidomain Equations

    KAUST Repository

    Southern, J.A.

    2009-10-01

    The bidomain equations are frequently used to model the propagation of cardiac action potentials across cardiac tissue. At the whole organ level, the size of the computational mesh required makes their solution a significant computational challenge. As the accuracy of the numerical solution cannot be compromised, efficiency of the solution technique is important to ensure that the results of the simulation can be obtained in a reasonable time while still encapsulating the complexities of the system. In an attempt to increase efficiency of the solver, the bidomain equations are often decoupled into one parabolic equation that is computationally very cheap to solve and an elliptic equation that is much more expensive to solve. In this study, the performance of this uncoupled solution method is compared with an alternative strategy in which the bidomain equations are solved as a coupled system. This seems counterintuitive as the alternative method requires the solution of a much larger linear system at each time step. However, in tests on two 3-D rabbit ventricle benchmarks, it is shown that the coupled method is up to 80% faster than the conventional uncoupled method-and that parallel performance is better for the larger coupled problem.

  12. Solving the Coupled System Improves Computational Efficiency of the Bidomain Equations

    KAUST Repository

    Southern, J.A.; Plank, G.; Vigmond, E.J.; Whiteley, J.P.

    2009-01-01

    The bidomain equations are frequently used to model the propagation of cardiac action potentials across cardiac tissue. At the whole organ level, the size of the computational mesh required makes their solution a significant computational challenge. As the accuracy of the numerical solution cannot be compromised, efficiency of the solution technique is important to ensure that the results of the simulation can be obtained in a reasonable time while still encapsulating the complexities of the system. In an attempt to increase efficiency of the solver, the bidomain equations are often decoupled into one parabolic equation that is computationally very cheap to solve and an elliptic equation that is much more expensive to solve. In this study, the performance of this uncoupled solution method is compared with an alternative strategy in which the bidomain equations are solved as a coupled system. This seems counterintuitive as the alternative method requires the solution of a much larger linear system at each time step. However, in tests on two 3-D rabbit ventricle benchmarks, it is shown that the coupled method is up to 80% faster than the conventional uncoupled method-and that parallel performance is better for the larger coupled problem.

  13. Efficient computation of argumentation semantics

    CERN Document Server

    Liao, Beishui

    2013-01-01

    Efficient Computation of Argumentation Semantics addresses argumentation semantics and systems, introducing readers to cutting-edge decomposition methods that drive increasingly efficient logic computation in AI and intelligent systems. Such complex and distributed systems are increasingly used in the automation and transportation systems field, and particularly autonomous systems, as well as more generic intelligent computation research. The Series in Intelligent Systems publishes titles that cover state-of-the-art knowledge and the latest advances in research and development in intelligen

  14. A new VLSI complex integer multiplier which uses a quadratic-polynomial residue system with Fermat numbers

    Science.gov (United States)

    Shyu, H. C.; Reed, I. S.; Truong, T. K.; Hsu, I. S.; Chang, J. J.

    1987-01-01

    A quadratic-polynomial Fermat residue number system (QFNS) has been used to compute complex integer multiplications. The advantage of such a QFNS is that a complex integer multiplication requires only two integer multiplications. In this article, a new type Fermat number multiplier is developed which eliminates the initialization condition of the previous method. It is shown that the new complex multiplier can be implemented on a single VLSI chip. Such a chip is designed and fabricated in CMOS-Pw technology.

  15. Computational Genetic Regulatory Networks Evolvable, Self-organizing Systems

    CERN Document Server

    Knabe, Johannes F

    2013-01-01

    Genetic Regulatory Networks (GRNs) in biological organisms are primary engines for cells to enact their engagements with environments, via incessant, continually active coupling. In differentiated multicellular organisms, tremendous complexity has arisen in the course of evolution of life on earth. Engineering and science have so far achieved no working system that can compare with this complexity, depth and scope of organization. Abstracting the dynamics of genetic regulatory control to a computational framework in which artificial GRNs in artificial simulated cells differentiate while connected in a changing topology, it is possible to apply Darwinian evolution in silico to study the capacity of such developmental/differentiated GRNs to evolve. In this volume an evolutionary GRN paradigm is investigated for its evolvability and robustness in models of biological clocks, in simple differentiated multicellularity, and in evolving artificial developing 'organisms' which grow and express an ontogeny starting fr...

  16. LT^2C^2: A language of thought with Turing-computable Kolmogorov complexity

    Directory of Open Access Journals (Sweden)

    Santiago Figueira

    2013-03-01

    Full Text Available In this paper, we present a theoretical effort to connect the theory of program size to psychology by implementing a concrete language of thought with Turing-computable Kolmogorov complexity (LT^2C^2 satisfying the following requirements: 1 to be simple enough so that the complexity of any given finite binary sequence can be computed, 2 to be based on tangible operations of human reasoning (printing, repeating,. . . , 3 to be sufficiently powerful to generate all possible sequences but not too powerful as to identify regularities which would be invisible to humans. We first formalize LT^2C^2, giving its syntax and semantics, and defining an adequate notion of program size. Our setting leads to a Kolmogorov complexity function relative to LT^2C^2 which is computable in polynomial time, and it also induces a prediction algorithm in the spirit of Solomonoff’s inductive inference theory. We then prove the efficacy of this language by investigating regularities in strings produced by participants attempting to generate random strings. Participants had a profound understanding of randomness and hence avoided typical misconceptions such as exaggerating the number of alternations. We reasoned that remaining regularities would express the algorithmic nature of human thoughts, revealed in the form of specific patterns. Kolmogorov complexity relative to LT^2C^2 passed three expected tests examined here: 1 human sequences were less complex than control PRNG sequences, 2 human sequences were not stationary showing decreasing values of complexity resulting from fatigue 3 each individual showed traces of algorithmic stability since fitting of partial data was more effective to predict subsequent data than average fits. This work extends on previous efforts to combine notions of Kolmogorov complexity theory and algorithmic information theory to psychology, by explicitly proposing a language which may describe the patterns of human thoughts.Received: 12

  17. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  18. Development of a computer-aided digital reactivity computer system for PWRs

    International Nuclear Information System (INIS)

    Chung, S.-K.; Sung, K.-Y.; Kim, D.; Cho, D.-Y.

    1993-01-01

    Reactor physics tests at initial startup and after reloading are performed to verify nuclear design and to ensure safety operation. Two kinds of reactivity computers, analog and digital, have been widely used in the pressurized water reactor (PWR) core physics test. The test data of both reactivity computers are displayed only on the strip chart recorder, and these data are managed by hand so that the accuracy of the test results depends on operator expertise and experiences. This paper describes the development of the computer-aided digital reactivity computer system (DRCS), which is enhanced by system management software and an improved system for the application of the PWR core physics test

  19. Computer information systems framework

    International Nuclear Information System (INIS)

    Shahabuddin, S.

    1989-01-01

    Management information systems (MIS) is a commonly used term in computer profession. The new information technology has caused management to expect more from computer. The process of supplying information follows a well defined procedure. MIS should be capable for providing usable information to the various areas and levels of organization. MIS is different from data processing. MIS and business hierarchy provides a good framework for many organization which are using computers. (A.B.)

  20. Computer science approach to quantum control

    International Nuclear Information System (INIS)

    Janzing, D.

    2006-01-01

    Whereas it is obvious that every computation process is a physical process it has hardly been recognized that many complex physical processes bear similarities to computation processes. This is in particular true for the control of physical systems on the nanoscopic level: usually the system can only be accessed via a rather limited set of elementary control operations and for many purposes only a concatenation of a large number of these basic operations will implement the desired process. This concatenation is in many cases quite similar to building complex programs from elementary steps and principles for designing algorithm may thus be a paradigm for designing control processes. For instance, one can decrease the temperature of one part of a molecule by transferring its heat to the remaining part where it is then dissipated to the environment. But the implementation of such a process involves a complex sequence of electromagnetic pulses. This work considers several hypothetical control processes on the nanoscopic level and show their analogy to computation processes. We show that measuring certain types of quantum observables is such a complex task that every instrument that is able to perform it would necessarily be an extremely powerful computer. Likewise, the implementation of a heat engine on the nanoscale requires to process the heat in a way that is similar to information processing and it can be shown that heat engines with maximal efficiency would be powerful computers, too. In the same way as problems in computer science can be classified by complexity classes we can also classify control problems according to their complexity. Moreover, we directly relate these complexity classes for control problems to the classes in computer science. Unifying notions of complexity in computer science and physics has therefore two aspects: on the one hand, computer science methods help to analyze the complexity of physical processes. On the other hand, reasonable

  1. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  2. Evaluating a computational support tool for set-based configuration of production systems : Results from an industrial case

    NARCIS (Netherlands)

    Unglert, Johannes; Hoekstra, Sipke; Jauregui Becker, Juan Manuel

    2017-01-01

    This paper describes research conducted in the context of an industrial case dealing with the design of re configurable cellular manufacturing systems. Reconfiguring such systems represents a complex task due to the interdependences between the constituent subsystems. A novel computational tool was

  3. Heterogeneity and Self-Organization of Complex Systems Through an Application to Financial Market with Multiagent Systems

    Science.gov (United States)

    Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille

    2017-12-01

    Multiagent systems (MAS) provide a useful tool for exploring the complex dynamics and behavior of financial markets and now MAS approach has been widely implemented and documented in the empirical literature. This paper introduces the implementation of an innovative multi-scale mathematical model for a computational agent-based financial market. The paper develops a method to quantify the degree of self-organization which emerges in the system and shows that the capacity of self-organization is maximized when the agent behaviors are heterogeneous. Numerical results are presented and analyzed, showing how the global market behavior emerges from specific individual behavior interactions.

  4. Analyzing the Implicit Computational Complexity of object-oriented programs

    OpenAIRE

    Marion , Jean-Yves; Péchoux , Romain

    2008-01-01

    International audience; A sup-interpretation is a tool which provides upper bounds on the size of the values computed by the function symbols of a program. Sup-interpretations have shown their interest to deal with the complexity of first order functional programs. This paper is an attempt to adapt the framework of sup-interpretations to a fragment of object-oriented programs, including loop and while constructs and methods with side effects. We give a criterion, called brotherly criterion, w...

  5. DOE research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-12-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models whose execution is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex; consequently, it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  6. Computer-aided protective system (CAPS)

    International Nuclear Information System (INIS)

    Squire, R.K.

    1988-01-01

    A method of improving the security of materials in transit is described. The system provides a continuously monitored position location system for the transport vehicle, an internal computer-based geographic delimiter that makes continuous comparisons of actual positions with the preplanned routing and schedule, and a tamper detection/reaction system. The position comparison is utilized to institute preprogrammed reactive measures if the carrier is taken off course or schedule, penetrated, or otherwise interfered with. The geographic locater could be an independent internal platform or an external signal-dependent system utilizing GPS, Loran or similar source of geographic information; a small (micro) computer could provide adequate memory and computational capacity; the insurance of integrity of the system indicates the need for a tamper-proof container and built-in intrusion sensors. A variant of the system could provide real-time transmission of the vehicle position and condition to a central control point for; such transmission could be encrypted to preclude spoofing

  7. Influence of the chelator structures on the stability of Re and Tc Tricarbonyl complexes: a computational study

    International Nuclear Information System (INIS)

    Hernández Valdés, Daniel; Rodríguez Riera, Zalua; Jáuregui Haza, Ulises; Díaz García, Alicia; Benoist, Eric

    2016-01-01

    The development of novel radiopharmaceuticals in nuclear medicine based on the M(CO)3 (M = Tc, Re) complexes has attracted great attention1. The versatility of this core and the easy production of the fac-[M(CO)3(H 2 O) 3 ]+ precursor could explain this interest2,3. The main characteristics of these tricarbonyl complexes are a high substitution stability of the three CO ligands and a corresponding lability of the coordinated water molecules, yielding, via easy exchange of a variety of mono-, bi-, and tridentate ligands, complexes of very high kinetic stability. A computational study of different tricarbonyl complexes for Re(I) and Tc(I) has been performed using density functional theory. The solvent effect was simulated using the polarizable continuum model. The fully optimized complexes show geometries that compare favorably with the X-ray data. These structures were used as a starting point to investigate the relative stability of tricarbonyl complexes with various tridentate ligands. They comprise an iminodiacetic acid unit for tridentate coordination to the fac-[M(CO) 3 ]+ moiety (M = Re, Tc), an aromatic ring system bearing a functional group (NO 2 -, NH 2 - and Cl-) as linking site model, and a tethering moiety (methylene, ethylene, propylene butylene or pentylene bridge) between the linking and coordinating sites. In general, Re complexes are more stables than the corresponding Tc complexes. Furthermore, the NH2 functional group, medium length in the carbon chain and meta substitution increase the stability of the complexes. The correlation of these results with the available experimental4 data on these systems allows bringing some understanding of the chemistry of tricarbonyl complexes. (author)

  8. SABRE: a computer-based system for the assessment of body radioactivity by photon spectrometry. Part 4

    International Nuclear Information System (INIS)

    Venn, J.B.

    1982-02-01

    A PDP-11/10 computer system is described for the acquisition and processing of pulse height spectra from detectors used for the measurement of body radioactivity. Version 4 of SABRE (System for the Assessment of Body Radioactivity) provides control of multiple detection systems from visual display consoles by means of a command language. A wide range of facilities is available for the display, processing and storage of acquired spectra and complex operations may be pre-programmed by means of the SABRE MACRO language. The hardware includes a CAMAC interface to the detection systems, disc cartridge drives for mass storage of data and programs, and data-links to other computers. The software is written in assembler language and includes special features for the dynamic allocation of computer memory and for safeguarding acquired data. (author)

  9. GAM-HEAT -- a computer code to compute heat transfer in complex enclosures

    International Nuclear Information System (INIS)

    Cooper, R.E.; Taylor, J.R.; Kielpinski, A.L.; Steimke, J.L.

    1991-02-01

    The GAM-HEAT code was developed for heat transfer analyses associated with postulated Double Ended Guillotine Break Loss Of Coolant Accidents (DEGB LOCA) resulting in a drained reactor vessel. In these analyses the gamma radiation resulting from fission product decay constitutes the primary source of energy as a function of time. This energy is deposited into the various reactor components and is re- radiated as thermal energy. The code accounts for all radiant heat exchanges within and leaving the reactor enclosure. The SRS reactors constitute complex radiant exchange enclosures since there are many assemblies of various types within the primary enclosure and most of the assemblies themselves constitute enclosures. GAM-HEAT accounts for this complexity by processing externally generated view factors and connectivity matrices, and also accounts for convective, conductive, and advective heat exchanges. The code is applicable for many situations involving heat exchange between surfaces within a radiatively passive medium. The GAM-HEAT code has been exercised extensively for computing transient temperatures in SRS reactors with specific charges and control components. Results from these computations have been used to establish the need for and to evaluate hardware modifications designed to mitigate results of postulated accident scenarios, and to assist in the specification of safe reactor operating power limits. The code utilizes temperature dependence on material properties. The efficiency of the code has been enhanced by the use of an iterative equation solver. Verification of the code to date consists of comparisons with parallel efforts at Los Alamos National Laboratory and with similar efforts at Westinghouse Science and Technology Center in Pittsburgh, PA, and benchmarked using problems with known analytical or iterated solutions. All comparisons and tests yield results that indicate the GAM-HEAT code performs as intended

  10. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    Science.gov (United States)

    Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  11. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    Directory of Open Access Journals (Sweden)

    David K Brown

    Full Text Available Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS, a workflow management system and web interface for high performance computing (HPC. JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  12. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing

    Science.gov (United States)

    Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450

  13. Configurating computer-controlled bar system

    OpenAIRE

    Šuštaršič, Nejc

    2010-01-01

    The principal goal of my diploma thesis is creating an application for configurating computer-controlled beverages dispensing systems. In the preamble of my thesis I present the theoretical platform for point of sale systems and beverages dispensing systems, which are required for the understanding of the target problematics. As with many other fields, computer tehnologies entered the field of managing bars and restaurants quite some time ago. Basic components of every bar or restaurant a...

  14. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  15. Multi-agent evolutionary systems for the generation of complex virtual worlds

    Directory of Open Access Journals (Sweden)

    J. Kruse

    2016-01-01

    Full Text Available Modern films, games and virtual reality applications are dependent on convincing computer graphics. Highly complex models are a requirement for the successful delivery of many scenes and environments. While workflows such as rendering, compositing and animation have been streamlined to accommodate increasing demands, modelling complex models is still a laborious task. This paper introduces the computational benefits of an Interactive Genetic Algorithm (IGA to computer graphics modelling while compensating the effects of user fatigue, a common issue with Interactive Evolutionary Computation. An intelligent agent is used in conjunction with an IGA that offers the potential to reduce the effects of user fatigue by learning from the choices made by the human designer and directing the search accordingly. This workflow accelerates the layout and distribution of basic elements to form complex models. It captures the designer’s intent through interaction, and encourages playful discovery.

  16. Cluster Computing for Embedded/Real-Time Systems

    Science.gov (United States)

    Katz, D.; Kepner, J.

    1999-01-01

    Embedded and real-time systems, like other computing systems, seek to maximize computing power for a given price, and thus can significantly benefit from the advancing capabilities of cluster computing.

  17. Computer-Based Decision Support for Railroad Transportation Systems: an Investment Case Study

    Directory of Open Access Journals (Sweden)

    Luminita DUTA

    2009-01-01

    Full Text Available In the last decade the development of the economical and social life increased the complexity of transportation systems. In this context, the role of Decision Support Systems (DSS became more and more important. The paper presents the characteristics, necessity, and usage of DSS in transportation and describes a practical application in the railroad field. To compute the optimal transportation capacity and flow on a certain railroad, specialized decision-support software which is available on the market was used.

  18. The semiotics of control and modeling relations in complex systems.

    Science.gov (United States)

    Joslyn, C

    2001-01-01

    We provide a conceptual analysis of ideas and principles from the systems theory discourse which underlie Pattee's semantic or semiotic closure, which is itself foundational for a school of theoretical biology derived from systems theory and cybernetics, and is now being related to biological semiotics and explicated in the relational biological school of Rashevsky and Rosen. Atomic control systems and models are described as the canonical forms of semiotic organization, sharing measurement relations, but differing topologically in that control systems are circularly and models linearly related to their environments. Computation in control systems is introduced, motivating hierarchical decomposition, hybrid modeling and control systems, and anticipatory or model-based control. The semiotic relations in complex control systems are described in terms of relational constraints, and rules and laws are distinguished as contingent and necessary functional entailments, respectively. Finally, selection as a meta-level of constraint is introduced as the necessary condition for semantic relations in control systems and models.

  19. Context-aware computing and self-managing systems

    CERN Document Server

    Dargie, Waltenegus

    2009-01-01

    Bringing together an extensively researched area with an emerging research issue, Context-Aware Computing and Self-Managing Systems presents the core contributions of context-aware computing in the development of self-managing systems, including devices, applications, middleware, and networks. The expert contributors reveal the usefulness of context-aware computing in developing autonomous systems that have practical application in the real world.The first chapter of the book identifies features that are common to both context-aware computing and autonomous computing. It offers a basic definit

  20. Computational modelling of oxygenation processes in enzymes and biomimetic model complexes

    OpenAIRE

    de Visser, Sam P.; Quesne, Matthew G.; Martin, Bodo; Comba, Peter; Ryde, Ulf

    2014-01-01

    With computational resources becoming more efficient and more powerful and at the same time cheaper, computational methods have become more and more popular for studies on biochemical and biomimetic systems. Although large efforts from the scientific community have gone into exploring the possibilities of computational methods on large biochemical systems, such studies are not without pitfalls and often cannot be routinely done but require expert execution. In this review we summarize and hig...

  1. What Is a Complex Innovation System?

    Science.gov (United States)

    Katz, J. Sylvan

    2016-01-01

    Innovation systems are sometimes referred to as complex systems, something that is intuitively understood but poorly defined. A complex system dynamically evolves in non-linear ways giving it unique properties that distinguish it from other systems. In particular, a common signature of complex systems is scale-invariant emergent properties. A scale-invariant property can be identified because it is solely described by a power law function, f(x) = kxα, where the exponent, α, is a measure of scale-invariance. The focus of this paper is to describe and illustrate that innovation systems have properties of a complex adaptive system. In particular scale-invariant emergent properties indicative of their complex nature that can be quantified and used to inform public policy. The global research system is an example of an innovation system. Peer-reviewed publications containing knowledge are a characteristic output. Citations or references to these articles are an indirect measure of the impact the knowledge has on the research community. Peer-reviewed papers indexed in Scopus and in the Web of Science were used as data sources to produce measures of sizes and impact. These measures are used to illustrate how scale-invariant properties can be identified and quantified. It is demonstrated that the distribution of impact has a reasonable likelihood of being scale-invariant with scaling exponents that tended toward a value of less than 3.0 with the passage of time and decreasing group sizes. Scale-invariant correlations are shown between the evolution of impact and size with time and between field impact and sizes at points in time. The recursive or self-similar nature of scale-invariance suggests that any smaller innovation system within the global research system is likely to be complex with scale-invariant properties too. PMID:27258040

  2. What Is a Complex Innovation System?

    Directory of Open Access Journals (Sweden)

    J Sylvan Katz

    Full Text Available Innovation systems are sometimes referred to as complex systems, something that is intuitively understood but poorly defined. A complex system dynamically evolves in non-linear ways giving it unique properties that distinguish it from other systems. In particular, a common signature of complex systems is scale-invariant emergent properties. A scale-invariant property can be identified because it is solely described by a power law function, f(x = kxα, where the exponent, α, is a measure of scale-invariance. The focus of this paper is to describe and illustrate that innovation systems have properties of a complex adaptive system. In particular scale-invariant emergent properties indicative of their complex nature that can be quantified and used to inform public policy. The global research system is an example of an innovation system. Peer-reviewed publications containing knowledge are a characteristic output. Citations or references to these articles are an indirect measure of the impact the knowledge has on the research community. Peer-reviewed papers indexed in Scopus and in the Web of Science were used as data sources to produce measures of sizes and impact. These measures are used to illustrate how scale-invariant properties can be identified and quantified. It is demonstrated that the distribution of impact has a reasonable likelihood of being scale-invariant with scaling exponents that tended toward a value of less than 3.0 with the passage of time and decreasing group sizes. Scale-invariant correlations are shown between the evolution of impact and size with time and between field impact and sizes at points in time. The recursive or self-similar nature of scale-invariance suggests that any smaller innovation system within the global research system is likely to be complex with scale-invariant properties too.

  3. Synchronization and emergence in complex systems

    Indian Academy of Sciences (India)

    ... complex systems. Fatihcan M Atay. Synchronization, Coupled Systems and Networks Volume 77 Issue 5 November 2011 pp 855-863 ... We show how novel behaviour can emerge in complex systems at the global level through synchronization of the activities of their constituent units. Two mechanisms are suggested for ...

  4. Infinite Particle Systems: Complex Systems III

    Directory of Open Access Journals (Sweden)

    Editorial Board

    2008-06-01

    Full Text Available In the years 2002-2005, a group of German and Polish mathematicians worked under a DFG research project No 436 POL 113/98/0-1 entitled "Methods of stochastic analysis in the theory of collective phenomena: Gibbs states and statistical hydrodynamics". The results of their study were summarized at the German-Polish conference, which took place in Poland in October 2005. The venue of the conference was Kazimierz Dolny upon Vistula - a lovely town and a popular place for various cultural, scientific, and even political events of an international significance. The conference was also attended by scientists from France, Italy, Portugal, UK, Ukraine, and USA, which predetermined its international character. Since that time, the conference, entitled "Infinite Particle Systems: Complex Systems" has become an annual international event, attended by leading scientists from Germany, Poland and many other countries. The present volume of the "Condensed Matter Physics" contains proceedings of the conference "Infinite Particle Systems: Complex Systems III", which took place in June 2007.

  5. Structure, function, and behaviour of computational models in systems biology.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  6. Complex biological and bio-inspired systems

    Energy Technology Data Exchange (ETDEWEB)

    Ecke, Robert E [Los Alamos National Laboratory

    2009-01-01

    The understanding and characterization ofthe fundamental processes of the function of biological systems underpins many of the important challenges facing American society, from the pathology of infectious disease and the efficacy ofvaccines, to the development of materials that mimic biological functionality and deliver exceptional and novel structural and dynamic properties. These problems are fundamentally complex, involving many interacting components and poorly understood bio-chemical kinetics. We use the basic science of statistical physics, kinetic theory, cellular bio-chemistry, soft-matter physics, and information science to develop cell level models and explore the use ofbiomimetic materials. This project seeks to determine how cell level processes, such as response to mechanical stresses, chemical constituents and related gradients, and other cell signaling mechanisms, integrate and combine to create a functioning organism. The research focuses on the basic physical processes that take place at different levels ofthe biological organism: the basic role of molecular and chemical interactions are investigated, the dynamics of the DNA-molecule and its phylogenetic role are examined and the regulatory networks of complex biochemical processes are modeled. These efforts may lead to early warning algorithms ofpathogen outbreaks, new bio-sensors to detect hazards from pathomic viruses to chemical contaminants. Other potential applications include the development of efficient bio-fuel alternative-energy processes and the exploration ofnovel materials for energy usages. Finally, we use the notion of 'coarse-graining,' which is a method for averaging over less important degrees of freedom to develop computational models to predict cell function and systems-level response to disease, chemical stress, or biological pathomic agents. This project supports Energy Security, Threat Reduction, and the missions of the DOE Office of Science through its efforts to

  7. Incorporating Parallel Computing into the Goddard Earth Observing System Data Assimilation System (GEOS DAS)

    Science.gov (United States)

    Larson, Jay W.

    1998-01-01

    Atmospheric data assimilation is a method of combining actual observations with model forecasts to produce a more accurate description of the earth system than the observations or forecast alone can provide. The output of data assimilation, sometimes called the analysis, are regular, gridded datasets of observed and unobserved variables. Analysis plays a key role in numerical weather prediction and is becoming increasingly important for climate research. These applications, and the need for timely validation of scientific enhancements to the data assimilation system pose computational demands that are best met by distributed parallel software. The mission of the NASA Data Assimilation Office (DAO) is to provide datasets for climate research and to support NASA satellite and aircraft missions. The system used to create these datasets is the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The core components of the the GEOS DAS are: the GEOS General Circulation Model (GCM), the Physical-space Statistical Analysis System (PSAS), the Observer, the on-line Quality Control (QC) system, the Coupler (which feeds analysis increments back to the GCM), and an I/O package for processing the large amounts of data the system produces (which will be described in another presentation in this session). The discussion will center on the following issues: the computational complexity for the whole GEOS DAS, assessment of the performance of the individual elements of GEOS DAS, and parallelization strategy for some of the components of the system.

  8. From System Complexity to Emergent Properties

    CERN Document Server

    Aziz-Alaoui, M. A

    2009-01-01

    Emergence and complexity refer to the appearance of higher-level properties and behaviours of a system that obviously comes from the collective dynamics of that system's components. These properties are not directly deductable from the lower-level motion of that system. Emergent properties are properties of the "whole'' that are not possessed by any of the individual parts making up that whole. Such phenomena exist in various domains and can be described, using complexity concepts and thematic knowledges. This book highlights complexity modelling through dynamical or behavioral systems. The pluridisciplinary purposes, developped along the chapters, are enable to design links between a wide-range of fundamental and applicative Sciences. Developing such links - instead of focusing on specific and narrow researches - is characteristic of the Science of Complexity that we try to promote by this contribution.

  9. Complex Systems Design & Management : Proceedings of the Third International Conference on Complex Systems Design & Management

    CERN Document Server

    Caseau, Yves; Krob, Daniel; Rauzy, Antoine

    2013-01-01

    This book contains all refereed papers that were accepted to the third edition of the « Complex Systems Design & Management » (CSD&M 2012) international conference that took place in Paris (France) from December 12-14, 2012. (Website: http://www.csdm2012.csdm.fr)  These proceedings cover the most recent trends in the emerging field of complex systems sciences & practices from an industrial and academic perspective, including the main industrial domains (transport, defense & security, electronics, energy & environment, e-services), scientific & technical topics (systems fundamentals, systems architecture& engineering, systems metrics & quality, systemic  tools) and system types (transportation systems, embedded systems, software & information systems, systems of systems, artificial ecosystems). The CSD&M 2012 conference is organized under the guidance of the CESAMES non-profit organization (http://www.cesames.net).

  10. Applying systemic-structural activity theory to design of human-computer interaction systems

    CERN Document Server

    Bedny, Gregory Z; Bedny, Inna

    2015-01-01

    Human-Computer Interaction (HCI) is an interdisciplinary field that has gained recognition as an important field in ergonomics. HCI draws on ideas and theoretical concepts from computer science, psychology, industrial design, and other fields. Human-Computer Interaction is no longer limited to trained software users. Today people interact with various devices such as mobile phones, tablets, and laptops. How can you make such interaction user friendly, even when user proficiency levels vary? This book explores methods for assessing the psychological complexity of computer-based tasks. It also p

  11. Theory of computational complexity

    CERN Document Server

    Du, Ding-Zhu

    2011-01-01

    DING-ZHU DU, PhD, is a professor in the Department of Computer Science at the University of Minnesota. KER-I KO, PhD, is a professor in the Department of Computer Science at the State University of New York at Stony Brook.

  12. Essentials and Perspectives of Computational Modelling Assistance for CNS-oriented Nanoparticle-based Drug Delivery Systems.

    Science.gov (United States)

    Kisała, Joanna; Heclik, Kinga I; Pogocki, Krzysztof; Pogocki, Dariusz

    2018-05-16

    The blood-brain barrier (BBB) is a complex system controlling two-way substances traffic between circulatory (cardiovascular) system and central nervous system (CNS). It is almost perfectly crafted to regulate brain homeostasis and to permit selective transport of molecules that are essential for brain function. For potential drug candidates, the CNS-oriented neuropharmaceuticals as well as for those of primary targets in the periphery, the extent to which a substance in the circulation gains access to the CNS seems crucial. With the advent of nanopharmacology the problem of the BBB permeability for drug nano-carriers gains new significance. Compare to some other fields of medicinal chemistry, the computational science of nanodelivery is still prematured to offer the black-box type solutions, especially for the BBB-case. However, even its enormous complexity can be spell out the physical principles, and as such subjected to computation. Basic understanding of various physico-chemical parameters describing the brain uptake is required to take advantage of their usage for the BBB-nanodelivery. This mini-review provides a sketchy introduction into essential concepts allowing application of computational simulation to the BBB-nanodelivery design. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  13. An Applet-based Anonymous Distributed Computing System.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  14. A computer program to reduce the time of diagnosis in complex systems

    International Nuclear Information System (INIS)

    Arellano-Gomez, Juan; Romero-Rubio, Omar U.

    2006-01-01

    In Nuclear Power Plants (NPPs), the time that some systems are allowed to be down is frequently very limited. Thus, when one of these systems fails, diagnosis and repair must be quickly performed in order to get the system back to an operative state. Unfortunately, in complex systems, a considerable amount of the total repair time could be spent in the duty of diagnosis. For this reason, it seems very useful to provide maintenance personnel with a systematic approach to system failure diagnosis, capable to minimize the time required to effectively identify the causes of system malfunction. In this context, the expert systems technology has been widely applied in several disciplines to successfully develop diagnostic systems. Obviously, an important input to develop these expert systems is, of course, knowledge; this knowledge includes both formal knowledge and experience on what faults could occur, how these faults occur, which are the effects of these faults, what could be inferred from symptoms, etc. Due to their logical nature, those fault trees developed by expert analysts during risk studies could also be used as the source of knowledge of diagnostic expert systems (DES); however, these fault trees must be expanded to include symptoms because, typically, diagnosis is performed by inferring the causes of system malfunction from symptoms. This paper presents SANA (Symptom Analyzer), a new software package specially designed to develop diagnostic expert systems. The main feature of this software is that it exploits the knowledge stored in fault trees (in particular, expanded fault trees) to generate very efficient diagnostic strategies. These strategies guide diagnostic activities seeking to minimize the time required to identify those components which are responsible of the system failure. Besides, the generated strategies 'emulate' the way experienced technicians proceed in order to diagnose the causes of system failure (i.e. by recognizing categories of

  15. The Meaning of System: Towards a Complexity Orientation in Systems Thinking

    DEFF Research Database (Denmark)

    Leleur, Steen

    2014-01-01

    for systems practice. It is argued that complexity theory and thinking with reference to Luhmann a.o. ought to be recognised and paid attention to by the systems community. Overall, it is found that a complexity orientation may contribute to extend and enrich the explanatory power of current systems theory......This article reviews the generic meaning of ‘system’ and complements more conventional system notions with a system perception based on recent complexity theory. With system as the core concept of systems theory, its actual meaning is not just of theoretical interest but is highly relevant also...... when used to complex real-world problems. As regards systems practice it is found that selective use and combination of five presented research approaches (functionalist, interpretive, emancipatory, postmodern and complexity) which function as different but complementing ‘epistemic lenses’ in a process...

  16. 'Micro-8' micro-computer system

    International Nuclear Information System (INIS)

    Yagi, Hideyuki; Nakahara, Yoshinori; Yamada, Takayuki; Takeuchi, Norio; Koyama, Kinji

    1978-08-01

    The micro-computer Micro-8 system has been developed to organize a data exchange network between various instruments and a computer group including a large computer system. Used for packet exchangers and terminal controllers, the system consists of ten kinds of standard boards including a CPU board with INTEL-8080 one-chip-processor. CPU architecture, BUS architecture, interrupt control, and standard-boards function are explained in circuit block diagrams. Operations of the basic I/O device, digital I/O board and communication adapter are described with definitions of the interrupt ramp status, I/O command, I/O mask, data register, etc. In the appendixes are circuit drawings, INTEL-8080 micro-processor specifications, BUS connections, I/O address mappings, jumper connections of address selection, and interface connections. (author)

  17. Data integration, systems approach and multilevel description of complex biosystems

    International Nuclear Information System (INIS)

    Hernández-Lemus, Enrique

    2013-01-01

    Recent years have witnessed the development of new quantitative approaches and theoretical tenets in the biological sciences. The advent of high throughput experiments in genomics, proteomics and electrophysiology (to cite just a few examples) have provided the researchers with unprecedented amounts of data to be analyzed. Large datasets, however can not provide the means to achieve a complete understanding of the underlying biological phenomena, unless they are supplied with a solid theoretical framework and with proper analytical tools. It is now widely accepted that by using and extending some of the paradigmatic principles of what has been called complex systems theory, some degree of advance in this direction can be attained. We will be presenting ways in which by using data integration techniques (linear, non-linear, combinatorial, graphical), multidimensional-multilevel descriptions (multifractal modeling, dimensionality reduction, computational learning), as well as an approach based in systems theory (interaction maps, probabilistic graphical models, non-equilibrium physics) have allowed us to better understand some problems in the interface of Statistical Physics and Computational Biology

  18. Cyber Security on Nuclear Power Plant's Computer Systems

    International Nuclear Information System (INIS)

    Shin, Ick Hyun

    2010-01-01

    Computer systems are used in many different fields of industry. Most of us are taking great advantages from the computer systems. Because of the effectiveness and great performance of computer system, we are getting so dependable on the computer. But the more we are dependable on the computer system, the more the risk we will face when the computer system is unavailable or inaccessible or uncontrollable. There are SCADA, Supervisory Control And Data Acquisition, system which are broadly used for critical infrastructure such as transportation, electricity, water management. And if the SCADA system is vulnerable to the cyber attack, it is going to be nation's big disaster. Especially if nuclear power plant's main control systems are attacked by cyber terrorists, the results may be huge. Leaking of radioactive material will be the terrorist's main purpose without using physical forces. In this paper, different types of cyber attacks are described, and a possible structure of NPP's computer network system is presented. And the paper also provides possible ways of destruction of the NPP's computer system along with some suggestions for the protection against cyber attacks

  19. Potential of Cognitive Computing and Cognitive Systems

    Science.gov (United States)

    Noor, Ahmed K.

    2015-01-01

    Cognitive computing and cognitive technologies are game changers for future engineering systems, as well as for engineering practice and training. They are major drivers for knowledge automation work, and the creation of cognitive products with higher levels of intelligence than current smart products. This paper gives a brief review of cognitive computing and some of the cognitive engineering systems activities. The potential of cognitive technologies is outlined, along with a brief description of future cognitive environments, incorporating cognitive assistants - specialized proactive intelligent software agents designed to follow and interact with humans and other cognitive assistants across the environments. The cognitive assistants engage, individually or collectively, with humans through a combination of adaptive multimodal interfaces, and advanced visualization and navigation techniques. The realization of future cognitive environments requires the development of a cognitive innovation ecosystem for the engineering workforce. The continuously expanding major components of the ecosystem include integrated knowledge discovery and exploitation facilities (incorporating predictive and prescriptive big data analytics); novel cognitive modeling and visual simulation facilities; cognitive multimodal interfaces; and cognitive mobile and wearable devices. The ecosystem will provide timely, engaging, personalized / collaborative, learning and effective decision making. It will stimulate creativity and innovation, and prepare the participants to work in future cognitive enterprises and develop new cognitive products of increasing complexity. http://www.aee.odu.edu/cognitivecomp

  20. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  1. Embracing uncertainty, managing complexity: applying complexity thinking principles to transformation efforts in healthcare systems.

    Science.gov (United States)

    Khan, Sobia; Vandermorris, Ashley; Shepherd, John; Begun, James W; Lanham, Holly Jordan; Uhl-Bien, Mary; Berta, Whitney

    2018-03-21

    Complexity thinking is increasingly being embraced in healthcare, which is often described as a complex adaptive system (CAS). Applying CAS to healthcare as an explanatory model for understanding the nature of the system, and to stimulate changes and transformations within the system, is valuable. A seminar series on systems and complexity thinking hosted at the University of Toronto in 2016 offered a number of insights on applications of CAS perspectives to healthcare that we explore here. We synthesized topics from this series into a set of six insights on how complexity thinking fosters a deeper understanding of accepted ideas in healthcare, applications of CAS to actors within the system, and paradoxes in applications of complexity thinking that may require further debate: 1) a complexity lens helps us better understand the nebulous term "context"; 2) concepts of CAS may be applied differently when actors are cognizant of the system in which they operate; 3) actor responses to uncertainty within a CAS is a mechanism for emergent and intentional adaptation; 4) acknowledging complexity supports patient-centred intersectional approaches to patient care; 5) complexity perspectives can support ways that leaders manage change (and transformation) in healthcare; and 6) complexity demands different ways of implementing ideas and assessing the system. To enhance our exploration of key insights, we augmented the knowledge gleaned from the series with key articles on complexity in the literature. Ultimately, complexity thinking acknowledges the "messiness" that we seek to control in healthcare and encourages us to embrace it. This means seeing challenges as opportunities for adaptation, stimulating innovative solutions to ensure positive adaptation, leveraging the social system to enable ideas to emerge and spread across the system, and even more important, acknowledging that these adaptive actions are part of system behaviour just as much as periods of stability are. By

  2. Universal blind quantum computation for hybrid system

    Science.gov (United States)

    Huang, He-Liang; Bao, Wan-Su; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Zhang, Hai-Long; Wang, Xiang

    2017-08-01

    As progress on the development of building quantum computer continues to advance, first-generation practical quantum computers will be available for ordinary users in the cloud style similar to IBM's Quantum Experience nowadays. Clients can remotely access the quantum servers using some simple devices. In such a situation, it is of prime importance to keep the security of the client's information. Blind quantum computation protocols enable a client with limited quantum technology to delegate her quantum computation to a quantum server without leaking any privacy. To date, blind quantum computation has been considered only for an individual quantum system. However, practical universal quantum computer is likely to be a hybrid system. Here, we take the first step to construct a framework of blind quantum computation for the hybrid system, which provides a more feasible way for scalable blind quantum computation.

  3. A general method for computing the total solar radiation force on complex spacecraft structures

    Science.gov (United States)

    Chan, F. K.

    1981-01-01

    The method circumvents many of the existing difficulties in computational logic presently encountered in the direct analytical or numerical evaluation of the appropriate surface integral. It may be applied to complex spacecraft structures for computing the total force arising from either specular or diffuse reflection or even from non-Lambertian reflection and re-radiation.

  4. Systems Approach to Tourism: A Methodology for Defining Complex Tourism System

    Directory of Open Access Journals (Sweden)

    Jere Jakulin Tadeja

    2017-08-01

    Full Text Available Background and Purpose: The complexity of the tourism system, as well as modelling in a frame of system dynamics, will be discussed in this paper. The phaenomenon of tourism, which possesses the typical properties of global and local organisations, will be presented as an open complex system with all its elements, and an optimal methodology to explain the relations among them. The approach we want to present is due to its transparency an excellent tool for searching systems solutions and serves also as a strategic decision-making assessment. We will present systems complexity and develop three models of a complex tourism system: the first one will present tourism as an open complex system with its elements, which operate inside of a tourism market area. The elements of this system present subsystems, which relations and interdependencies will be explained with two models: causal-loop diagram and a simulation model in frame of systems dynamics.

  5. An LISP interpreter for a PDP-15 computer system

    International Nuclear Information System (INIS)

    Gonzalez Navidad, R.

    1974-01-01

    LISP processor for a PDP-15 computer system. The object of this work is to use the LISP language as an integral part of a PDP-15 computer system. LISP is a programming language designed for processing lists, and from this it gets its name (LIST-PROCESSING). The most important aspects of LISP are as follows: (1) It is a formal mathematical language based on the theory of recursive functions; (2) It is geared to processing symbolic rather than numerical data; (3) Programs written in LISP are themselves lists. The techniques employed for this application were based on the characteristics and capabilities of the PDP-15 system. It offers a basic minimum system of LISP functions with which more complex functions can be constructed. A detailed description of each of the units comprising the interpreter provides the reader with a useful introduction to the LISP language. Among the results that have been obtained with this interpreter we can quote the following: construction of a set of auxiliary functions for the system and simulation of a Turing machine based on this set of auxiliary functions written in LISP, with which a beginning has been made in setting up a function library, utilizing the capacity of LISP as a self-generating function language. Hence the LISP processor can be used to resolve problems of algebraic manipulation, system simulation, group theory, modern algebra, data classification and processing, and other common logical problems. (author)

  6. Torness computer system turns round data

    International Nuclear Information System (INIS)

    Dowler, E.; Hamilton, J.

    1989-01-01

    The Torness nuclear power station has two advanced gas-cooled reactors. A key feature is the distributed computer system which covers both data processing and auto-control. The complete computer system has over 80 processors with 45000 digital and 22000 analogue input signals. The on-line control and monitoring systems includes operating systems, plant data acquisition and processing, alarm and event detection, communications software, process management systems and database management software. Some features of the system are described. (UK)

  7. Application of computational intelligence techniques for load shedding in power systems: A review

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Bakar, A.H.A.; Mohamad, Hasmaini

    2013-01-01

    Highlights: • The power system blackout history of last two decades is presented. • Conventional load shedding techniques, their types and limitations are presented. • Applications of intelligent techniques in load shedding are presented. • Intelligent techniques include ANN, fuzzy logic, ANFIS, genetic algorithm and PSO. • The discussion and comparison between these techniques are provided. - Abstract: Recent blackouts around the world question the reliability of conventional and adaptive load shedding techniques in avoiding such power outages. To address this issue, reliable techniques are required to provide fast and accurate load shedding to prevent collapse in the power system. Computational intelligence techniques, due to their robustness and flexibility in dealing with complex non-linear systems, could be an option in addressing this problem. Computational intelligence includes techniques like artificial neural networks, genetic algorithms, fuzzy logic control, adaptive neuro-fuzzy inference system, and particle swarm optimization. Research in these techniques is being undertaken in order to discover means for more efficient and reliable load shedding. This paper provides an overview of these techniques as applied to load shedding in a power system. This paper also compares the advantages of computational intelligence techniques over conventional load shedding techniques. Finally, this paper discusses the limitation of computational intelligence techniques, which restricts their usage in load shedding in real time

  8. Complexity: Outline of the NWO strategic theme Dynamics of complex systems

    NARCIS (Netherlands)

    Burgers, G.; Doelman, A.; Frenken, K.; Hogeweg, P.; Hommes, C.; van der Maas, H.; Mulder, B.; Stam, K.; van Steen, M.; Zandee, L.

    2008-01-01

    Dynamics of complex systems is one of the program 5 themes in the NWO (Netherlands Organisation for Scientific Research) strategy for the years 2007-2011. The ambition of the current proposal is to initiate integrated activities in the field of complex systems within the Netherlands, to provide

  9. Complexity : outline of the NWO strategic theme dynamics of complex systems

    NARCIS (Netherlands)

    Burgers, G.; Doelman, A.; Frenken, K.; Hogeweg, P.; Hommes, C.; Maas, van der H.; Mulder, B.; Stam, K.; Steen, van M.; Zandee, L.

    2008-01-01

    Dynamics of complex systems is one of the program 5 themes in the NWO (Netherlands Organisation for Scientific Research) strategy for the years 2007-2011. The ambition of the current proposal is to initiate integrated activities in the field of complex systems within the Netherlands, to provide

  10. Computing Preferred Extensions for Argumentation Systems with Sets of Attacking

    DEFF Research Database (Denmark)

    Nielsen, Søren Holbech; Parsons, Simon

    2006-01-01

    The hitherto most abstract, and hence general, argumentation system, is the one described by Dung in a paper from 1995. This framework does not allow for joint attacks on arguments, but in a recent paper we adapted it to support such attacks, and proved that this adapted framework enjoyed the same...... formal properties as that of Dung. One problem posed by Dung's original framework, which was neglected for some time, is how to compute preferred extensions of the argumentation systems. However, in 2001, in a paper by Doutre and Mengin, a procedure was given for enumerating preferred extensions...... for these systems. In this paper we propose a method for enumerating preferred extensions of the potentially more complex systems, where joint attacks are allowed. The method is inspired by the one given by Doutre and Mengin....

  11. A computer-based training system combining virtual reality and multimedia

    International Nuclear Information System (INIS)

    Stansfield, S.A.

    1993-01-01

    Training new users of complex machines is often an expensive and time-consuming process. This is particularly true for special purpose systems, such as those frequently encountered in DOE applications. This paper presents a computer-based training system intended as a partial solution to this problem. The system extends the basic virtual reality (VR) training paradigm by adding a multimedia component which may be accessed during interaction with the virtual environment: The 3D model used to create the virtual reality is also used as the primary navigation tool through the associated multimedia. This method exploits the natural mapping between a virtual world and the real world that it represents to provide a more intuitive way for the student to interact with all forms of information about the system

  12. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.

    Science.gov (United States)

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.

  13. Computer-based control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Kalashnikov, V.K.; Shugam, R.A.; Ol'shevsky, Yu.N.

    1975-01-01

    Computer-based control systems of nuclear power plants may be classified into those using computers for data acquisition only, those using computers for data acquisition and data processing, and those using computers for process control. In the present paper a brief review is given of the functions the systems above mentioned perform, their applications in different nuclear power plants, and some of their characteristics. The trend towards hierarchic systems using control computers with reserves already becomes clear when consideration is made of the control systems applied in the Canadian nuclear power plants that pertain to the first ones equipped with process computers. The control system being now under development for the large Soviet reactors of WWER type will also be based on the use of control computers. That part of the system concerned with controlling the reactor assembly is described in detail

  14. Integration of process computer systems to Cofrentes NPP

    International Nuclear Information System (INIS)

    Saettone Justo, A.; Pindado Andres, R.; Buedo Jimenez, J.L.; Jimenez Fernandez-Sesma, A.; Delgado Muelas, J.A.

    1997-01-01

    The existence of three different process computer systems in Cofrentes NPP and the ageing of two of them have led to the need for their integration into a single real time computer system, known as Integrated ERIS-Computer System (SIEC), which covers the functionality of the three systems: Process Computer (PC), Emergency Response Information System (ERIS) and Nuclear Calculation Computer (OCN). The paper describes the integration project developed, which has essentially consisted in the integration of PC, ERIS and OCN databases into a single database, the migration of programs from the old process computer into the new SIEC hardware-software platform and the installation of a communications programme to transmit all necessary data for OCN programs from the SIEC computer, which in the new configuration is responsible for managing the databases of the whole system. (Author)

  15. A chimera grid scheme. [multiple overset body-conforming mesh system for finite difference adaptation to complex aircraft configurations

    Science.gov (United States)

    Steger, J. L.; Dougherty, F. C.; Benek, J. A.

    1983-01-01

    A mesh system composed of multiple overset body-conforming grids is described for adapting finite-difference procedures to complex aircraft configurations. In this so-called 'chimera mesh,' a major grid is generated about a main component of the configuration and overset minor grids are used to resolve all other features. Methods for connecting overset multiple grids and modifications of flow-simulation algorithms are discussed. Computational tests in two dimensions indicate that the use of multiple overset grids can simplify the task of grid generation without an adverse effect on flow-field algorithms and computer code complexity.

  16. Gaussian random-matrix process and universal parametric correlations in complex systems

    International Nuclear Information System (INIS)

    Attias, H.; Alhassid, Y.

    1995-01-01

    We introduce the framework of the Gaussian random-matrix process as an extension of Dyson's Gaussian ensembles and use it to discuss the statistical properties of complex quantum systems that depend on an external parameter. We classify the Gaussian processes according to the short-distance diffusive behavior of their energy levels and demonstrate that all parametric correlation functions become universal upon the appropriate scaling of the parameter. The class of differentiable Gaussian processes is identified as the relevant one for most physical systems. We reproduce the known spectral correlators and compute eigenfunction correlators in their universal form. Numerical evidence from both a chaotic model and weakly disordered model confirms our predictions

  17. Recent Developments in Complex Analysis and Computer Algebra

    CERN Document Server

    Kajiwara, Joji; Xu, Yongzhi

    1999-01-01

    This volume consists of papers presented in the special sessions on "Complex and Numerical Analysis", "Value Distribution Theory and Complex Domains", and "Use of Symbolic Computation in Mathematics Education" of the ISAAC'97 Congress held at the University of Delaware, during June 2-7, 1997. The ISAAC Congress coincided with a U.S.-Japan Seminar also held at the University of Delaware. The latter was supported by the National Science Foundation through Grant INT-9603029 and the Japan Society for the Promotion of Science through Grant MTCS-134. It was natural that the participants of both meetings should interact and consequently several persons attending the Congress also presented papers in the Seminar. The success of the ISAAC Congress and the U.S.-Japan Seminar has led to the ISAAC'99 Congress being held in Fukuoka, Japan during August 1999. Many of the same participants will return to this Seminar. Indeed, it appears that the spirit of the U.S.-Japan Seminar will be continued every second year as part of...

  18. On several computer-oriented studies

    International Nuclear Information System (INIS)

    Takahashi, Ryoichi

    1982-01-01

    To utilize fully digital techniques for solving various difficult problems, nuclear engineers have recourse to computer-oriented approaches. The current trend, in such fields as optimization theory, control system theory and computational fluid dynamics reflect the ability to use computers to obtain numerical solutions to complex problems. Special purpose computers will be used as the integral part of the solving system to process a large amount of data, to implement a control law and even to produce a decision-making. Many problem-solving systems designed in the future will incorporate special-purpose computers as system component. The optimum use of computer system is discussed: why are energy model, energy data base and a big computer used; why will the economic process-computer be allocated to nuclear plants in the future; why should the super-computer be demonstrated at once. (Mori, K.)

  19. Integrated Computer System of Management in Logistics

    Science.gov (United States)

    Chwesiuk, Krzysztof

    2011-06-01

    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  20. Complex systems fractionality, time-delay and synchronization

    CERN Document Server

    Sun, Jian-Qiao

    2012-01-01

    "Complex Systems: Fractionality, Time-delay and Synchronization" covers the most recent developments and advances in the theory and application of complex systems in these areas. Each chapter was written by scientists highly active in the field of complex systems. The book discusses a new treatise on fractional dynamics and control, as well as the new methods for differential delay systems and control. Lastly, a theoretical framework for the complexity and synchronization of complex system is presented. The book is intended for researchers in the field of nonlinear dynamics in mathematics, physics and engineering. It can also serve as a reference book for graduate students in physics, applied mathematics and engineering. Dr. Albert C.J. Luo is a Professor at Southern Illinois University Edwardsville, USA. Dr. Jian-Qiao Sun is a Professor at the University of California, Merced, USA.