WorldWideScience

Sample records for analysis cuii8-hydroxyquinoline complex

  1. Complex analysis

    CERN Document Server

    Freitag, Eberhard

    2005-01-01

    The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...

  2. Invitation to complex analysis

    CERN Document Server

    Boas, Ralph P

    2010-01-01

    Ideal for a first course in complex analysis, this book can be used either as a classroom text or for independent study. Written at a level accessible to advanced undergraduates and beginning graduate students, the book is suitable for readers acquainted with advanced calculus or introductory real analysis. The treatment goes beyond the standard material of power series, Cauchy's theorem, residues, conformal mapping, and harmonic functions by including accessible discussions of intriguing topics that are uncommon in a book at this level. The flexibility afforded by the supplementary topics and applications makes the book adaptable either to a short, one-term course or to a comprehensive, full-year course. Detailed solutions of the exercises both serve as models for students and facilitate independent study. Supplementary exercises, not solved in the book, provide an additional teaching tool. This second edition has been painstakingly revised by the author's son, himself an award-winning mathematical expositor...

  3. Complex analysis and geometry

    CERN Document Server

    Silva, Alessandro

    1993-01-01

    The papers in this wide-ranging collection report on the results of investigations from a number of linked disciplines, including complex algebraic geometry, complex analytic geometry of manifolds and spaces, and complex differential geometry.

  4. Real and complex analysis

    CERN Document Server

    Apelian, Christopher; Taft, Earl; Nashed, Zuhair

    2009-01-01

    The Spaces R, Rk, and CThe Real Numbers RThe Real Spaces RkThe Complex Numbers CPoint-Set Topology Bounded SetsClassification of Points Open and Closed SetsNested Intervals and the Bolzano-Weierstrass Theorem Compactness and Connectedness Limits and Convergence Definitions and First Properties Convergence Results for SequencesTopological Results for Sequences Properties of Infinite SeriesManipulations of Series in RFunctions: Definitions and Limits DefinitionsFunctions as MappingsSome Elementary Complex FunctionsLimits of FunctionsFunctions: Continuity and Convergence Continuity Uniform Continuity Sequences and Series of FunctionsThe DerivativeThe Derivative for f: D1 → RThe Derivative for f: Dk → RThe Derivative for f: Dk → RpThe Derivative for f: D → CThe Inverse and Implicit Function TheoremsReal IntegrationThe Integral of f: [a, b] → RProperties of the Riemann Integral Further Development of Integration TheoryVector-Valued and Line IntegralsComplex IntegrationIntroduction to Complex Integrals Fu...

  5. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  6. Communication Analysis of Information Complexes.

    Science.gov (United States)

    Malik, M. F.

    Communication analysis is a tool for perceptual assessment of existing or projected information complexes, i.e., an established reality perceived by one or many humans. An information complex could be of a physical nature, such as a building, landscape, city street; or of a pure informational nature, such as a film, television program,…

  7. From real to complex analysis

    CERN Document Server

    Dyer, R H

    2014-01-01

    The purpose of this book is to provide an integrated course in real and complex analysis for those who have already taken a preliminary course in real analysis. It particularly emphasises the interplay between analysis and topology. Beginning with the theory of the Riemann integral (and its improper extension) on the real line, the fundamentals of metric spaces are then developed, with special attention being paid to connectedness, simple connectedness and various forms of homotopy. The final chapter develops the theory of complex analysis, in which emphasis is placed on the argument, the winding number, and a general (homology) version of Cauchy's theorem which is proved using the approach due to Dixon. Special features are the inclusion of proofs of Montel's theorem, the Riemann mapping theorem and the Jordan curve theorem that arise naturally from the earlier development. Extensive exercises are included in each of the chapters, detailed solutions of the majority of which are given at the end. From Real to...

  8. Complex Wavelet Based Modulation Analysis

    DEFF Research Database (Denmark)

    Luneau, Jean-Marc; Lebrun, Jérôme; Jensen, Søren Holdt

    2008-01-01

    Low-frequency modulation of sound carry important information for speech and music. The modulation spectrum i commonly obtained by spectral analysis of the sole temporal envelopes of the sub-bands out of a time-frequency analysis. Processing in this domain usually creates undesirable distortions...... polynomial trends. Moreover an analytic Hilbert-like transform is possible with complex wavelets implemented as an orthogonal filter bank. By working in an alternative transform domain coined as “Modulation Subbands”, this transform shows very promising denoising capabilities and suggests new approaches for joint...

  9. Music analysis and Kolmogorov complexity

    DEFF Research Database (Denmark)

    Meredith, David

    The goal of music analysis is to find the most satisfying explanations for musical works. It is proposed that this can best be achieved by attempting to write computer programs that are as short as possible and that generate representations that are as detailed as possible of the music...... that the program represents. If such an effective measure of analysis quality can be found, it could be used in a system that automatically finds the optimal analysis for any passage of music. Measuring program length in terms of number of source-code characters is shown to be problematic and an expression...... is proposed that overcomes some but not all of these problems. It is suggested that the solutions to the remaining problems may lie either in the field of concrete Kolmogorov complexity or in the design of languages specialized for expressing musical structure....

  10. Structural Analysis of Complex Networks

    CERN Document Server

    Dehmer, Matthias

    2011-01-01

    Filling a gap in literature, this self-contained book presents theoretical and application-oriented results that allow for a structural exploration of complex networks. The work focuses not only on classical graph-theoretic methods, but also demonstrates the usefulness of structural graph theory as a tool for solving interdisciplinary problems. Applications to biology, chemistry, linguistics, and data analysis are emphasized. The book is suitable for a broad, interdisciplinary readership of researchers, practitioners, and graduate students in discrete mathematics, statistics, computer science,

  11. Elementary real and complex analysis

    CERN Document Server

    Shilov, Georgi E

    1996-01-01

    In this book the renowned Russian mathematician Georgi E. Shilov brings his unique perspective to real and complex analysis, an area of perennial interest in mathematics. Although there are many books available on the topic, the present work is specially designed for undergraduates in mathematics, science and engineering. A high level of mathematical sophistication is not required.The book begins with a systematic study of real numbers, understood to be a set of objects satisfying certain definite axioms. The concepts of a mathematical structure and an isomorphism are introduced in Chapter 2,

  12. Multifractal analysis of complex networks

    International Nuclear Information System (INIS)

    Wang Dan-Ling; Yu Zu-Guo; Anh V

    2012-01-01

    Complex networks have recently attracted much attention in diverse areas of science and technology. Many networks such as the WWW and biological networks are known to display spatial heterogeneity which can be characterized by their fractal dimensions. Multifractal analysis is a useful way to systematically describe the spatial heterogeneity of both theoretical and experimental fractal patterns. In this paper, we introduce a new box-covering algorithm for multifractal analysis of complex networks. This algorithm is used to calculate the generalized fractal dimensions D q of some theoretical networks, namely scale-free networks, small world networks, and random networks, and one kind of real network, namely protein—protein interaction networks of different species. Our numerical results indicate the existence of multifractality in scale-free networks and protein—protein interaction networks, while the multifractal behavior is not clear-cut for small world networks and random networks. The possible variation of D q due to changes in the parameters of the theoretical network models is also discussed. (general)

  13. Complex analysis and CR geometry

    CERN Document Server

    Zampieri, Giuseppe

    2008-01-01

    Cauchy-Riemann (CR) geometry is the study of manifolds equipped with a system of CR-type equations. Compared to the early days when the purpose of CR geometry was to supply tools for the analysis of the existence and regularity of solutions to the \\bar\\partial-Neumann problem, it has rapidly acquired a life of its own and has became an important topic in differential geometry and the study of non-linear partial differential equations. A full understanding of modern CR geometry requires knowledge of various topics such as real/complex differential and symplectic geometry, foliation theory, the geometric theory of PDE's, and microlocal analysis. Nowadays, the subject of CR geometry is very rich in results, and the amount of material required to reach competence is daunting to graduate students who wish to learn it. However, the present book does not aim at introducing all the topics of current interest in CR geometry. Instead, an attempt is made to be friendly to the novice by moving, in a fairly relaxed way, f...

  14. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  15. Complex ray analysis for plasmas

    International Nuclear Information System (INIS)

    Connor, K.A.

    1980-01-01

    An extension of ray tracing techniques is considered for a variety of cases in which the dispersion relation of the plasma medium is complex. The ray trajectories are permitted to begin and/or at least travel through complex space-time; the wave propagation process so characterized becomes significant only where the rays intersect real space-time. It is found that rules and guidelines can be established for limited application of this idea

  16. Analysis of nanoparticle biomolecule complexes.

    Science.gov (United States)

    Gunnarsson, Stefán B; Bernfur, Katja; Mikkelsen, Anders; Cedervall, Tommy

    2018-03-01

    Nanoparticles exposed to biological fluids adsorb biomolecules on their surface forming a biomolecular corona. This corona determines, on a molecular level, the interactions and impact the newly formed complex has on cells and organisms. The corona formation as well as the physiological and toxicological relevance are commonly investigated. However, an acknowledged but rarely addressed problem in many fields of nanobiotechnology is aggregation and broadened size distribution of nanoparticles following their interactions with the molecules of biological fluids. In blood serum, TiO 2 nanoparticles form complexes with a size distribution from 30 nm to more than 500 nm. In this study we have separated these complexes, with good resolution, using preparative centrifugation in a sucrose gradient. Two main apparent size populations were obtained, a fast sedimenting population of complexes that formed a pellet in the preparative centrifugation tube, and a slow sedimenting complex population still suspended in the gradient after centrifugation. Concentration and surface area dependent differences are found in the biomolecular corona between the slow and fast sedimenting fractions. There are more immunoglobulins, lipid binding proteins, and lipid-rich complexes at higher serum concentrations. Sedimentation rate and the biomolecular corona are important factors for evaluating any experiment including nanoparticle exposure. Our results show that traditional description of nanoparticles in biological fluids is an oversimplification and that more thorough characterisations are needed.

  17. An introduction to complex analysis and geometry

    CERN Document Server

    D'Angelo, John P

    2010-01-01

    An Introduction to Complex Analysis and Geometry provides the reader with a deep appreciation of complex analysis and how this subject fits into mathematics. The book developed from courses given in the Campus Honors Program at the University of Illinois Urbana-Champaign. These courses aimed to share with students the way many mathematics and physics problems magically simplify when viewed from the perspective of complex analysis. The book begins at an elementary level but also contains advanced material. The first four chapters provide an introduction to complex analysis with many elementary

  18. Complexity in White Noise Analysis

    Science.gov (United States)

    Hida, Takeyuki

    We restrict our attention to random complex systems and discuss degree their degree of complexity based on a white noise. The white noise is realized as the time derivative of a Brownian motion B(t), and denoted by Ḃ(t). The collection {Ḃ(t)}, is a system of idealized elementary variables and at the same time the system is a stochastic representation of the time t, in other words it is time-oriented. Having expressed the given evolutional random phenomena in question in terms of the Ḃ(t), we introduce the notion of spectral multiplicity, which describes how much the phenomena are complex. The multiplicity is the number of cyclic subspaces that are spanned by the given random phenomena. Each cyclic subspace has further structure. Typical property is multiple Markov property, although this property appears only particular cases. As a related property, in fact as a characteristic of a complex system, one can speak of the time reversibility and irreversibility of certain random phenomena in terms of the white noise. We expect an irreversible random complex system may be decomposed into reversible systems.

  19. Harmonic and complex analysis in several variables

    CERN Document Server

    Krantz, Steven G

    2017-01-01

    Authored by a ranking authority in harmonic analysis of several complex variables, this book embodies a state-of-the-art entrée at the intersection of two important fields of research: complex analysis and harmonic analysis. Written with the graduate student in mind, it is assumed that the reader has familiarity with the basics of complex analysis of one and several complex variables as well as with real and functional analysis. The monograph is largely self-contained and develops the harmonic analysis of several complex variables from the first principles. The text includes copious examples, explanations, an exhaustive bibliography for further reading, and figures that illustrate the geometric nature of the subject. Each chapter ends with an exercise set. Additionally, each chapter begins with a prologue, introducing the reader to the subject matter that follows; capsules presented in each section give perspective and a spirited launch to the segment; preludes help put ideas into context. Mathematicians and...

  20. A complex analysis problem book

    CERN Document Server

    Alpay, Daniel

    2016-01-01

    This second edition presents a collection of exercises on the theory of analytic functions, including completed and detailed solutions. It introduces students to various applications and aspects of the theory of analytic functions not always touched on in a first course, while also addressing topics of interest to electrical engineering students (e.g., the realization of rational functions and its connections to the theory of linear systems and state space representations of such systems). It provides examples of important Hilbert spaces of analytic functions (in particular the Hardy space and the Fock space), and also includes a section reviewing essential aspects of topology, functional analysis and Lebesgue integration. Benefits of the 2nd edition Rational functions are now covered in a separate chapter. Further, the section on conformal mappings has been expanded.

  1. Revitalizing Complex Analysis: A Transition-to-Proof Course Centered on Complex Topics

    Science.gov (United States)

    Sachs, Robert

    2017-01-01

    A new transition course centered on complex topics would help in revitalizing complex analysis in two ways: first, provide early exposure to complex functions, sparking greater interest in the complex analysis course; second, create extra time in the complex analysis course by eliminating the "complex precalculus" part of the course. In…

  2. Visualization and Analysis of Complex Covert Networks

    DEFF Research Database (Denmark)

    Memon, Bisharat

    systems that are covert and hence inherently complex. My Ph.D. is positioned within the wider framework of CrimeFighter project. The framework envisions a number of key knowledge management processes that are involved in the workflow, and the toolbox provides supporting tools to assist human end......This report discusses and summarize the results of my work so far in relation to my Ph.D. project entitled "Visualization and Analysis of Complex Covert Networks". The focus of my research is primarily on development of methods and supporting tools for visualization and analysis of networked......-users (intelligence analysts) in harvesting, filtering, storing, managing, structuring, mining, analyzing, interpreting, and visualizing data about offensive networks. The methods and tools proposed and discussed in this work can also be applied to analysis of more generic complex networks....

  3. Complex harmonic modal analysis of rotor systems

    International Nuclear Information System (INIS)

    Han, Dong Ju

    2015-01-01

    Complex harmonic analysis for rotor systems has been proposed from the strict complex modal analysis based upon Floquet theory. In this process the harmonic balance method is adopted, effectively associated with conventional eigenvalue analysis. Also, the harmonic coefficients equivalent to dFRFs in harmonic mode has been derived in practice. The modes are classified from identifying the modal characteristics, and the adaptation of harmonic balance method has been proven by comparing the results of the stability analyses from Floque theory and the eigen analysis. The modal features of each critical speed are depicted in quantitatively and qualitatively by showing that the strengths of each component of the harmonic coefficients are estimated from the order of magnitude analysis according to their harmonic patterns. This effectiveness has been verified by comparing with the numerical solutions

  4. Mathematical Analysis of Evolution, Information, and Complexity

    CERN Document Server

    Arendt, Wolfgang

    2009-01-01

    Mathematical Analysis of Evolution, Information, and Complexity deals with the analysis of evolution, information and complexity. The time evolution of systems or processes is a central question in science, this text covers a broad range of problems including diffusion processes, neuronal networks, quantum theory and cosmology. Bringing together a wide collection of research in mathematics, information theory, physics and other scientific and technical areas, this new title offers elementary and thus easily accessible introductions to the various fields of research addressed in the book.

  5. An equipment complex for instrumental elementary analysis

    International Nuclear Information System (INIS)

    Borisov, G.I.; Komkov, M.M.; Kuz'michev, V.A.; Leonov, V.F.; Tarasov, Y.F.

    1986-01-01

    The competitiveness of elementary analysis on a research reactor depends, in many respects, on the provision of equipment which would permit the realization of the advantages of the analytical method. This paper describes the IR-8 reactor at the I.V. Kurchatov Institute of Atomic energy. In the design of the complex considerable attention was focused on automation and the radiation safety of the operations, which is of particular importance in the light of the large volumes of analysis

  6. Integrative Genomic Analysis of Complex traits

    DEFF Research Database (Denmark)

    Ehsani, Ali Reza

    In the last decade rapid development in biotechnologies has made it possible to extract extensive information about practically all levels of biological organization. An ever-increasing number of studies are reporting miltilayered datasets on the entire DNA sequence, transceroption, protein...... expression, and metabolite abundance of more and more populations in a multitude of invironments. However, a solid model for including all of this complex information in one analysis, to disentangle genetic variation and the underlying genetic architecture of complex traits and diseases, has not yet been...

  7. Intentional risk management through complex networks analysis

    CERN Document Server

    Chapela, Victor; Moral, Santiago; Romance, Miguel

    2015-01-01

    This book combines game theory and complex networks to examine intentional technological risk through modeling. As information security risks are in constant evolution,  the methodologies and tools to manage them must evolve to an ever-changing environment. A formal global methodology is explained  in this book, which is able to analyze risks in cyber security based on complex network models and ideas extracted from the Nash equilibrium. A risk management methodology for IT critical infrastructures is introduced which provides guidance and analysis on decision making models and real situations. This model manages the risk of succumbing to a digital attack and assesses an attack from the following three variables: income obtained, expense needed to carry out an attack, and the potential consequences for an attack. Graduate students and researchers interested in cyber security, complex network applications and intentional risk will find this book useful as it is filled with a number of models, methodologies a...

  8. Metrical and dynamical aspects in complex analysis

    CERN Document Server

    2017-01-01

    The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.

  9. Analysis of complex networks using aggressive abstraction.

    Energy Technology Data Exchange (ETDEWEB)

    Colbaugh, Richard; Glass, Kristin.; Willard, Gerald

    2008-10-01

    This paper presents a new methodology for analyzing complex networks in which the network of interest is first abstracted to a much simpler (but equivalent) representation, the required analysis is performed using the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit abstractions that are simultaneously dramatically simplifying and property preserving we call these aggressive abstractions -- and which can therefore be analyzed using the proposed approach. We then introduce and develop two forms of aggressive abstraction: 1.) finite state abstraction, in which dynamical networks with uncountable state spaces are modeled using finite state systems, and 2.) onedimensional abstraction, whereby high dimensional network dynamics are captured in a meaningful way using a single scalar variable. In each case, the property preserving nature of the abstraction process is rigorously established and efficient algorithms are presented for computing the abstraction. The considerable potential of the proposed approach to complex networks analysis is illustrated through case studies involving vulnerability analysis of technological networks and predictive analysis for social processes.

  10. Symmetry analysis in parametrisation of complex systems

    International Nuclear Information System (INIS)

    Sikora, W; Malinowski, J

    2010-01-01

    The symmetry analysis method based on the theory of group representations is used for description of complex systems and their behavior in this work. The first trial of using the symmetry analysis in modeling of behavior of complex social system is presented. The evacuation of large building scenarios are discussed as transition from chaotic to ordered states, described as movements of individuals according to fields of displacements, calculated correspondingly to given scenario. The symmetry of the evacuation space is taken into account in calculation of displacements field - the displacements related to every point of this space are presented in the coordinate frame in the best way adapted to given symmetry space group, which is the set of basic vectors of irreducible representation of given symmetry group. The results got with using the symmetry consideration are compared with corresponding results calculated under assumption of shortest way to exits (Voronoi assumption).

  11. Symmetry analysis in parametrisation of complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Sikora, W; Malinowski, J, E-mail: sikora@novell.ftj.agh.edu.p [Faculty of Physics and Applied Computer Science, AGH - University of Science and Technology, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2010-03-01

    The symmetry analysis method based on the theory of group representations is used for description of complex systems and their behavior in this work. The first trial of using the symmetry analysis in modeling of behavior of complex social system is presented. The evacuation of large building scenarios are discussed as transition from chaotic to ordered states, described as movements of individuals according to fields of displacements, calculated correspondingly to given scenario. The symmetry of the evacuation space is taken into account in calculation of displacements field - the displacements related to every point of this space are presented in the coordinate frame in the best way adapted to given symmetry space group, which is the set of basic vectors of irreducible representation of given symmetry group. The results got with using the symmetry consideration are compared with corresponding results calculated under assumption of shortest way to exits (Voronoi assumption).

  12. Complex Network Analysis of Guangzhou Metro

    OpenAIRE

    Yasir Tariq Mohmand; Fahad Mehmood; Fahd Amjad; Nedim Makarevic

    2015-01-01

    The structure and properties of public transportation networks can provide suggestions for urban planning and public policies. This study contributes a complex network analysis of the Guangzhou metro. The metro network has 236 kilometers of track and is the 6th busiest metro system of the world. In this paper topological properties of the network are explored. We observed that the network displays small world properties and is assortative in nature. The network possesses a high average degree...

  13. Automated analysis and design of complex structures

    International Nuclear Information System (INIS)

    Wilson, E.L.

    1977-01-01

    The present application of optimum design appears to be restricted to components of the structure rather than to the total structural system. Since design normally involved many analysis of the system any improvement in the efficiency of the basic methods of analysis will allow more complicated systems to be designed by optimum methods. The evaluation of the risk and reliability of a structural system can be extremely important. Reliability studies have been made of many non-structural systems for which the individual components have been extensively tested and the service environment is known. For such systems the reliability studies are valid. For most structural systems, however, the properties of the components can only be estimated and statistical data associated with the potential loads is often minimum. Also, a potentially critical loading condition may be completely neglected in the study. For these reasons and the previous problems associated with the reliability of both linear and nonlinear analysis computer programs it appears to be premature to place a significant value on such studies for complex structures. With these comments as background the purpose of this paper is to discuss the following: the relationship of analysis to design; new methods of analysis; new of improved finite elements; effect of minicomputer on structural analysis methods; the use of system of microprocessors for nonlinear structural analysis; the role of interacting graphics systems in future analysis and design. This discussion will focus on the impact of new, inexpensive computer hardware on design and analysis methods

  14. Complex surveys analysis of categorical data

    CERN Document Server

    Mukhopadhyay, Parimal

    2016-01-01

    The primary objective of this book is to study some of the research topics in the area of analysis of complex surveys which have not been covered in any book yet. It discusses the analysis of categorical data using three models: a full model, a log-linear model and a logistic regression model. It is a valuable resource for survey statisticians and practitioners in the field of sociology, biology, economics, psychology and other areas who have to use these procedures in their day-to-day work. It is also useful for courses on sampling and complex surveys at the upper-undergraduate and graduate levels. The importance of sample surveys today cannot be overstated. From voters’ behaviour to fields such as industry, agriculture, economics, sociology, psychology, investigators generally resort to survey sampling to obtain an assessment of the behaviour of the population they are interested in. Many large-scale sample surveys collect data using complex survey designs like multistage stratified cluster designs. The o...

  15. Mathematical analysis of complex cellular activity

    CERN Document Server

    Bertram, Richard; Teka, Wondimu; Vo, Theodore; Wechselberger, Martin; Kirk, Vivien; Sneyd, James

    2015-01-01

    This book contains two review articles on mathematical physiology that deal with closely related topics but were written and can be read independently. The first article reviews the basic theory of calcium oscillations (common to almost all cell types), including spatio-temporal behaviors such as waves. The second article uses, and expands on, much of this basic theory to show how the interaction of cytosolic calcium oscillators with membrane ion channels can result in highly complex patterns of electrical spiking. Through these examples one can see clearly how multiple oscillatory processes interact within a cell, and how mathematical methods can be used to understand such interactions better. The two reviews provide excellent examples of how mathematics and physiology can learn from each other, and work jointly towards a better understanding of complex cellular processes. Review 1: Richard Bertram, Joel Tabak, Wondimu Teka, Theodore Vo, Martin Wechselberger: Geometric Singular Perturbation Analysis of Burst...

  16. Structured analysis and modeling of complex systems

    Science.gov (United States)

    Strome, David R.; Dalrymple, Mathieu A.

    1992-01-01

    The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.

  17. A Complexity Analysis of Functional Interpretations

    DEFF Research Database (Denmark)

    Hernest, Mircea-Dan; Kohlenbach, Ulrich

    2003-01-01

    We give a quantitative analysis of G ̈odel’s functional interpretation and its monotone variant. The two have been used for the extraction of programs and numerical bounds as well as for conservation results. They apply both to (semi-)intuitionistic as well as (combined with negative translation...... of the verifying proof. In all cases terms of size linear in the size of the proof at input can be extracted and the corresponding extraction algorithms have cubic worst-time complexity. The verifying proofs have depth linear in the depth of the proof at input and the maximal size of a formula of this proof....

  18. Tandem mass spectrometry: analysis of complex mixtures

    International Nuclear Information System (INIS)

    Singleton, K.E.

    1985-01-01

    Applications of tandem mass spectrometry (MS/MS) for the analysis of complex mixtures results in increased specificity and selectivity by using a variety of reagent gases in both negative and positive ion modes. Natural isotopic abundance ratios were examined in both simple and complex mixtures using parent, daughter and neutral loss scans. MS/MS was also used to discover new compounds. Daughter scans were used to identify seven new alkaloids in a cactus species. Three of these alkaloids were novel compounds, and included the first simple, fully aromatic isoquinoline alkaloids reported in Cactaceae. MS/MS was used to characterize the chemical reaction products of coal in studies designed to probe its macromolecular structure. Negative ion chemical ionization was utilized to study reaction products resulting from the oxidation of coal. Possible structural units in the precursor coal were predicted based on the reaction products identified, aliphatic and aromatic acids and their anhydrides. The MS/MS method was also used to characterize reaction products resulting from coal liquefaction and/or extraction. These studies illustrate the types of problems for which MS/MS is useful. Emphasis has been placed on characterization of complex mixtures by selecting experimental parameters which enhance the information obtained. The value of using MS/MS in conjunction with other analytical techniques as well as the chemical pretreatment is demonstrated

  19. Central Waste Complex (CWC) Waste Analysis Plan

    International Nuclear Information System (INIS)

    ELLEFSON, M.D.

    2000-01-01

    The purpose of this waste analysis plan (WAP) is to document the waste acceptance process, sampling methodologies, analytical techniques, and overall processes that are undertaken for waste accepted for storage at the Central Waste Complex (CWC), which is located in the 200 West Area of the Hanford Facility, Richland, Washington. Because dangerous waste does not include the source special nuclear and by-product material components of mixed waste, radionuclides are not within the scope of this document. The information on radionuclides is provided only for general knowledge. This document has been revised to meet the interim status waste analysis plan requirements of Washington Administrative Code (WAC) 173 303-300(5). When the final status permit is issued, permit conditions will be incorporated and this document will be revised accordingly

  20. Exergy Analysis of Complex Ship Energy Systems

    Directory of Open Access Journals (Sweden)

    Pierre Marty

    2016-04-01

    Full Text Available With multiple primary and secondary energy converters (diesel engines, steam turbines, waste heat recovery (WHR and oil-fired boilers, etc. and extensive energy networks (steam, cooling water, exhaust gases, etc., ships may be considered as complex energy systems. Understanding and optimizing such systems requires advanced holistic energy modeling. This modeling can be done in two ways: The simpler approach focuses on energy flows, and has already been tested, approved and presented; a new, more complicated approach, focusing on energy quality, i.e., exergy, is presented in this paper. Exergy analysis has rarely been applied to ships, and, as a general rule, the shipping industry is not familiar with this tool. This paper tries to fill this gap. We start by giving a short reminder of what exergy is and describe the principles of exergy modeling to explain what kind of results should be expected from such an analysis. We then apply these principles to the analysis of a large two-stroke diesel engine with its cooling and exhaust systems. Simulation results are then presented along with the exergy analysis. Finally, we propose solutions for energy and exergy saving which could be applied to marine engines and ships in general.

  1. Common cause failure analysis methodology for complex systems

    International Nuclear Information System (INIS)

    Wagner, D.P.; Cate, C.L.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complex system reliability analysis. This paper extends existing methods of computer aided common cause failure analysis by allowing analysis of the complex systems often encountered in practice. The methods presented here aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  2. Analysis of complex systems using neural networks

    International Nuclear Information System (INIS)

    Uhrig, R.E.

    1992-01-01

    The application of neural networks, alone or in conjunction with other advanced technologies (expert systems, fuzzy logic, and/or genetic algorithms), to some of the problems of complex engineering systems has the potential to enhance the safety, reliability, and operability of these systems. Typically, the measured variables from the systems are analog variables that must be sampled and normalized to expected peak values before they are introduced into neural networks. Often data must be processed to put it into a form more acceptable to the neural network (e.g., a fast Fourier transformation of the time-series data to produce a spectral plot of the data). Specific applications described include: (1) Diagnostics: State of the Plant (2) Hybrid System for Transient Identification, (3) Sensor Validation, (4) Plant-Wide Monitoring, (5) Monitoring of Performance and Efficiency, and (6) Analysis of Vibrations. Although specific examples described deal with nuclear power plants or their subsystems, the techniques described can be applied to a wide variety of complex engineering systems

  3. Weighted Complex Network Analysis of Pakistan Highways

    Directory of Open Access Journals (Sweden)

    Yasir Tariq Mohmand

    2013-01-01

    Full Text Available The structure and properties of public transportation networks have great implications in urban planning, public policies, and infectious disease control. This study contributes a weighted complex network analysis of travel routes on the national highway network of Pakistan. The network is responsible for handling 75 percent of the road traffic yet is largely inadequate, poor, and unreliable. The highway network displays small world properties and is assortative in nature. Based on the betweenness centrality of the nodes, the most important cities are identified as this could help in identifying the potential congestion points in the network. Keeping in view the strategic location of Pakistan, such a study is of practical importance and could provide opportunities for policy makers to improve the performance of the highway network.

  4. Complex Network Analysis of Guangzhou Metro

    Directory of Open Access Journals (Sweden)

    Yasir Tariq Mohmand

    2015-11-01

    Full Text Available The structure and properties of public transportation networks can provide suggestions for urban planning and public policies. This study contributes a complex network analysis of the Guangzhou metro. The metro network has 236 kilometers of track and is the 6th busiest metro system of the world. In this paper topological properties of the network are explored. We observed that the network displays small world properties and is assortative in nature. The network possesses a high average degree of 17.5 with a small diameter of 5. Furthermore, we also identified the most important metro stations based on betweenness and closeness centralities. These could help in identifying the probable congestion points in the metro system and provide policy makers with an opportunity to improve the performance of the metro system.

  5. On the complexity of numerical analysis

    DEFF Research Database (Denmark)

    Miltersen, Peter Bro; Allender, Eric; Burgisser, Peter

    2009-01-01

    an integer N, decide whether N>0. • In the Blum-Shub-Smale model, polynomial time computation over the reals (on discrete inputs) is polynomial-time equivalent to PosSLP, when there are only algebraic constants. We conjecture that using transcendental constants provides no additional power, beyond nonuniform...... reductions to PosSLP, and we present some preliminary results supporting this conjecture. • The Generic Task of Numerical Computation is also polynomial-time equivalent to PosSLP. We prove that PosSLP lies in the counting hierarchy. Combining this with work of Tiwari, we obtain that the Euclidean Traveling......We study two quite different approaches to understanding the complexity of fundamental problems in numerical analysis: • The Blum-Shub-Smale model of computation over the reals. • A problem we call the “Generic Task of Numerical Computation,” which captures an aspect of doing numerical computation...

  6. [Technologies for Complex Intelligent Clinical Data Analysis].

    Science.gov (United States)

    Baranov, A A; Namazova-Baranova, L S; Smirnov, I V; Devyatkin, D A; Shelmanov, A O; Vishneva, E A; Antonova, E V; Smirnov, V I

    2016-01-01

    The paper presents the system for intelligent analysis of clinical information. Authors describe methods implemented in the system for clinical information retrieval, intelligent diagnostics of chronic diseases, patient's features importance and for detection of hidden dependencies between features. Results of the experimental evaluation of these methods are also presented. Healthcare facilities generate a large flow of both structured and unstructured data which contain important information about patients. Test results are usually retained as structured data but some data is retained in the form of natural language texts (medical history, the results of physical examination, and the results of other examinations, such as ultrasound, ECG or X-ray studies). Many tasks arising in clinical practice can be automated applying methods for intelligent analysis of accumulated structured array and unstructured data that leads to improvement of the healthcare quality. the creation of the complex system for intelligent data analysis in the multi-disciplinary pediatric center. Authors propose methods for information extraction from clinical texts in Russian. The methods are carried out on the basis of deep linguistic analysis. They retrieve terms of diseases, symptoms, areas of the body and drugs. The methods can recognize additional attributes such as "negation" (indicates that the disease is absent), "no patient" (indicates that the disease refers to the patient's family member, but not to the patient), "severity of illness", disease course", "body region to which the disease refers". Authors use a set of hand-drawn templates and various techniques based on machine learning to retrieve information using a medical thesaurus. The extracted information is used to solve the problem of automatic diagnosis of chronic diseases. A machine learning method for classification of patients with similar nosology and the methodfor determining the most informative patients'features are

  7. The resilience analysis grid taming complexity?

    NARCIS (Netherlands)

    Gallis, R.

    2012-01-01

    In this workshop we will look at the next step in safety: using resilience engineering as a tool for dealing with complexity. Present concepts such as Bow-ties use linear cause & effect relationships. More and more we see that companies are faced with a complex world with non linear systems. This

  8. Stability analysis of impulsive parabolic complex networks

    Energy Technology Data Exchange (ETDEWEB)

    Wang Jinliang, E-mail: wangjinliang1984@yahoo.com.cn [Science and Technology on Aircraft Control Laboratory, School of Automation Science and Electrical Engineering, Beihang University, XueYuan Road, No. 37, HaiDian District, Beijing 100191 (China); Wu Huaining [Science and Technology on Aircraft Control Laboratory, School of Automation Science and Electrical Engineering, Beihang University, XueYuan Road, No. 37, HaiDian District, Beijing 100191 (China)

    2011-11-15

    Highlights: > Two impulsive parabolic complex network models are proposed. > The global exponential stability of impulsive parabolic complex networks are considered. > The robust global exponential stability of impulsive parabolic complex networks are considered. - Abstract: In the present paper, two kinds of impulsive parabolic complex networks (IPCNs) are considered. In the first one, all nodes have the same time-varying delay. In the second one, different nodes have different time-varying delays. Using the Lyapunov functional method combined with the inequality techniques, some global exponential stability criteria are derived for the IPCNs. Furthermore, several robust global exponential stability conditions are proposed to take uncertainties in the parameters of the IPCNs into account. Finally, numerical simulations are presented to illustrate the effectiveness of the results obtained here.

  9. Stability analysis of impulsive parabolic complex networks

    International Nuclear Information System (INIS)

    Wang Jinliang; Wu Huaining

    2011-01-01

    Highlights: → Two impulsive parabolic complex network models are proposed. → The global exponential stability of impulsive parabolic complex networks are considered. → The robust global exponential stability of impulsive parabolic complex networks are considered. - Abstract: In the present paper, two kinds of impulsive parabolic complex networks (IPCNs) are considered. In the first one, all nodes have the same time-varying delay. In the second one, different nodes have different time-varying delays. Using the Lyapunov functional method combined with the inequality techniques, some global exponential stability criteria are derived for the IPCNs. Furthermore, several robust global exponential stability conditions are proposed to take uncertainties in the parameters of the IPCNs into account. Finally, numerical simulations are presented to illustrate the effectiveness of the results obtained here.

  10. Advances in real and complex analysis with applications

    CERN Document Server

    Cho, Yeol; Agarwal, Praveen; Area, Iván

    2017-01-01

    This book discusses a variety of topics in mathematics and engineering as well as their applications, clearly explaining the mathematical concepts in the simplest possible way and illustrating them with a number of solved examples. The topics include real and complex analysis, special functions and analytic number theory, q-series, Ramanujan’s mathematics, fractional calculus, Clifford and harmonic analysis, graph theory, complex analysis, complex dynamical systems, complex function spaces and operator theory, geometric analysis of complex manifolds, geometric function theory, Riemannian surfaces, Teichmüller spaces and Kleinian groups, engineering applications of complex analytic methods, nonlinear analysis, inequality theory, potential theory, partial differential equations, numerical analysis , fixed-point theory, variational inequality, equilibrium problems, optimization problems, stability of functional equations, and mathematical physics.  It includes papers presented at the 24th International Confe...

  11. Automated analysis and design of complex structures

    International Nuclear Information System (INIS)

    Wilson, E.L.

    1977-01-01

    This paper discusses the following: 1. The relationship of analysis to design. 2. New methods of analysis. 3. Improved finite elements. 4. Effect of minicomputer on structural analysis methods. 5. The use of system of microprocessors for nonlinear structural analysis. 6. The role of interacting graphics systems in future analysis and design. The discussion focusses on the impact of new inexpensive computer hardware on design and analysis methods. (Auth.)

  12. Complex analysis a modern first course in function theory

    CERN Document Server

    Muir, Jerry R

    2015-01-01

    A thorough introduction to the theory of complex functions emphasizing the beauty, power, and counterintuitive nature of the subject Written with a reader-friendly approach, Complex Analysis: A Modern First Course in Function Theory features a self-contained, concise development of the fundamental principles of complex analysis. After laying groundwork on complex numbers and the calculus and geometric mapping properties of functions of a complex variable, the author uses power series as a unifying theme to define and study the many rich and occasionally surprising properties of analytic fun

  13. Complex Visual Data Analysis, Uncertainty, and Representation

    National Research Council Canada - National Science Library

    Schunn, Christian D; Saner, Lelyn D; Kirschenbaum, Susan K; Trafton, J. G; Littleton, Eliza B

    2007-01-01

    ... (weather forecasting, submarine target motion analysis, and fMRI data analysis). Internal spatial representations are coded from spontaneous gestures made during cued-recall summaries of problem solving activities...

  14. Analysis and Design of Complex Network Environments

    Science.gov (United States)

    2012-03-01

    and J. Lowe, “The myths and facts behind cyber security risks for industrial control systems ,” in the Proceedings of the VDE Kongress, VDE Congress...questions about 1) how to model them, 2) the design of experiments necessary to discover their structure (and thus adapt system inputs to optimize the...theoretical work that clarifies fundamental limitations of complex networks with network engineering and systems biology to implement specific designs and

  15. Multivariate Complexity Analysis of Swap Bribery

    Science.gov (United States)

    Dorn, Britta; Schlotter, Ildikó

    We consider the computational complexity of a problem modeling bribery in the context of voting systems. In the scenario of Swap Bribery, each voter assigns a certain price for swapping the positions of two consecutive candidates in his preference ranking. The question is whether it is possible, without exceeding a given budget, to bribe the voters in a way that the preferred candidate wins in the election.

  16. Complexity analysis in particulate matter measurements

    Directory of Open Access Journals (Sweden)

    Luciano Telesca

    2011-09-01

    Full Text Available We investigated the complex temporal fluctuations of particulate matter data recorded in London area by using the Fisher-Shannon (FS information plane. In the FS plane the PM10 and PM2.5 data are aggregated in two different clusters, characterized by different degrees of order and organization. This results could be related to different sources of the particulate matter.

  17. Complex analysis conformal inequalities and the Bieberbach conjecture

    CERN Document Server

    Kythe, Prem K

    2015-01-01

    Complex Analysis: Conformal Inequalities and the Bieberbach Conjecture discusses the mathematical analysis created around the Bieberbach conjecture, which is responsible for the development of many beautiful aspects of complex analysis, especially in the geometric-function theory of univalent functions. Assuming basic knowledge of complex analysis and differential equations, the book is suitable for graduate students engaged in analytical research on the topics and researchers working on related areas of complex analysis in one or more complex variables. The author first reviews the theory of analytic functions, univalent functions, and conformal mapping before covering various theorems related to the area principle and discussing Löwner theory. He then presents Schiffer’s variation method, the bounds for the fourth and higher-order coefficients, various subclasses of univalent functions, generalized convexity and the class of a-convex functions, and numerical estimates of the coefficient problem. The boo...

  18. Analysis and control of complex dynamical systems robust bifurcation, dynamic attractors, and network complexity

    CERN Document Server

    Imura, Jun-ichi; Ueta, Tetsushi

    2015-01-01

    This book is the first to report on theoretical breakthroughs on control of complex dynamical systems developed by collaborative researchers in the two fields of dynamical systems theory and control theory. As well, its basic point of view is of three kinds of complexity: bifurcation phenomena subject to model uncertainty, complex behavior including periodic/quasi-periodic orbits as well as chaotic orbits, and network complexity emerging from dynamical interactions between subsystems. Analysis and Control of Complex Dynamical Systems offers a valuable resource for mathematicians, physicists, and biophysicists, as well as for researchers in nonlinear science and control engineering, allowing them to develop a better fundamental understanding of the analysis and control synthesis of such complex systems.

  19. On the Complexity of Numerical Analysis

    DEFF Research Database (Denmark)

    Allender, Eric; Bürgisser, Peter; Kjeldgaard-Pedersen, Johan

    2005-01-01

    an integer N, decide whether N>0. We show that PosSLP lies in the counting hierarchy, and we show that if A is any language in the Boolean part of Polynomial-time over the Reals accepted by a machine whose machine constants are algebraic real numbers, then A is in P^PosSLP. Combining our results with work...... of Tiwari, we show that the Euclidean Traveling Salesman Problem lies in the counting hierarchy -- the previous best upper bound for this important problem (in terms of classical complexity classes) being PSPACE....

  20. Genomic Analysis of Complex Microbial Communities in Wounds

    Science.gov (United States)

    2012-01-01

    Permutation Multivariate Analysis of Variance ( PerMANOVA ). We used PerMANOVA to test the null-hypothesis of no... permutation -based version of the multivariate analysis of variance (MANOVA). PerMANOVA uses the distances between samples to partition variance and...coli. Antibiotics, bacteria, community analysis , diabetes, pyrosequencing, wound, wound therapy, 16S rRNA gene Genomic Analysis of Complex

  1. Multidimensional approach to complex system resilience analysis

    International Nuclear Information System (INIS)

    Gama Dessavre, Dante; Ramirez-Marquez, Jose E.; Barker, Kash

    2016-01-01

    Recent works have attempted to formally define a general metric for quantifying resilience for complex systems as a relationship of performance of the systems against time. The technical content in the proposed work introduces a new model that allows, for the first time, to compare the system resilience among systems (or different modifications to a system), by introducing a new dimension to system resilience models, called stress, to mimic the definition of resilience in material science. The applicability and usefulness of the model is shown with a new heat map visualization proposed in this work, and it is applied to a simulated network resilience case to exemplify its potential benefits. - Highlights: • We analyzed two of the main current metrics of resilience. • We create a new model that relates events with the effects they have. • We develop a novel heat map visualization to compare system resilience. • We showed the model and visualization usefulness in a simulated case.

  2. Analysis of remote synchronization in complex networks

    Science.gov (United States)

    Gambuzza, Lucia Valentina; Cardillo, Alessio; Fiasconaro, Alessandro; Fortuna, Luigi; Gómez-Gardeñes, Jesus; Frasca, Mattia

    2013-12-01

    A novel regime of synchronization, called remote synchronization, where the peripheral nodes form a phase synchronized cluster not including the hub, was recently observed in star motifs [Bergner et al., Phys. Rev. E 85, 026208 (2012)]. We show the existence of a more general dynamical state of remote synchronization in arbitrary networks of coupled oscillators. This state is characterized by the synchronization of pairs of nodes that are not directly connected via a physical link or any sequence of synchronized nodes. This phenomenon is almost negligible in networks of phase oscillators as its underlying mechanism is the modulation of the amplitude of those intermediary nodes between the remotely synchronized units. Our findings thus show the ubiquity and robustness of these states and bridge the gap from their recent observation in simple toy graphs to complex networks.

  3. NEXCADE: perturbation analysis for complex networks.

    Directory of Open Access Journals (Sweden)

    Gitanjali Yadav

    Full Text Available Recent advances in network theory have led to considerable progress in our understanding of complex real world systems and their behavior in response to external threats or fluctuations. Much of this research has been invigorated by demonstration of the 'robust, yet fragile' nature of cellular and large-scale systems transcending biology, sociology, and ecology, through application of the network theory to diverse interactions observed in nature such as plant-pollinator, seed-dispersal agent and host-parasite relationships. In this work, we report the development of NEXCADE, an automated and interactive program for inducing disturbances into complex systems defined by networks, focusing on the changes in global network topology and connectivity as a function of the perturbation. NEXCADE uses a graph theoretical approach to simulate perturbations in a user-defined manner, singly, in clusters, or sequentially. To demonstrate the promise it holds for broader adoption by the research community, we provide pre-simulated examples from diverse real-world networks including eukaryotic protein-protein interaction networks, fungal biochemical networks, a variety of ecological food webs in nature as well as social networks. NEXCADE not only enables network visualization at every step of the targeted attacks, but also allows risk assessment, i.e. identification of nodes critical for the robustness of the system of interest, in order to devise and implement context-based strategies for restructuring a network, or to achieve resilience against link or node failures. Source code and license for the software, designed to work on a Linux-based operating system (OS can be downloaded at http://www.nipgr.res.in/nexcade_download.html. In addition, we have developed NEXCADE as an OS-independent online web server freely available to the scientific community without any login requirement at http://www.nipgr.res.in/nexcade.html.

  4. Waste analysis plan for T Plant Complex

    International Nuclear Information System (INIS)

    Williams, J.F.

    1996-01-01

    Washington Administration Code 173-303-300 requires that a waste analysis plan (WAP) be provided by a treatment, storage, and/or disposal (TSD) unit to confirm their knowledge about a dangerous and/or mixed waste to ensure that the waste is managed properly. The specific objectives of the WAP are as follows: Ensure safe management of waste during treatment and storage; Ensure that waste generated during operational activities is properly designated in accordance with regulatory requirements; Provide chemical and physical analysis of representative samples of the waste stored for characterization and/or verification before the waste is transferred to another TSD unit; Ensure compliance with land disposal restriction (LDR) requirements for treated waste; and Provide basis for work plans that describes waste analysis for development of new treatment technologies

  5. Fourier analysis in several complex variables

    CERN Document Server

    Ehrenpreis, Leon

    2006-01-01

    Suitable for advanced undergraduates and graduate students, this text develops comparison theorems to establish the fundamentals of Fourier analysis and to illustrate their applications to partial differential equations.The three-part treatment begins by establishing the quotient structure theorem or fundamental principle of Fourier analysis. Topics include the geometric structure of ideals and modules, quantitative estimates, and examples in which the theory can be applied. The second part focuses on applications to partial differential equations and covers the solution of homogeneous and inh

  6. Discourse analysis: making complex methodology simple

    NARCIS (Netherlands)

    Bondarouk, Tatiana; Ruel, Hubertus Johannes Maria; Leino, T.; Saarinen, T.; Klein, S.

    2004-01-01

    Discursive-based analysis of organizations is not new in the field of interpretive social studies. Since not long ago have information systems (IS) studies also shown a keen interest in discourse (Wynn et al, 2002). The IS field has grown significantly in its multiplicity that is echoed in the

  7. Empirical and theoretical analysis of complex systems

    Science.gov (United States)

    Zhao, Guannan

    This thesis is an interdisciplinary work under the heading of complexity science which focuses on an arguably common "hard" problem across physics, finance and biology [1], to quantify and mimic the macroscopic "emergent phenomenon" in large-scale systems consisting of many interacting "particles" governed by microscopic rules. In contrast to traditional statistical physics, we are interested in systems whose dynamics are subject to feedback, evolution, adaption, openness, etc. Global financial markets, like the stock market and currency market, are ideal candidate systems for such a complexity study: there exists a vast amount of accurate data, which is the aggregate output of many autonomous agents continuously competing with each other. We started by examining the ultrafast "mini flash crash (MFC)" events in the US stock market. An abrupt system-wide composition transition from a mixed human machine phase to a new all-machine phase is uncovered, and a novel theory developed to explain this observation. Then in the study of FX market, we found an unexpected variation in the synchronicity of price changes in different market subsections as a function of the overall trading activity. Several survival models have been tested in analyzing the distribution of waiting times to the next price change. In the region of long waiting-times, the distribution for each currency pair exhibits a power law with exponent in the vicinity of 3.5. By contrast, for short waiting times only, the market activity can be mimicked by the fluctuations emerging from a finite resource competition model containing multiple agents with limited rationality (so called El Farol Model). Switching to the biomedical domain, we present a minimal mathematical model built around a co-evolving resource network and cell population, yielding good agreement with primary tumors in mice experiment and with clinical metastasis data. In the quest to understand contagion phenomena in systems where social group

  8. A complexity analysis of functional interpretations

    DEFF Research Database (Denmark)

    Hernest, Mircea-Dan; Kohlenbach, Ulrich

    2005-01-01

    Summary: We give a quantitative analysis of Gödel's functional     interpretation and its monotone variant. The two have been used     for the extraction of programs and numerical bounds as well as     for conservation results. They apply both to (semi-)intuitionistic     as well as (combined...... with negative translation) classical proofs.     The proofs may be formalized in systems ranging from weak base     systems to arithmetic and analysis (and numerous fragments of     these). We give upper bounds in basic proof data on the depth,     size, maximal type degree and maximal type arity...

  9. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    The importance of data analysis in quantitative assessment of natural resources remains significant in the sustainable management of complex tropical forest resources. Analyses of data from complex tropical forest stands have not been easy or clear due to improper data management. It is pivotal to practical researches ...

  10. Analysis and design of complex impedance transforming marchand baluns

    DEFF Research Database (Denmark)

    Michaelsen, Rasmus Schandorph; Johansen, Tom Keinicke; Tamborg, Kjeld M.

    2014-01-01

    A new type of Marchand balun is presented in this paper, which has the property of complex impedance transformation. To allow the Marchand balun to transform between arbitrary complex impedances, three reactances should be added to the circuit. A detailed analysis of the circuit gives the governing...

  11. Widening the scope of incident analysis in complex work environments

    NARCIS (Netherlands)

    Vuuren, van W.; Kanse, L.; Manser, T.

    2003-01-01

    Incident analysis is commonly used as a tooi to provide information on how to improve health and safety in complex work environments. The effectiveness of this tooi, however, depends on how the analysis is actually carried out. The traditional view on incident analysis highlights the role of human

  12. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  13. Complex eigenvalue analysis of railway wheel/rail squeal

    African Journals Online (AJOL)

    DR OKE

    Squeal noise from wheel/rail and brake disc/pad frictional contact is typical in railways. ... squeal noise by multibody simulation of a rail car running on rigid rails. ... system, traditional complex eigenvalue analysis by finite element was used.

  14. An Alternative Front End Analysis Strategy for Complex Systems

    Science.gov (United States)

    2014-12-01

    missile ( ABM ) system . Patriot is employed in the field through a battalion echelon organizational structure. The line battery is the basic building...Research Report 1981 An Alternative Front End Analysis Strategy for Complex Systems M. Glenn Cobb U.S. Army Research Institute...NUMBER W5J9CQ11D0003 An Alternative Front End Analysis Strategy for Complex Systems 5b. PROGRAM ELEMENT NUMBER 633007 6

  15. Coping with Complexity Model Reduction and Data Analysis

    CERN Document Server

    Gorban, Alexander N

    2011-01-01

    This volume contains the extended version of selected talks given at the international research workshop 'Coping with Complexity: Model Reduction and Data Analysis', Ambleside, UK, August 31 - September 4, 2009. This book is deliberately broad in scope and aims at promoting new ideas and methodological perspectives. The topics of the chapters range from theoretical analysis of complex and multiscale mathematical models to applications in e.g., fluid dynamics and chemical kinetics.

  16. Uranium complexes with macrosyclic polyethers. Synthesis and structural chemical analysis

    International Nuclear Information System (INIS)

    Elbasyouny, A.

    1983-01-01

    This dissertation reports about studies on the chemical coordination behaviour of uranium of oxidation stages IV and VI with regard to twelve different macrocyclic ligands. For the preparation of the complexes, for every system a different method has been developed. The elementary analysis of the various complexes including the uranium had been done by X-ray fluorescence analysis, and the structural characterization proceeded via vibrational, uv-vis and emission spectroscopy as well as 1 H-NMR and 13 C-spin-lattice relaxation time studies. Conformational analysis of the polyethers used allowed the structural changes in the complexes to be observed. The structural analysis of the hydrous uranium VI crown ether complexes yielded information of characteristic features of these types of complexes. The first coordination sphere of the uranyl ion with covalently bonded anion remains unchanged. As to the water content, there is a certain range. Depending upon the solvent used, the complexes have two or four H 2 O molecules per formula unit. (orig./EF) [de

  17. Complex analysis with applications to flows and fields

    CERN Document Server

    Braga da Costa Campos, Luis Manuel

    2012-01-01

    Complex Analysis with Applications to Flows and Fields presents the theory of functions of a complex variable, from the complex plane to the calculus of residues to power series to conformal mapping. The book explores numerous physical and engineering applications concerning potential flows, the gravity field, electro- and magnetostatics, steady heat conduction, and other problems. It provides the mathematical results to sufficiently justify the solution of these problems, eliminating the need to consult external references.The book is conveniently divided into four parts. In each part, the ma

  18. Complex of the equipment for instrumental element analysis

    International Nuclear Information System (INIS)

    Borisov, G.I.; Komkov, M.M.; Kuz'michev, V.A.

    1986-01-01

    Complex of the equipment for instrumental element analysis at the IR-8 reactor is designed, fabricated and taken into operation. The complex is provided with a multichannel system of vacuum pneumatic transport with radiation positions in the reactor horizontal tangential channel for neutron-activation analysis by short-lived isotopes; specialized dry vertical channels in a beryllium reflector of the reactor and remote system of radioactive sample replacement for neutron-activation analysis by long-lived isotopes; a specialized horizontal tangential channel for neutron beam extraction by means of a beryllium converter and remote device for studied sample replacement under radiation and measurement of prompt γ-radiation for neutron-radiation analysis; a measuring center using minicomputers for experimental data accumulation and processing and analysis control

  19. Current topics in pure and computational complex analysis

    CERN Document Server

    Dorff, Michael; Lahiri, Indrajit

    2014-01-01

    The book contains 13 articles, some of which are survey articles and others research papers. Written by eminent mathematicians, these articles were presented at the International Workshop on Complex Analysis and Its Applications held at Walchand College of Engineering, Sangli. All the contributing authors are actively engaged in research fields related to the topic of the book. The workshop offered a comprehensive exposition of the recent developments in geometric functions theory, planar harmonic mappings, entire and meromorphic functions and their applications, both theoretical and computational. The recent developments in complex analysis and its applications play a crucial role in research in many disciplines.

  20. Protein complex prediction via dense subgraphs and false positive analysis.

    Directory of Open Access Journals (Sweden)

    Cecilia Hernandez

    Full Text Available Many proteins work together with others in groups called complexes in order to achieve a specific function. Discovering protein complexes is important for understanding biological processes and predict protein functions in living organisms. Large-scale and throughput techniques have made possible to compile protein-protein interaction networks (PPI networks, which have been used in several computational approaches for detecting protein complexes. Those predictions might guide future biologic experimental research. Some approaches are topology-based, where highly connected proteins are predicted to be complexes; some propose different clustering algorithms using partitioning, overlaps among clusters for networks modeled with unweighted or weighted graphs; and others use density of clusters and information based on protein functionality. However, some schemes still require much processing time or the quality of their results can be improved. Furthermore, most of the results obtained with computational tools are not accompanied by an analysis of false positives. We propose an effective and efficient mining algorithm for discovering highly connected subgraphs, which is our base for defining protein complexes. Our representation is based on transforming the PPI network into a directed acyclic graph that reduces the number of represented edges and the search space for discovering subgraphs. Our approach considers weighted and unweighted PPI networks. We compare our best alternative using PPI networks from Saccharomyces cerevisiae (yeast and Homo sapiens (human with state-of-the-art approaches in terms of clustering, biological metrics and execution times, as well as three gold standards for yeast and two for human. Furthermore, we analyze false positive predicted complexes searching the PDBe (Protein Data Bank in Europe database in order to identify matching protein complexes that have been purified and structurally characterized. Our analysis shows

  1. Proteomic Analysis of the Mediator Complex Interactome in Saccharomyces cerevisiae.

    Science.gov (United States)

    Uthe, Henriette; Vanselow, Jens T; Schlosser, Andreas

    2017-02-27

    Here we present the most comprehensive analysis of the yeast Mediator complex interactome to date. Particularly gentle cell lysis and co-immunopurification conditions allowed us to preserve even transient protein-protein interactions and to comprehensively probe the molecular environment of the Mediator complex in the cell. Metabolic 15 N-labeling thereby enabled stringent discrimination between bona fide interaction partners and nonspecifically captured proteins. Our data indicates a functional role for Mediator beyond transcription initiation. We identified a large number of Mediator-interacting proteins and protein complexes, such as RNA polymerase II, general transcription factors, a large number of transcriptional activators, the SAGA complex, chromatin remodeling complexes, histone chaperones, highly acetylated histones, as well as proteins playing a role in co-transcriptional processes, such as splicing, mRNA decapping and mRNA decay. Moreover, our data provides clear evidence, that the Mediator complex interacts not only with RNA polymerase II, but also with RNA polymerases I and III, and indicates a functional role of the Mediator complex in rRNA processing and ribosome biogenesis.

  2. Correlation analysis of the Taurus molecular cloud complex

    International Nuclear Information System (INIS)

    Kleiner, S.C.

    1985-01-01

    Autocorrelation and power spectrum methods were applied to the analysis of the density and velocity structure of the Taurus Complex and Heiles Cloud 2 as traced out by 13 CO J = 1 → 0 molecular line observations obtained with the 14m antenna of the Five College Radio Astronomy Observatory. Statistically significant correlations in the spacing of density fluctuations within the Taurus Complex and Heiles 2 were uncovered. The length scales of the observed correlations correspond in magnitude to the Jeans wavelengths characterizing gravitational instabilities with (i) interstellar atomic hydrogen gas for the case of the Taurus complex, and (ii) molecular hydrogen for Heiles 2. The observed correlations may be the signatures of past and current gravitational instabilities frozen into the structure of the molecular gas. The appendices provide a comprehensive description of the analytical and numerical methods developed for the correlation analysis of molecular clouds

  3. Big and complex data analysis methodologies and applications

    CERN Document Server

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  4. Twenty-one lectures on complex analysis a first course

    CERN Document Server

    Isaev, Alexander

    2017-01-01

    At its core, this concise textbook presents standard material for a first course in complex analysis at the advanced undergraduate level. This distinctive text will prove most rewarding for students who have a genuine passion for mathematics as well as certain mathematical maturity. Primarily aimed at undergraduates with working knowledge of real analysis and metric spaces, this book can also be used to instruct a graduate course. The text uses a conversational style with topics purposefully apportioned into 21 lectures, providing a suitable format for either independent study or lecture-based teaching. Instructors are invited to rearrange the order of topics according to their own vision. A clear and rigorous exposition is supported by engaging examples and exercises unique to each lecture; a large number of exercises contain useful calculation problems. Hints are given for a selection of the more difficult exercises. This text furnishes the reader with a means of learning complex analysis as well as a subtl...

  5. System for decision analysis support on complex waste management issues

    International Nuclear Information System (INIS)

    Shropshire, D.E.

    1997-01-01

    A software system called the Waste Flow Analysis has been developed and applied to complex environmental management processes for the United States Department of Energy (US DOE). The system can evaluate proposed methods of waste retrieval, treatment, storage, transportation, and disposal. Analysts can evaluate various scenarios to see the impacts to waste slows and schedules, costs, and health and safety risks. Decision analysis capabilities have been integrated into the system to help identify preferred alternatives based on a specific objectives may be to maximize the waste moved to final disposition during a given time period, minimize health risks, minimize costs, or combinations of objectives. The decision analysis capabilities can support evaluation of large and complex problems rapidly, and under conditions of variable uncertainty. The system is being used to evaluate environmental management strategies to safely disposition wastes in the next ten years and reduce the environmental legacy resulting from nuclear material production over the past forty years

  6. CISAPS: Complex Informational Spectrum for the Analysis of Protein Sequences

    Directory of Open Access Journals (Sweden)

    Charalambos Chrysostomou

    2015-01-01

    Full Text Available Complex informational spectrum analysis for protein sequences (CISAPS and its web-based server are developed and presented. As recent studies show, only the use of the absolute spectrum in the analysis of protein sequences using the informational spectrum analysis is proven to be insufficient. Therefore, CISAPS is developed to consider and provide results in three forms including absolute, real, and imaginary spectrum. Biologically related features to the analysis of influenza A subtypes as presented as a case study in this study can also appear individually either in the real or imaginary spectrum. As the results presented, protein classes can present similarities or differences according to the features extracted from CISAPS web server. These associations are probable to be related with the protein feature that the specific amino acid index represents. In addition, various technical issues such as zero-padding and windowing that may affect the analysis are also addressed. CISAPS uses an expanded list of 611 unique amino acid indices where each one represents a different property to perform the analysis. This web-based server enables researchers with little knowledge of signal processing methods to apply and include complex informational spectrum analysis to their work.

  7. Operator Semigroups meet Complex Analysis, Harmonic Analysis and Mathematical Physics

    CERN Document Server

    Chill, Ralph; Tomilov, Yuri

    2015-01-01

    This proceedings volume originates from a conference held in Herrnhut in June 2013. It provides unique insights into the power of abstract methods and techniques in dealing successfully with numerous applications stemming from classical analysis and mathematical physics. The book features diverse topics in the area of operator semigroups, including partial differential equations, martingale and Hilbert transforms, Banach and von Neumann algebras, Schrödinger operators, maximal regularity and Fourier multipliers, interpolation, operator-theoretical problems (concerning generation, perturbation and dilation, for example), and various qualitative and quantitative Tauberian theorems with a focus on transfinite induction and magics of Cantor. The last fifteen years have seen the dawn of a new era for semigroup theory with the emphasis on applications of abstract results, often unexpected and far removed from traditional ones. The aim of the conference was to bring together prominent experts in the field of modern...

  8. High frequency vibration analysis by the complex envelope vectorization.

    Science.gov (United States)

    Giannini, O; Carcaterra, A; Sestieri, A

    2007-06-01

    The complex envelope displacement analysis (CEDA) is a procedure to solve high frequency vibration and vibro-acoustic problems, providing the envelope of the physical solution. CEDA is based on a variable transformation mapping the high frequency oscillations into signals of low frequency content and has been successfully applied to one-dimensional systems. However, the extension to plates and vibro-acoustic fields met serious difficulties so that a general revision of the theory was carried out, leading finally to a new method, the complex envelope vectorization (CEV). In this paper the CEV method is described, underlying merits and limits of the procedure, and a set of applications to vibration and vibro-acoustic problems of increasing complexity are presented.

  9. Cumulative trauma and symptom complexity in children: a path analysis.

    Science.gov (United States)

    Hodges, Monica; Godbout, Natacha; Briere, John; Lanktree, Cheryl; Gilbert, Alicia; Kletzka, Nicole Taylor

    2013-11-01

    Multiple trauma exposures during childhood are associated with a range of psychological symptoms later in life. In this study, we examined whether the total number of different types of trauma experienced by children (cumulative trauma) is associated with the complexity of their subsequent symptomatology, where complexity is defined as the number of different symptom clusters simultaneously elevated into the clinical range. Children's symptoms in six different trauma-related areas (e.g., depression, anger, posttraumatic stress) were reported both by child clients and their caretakers in a clinical sample of 318 children. Path analysis revealed that accumulated exposure to multiple different trauma types predicts symptom complexity as reported by both children and their caretakers. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Nostradamus 2014 prediction, modeling and analysis of complex systems

    CERN Document Server

    Suganthan, Ponnuthurai; Chen, Guanrong; Snasel, Vaclav; Abraham, Ajith; Rössler, Otto

    2014-01-01

    The prediction of behavior of complex systems, analysis and modeling of its structure is a vitally important problem in engineering, economy and generally in science today. Examples of such systems can be seen in the world around us (including our bodies) and of course in almost every scientific discipline including such “exotic” domains as the earth’s atmosphere, turbulent fluids, economics (exchange rate and stock markets), population growth, physics (control of plasma), information flow in social networks and its dynamics, chemistry and complex networks. To understand such complex dynamics, which often exhibit strange behavior, and to use it in research or industrial applications, it is paramount to create its models. For this purpose there exists a rich spectrum of methods, from classical such as ARMA models or Box Jenkins method to modern ones like evolutionary computation, neural networks, fuzzy logic, geometry, deterministic chaos amongst others. This proceedings book is a collection of accepted ...

  11. Myocardial ischemia analysis based on electrocardiogram QRS complex

    International Nuclear Information System (INIS)

    Song, J.; Yan, H.; Xu, Z.; Yu, X.; Zhu, R.

    2011-01-01

    Full text: Electrocardiogram (ECG) is an economic, convenient, and non-invasive detecting tool in myocardial ischemia (MI), and its clinical appearance is mainly exhibited by the changes in ST-T complex. Recently, QRS complex characters were proposed to analyze MI by more and more researchers. In this paper, various QRS complex characters were extracted in ECG signals, and their relationship was analyzed systematically. As a result, these characters were divided into two groups, and there existed good relationship among them for each group, while the poor relationship between the groups. Then these QRS complex characters were applied for statistical analysis on MI, and five characters had significant differences after ECG recording verification, which were: QRS upward and downward slopes, transient heart rate, angle R and angle Q. On the other hand, these QRS complex characters were analyzed in frequency domain. Experimental results showed that the frequency features of RR interval series (Heart Rate Variability, HRV), and QRS barycenter sequence had signjficant differences between MI states and normal states. Moreover, QRS barycenter sequence performed better. (author)

  12. Outlier-resilient complexity analysis of heartbeat dynamics

    Science.gov (United States)

    Lo, Men-Tzung; Chang, Yi-Chung; Lin, Chen; Young, Hsu-Wen Vincent; Lin, Yen-Hung; Ho, Yi-Lwun; Peng, Chung-Kang; Hu, Kun

    2015-03-01

    Complexity in physiological outputs is believed to be a hallmark of healthy physiological control. How to accurately quantify the degree of complexity in physiological signals with outliers remains a major barrier for translating this novel concept of nonlinear dynamic theory to clinical practice. Here we propose a new approach to estimate the complexity in a signal by analyzing the irregularity of the sign time series of its coarse-grained time series at different time scales. Using surrogate data, we show that the method can reliably assess the complexity in noisy data while being highly resilient to outliers. We further apply this method to the analysis of human heartbeat recordings. Without removing any outliers due to ectopic beats, the method is able to detect a degradation of cardiac control in patients with congestive heart failure and a more degradation in critically ill patients whose life continuation relies on extracorporeal membrane oxygenator (ECMO). Moreover, the derived complexity measures can predict the mortality of ECMO patients. These results indicate that the proposed method may serve as a promising tool for monitoring cardiac function of patients in clinical settings.

  13. Complex finite element sensitivity method for creep analysis

    International Nuclear Information System (INIS)

    Gomez-Farias, Armando; Montoya, Arturo; Millwater, Harry

    2015-01-01

    The complex finite element method (ZFEM) has been extended to perform sensitivity analysis for mechanical and structural systems undergoing creep deformation. ZFEM uses a complex finite element formulation to provide shape, material, and loading derivatives of the system response, providing an insight into the essential factors which control the behavior of the system as a function of time. A complex variable-based quadrilateral user element (UEL) subroutine implementing the power law creep constitutive formulation was incorporated within the Abaqus commercial finite element software. The results of the complex finite element computations were verified by comparing them to the reference solution for the steady-state creep problem of a thick-walled cylinder in the power law creep range. A practical application of the ZFEM implementation to creep deformation analysis is the calculation of the skeletal point of a notched bar test from a single ZFEM run. In contrast, the standard finite element procedure requires multiple runs. The value of the skeletal point is that it identifies the location where the stress state is accurate, regardless of the certainty of the creep material properties. - Highlights: • A novel finite element sensitivity method (ZFEM) for creep was introduced. • ZFEM has the capability to calculate accurate partial derivatives. • ZFEM can be used for identification of the skeletal point of creep structures. • ZFEM can be easily implemented in a commercial software, e.g. Abaqus. • ZFEM results were shown to be in excellent agreement with analytical solutions

  14. Complexity, Analysis and Control of Singular Biological Systems

    CERN Document Server

    Zhang, Qingling; Zhang, Xue

    2012-01-01

    Complexity, Analysis and Control of Singular Biological Systems follows the control of real-world biological systems at both ecological and phyisological levels concentrating on the application of now-extensively-investigated singular system theory. Much effort has recently been dedicated to the modelling and analysis of developing bioeconomic systems and the text establishes singular examples of these, showing how proper control can help to maintain sustainable economic development of biological resources. The book begins from the essentials of singular systems theory and bifurcations before tackling  the use of various forms of control in singular biological systems using examples including predator-prey relationships and viral vaccination and quarantine control. Researchers and graduate students studying the control of complex biological systems are shown how a variety of methods can be brought to bear and practitioners working with the economics of biological systems and their control will also find the ...

  15. Modeling data irregularities and structural complexities in data envelopment analysis

    CERN Document Server

    Zhu, Joe

    2007-01-01

    In a relatively short period of time, Data Envelopment Analysis (DEA) has grown into a powerful quantitative, analytical tool for measuring and evaluating performance. It has been successfully applied to a whole variety of problems in many different contexts worldwide. This book deals with the micro aspects of handling and modeling data issues in modeling DEA problems. DEA's use has grown with its capability of dealing with complex "service industry" and the "public service domain" types of problems that require modeling of both qualitative and quantitative data. This handbook treatment deals with specific data problems including: imprecise or inaccurate data; missing data; qualitative data; outliers; undesirable outputs; quality data; statistical analysis; software and other data aspects of modeling complex DEA problems. In addition, the book will demonstrate how to visualize DEA results when the data is more than 3-dimensional, and how to identify efficiency units quickly and accurately.

  16. Procedure for the analysis of americium in complex matrices

    International Nuclear Information System (INIS)

    Knab, D.

    1978-02-01

    A radioanalytical procedure for the analysis of americium in complex matrices has been developed. Clean separations of americium can be obtained from up to 100 g of sample ash, regardless of the starting material. The ability to analyze large masses of material provides the increased sensitivity necessary to detect americium in many environmental samples. The procedure adequately decontaminates from rare earth elements and natural radioactive nuclides that interfere with the alpha spectrometric measurements

  17. Complexity analysis based on generalized deviation for financial markets

    Science.gov (United States)

    Li, Chao; Shang, Pengjian

    2018-03-01

    In this paper, a new modified method is proposed as a measure to investigate the correlation between past price and future volatility for financial time series, known as the complexity analysis based on generalized deviation. In comparison with the former retarded volatility model, the new approach is both simple and computationally efficient. The method based on the generalized deviation function presents us an exhaustive way showing the quantization of the financial market rules. Robustness of this method is verified by numerical experiments with both artificial and financial time series. Results show that the generalized deviation complexity analysis method not only identifies the volatility of financial time series, but provides a comprehensive way distinguishing the different characteristics between stock indices and individual stocks. Exponential functions can be used to successfully fit the volatility curves and quantify the changes of complexity for stock market data. Then we study the influence for negative domain of deviation coefficient and differences during the volatile periods and calm periods. after the data analysis of the experimental model, we found that the generalized deviation model has definite advantages in exploring the relationship between the historical returns and future volatility.

  18. Analysis of costs-benefits tradeoffs of complex security systems

    International Nuclear Information System (INIS)

    Hicks, M.J.

    1996-01-01

    Essential to a systems approach to design of security systems is an analysis of the cost effectiveness of alternative designs. While the concept of analysis of costs and benefits is straightforward, implementation can be at the least tedious and, for complex designs and alternatives, can become nearly intractable without the help of structured analysis tools. PACAIT--Performance and Cost Analysis Integrated Tools--is a prototype tool. The performance side of the analysis collates and reduces data from ASSESS, and existing DOE PC-based security systems performance analysis tool. The costs side of the analysis uses ACE, an existing DOD PC-based costs analysis tool. Costs are reported over the full life-cycle of the system, that is, the costs to procure, operate, maintain and retire the system and all of its components. Results are collected in Microsoft reg-sign Excel workbooks and are readily available to analysts and decision makers in both tabular and graphical formats and at both the system and path-element levels

  19. Economic development and wage inequality: A complex system analysis.

    Directory of Open Access Journals (Sweden)

    Angelica Sbardella

    Full Text Available Adapting methods from complex system analysis, this paper analyzes the features of the complex relationship between wage inequality and the development and industrialization of a country. Development is understood as a combination of a monetary index, GDP per capita, and a recently introduced measure of a country's economic complexity: Fitness. Initially the paper looks at wage inequality on a global scale, over the time period 1990-2008. Our empirical results show that globally the movement of wage inequality along with the ongoing industrialization of countries has followed a longitudinally persistent pattern comparable to the one theorized by Kuznets in the fifties: countries with an average level of development suffer the highest levels of wage inequality. Next, the study narrows its focus on wage inequality within the United States. By using data on wages and employment in the approximately 3100 US counties over the time interval 1990-2014, it generalizes the Fitness-Complexity metric for geographic units and industrial sectors, and then investigates wage inequality between NAICS industries. The empirical time and scale dependencies are consistent with a relation between wage inequality and development driven by institutional factors comparing countries, and by change in the structural compositions of sectors in a homogeneous institutional environment, such as the counties of the United States.

  20. Economic development and wage inequality: A complex system analysis

    Science.gov (United States)

    Pugliese, Emanuele; Pietronero, Luciano

    2017-01-01

    Adapting methods from complex system analysis, this paper analyzes the features of the complex relationship between wage inequality and the development and industrialization of a country. Development is understood as a combination of a monetary index, GDP per capita, and a recently introduced measure of a country’s economic complexity: Fitness. Initially the paper looks at wage inequality on a global scale, over the time period 1990–2008. Our empirical results show that globally the movement of wage inequality along with the ongoing industrialization of countries has followed a longitudinally persistent pattern comparable to the one theorized by Kuznets in the fifties: countries with an average level of development suffer the highest levels of wage inequality. Next, the study narrows its focus on wage inequality within the United States. By using data on wages and employment in the approximately 3100 US counties over the time interval 1990–2014, it generalizes the Fitness-Complexity metric for geographic units and industrial sectors, and then investigates wage inequality between NAICS industries. The empirical time and scale dependencies are consistent with a relation between wage inequality and development driven by institutional factors comparing countries, and by change in the structural compositions of sectors in a homogeneous institutional environment, such as the counties of the United States. PMID:28926577

  1. Economic development and wage inequality: A complex system analysis.

    Science.gov (United States)

    Sbardella, Angelica; Pugliese, Emanuele; Pietronero, Luciano

    2017-01-01

    Adapting methods from complex system analysis, this paper analyzes the features of the complex relationship between wage inequality and the development and industrialization of a country. Development is understood as a combination of a monetary index, GDP per capita, and a recently introduced measure of a country's economic complexity: Fitness. Initially the paper looks at wage inequality on a global scale, over the time period 1990-2008. Our empirical results show that globally the movement of wage inequality along with the ongoing industrialization of countries has followed a longitudinally persistent pattern comparable to the one theorized by Kuznets in the fifties: countries with an average level of development suffer the highest levels of wage inequality. Next, the study narrows its focus on wage inequality within the United States. By using data on wages and employment in the approximately 3100 US counties over the time interval 1990-2014, it generalizes the Fitness-Complexity metric for geographic units and industrial sectors, and then investigates wage inequality between NAICS industries. The empirical time and scale dependencies are consistent with a relation between wage inequality and development driven by institutional factors comparing countries, and by change in the structural compositions of sectors in a homogeneous institutional environment, such as the counties of the United States.

  2. Use of neural networks in the analysis of complex systems

    International Nuclear Information System (INIS)

    Uhrig, R.E.

    1992-01-01

    The application of neural networks, alone or in conjunction with other advanced technologies (expert systems, fuzzy logic, and/or genetic algorithms) to some of the problems of complex engineering systems has the potential to enhance the safety reliability and operability of these systems. The work described here deals with complex systems or parts of such systems that can be isolated from the total system. Typically, the measured variables from the systems are analog variables that must be sampled and normalized to expected peak values before they are introduced into neural networks. Often data must be processed to put it into a form more acceptable to the neural network. The neural networks are usually simulated on modern high-speed computers that carry out the calculations serially. However, it is possible to implement neural networks using specially designed microchips where the network calculations are truly carried out in parallel, thereby providing virtually instantaneous outputs for each set of inputs. Specific applications described include: Diagnostics: State of the Plant; Hybrid System for Transient Identification; Detection of Change of Mode in Complex Systems; Sensor Validation; Plant-Wide Monitoring; Monitoring of Performance and Efficiency; and Analysis of Vibrations. Although the specific examples described deal with nuclear power plants or their subsystems, the techniques described can be applied to a wide variety of complex engineering systems

  3. State analysis requirements database for engineering complex embedded systems

    Science.gov (United States)

    Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.

  4. Fire hazard analysis for Plutonium Finishing Plant complex

    International Nuclear Information System (INIS)

    MCKINNIS, D.L.

    1999-01-01

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards

  5. Fire hazard analysis for Plutonium Finishing Plant complex

    Energy Technology Data Exchange (ETDEWEB)

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.

  6. Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis

    Science.gov (United States)

    Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.

    We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.

  7. Waste Sampling and Characterization Facility (WSCF) Complex Safety Analysis

    International Nuclear Information System (INIS)

    MELOY, R.T.

    2003-01-01

    The Waste Sampling and Characterization Facility (WSCF) is an analytical laboratory complex on the Hanford Site that was constructed to perform chemical and low-level radiological analyses on a variety of sample media in support of Hanford Site customer needs. The complex is located in the 600 area of the Hanford Site, east of the 200 West Area. Customers include effluent treatment facilities, waste disposal and storage facilities, and remediation projects. Customers primarily need analysis results for process control and to comply with federal, Washington State, and US. Department of Energy (DOE) environmental or industrial hygiene requirements. This document was prepared to analyze the facility for safety consequences and includes the following steps: Determine radionuclide and highly hazardous chemical inventories; Compare these inventories to the appropriate regulatory limits; Document the compliance status with respect to these limits; and Identify the administrative controls necessary to maintain this status

  8. Complex segregation analysis of craniomandibular osteopathy in Deutsch Drahthaar dogs.

    Science.gov (United States)

    Vagt, J; Distl, O

    2018-01-01

    This study investigated familial relationships among Deutsch Drahthaar dogs with craniomandibular osteopathy and examined the most likely mode of inheritance. Sixteen Deutsch Drahthaar dogs with craniomandibular osteopathy were diagnosed using clinical findings, radiography or computed tomography. All 16 dogs with craniomandibular osteopathy had one common ancestor. Complex segregation analyses rejected models explaining the segregation of craniomandibular osteopathy through random environmental variation, monogenic inheritance or an additive sex effect. Polygenic and mixed major gene models sufficiently explained the segregation of craniomandibular osteopathy in the pedigree analysis and offered the most likely hypotheses. The SLC37A2:c.1332C>T variant was not found in a sample of Deutsch Drahthaar dogs with craniomandibular osteopathy, nor in healthy controls. Craniomandibular osteopathy is an inherited condition in Deutsch Drahthaar dogs and the inheritance seems to be more complex than a simple Mendelian model. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Fault Correspondence Analysis in Complex Electric Power Systems

    Directory of Open Access Journals (Sweden)

    WANG, C.

    2015-02-01

    Full Text Available Wide area measurement system (WAMS mainly serves for the requirement of time synchronization in complex electric power systems. The analysis and control of power system mostly depends on the measurement of state variables, and WAMS provides the basis for dynamic monitoring of power system by these measurements, which can also satisfy the demands of observable, controllable, real-time analysis and decision, self-adaptive etc. requested by smart grid. In this paper, based on the principles of fault correspondence analysis, by calculating row characteristic which represents nodal electrical information and column characteristic which represents acquisition time information, we will conduct intensive research on fault detection. The research results indicate that the fault location is determined by the first dimensional variable, and the occurrence time of fault is determined by the second dimensional variable. The research in this paper will contribute to the development of future smart grid.

  10. Readability, complexity, and suitability analysis of online lymphedema resources.

    Science.gov (United States)

    Tran, Bao Ngoc N; Singh, Mansher; Lee, Bernard T; Rudd, Rima; Singhal, Dhruv

    2017-06-01

    Over 72% of Americans use online health information to assist in health care decision-making. Previous studies of lymphedema literature have focused only on reading level of patient-oriented materials online. Findings indicate they are too advanced for most patients to comprehend. This, more comprehensive study, expands the previous analysis to include critical elements of health materials beyond readability using assessment tools to report on the complexity and density of data as well as text design, vocabulary, and organization. The top 10 highest ranked websites on lymphedema were identified using the most popular search engine (Google). Website content was analyzed for readability, complexity, and suitability using Simple Measure of Gobbledygook, PMOSE/iKIRSCH, and Suitability Assessment of Materials (SAM), respectively. PMOSE/iKIRSCH and SAM were performed by two independent raters. Fleiss' kappa score was calculated to ensure inter-rater reliability. Online lymphedema literature had a reading grade level of 14.0 (SMOG). Overall complexity score was 6.7 (PMOSE/iKIRSCH) corresponding to "low" complexity and requiring a 8th-12th grade education. Fleiss' kappa score was 80% (P = 0.04, "substantial" agreement). Overall suitability score was 45% (SAM) correlating to the lowest level of "adequate" suitability. Fleiss' kappa score was 76% (P = 0.06, "substantial" agreement). Online resources for lymphedema are above the recommended levels for readability and complexity. The suitability level is barely adequate for the intended audience. Overall, these materials are too sophisticated for the average American adult, whose literacy skills are well documented. Further efforts to revise these materials are needed to improve patient comprehension and understanding. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Spontaneous brain network activity: Analysis of its temporal complexity

    Directory of Open Access Journals (Sweden)

    Mangor Pedersen

    2017-06-01

    Full Text Available The brain operates in a complex way. The temporal complexity underlying macroscopic and spontaneous brain network activity is still to be understood. In this study, we explored the brain’s complexity by combining functional connectivity, graph theory, and entropy analyses in 25 healthy people using task-free functional magnetic resonance imaging. We calculated the pairwise instantaneous phase synchrony between 8,192 brain nodes for a total of 200 time points. This resulted in graphs for which time series of clustering coefficients (the “cliquiness” of a node and participation coefficients (the between-module connectivity of a node were estimated. For these two network metrics, sample entropy was calculated. The procedure produced a number of results: (1 Entropy is higher for the participation coefficient than for the clustering coefficient. (2 The average clustering coefficient is negatively related to its associated entropy, whereas the average participation coefficient is positively related to its associated entropy. (3 The level of entropy is network-specific to the participation coefficient, but not to the clustering coefficient. High entropy for the participation coefficient was observed in the default-mode, visual, and motor networks. These results were further validated using an independent replication dataset. Our work confirms that brain networks are temporally complex. Entropy is a good candidate metric to explore temporal network alterations in diseases with paroxysmal brain disruptions, including schizophrenia and epilepsy. In recent years, connectomics has provided significant insights into the topological complexity of brain networks. However, the temporal complexity of brain networks still remains somewhat poorly understood. In this study we used entropy analysis to demonstrate that the properties of network segregation (the clustering coefficient and integration (the participation coefficient are temporally complex

  12. Complex experimental analysis of rifle-shooter interaction

    Directory of Open Access Journals (Sweden)

    Michał Taraszewski, M.ScEng, PhD. candidate

    2017-10-01

    Full Text Available In this study, a complex analysis of a man-weapon interaction based on experimental effort is presented. The attention is focused on how a shooter can influence on a rifle, opposite to generally considered in literature rifle's impact on a shooter. It is shown, based on the kbk AKM weapon, that each support point of the rifle has an substantial impact on the system. It is said that identifying human reactions on weapon may let to describe gun movement and thus may be applied to weapon accuracy determination.

  13. CFD three dimensional wake analysis in complex terrain

    Science.gov (United States)

    Castellani, F.; Astolfi, D.; Terzi, L.

    2017-11-01

    Even if wind energy technology is nowadays fully developed, the use of wind energy in very complex terrain is still challenging. In particular, it is challenging to characterize the combination effects of wind ow over complex terrain and wake interactions between nearby turbines and this has a practical relevance too, for the perspective of mitigating anomalous vibrations and loads as well improving the farm efficiency. In this work, a very complex terrain site has been analyzed through a Reynolds-averaged CFD (Computational Fluid Dynamics) numerical wind field model; in the simulation the inuence of wakes has been included through the Actuator Disk (AD) approach. In particular, the upstream turbine of a cluster of 4 wind turbines having 2.3 MW of rated power is studied. The objective of this study is investigating the full three-dimensional wind field and the impact of three-dimensionality on the evolution of the waked area between nearby turbines. A post-processing method of the output of the CFD simulation is developed and this allows to estimate the wake lateral deviation and the wake width. The reliability of the numerical approach is inspired by and crosschecked through the analysis of the operational SCADA (Supervisory Control and Data Acquisition) data of the cluster of interest.

  14. A low complexity visualization tool that helps to perform complex systems analysis

    International Nuclear Information System (INIS)

    Beiro, M G; Alvarez-Hamelin, J I; Busch, J R

    2008-01-01

    In this paper, we present an extension of large network visualization (LaNet-vi), a tool to visualize large scale networks using the k-core decomposition. One of the new features is how vertices compute their angular position. While in the later version it is done using shell clusters, in this version we use the angular coordinate of vertices in higher k-shells, and arrange the highest shell according to a cliques decomposition. The time complexity goes from O(n√n) to O(n) upon bounds on a heavy-tailed degree distribution. The tool also performs a k-core-connectivity analysis, highlighting vertices that are not k-connected; e.g. this property is useful to measure robustness or quality of service (QoS) capabilities in communication networks. Finally, the actual version of LaNet-vi can draw labels and all the edges using transparencies, yielding an accurate visualization. Based on the obtained figure, it is possible to distinguish different sources and types of complex networks at a glance, in a sort of 'network iris-print'.

  15. A low complexity visualization tool that helps to perform complex systems analysis

    Science.gov (United States)

    Beiró, M. G.; Alvarez-Hamelin, J. I.; Busch, J. R.

    2008-12-01

    In this paper, we present an extension of large network visualization (LaNet-vi), a tool to visualize large scale networks using the k-core decomposition. One of the new features is how vertices compute their angular position. While in the later version it is done using shell clusters, in this version we use the angular coordinate of vertices in higher k-shells, and arrange the highest shell according to a cliques decomposition. The time complexity goes from O(n\\sqrt n) to O(n) upon bounds on a heavy-tailed degree distribution. The tool also performs a k-core-connectivity analysis, highlighting vertices that are not k-connected; e.g. this property is useful to measure robustness or quality of service (QoS) capabilities in communication networks. Finally, the actual version of LaNet-vi can draw labels and all the edges using transparencies, yielding an accurate visualization. Based on the obtained figure, it is possible to distinguish different sources and types of complex networks at a glance, in a sort of 'network iris-print'.

  16. Optimal fatigue analysis of structures during complex loadings

    Directory of Open Access Journals (Sweden)

    Karaouni Habib

    2016-01-01

    Full Text Available A new framework for high cycle fatigue analysis of metallic structures under complex multi-parameter loadings was here developed. This allows to reduce the analysis on a 2-D window with a characterized one-parameter cyclic loading thanks to an equivalence rule relative to damage between any two loadings. The simplified inelastic analysis introduced by J. Zarka [J. Zarka et al. 1990. A new approach in inelastic analysis of structures. CADLM] was used to find the limit state of the structure. A new design rules for fatigue analysis by utilizing automatic learning systems was successfully performed. A database was built by coupling numerical simulations and experimental results on several welded specimens which are considered as a general structure in the proposed approach. This could be possible by the introduction of an intelligent description of a general fatigue case based on the actual theories and models. A software, FATPRO [M.I. Systems, FatPro, available at http://www.mzintsys.com/our_products_fatpro.html], based on this work has been developed at MZ Intelligent Systems.

  17. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  18. Discrete Fourier Transform Analysis in a Complex Vector Space

    Science.gov (United States)

    Dean, Bruce H.

    2009-01-01

    Alternative computational strategies for the Discrete Fourier Transform (DFT) have been developed using analysis of geometric manifolds. This approach provides a general framework for performing DFT calculations, and suggests a more efficient implementation of the DFT for applications using iterative transform methods, particularly phase retrieval. The DFT can thus be implemented using fewer operations when compared to the usual DFT counterpart. The software decreases the run time of the DFT in certain applications such as phase retrieval that iteratively call the DFT function. The algorithm exploits a special computational approach based on analysis of the DFT as a transformation in a complex vector space. As such, this approach has the potential to realize a DFT computation that approaches N operations versus Nlog(N) operations for the equivalent Fast Fourier Transform (FFT) calculation.

  19. Objective high Resolution Analysis over Complex Terrain with VERA

    Science.gov (United States)

    Mayer, D.; Steinacker, R.; Steiner, A.

    2012-04-01

    VERA (Vienna Enhanced Resolution Analysis) is a model independent, high resolution objective analysis of meteorological fields over complex terrain. This system consists of a special developed quality control procedure and a combination of an interpolation and a downscaling technique. Whereas the so called VERA-QC is presented at this conference in the contribution titled "VERA-QC, an approved Data Quality Control based on Self-Consistency" by Andrea Steiner, this presentation will focus on the method and the characteristics of the VERA interpolation scheme which enables one to compute grid point values of a meteorological field based on irregularly distributed observations and topography related aprior knowledge. Over a complex topography meteorological fields are not smooth in general. The roughness which is induced by the topography can be explained physically. The knowledge about this behavior is used to define the so called Fingerprints (e.g. a thermal Fingerprint reproducing heating or cooling over mountainous terrain or a dynamical Fingerprint reproducing positive pressure perturbation on the windward side of a ridge) under idealized conditions. If the VERA algorithm recognizes patterns of one or more Fingerprints at a few observation points, the corresponding patterns are used to downscale the meteorological information in a greater surrounding. This technique allows to achieve an analysis with a resolution much higher than the one of the observational network. The interpolation of irregularly distributed stations to a regular grid (in space and time) is based on a variational principle applied to first and second order spatial and temporal derivatives. Mathematically, this can be formulated as a cost function that is equivalent to the penalty function of a thin plate smoothing spline. After the analysis field has been divided into the Fingerprint components and the unexplained part respectively, the requirement of a smooth distribution is applied to the

  20. Complexity and Vulnerability Analysis of Critical Infrastructures: A Methodological Approach

    Directory of Open Access Journals (Sweden)

    Yongliang Deng

    2017-01-01

    Full Text Available Vulnerability analysis of network models has been widely adopted to explore the potential impacts of random disturbances, deliberate attacks, and natural disasters. However, almost all these models are based on a fixed topological structure, in which the physical properties of infrastructure components and their interrelationships are not well captured. In this paper, a new research framework is put forward to quantitatively explore and assess the complexity and vulnerability of critical infrastructure systems. Then, a case study is presented to prove the feasibility and validity of the proposed framework. After constructing metro physical network (MPN, Pajek is employed to analyze its corresponding topological properties, including degree, betweenness, average path length, network diameter, and clustering coefficient. With a comprehensive understanding of the complexity of MPN, it would be beneficial for metro system to restrain original near-miss or accidents and support decision-making in emergency situations. Moreover, through the analysis of two simulation protocols for system component failure, it is found that the MPN turned to be vulnerable under the condition that the high-degree nodes or high-betweenness edges are attacked. These findings will be conductive to offer recommendations and proposals for robust design, risk-based decision-making, and prioritization of risk reduction investment.

  1. Informational analysis involving application of complex information system

    Science.gov (United States)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  2. Stochastic analysis of complex reaction networks using binomial moment equations.

    Science.gov (United States)

    Barzel, Baruch; Biham, Ofer

    2012-09-01

    The stochastic analysis of complex reaction networks is a difficult problem because the number of microscopic states in such systems increases exponentially with the number of reactive species. Direct integration of the master equation is thus infeasible and is most often replaced by Monte Carlo simulations. While Monte Carlo simulations are a highly effective tool, equation-based formulations are more amenable to analytical treatment and may provide deeper insight into the dynamics of the network. Here, we present a highly efficient equation-based method for the analysis of stochastic reaction networks. The method is based on the recently introduced binomial moment equations [Barzel and Biham, Phys. Rev. Lett. 106, 150602 (2011)]. The binomial moments are linear combinations of the ordinary moments of the probability distribution function of the population sizes of the interacting species. They capture the essential combinatorics of the reaction processes reflecting their stoichiometric structure. This leads to a simple and transparent form of the equations, and allows a highly efficient and surprisingly simple truncation scheme. Unlike ordinary moment equations, in which the inclusion of high order moments is prohibitively complicated, the binomial moment equations can be easily constructed up to any desired order. The result is a set of equations that enables the stochastic analysis of complex reaction networks under a broad range of conditions. The number of equations is dramatically reduced from the exponential proliferation of the master equation to a polynomial (and often quadratic) dependence on the number of reactive species in the binomial moment equations. The aim of this paper is twofold: to present a complete derivation of the binomial moment equations; to demonstrate the applicability of the moment equations for a representative set of example networks, in which stochastic effects play an important role.

  3. Analysis and interpretation of diffraction data from complex, anisotropic materials

    Science.gov (United States)

    Tutuncu, Goknur

    Most materials are elastically anisotropic and exhibit additional anisotropy beyond elastic deformation. For instance, in ferroelectric materials the main inelastic deformation mode is via domains, which are highly anisotropic crystallographic features. To quantify this anisotropy of ferroelectrics, advanced X-ray and neutron diffraction methods were employed. Extensive sets of data were collected from tetragonal BaTiO3, PZT and other ferroelectric ceramics. Data analysis was challenging due to the complex constitutive behavior of these materials. To quantify the elastic strain and texture evolution in ferroelectrics under loading, a number of data analysis techniques such as the single peak and Rietveld methods were used and their advantages and disadvantages compared. It was observed that the single peak analysis fails at low peak intensities especially after domain switching while the Rietveld method does not account for lattice strain anisotropy although it overcomes the low intensity problem via whole pattern analysis. To better account for strain anisotropy the constant stress (Reuss) approximation was employed within the Rietveld method and new formulations to estimate lattice strain were proposed. Along the way, new approaches for handling highly anisotropic lattice strain data were also developed and applied. All of the ceramics studied exhibited significant changes in their crystallographic texture after loading indicating non-180° domain switching. For a full interpretation of domain switching the spherical harmonics method was employed in Rietveld. A procedure for simultaneous refinement of multiple data sets was established for a complete texture analysis. To further interpret diffraction data, a solid mechanics model based on the self-consistent approach was used in calculating lattice strain and texture evolution during the loading of a polycrystalline ferroelectric. The model estimates both the macroscopic average response of a specimen and its hkl

  4. Spectroscopic analysis of the powdery complex chitosan-iodine

    Science.gov (United States)

    Gegel, Natalia O.; Babicheva, Tatyana S.; Belyakova, Olga A.; Lugovitskaya, Tatyana N.; Shipovskaya, Anna B.

    2018-04-01

    A chitosan-iodine complex was obtained by modification of polymer powder in the vapor of an iodine-containing sorbate and studied by electron and IR spectroscopy, optical rotation dispersion. It was found that the electronic spectra of an aqueous solution of the modified chitosan (the source one and that stored for a year) showed intense absorption bands of triiodide and iodate ions, and also polyiodide ions, bound to the macromolecule by exciton bonding with charge transfer. Analysis of the IR spectra shows destruction of the network of intramolecular and intermolecular hydrogen bonds in the iodinated chitosan powder in comparison with the source polymer and the formation of a new chemical substance. E.g., the absorption band of deformation vibrations of the hydroxyl group disappears in the modified sample, and that of the protonated amino group shifts toward shorter wavelengths. The intensity of the stretching vibration band of the glucopyranose ring atoms significantly reduces. Heating of the modified sample at a temperature below the thermal degradation point of the polymer leads to stabilization of the chitosan-iodine complex. Based on our studies, the hydroxyl and amino groups of the aminopolysaccharide have been recognized as the centers of retention of polyiodide chains in the chitosan matrix.

  5. Recent Developments in Complex Analysis and Computer Algebra

    CERN Document Server

    Kajiwara, Joji; Xu, Yongzhi

    1999-01-01

    This volume consists of papers presented in the special sessions on "Complex and Numerical Analysis", "Value Distribution Theory and Complex Domains", and "Use of Symbolic Computation in Mathematics Education" of the ISAAC'97 Congress held at the University of Delaware, during June 2-7, 1997. The ISAAC Congress coincided with a U.S.-Japan Seminar also held at the University of Delaware. The latter was supported by the National Science Foundation through Grant INT-9603029 and the Japan Society for the Promotion of Science through Grant MTCS-134. It was natural that the participants of both meetings should interact and consequently several persons attending the Congress also presented papers in the Seminar. The success of the ISAAC Congress and the U.S.-Japan Seminar has led to the ISAAC'99 Congress being held in Fukuoka, Japan during August 1999. Many of the same participants will return to this Seminar. Indeed, it appears that the spirit of the U.S.-Japan Seminar will be continued every second year as part of...

  6. Complex network analysis of state spaces for random Boolean networks

    Energy Technology Data Exchange (ETDEWEB)

    Shreim, Amer [Complexity Science Group, Department of Physics and Astronomy, University of Calgary, Calgary, AB, T2N 1N4 (Canada); Berdahl, Andrew [Complexity Science Group, Department of Physics and Astronomy, University of Calgary, Calgary, AB, T2N 1N4 (Canada); Sood, Vishal [Complexity Science Group, Department of Physics and Astronomy, University of Calgary, Calgary, AB, T2N 1N4 (Canada); Grassberger, Peter [Complexity Science Group, Department of Physics and Astronomy, University of Calgary, Calgary, AB, T2N 1N4 (Canada); Paczuski, Maya [Complexity Science Group, Department of Physics and Astronomy, University of Calgary, Calgary, AB, T2N 1N4 (Canada)

    2008-01-15

    We apply complex network analysis to the state spaces of random Boolean networks (RBNs). An RBN contains N Boolean elements each with K inputs. A directed state space network (SSN) is constructed by linking each dynamical state, represented as a node, to its temporal successor. We study the heterogeneity of these SSNs at both local and global scales, as well as sample to-sample fluctuations within an ensemble of SSNs. We use in-degrees of nodes as a local topological measure, and the path diversity (Shreim A et al 2007 Phys. Rev. Lett. 98 198701) of an SSN as a global topological measure. RBNs with 2 {<=} K {<=} 5 exhibit non-trivial fluctuations at both local and global scales, while K = 2 exhibits the largest sample-to-sample (possibly non-self-averaging) fluctuations. We interpret the observed 'multi scale' fluctuations in the SSNs as indicative of the criticality and complexity of K = 2 RBNs. 'Garden of Eden' (GoE) states are nodes on an SSN that have in-degree zero. While in-degrees of non-GoE nodes for K > 1 SSNs can assume any integer value between 0 and 2{sup N}, for K = 1 all the non-GoE nodes in a given SSN have the same in-degree which is always a power of two.

  7. Complex network analysis of state spaces for random Boolean networks

    International Nuclear Information System (INIS)

    Shreim, Amer; Berdahl, Andrew; Sood, Vishal; Grassberger, Peter; Paczuski, Maya

    2008-01-01

    We apply complex network analysis to the state spaces of random Boolean networks (RBNs). An RBN contains N Boolean elements each with K inputs. A directed state space network (SSN) is constructed by linking each dynamical state, represented as a node, to its temporal successor. We study the heterogeneity of these SSNs at both local and global scales, as well as sample to-sample fluctuations within an ensemble of SSNs. We use in-degrees of nodes as a local topological measure, and the path diversity (Shreim A et al 2007 Phys. Rev. Lett. 98 198701) of an SSN as a global topological measure. RBNs with 2 ≤ K ≤ 5 exhibit non-trivial fluctuations at both local and global scales, while K = 2 exhibits the largest sample-to-sample (possibly non-self-averaging) fluctuations. We interpret the observed 'multi scale' fluctuations in the SSNs as indicative of the criticality and complexity of K = 2 RBNs. 'Garden of Eden' (GoE) states are nodes on an SSN that have in-degree zero. While in-degrees of non-GoE nodes for K > 1 SSNs can assume any integer value between 0 and 2 N , for K = 1 all the non-GoE nodes in a given SSN have the same in-degree which is always a power of two

  8. Method for analysis the complex grounding cables system

    International Nuclear Information System (INIS)

    Ackovski, R.; Acevski, N.

    2002-01-01

    A new iterative method for the analysis of the performances of the complex grounding systems (GS) in underground cable power networks with coated and/or uncoated metal sheathed cables is proposed in this paper. The analyzed grounding system consists of the grounding grid of a high voltage (HV) supplying transformer station (TS), middle voltage/low voltage (MV/LV) consumer TSs and arbitrary number of power cables, connecting them. The derived method takes into consideration the drops of voltage in the cable sheets and the mutual influence among all earthing electrodes, due to the resistive coupling through the soil. By means of the presented method it is possible to calculate the main grounding system performances, such as earth electrode potentials under short circuit fault to ground conditions, earth fault current distribution in the whole complex grounding system, step and touch voltages in the nearness of the earthing electrodes dissipating the fault current in the earth, impedances (resistances) to ground of all possible fault locations, apparent shield impedances to ground of all power cables, e.t.c. The proposed method is based on the admittance summation method [1] and is appropriately extended, so that it takes into account resistive coupling between the elements that the GS. (Author)

  9. Weighted Complex Network Analysis of Shanghai Rail Transit System

    Directory of Open Access Journals (Sweden)

    Yingying Xing

    2016-01-01

    Full Text Available With increasing passenger flows and construction scale, Shanghai rail transit system (RTS has entered a new era of networking operation. In addition, the structure and properties of the RTS network have great implications for urban traffic planning, design, and management. Thus, it is necessary to acquire their network properties and impacts. In this paper, the Shanghai RTS, as well as passenger flows, will be investigated by using complex network theory. Both the topological and dynamic properties of the RTS network are analyzed and the largest connected cluster is introduced to assess the reliability and robustness of the RTS network. Simulation results show that the distribution of nodes strength exhibits a power-law behavior and Shanghai RTS network shows a strong weighted rich-club effect. This study also indicates that the intentional attacks are more detrimental to the RTS network than to the random weighted network, but the random attacks can cause slightly more damage to the random weighted network than to the RTS network. Our results provide a richer view of complex weighted networks in real world and possibilities of risk analysis and policy decisions for the RTS operation department.

  10. Complexity Analysis of Industrial Organizations Based on a Perspective of Systems Engineering Analysts

    Directory of Open Access Journals (Sweden)

    I. H. Garbie

    2011-12-01

    Full Text Available Complexity in industrial organizations became more difficult and complex to be solved and it needs more attention from academicians and technicians. For these reasons, complexity in industrial organizations represents a new challenge in the next decades. Until now, analysis of industrial organizations complexity is still remaining a research topic of immense international interest and they require reduction in their complexity. In this paper, analysis of complexity in industrial organizations is shown based on the perspective of systems engineering analyst. In this perspective, analysis of complexity was divided into different levels and these levels were defined as complexity levels. A framework of analyzing these levels was proposed and suggested based on the complexity in industrial organizations. This analysis was divided into four main issues: industrial system vision, industrial system structure, industrial system operating, and industrial system evaluating. This analysis shows that the complexity of industrial organizations is still an ill-structured and a multi-dimensional problem.

  11. Automated modelling of complex refrigeration cycles through topological structure analysis

    International Nuclear Information System (INIS)

    Belman-Flores, J.M.; Riesco-Avila, J.M.; Gallegos-Munoz, A.; Navarro-Esbri, J.; Aceves, S.M.

    2009-01-01

    We have developed a computational method for analysis of refrigeration cycles. The method is well suited for automated analysis of complex refrigeration systems. The refrigerator is specified through a description of flows representing thermodynamic sates at system locations; components that modify the thermodynamic state of a flow; and controls that specify flow characteristics at selected points in the diagram. A system of equations is then established for the refrigerator, based on mass, energy and momentum balances for each of the system components. Controls specify the values of certain system variables, thereby reducing the number of unknowns. It is found that the system of equations for the refrigerator may contain a number of redundant or duplicate equations, and therefore further equations are necessary for a full characterization. The number of additional equations is related to the number of loops in the cycle, and this is calculated by a matrix-based topological method. The methodology is demonstrated through an analysis of a two-stage refrigeration cycle.

  12. Fractal dimension analysis of complexity in Ligeti piano pieces

    Science.gov (United States)

    Bader, Rolf

    2005-04-01

    Fractal correlation dimensional analysis has been performed with whole solo piano pieces by Gyrgy Ligeti at every 50ms interval of the pieces. The resulting curves of development of complexity represented by the fractal dimension showed up a very reasonable correlation with the perceptional density of events during these pieces. The seventh piece of Ligeti's ``Musica ricercata'' was used as a test case. Here, each new part of the piece was followed by an increase of the fractal dimension because of the increase of information at the part changes. The second piece ``Galamb borong,'' number seven of the piano Etudes was used, because Ligeti wrote these Etudes after studying fractal geometry. Although the piece is not fractal in the strict mathematical sense, the overall structure of the psychoacoustic event-density as well as the detailed event development is represented by the fractal dimension plot.

  13. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  14. Lempel-Ziv complexity analysis of one dimensional cellular automata.

    Science.gov (United States)

    Estevez-Rams, E; Lora-Serrano, R; Nunes, C A J; Aragón-Fernández, B

    2015-12-01

    Lempel-Ziv complexity measure has been used to estimate the entropy density of a string. It is defined as the number of factors in a production factorization of a string. In this contribution, we show that its use can be extended, by using the normalized information distance, to study the spatiotemporal evolution of random initial configurations under cellular automata rules. In particular, the transfer information from time consecutive configurations is studied, as well as the sensitivity to perturbed initial conditions. The behavior of the cellular automata rules can be grouped in different classes, but no single grouping captures the whole nature of the involved rules. The analysis carried out is particularly appropriate for studying the computational processing capabilities of cellular automata rules.

  15. Multifractal analysis of forest fires in complex regions

    Science.gov (United States)

    Vega Orozco, C. D.; Kanevski, M.; Golay, J.; Tonini, M.; Conedera, M.

    2012-04-01

    Forest fires can be studied as point processes where the ignition points represent the set of locations of the observed events in a defined study region. Their spatial and temporal patterns can be characterized by their fractal properties; which quantify the global aspect of the geometry of the support data. However, a monofractal dimension can not completely describe the pattern structure and related scaling properties. Enhancements in fractal theory had developed the multifractal concept which describes the measures from which interlinked fractal sets can be retrieved and characterized by their fractal dimension and singularity strength [1, 2]. The spatial variability of forest fires is conditioned by an intermixture of human, topographic, meteorological and vegetation factors. This heterogeneity makes fire patterns complex scale-invariant processes difficult to be depicted by a single scale. Therefore, this study proposes an exploratory data analysis through a multifractal formalism to characterize and quantify the multiscaling behaviour of the spatial distribution pattern of this phenomenon in a complex region like the Swiss Alps. The studied dataset is represented by 2,401 georeferenced forest fire ignition points in canton Ticino, Switzerland, in a 40-years period from 1969 to 2008. Three multifractal analyses are performed: one assesses the multiscaling behaviour of fire occurrence probability of the support data (raw data) and four random patterns simulated within three different support domains; second analysis studies the multifractal behavior of patterns from anthropogenic and natural ignited fires (arson-, accident- and lightning-caused fires); and third analysis aims at detecting scale-dependency of the size of burned area. To calculate the generalized dimensions, Dq, a generalization of the box counting methods is carried out based on the generalization of Rényi information of the qth order moment of the probability distribution. For q > 0, Dq

  16. Fractal analysis reveals reduced complexity of retinal vessels in CADASIL.

    Directory of Open Access Journals (Sweden)

    Michele Cavallari

    2011-04-01

    Full Text Available The Cerebral Autosomal Dominant Arteriopathy with Subcortical Infarcts and Leukoencephalopathy (CADASIL affects mainly small cerebral arteries and leads to disability and dementia. The relationship between clinical expression of the disease and progression of the microvessel pathology is, however, uncertain as we lack tools for imaging brain vessels in vivo. Ophthalmoscopy is regarded as a window into the cerebral microcirculation. In this study we carried out an ophthalmoscopic examination in subjects with CADASIL. Specifically, we performed fractal analysis of digital retinal photographs. Data are expressed as mean fractal dimension (mean-D, a parameter that reflects complexity of the retinal vessel branching. Ten subjects with genetically confirmed diagnosis of CADASIL and 10 sex and age-matched control subjects were enrolled. Fractal analysis of retinal digital images was performed by means of a computer-based program, and the data expressed as mean-D. Brain MRI lesion volume in FLAIR and T1-weighted images was assessed using MIPAV software. Paired t-test was used to disclose differences in mean-D between CADASIL and control groups. Spearman rank analysis was performed to evaluate potential associations between mean-D values and both disease duration and disease severity, the latter expressed as brain MRI lesion volumes, in the subjects with CADASIL. The results showed that mean-D value of patients (1.42±0.05; mean±SD was lower than control (1.50±0.04; p = 0.002. Mean-D did not correlate with disease duration nor with MRI lesion volumes of the subjects with CADASIL. The findings suggest that fractal analysis is a sensitive tool to assess changes of retinal vessel branching, likely reflecting early brain microvessel alterations, in CADASIL patients.

  17. Functional analysis for complex systems of nuclear fusion plant

    International Nuclear Information System (INIS)

    Pinna, Tonio; Dongiovanni, Danilo Nicola; Iannone, Francesco

    2016-01-01

    Highlights: • Functional analysis for complex systems. • Functional Flow Block Diagrams (FFBD). • IDEFØ diagrams. • Petri Net algorithm - Abstract: In system engineering context, a functional analysis is the systematic process of identifying, describing and correlating the functions a system must perform in order to be successful at any foreseen life-cycle phase or operational state/mode. By focusing on what the system must do disregarding the implementation, the functional analysis supports an unbiased system requirement allocation analysis. The system function architecture is defined in terms of process, protection (interlock) or nuclear safety functions. Then, the system functions are analyzed from several points of view in order to highlight the various pieces of information defining the way the system is designed to accomplish its mission as defined in the system requirement documents. The process functional flow is identified and represented by Functional Flow Block Diagrams (FFBD) while the system function interfaces are identified and represented by IDEFØ diagrams. Function interfaces are defined as relationships across identified functions in terms of function input (from other functions or requirements), output (added value or outcome of the function), controls (from other functions or systems) and mechanisms necessary to fulfill the function. The function architecture is further detailed by considering for each function: a) the phase of application, b) the actions performed c) the controlled variable and control actions to be foreseen in the implementation of the functions, d) the system involved in the control action, e) the equipment involved in the function, f) the requirements allocated to the function. The methodology here presented are suggested for the designing of fusion facilities and reactors already from the first phases of the pre-conceptual design, as it is now for DEMO.

  18. Synaptonemal complex analysis of X-7 translocations in male mice

    Energy Technology Data Exchange (ETDEWEB)

    Ashley, T. (Univ. of Tennessee, Knoxville); Russell, L.B.; Cacheiro, N.L.A.

    1982-01-01

    The synaptonemal complexes of surface-spread spermatocytes of mice heterozygous for one of two reciprocal translations (R3 and R5) between the X and chromosome 7 have been examined by light and electron microscopy (EM). The break points of R3 were determined to be at 70% of chromosome 7, as measured from the centromere, and at 22% of the X. Translocation quadrivalents were formed almost exclusively. The break points of R5 were at 21% of chromosome 7 as measured from the centromere, and at 83% of the X. There was little indication that the break in the X interfered with sex-chromosome synapis between the 7X and Y. Univalent Y's were not observed in R3, and only seldom observed (8-14%) in R5. However, in contrast to R3, R5 formed quadrivalents relatively rarely (20% in the EM study of 100 nuclei), and hetermorphic bivalents of 7X-Y and X7-7 quite frequently (72%). Possible causes of this high bivalent frequency are discussed. Light-microsope (LM) analysis alone was found to be inadequate for interpreting synaptic configurations (quadrivalents vs. bivalents) in R5. The LM analysis was further complicated by the occurrence of nonhomologous synapsis in the heteromorphic bivalents of R5, a phenomonon easily recognized and interpreted in the EM portion of the study.

  19. Functional complexity of the axonal growth cone: a proteomic analysis.

    Directory of Open Access Journals (Sweden)

    Adriana Estrada-Bernal

    Full Text Available The growth cone, the tip of the emerging neurite, plays a crucial role in establishing the wiring of the developing nervous system. We performed an extensive proteomic analysis of axonal growth cones isolated from the brains of fetal Sprague-Dawley rats. Approximately 2000 proteins were identified at ≥ 99% confidence level. Using informatics, including functional annotation cluster and KEGG pathway analysis, we found great diversity of proteins involved in axonal pathfinding, cytoskeletal remodeling, vesicular traffic and carbohydrate metabolism, as expected. We also found a large and complex array of proteins involved in translation, protein folding, posttranslational processing, and proteasome/ubiquitination-dependent degradation. Immunofluorescence studies performed on hippocampal neurons in culture confirmed the presence in the axonal growth cone of proteins representative of these processes. These analyses also provide evidence for rough endoplasmic reticulum and reveal a reticular structure equipped with Golgi-like functions in the axonal growth cone. Furthermore, Western blot revealed the growth cone enrichment, relative to fetal brain homogenate, of some of the proteins involved in protein synthesis, folding and catabolism. Our study provides a resource for further research and amplifies the relatively recently developed concept that the axonal growth cone is equipped with proteins capable of performing a highly diverse range of functions.

  20. Linkage analysis: Inadequate for detecting susceptibility loci in complex disorders?

    Energy Technology Data Exchange (ETDEWEB)

    Field, L.L.; Nagatomi, J. [Univ. of Calgary, Alberta (Canada)

    1994-09-01

    Insulin-dependent diabetes mellitus (IDDM) may provide valuable clues about approaches to detecting susceptibility loci in other oligogenic disorders. Numerous studies have demonstrated significant association between IDDM and a VNTR in the 5{prime} flanking region of the insulin (INS) gene. Paradoxically, all attempts to demonstrate linkage of IDDM to this VNTR have failed. Lack of linkage has been attributed to insufficient marker locus information, genetic heterogeneity, or high frequency of the IDDM-predisposing allele in the general population. Tyrosine hydroxylase (TH) is located 2.7 kb from INS on the 5` side of the VNTR and shows linkage disequilibrium with INS region loci. We typed a highly polymorphic microsatellite within TH in 176 multiplex families, and performed parametric (lod score) linkage analysis using various intermediate reduced penetrance models for IDDM (including rare and common disease allele frequencies), as well as non-parametric (affected sib pair) linkage analysis. The scores significantly reject linkage for recombination values of .05 or less, excluding the entire 19 kb region containing TH, the 5{prime} VNTR, the INS gene, and IGF2 on the 3{prime} side of INS. Non-parametric linkage analysis also provided no significant evidence for linkage (mean TH allele sharing 52.5%, P=.12). These results have important implications for efforts to locate genes predisposing to complex disorders, strongly suggesting that regions which are significantly excluded by linkage methods may nevertheless contain predisposing genes readily detectable by association methods. We advocate that investigators routinely perform association analyses in addition to linkage analyses.

  1. Analysis of a complex shape chain plate using Transmission Photoelasticity

    Directory of Open Access Journals (Sweden)

    Dasari N.

    2010-06-01

    Full Text Available Most chains are an assembly [1] of five parts namely, outer plate, inner plate, bush, pin and roller. Two inner plates are press fitted with two bushes to form an inner block assembly. The outer plates are press fitted with pins after keeping the pins through the assembled bushes of the inner block. Roller is a rotating member and placed over the bush during inner block assembly. Inner block assembly is the load transfer member from sprocket tooth. The outer block assembly helps in holding and also to pull the inner block over the sprocket teeth. If a chain length is in odd number of pitches, it requires an offset plate as shown in Figure 1 to connect two ends of the chain together to make chain endless. When the chain is assembled with an offset plate, the chain fatigue life was observed only 20 to 25% of the total life of a chain, assembled without an offset plate. The holes in the offset plate are of the same size as in the outer and inner plates respectively and it is a complex in shape chain plate. A inbuilt thinning zone at the centre of the chain plate as shown in Figure 1 is unavoidable. The stresses and its distribution in this complex shape chain plate geometry play a critical role in the fatigue life performance of a chain assembly. However, it is difficult identify the stress distribution and stress concentration zones precisely using only the conventional industrial friendly tools such as routine quality control test, breaking load test and numerical computations. In this context the transmission photoelastic technique has made it possible to identify the stress distribution, its concentration and also to quantify the stress and strain [2-3] at any point in the chain plate. This paper explains how transmission photoelastic technique is used to estimate the stress distribution and its concentration zones in a complex chain plate when it isloaded. An epoxy chain plate model was made through the casting method using a Perspex mould [2

  2. Complexity Management - A multiple case study analysis on control and reduction of complexity costs

    DEFF Research Database (Denmark)

    Myrodia, Anna

    of products, with features more custom-made to cover individual needs, both regarding characteristics of products and support services. This necessity leads to a considerable increase of the complexity in the company, which affects the product portfolio, production and supply chain, market segments......, IT systems, and business processes. In order to identify and eliminate complexity, several approaches are used, both by researchers and practitioners. The purpose of this thesis is to contribute to the existing knowledge of complexity management theory. This research focuses on the relationship between......Complexity tends to be arguably the biggest challenge of manufacturing companies. The motivation of further studying complexity is a combination between the existing literature and the practical experiences from the industry. Based on the latest trend companies are trying to supply a growing mix...

  3. Basic complex analysis a comprehensive course in analysis, part 2a

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 2A is devoted to basic complex analysis. It interweaves three analytic threads associated with Cauchy, Riemann, and Weierstrass, respectively. Cauchy's view focuses on th

  4. Advanced complex analysis a comprehensive course in analysis, part 2b

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 2B provides a comprehensive look at a number of subjects of complex analysis not included in Part 2A. Presented in this volume are the theory of conformal metrics (includ

  5. Identification and analysis of multi-protein complexes in placenta.

    Directory of Open Access Journals (Sweden)

    Fuqiang Wang

    Full Text Available Placental malfunction induces pregnancy disorders which contribute to life-threatening complications for both the mother and the fetus. Identification and characterization of placental multi-protein complexes is an important step to integratedly understand the protein-protein interaction networks in placenta which determine placental function. In this study, blue native/sodium dodecyl sulfate polyacrylamide gel electrophoresis (BN/SDS-PAGE and Liquid chromatography-tandem mass spectrometry (LC-MS/MS were used to screen the multi-protein complexes in placenta. 733 unique proteins and 34 known and novel heterooligomeric multi-protein complexes including mitochondrial respiratory chain complexes, integrin complexes, proteasome complexes, histone complex, and heat shock protein complexes were identified. A novel protein complex, which involves clathrin and small conductance calcium-activated potassium (SK channel protein 2, was identified and validated by antibody based gel shift assay, co-immunoprecipitation and immunofluorescence staining. These results suggest that BN/SDS-PAGE, when integrated with LC-MS/MS, is a very powerful and versatile tool for the investigation of placental protein complexes. This work paves the way for deeper functional characterization of the placental protein complexes associated with pregnancy disorders.

  6. Complexity of possibly gapped histogram and analysis of histogram

    Science.gov (United States)

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  7. A microfluidic dialysis device for complex biological mixture SERS analysis

    KAUST Repository

    Perozziello, Gerardo

    2015-08-01

    In this paper, we present a microfluidic device fabricated with a simple and inexpensive process allowing rapid filtering of peptides from a complex mixture. The polymer microfluidic device can be used for sample preparation in biological applications. The device is fabricated by micromilling and solvent assisted bonding, in which a microdialysis membrane (cut-off of 12-14 kDa) is sandwiched in between an upper and a bottom microfluidic chamber. An external frame connects the microfluidic device to external tubes, microvalves and syringe pumps. Bonding strength and interface sealing are pneumatically tested. Microfluidic protocols are also described by using the presented device to filter a sample composed of specific peptides (MW 1553.73 Da, at a concentration of 1.0 ng/μl) derived from the BRCA1 protein, a tumor-suppressor molecule which plays a pivotal role in the development of breast cancer, and albumin (MW 66.5 kDa, at a concentration of 35 μg/μl), the most represented protein in human plasma. The filtered samples coming out from the microfluidic device were subsequently deposited on a SERS (surface enhanced Raman scattering) substrate for further analysis by Raman spectroscopy. By using this approach, we were able to sort the small peptides from the bigger and highly concentrated protein albumin and to detect them by using a label-free technique at a resolution down to 1.0 ng/μl.

  8. Complexity of possibly gapped histogram and analysis of histogram.

    Science.gov (United States)

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  9. Absorption spectra analysis of hydrated uranium(III) complex chlorides

    Science.gov (United States)

    Karbowiak, M.; Gajek, Z.; Drożdżyński, J.

    2000-11-01

    Absorption spectra of powdered samples of hydrated uranium(III) complex chlorides of the formulas NH 4UCl 4 · 4H 2O and CsUCl 4 · 3H 2O have been recorded at 4.2 K in the 4000-26 000 cm -1 range. The analysis of the spectra enabled the determination of crystal-field parameters and assignment of 83 and 77 crystal-field levels for the tetrahydrate and trihydrate, respectively. The energies of the levels were computed by applying a simplified angular overlap model as well as a semiempirical Hamiltonian representing the combined atomic and crystal-field interactions. Ab initio calculations have enabled the application of a simplified parameterization and the determination of the starting values of the AOM parameters. The received results have proved that the AOM approach can quite well predict both the structure of the ground multiplet and the positions of the crystal-field levels in the 17 000-25 000 cm -1 range, usually obscured by strong f-d bands.

  10. Complex responsibilities : An empirical analysis of responsibilities and technological complexity in Dutch immigration policies

    NARCIS (Netherlands)

    Meijer, A.J.

    2009-01-01

    Complex patterns of (international) co-operation between public and private actors are facilitated by new information and communication technologies. New technological practices challenge current systems of political, public management and frontline staff responsibilities since these

  11. IR and UV spectroscopic analysis of TBP complexes

    International Nuclear Information System (INIS)

    Azzouz, A.; Berrak, A.; Seridi, L.; Attou, M.

    1985-06-01

    The complexity of TBP molecule and the limited number of references stimulated the elaboration of this report. The spectroscopic of TBP and its complexes in the IR and UV fields permitted to elucidate or to confirm certain aspects concerning the solvation phenomenum. In IR spectroscopy, the stretching band of the P→O bond only is characteristic of the complex formed. The position of this band gives sufficient information about the kind and the stability of a complex. The TBP electronic spectra are characterized by two bands (200-220 nm) 1 and (268-290 nm) 2 whose intensity ratio (2/1) is about 0,13. The solvent nature seems to influence the positions of these bands and that of the inflexion point. The band 2 disappears when the TBP is complexed and the position and the intensity of the band 1 depend upon the complex nature

  12. Complex assembly, crystallization and preliminary X-ray crystallographic analysis of the human Rod–Zwilch–ZW10 (RZZ) complex

    Energy Technology Data Exchange (ETDEWEB)

    Altenfeld, Anika; Wohlgemuth, Sabine [Max Planck Institute of Molecular Physiology, Otto Hahn Strasse 11, 44227 Dortmund (Germany); Wehenkel, Annemarie [Institut Curie, CNRS UMR 3348/INSERM U1005, Bâtiment 110, Centre Universitaire, 91405 Orsay CEDEX (France); Vetter, Ingrid R. [Max Planck Institute of Molecular Physiology, Otto Hahn Strasse 11, 44227 Dortmund (Germany); Musacchio, Andrea, E-mail: andrea.musacchio@mpi-dortmund.mpg.de [Max Planck Institute of Molecular Physiology, Otto Hahn Strasse 11, 44227 Dortmund (Germany); University of Duisburg-Essen, Universitätstrasse 1, 45141 Essen (Germany)

    2015-03-20

    The 800 kDa complex of the human Rod, Zwilch and ZW10 proteins (the RZZ complex) was reconstituted in insect cells, purified, crystallized and subjected to preliminary X-ray diffraction analysis. The spindle-assembly checkpoint (SAC) monitors kinetochore–microtubule attachment during mitosis. In metazoans, the three-subunit Rod–Zwilch–ZW10 (RZZ) complex is a crucial SAC component that interacts with additional SAC-activating and SAC-silencing components, including the Mad1–Mad2 complex and cytoplasmic dynein. The RZZ complex contains two copies of each subunit and has a predicted molecular mass of ∼800 kDa. Given the low abundance of the RZZ complex in natural sources, its recombinant reconstitution was attempted by co-expression of its subunits in insect cells. The RZZ complex was purified to homogeneity and subjected to systematic crystallization attempts. Initial crystals containing the entire RZZ complex were obtained using the sitting-drop method and were subjected to optimization to improve the diffraction resolution limit. The crystals belonged to space group P3{sub 1} (No. 144) or P3{sub 2} (No. 145), with unit-cell parameters a = b = 215.45, c = 458.7 Å, α = β = 90.0, γ = 120.0°.

  13. Characterization of complex networks : Application to robustness analysis

    NARCIS (Netherlands)

    Jamakovic, A.

    2008-01-01

    This thesis focuses on the topological characterization of complex networks. It specifically focuses on those elementary graph measures that are of interest when quantifying topology-related aspects of the robustness of complex networks. This thesis makes the following contributions to the field of

  14. Vibrational spectroscopy and structural analysis of complex uranium compounds (review)

    International Nuclear Information System (INIS)

    Umreiko, D.S.; Nikanovich, M.V.

    1985-01-01

    The paper reports on the combined application of experimental and theoretical methods of vibrational spectroscopy together with low-temperature luminescence data to determine the characteristic features of the formation and structure of complex systems, not only containing ligands directly coordinated to the CA uranium, but also associated with the extraspherical polyatomic electrically charged particles: organic cations. These include uranyl complexes and heterocyclical amines. Studied here were compounds of tetra-halouranylates with pyridine and its derivates, as well as dipyridyl, quinoline and phenanthroline. Structural schemes are also proposed for other uranyl complexes with protonated heterocyclical amines with a more complicated composition, which correctly reflect their spectroscopic properties

  15. Mass spectral analysis of cationic and neutral technetium complexes

    International Nuclear Information System (INIS)

    Unger, S.E.; McCormick, T.J.; Nunn, A.N.; Treher, E.N.

    1986-01-01

    Cationic and neutral technetium compounds have been characterized by mass spectrometry using a variety of ionization methods. These compounds include octahedral cationic complexes containing phosphorous and arsenic ligands such as DIPHOS and DIARS and neutral complexes containing PnAO and dimethylglyoxime, DMG, or cyclohexanedione dioxime, CDO, ligands. Boronate esters incorporating methyl and butyl derivatives of the DMG and CDO dioximes represent a new class of seven-coordinate Tc radiopharmaceuticals whose characterization by mass spectrometry has not previously been described. These complexes show promise as myocardial imaging agents. (author)

  16. Statistical analysis of complex systems with nonclassical invariant measures

    KAUST Repository

    Fratalocchi, Andrea

    2011-01-01

    I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a

  17. Challenges in the analysis of complex systems: introduction and overview

    Science.gov (United States)

    Hastings, Harold M.; Davidsen, Jörn; Leung, Henry

    2017-12-01

    One of the main challenges of modern physics is to provide a systematic understanding of systems far from equilibrium exhibiting emergent behavior. Prominent examples of such complex systems include, but are not limited to the cardiac electrical system, the brain, the power grid, social systems, material failure and earthquakes, and the climate system. Due to the technological advances over the last decade, the amount of observations and data available to characterize complex systems and their dynamics, as well as the capability to process that data, has increased substantially. The present issue discusses a cross section of the current research on complex systems, with a focus on novel experimental and data-driven approaches to complex systems that provide the necessary platform to model the behavior of such systems.

  18. A microfluidic dialysis device for complex biological mixture SERS analysis

    KAUST Repository

    Perozziello, Gerardo; Candeloro, Patrizio; Gentile, Francesco T.; Coluccio, Maria Laura; Tallerico, Marco; De Grazia, Antonio; Nicastri, Annalisa; Perri, Angela Mena; Parrotta, Elvira; Pardeo, Francesca; Catalano, Rossella; Cuda, Giovanni; Di Fabrizio, Enzo M.

    2015-01-01

    In this paper, we present a microfluidic device fabricated with a simple and inexpensive process allowing rapid filtering of peptides from a complex mixture. The polymer microfluidic device can be used for sample preparation in biological

  19. Correlation between detrended fluctuation analysis and the Lempel-Ziv complexity in nonlinear time series analysis

    International Nuclear Information System (INIS)

    Tang You-Fu; Liu Shu-Lin; Jiang Rui-Hong; Liu Ying-Hui

    2013-01-01

    We study the correlation between detrended fluctuation analysis (DFA) and the Lempel-Ziv complexity (LZC) in nonlinear time series analysis in this paper. Typical dynamic systems including a logistic map and a Duffing model are investigated. Moreover, the influence of Gaussian random noise on both the DFA and LZC are analyzed. The results show a high correlation between the DFA and LZC, which can quantify the non-stationarity and the nonlinearity of the time series, respectively. With the enhancement of the random component, the exponent a and the normalized complexity index C show increasing trends. In addition, C is found to be more sensitive to the fluctuation in the nonlinear time series than α. Finally, the correlation between the DFA and LZC is applied to the extraction of vibration signals for a reciprocating compressor gas valve, and an effective fault diagnosis result is obtained

  20. Dynamics of vortices in complex wakes: Modeling, analysis, and experiments

    Science.gov (United States)

    Basu, Saikat

    The thesis develops singly-periodic mathematical models for complex laminar wakes which are formed behind vortex-shedding bluff bodies. These wake structures exhibit a variety of patterns as the bodies oscillate or are in close proximity of one another. The most well-known formation comprises two counter-rotating vortices in each shedding cycle and is popularly known as the von Karman vortex street. Of the more complex configurations, as a specific example, this thesis investigates one of the most commonly occurring wake arrangements, which consists of two pairs of vortices in each shedding period. The paired vortices are, in general, counter-rotating and belong to a more general definition of the 2P mode, which involves periodic release of four vortices into the flow. The 2P arrangement can, primarily, be sub-classed into two types: one with a symmetric orientation of the two vortex pairs about the streamwise direction in a periodic domain and the other in which the two vortex pairs per period are placed in a staggered geometry about the wake centerline. The thesis explores the governing dynamics of such wakes and characterizes the corresponding relative vortex motion. In general, for both the symmetric as well as the staggered four vortex periodic arrangements, the thesis develops two-dimensional potential flow models (consisting of an integrable Hamiltonian system of point vortices) that consider spatially periodic arrays of four vortices with their strengths being +/-Gamma1 and +/-Gamma2. Vortex formations observed in the experiments inspire the assumed spatial symmetry. The models demonstrate a number of dynamic modes that are classified using a bifurcation analysis of the phase space topology, consisting of level curves of the Hamiltonian. Despite the vortex strengths in each pair being unequal in magnitude, some initial conditions lead to relative equilibrium when the vortex configuration moves with invariant size and shape. The scaled comparisons of the

  1. Nonlinear complexity analysis of brain FMRI signals in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Moses O Sokunbi

    Full Text Available We investigated the differences in brain fMRI signal complexity in patients with schizophrenia while performing the Cyberball social exclusion task, using measures of Sample entropy and Hurst exponent (H. 13 patients meeting diagnostic and Statistical Manual of Mental Disorders, 4th Edition (DSM IV criteria for schizophrenia and 16 healthy controls underwent fMRI scanning at 1.5 T. The fMRI data of both groups of participants were pre-processed, the entropy characterized and the Hurst exponent extracted. Whole brain entropy and H maps of the groups were generated and analysed. The results after adjusting for age and sex differences together show that patients with schizophrenia exhibited higher complexity than healthy controls, at mean whole brain and regional levels. Also, both Sample entropy and Hurst exponent agree that patients with schizophrenia have more complex fMRI signals than healthy controls. These results suggest that schizophrenia is associated with more complex signal patterns when compared to healthy controls, supporting the increase in complexity hypothesis, where system complexity increases with age or disease, and also consistent with the notion that schizophrenia is characterised by a dysregulation of the nonlinear dynamics of underlying neuronal systems.

  2. Quantitative analysis of complexes in electron irradiated CZ silicon

    International Nuclear Information System (INIS)

    Inoue, N.; Ohyama, H.; Goto, Y.; Sugiyama, T.

    2007-01-01

    Complexes in helium or electron irradiated silicon are quantitatively analyzed by highly sensitive and accurate infrared (IR) absorption spectroscopy. Carbon concentration (1x10 15 -1x10 17 cm -3 ) and helium dose (5x10 12 -5x10 13 cm -2 ) or electron dose (1x10 15 -1x10 17 cm -2 ) are changed by two orders of magnitude in relatively low regime compared to the previous works. It is demonstrated that the carbon-related complex in low carbon concentration silicon of commercial grade with low electron dose can be detected clearly. Concentration of these complexes is estimated. It is clarified that the complex configuration and thermal behavior in low carbon and low dose samples is simple and almost confined within the individual complex family compared to those in high concentration and high dose samples. Well-established complex behavior in electron-irradiated sample is compared to that in He-irradiated samples, obtained by deep level transient spectroscopy (DLTS) or cathodoluminescence (CL), which had close relation to the Si power device performance

  3. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  4. Complexity Studies and Security in the Complex World: An Epistemological Framework of Analysis

    Science.gov (United States)

    Mesjasz, Czeslaw

    The impact of systems thinking can be found in numerous security-oriented research, beginning from the early works on international system: Pitrim Sorokin, Quincy Wright, first models of military conflict and war: Frederick Lanchester, Lewis F. Richardson, national and military security (origins of RAND Corporation), through development of game theory-based conflict studies, International Relations, classical security studies of Morton A. Kaplan, Karl W. Deutsch [Mesjasz 1988], and ending with contemporary ideas of broadened concepts of security proposed by the Copenhagen School [Buzan et al 1998]. At present it may be even stated that the new military and non-military threats to contemporary complex society, such as low-intensity conflicts, regional conflicts, terrorism, environmental disturbances, etc. cannot be embraced without ideas taken from modern complex systems studies.

  5. The complex Langevin analysis of spontaneous symmetry breaking induced by complex fermion determinant

    Energy Technology Data Exchange (ETDEWEB)

    Ito, Yuta [KEK Theory Center, High Energy Accelerator Research Organization,1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); Nishimura, Jun [KEK Theory Center, High Energy Accelerator Research Organization,1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); Graduate University for Advanced Studies (SOKENDAI),1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan)

    2016-12-05

    In many interesting physical systems, the determinant which appears from integrating out fermions becomes complex, and its phase plays a crucial role in the determination of the vacuum. An example of this is QCD at low temperature and high density, where various exotic fermion condensates are conjectured to form. Another example is the Euclidean version of the type IIB matrix model for 10d superstring theory, where spontaneous breaking of the SO(10) rotational symmetry down to SO(4) is expected to occur. When one applies the complex Langevin method to these systems, one encounters the singular-drift problem associated with the appearance of nearly zero eigenvalues of the Dirac operator. Here we propose to avoid this problem by deforming the action with a fermion bilinear term. The results for the original system are obtained by extrapolations with respect to the deformation parameter. We demonstrate the power of this approach by applying it to a simple matrix model, in which spontaneous symmetry breaking from SO(4) to SO(2) is expected to occur due to the phase of the complex fermion determinant. Unlike previous work based on a reweighting-type method, we are able to determine the true vacuum by calculating the order parameters, which agree with the prediction by the Gaussian expansion method.

  6. Complexity analysis of accelerated MCMC methods for Bayesian inversion

    International Nuclear Information System (INIS)

    Hoang, Viet Ha; Schwab, Christoph; Stuart, Andrew M

    2013-01-01

    The Bayesian approach to inverse problems, in which the posterior probability distribution on an unknown field is sampled for the purposes of computing posterior expectations of quantities of interest, is starting to become computationally feasible for partial differential equation (PDE) inverse problems. Balancing the sources of error arising from finite-dimensional approximation of the unknown field, the PDE forward solution map and the sampling of the probability space under the posterior distribution are essential for the design of efficient computational Bayesian methods for PDE inverse problems. We study Bayesian inversion for a model elliptic PDE with an unknown diffusion coefficient. We provide complexity analyses of several Markov chain Monte Carlo (MCMC) methods for the efficient numerical evaluation of expectations under the Bayesian posterior distribution, given data δ. Particular attention is given to bounds on the overall work required to achieve a prescribed error level ε. Specifically, we first bound the computational complexity of ‘plain’ MCMC, based on combining MCMC sampling with linear complexity multi-level solvers for elliptic PDE. Our (new) work versus accuracy bounds show that the complexity of this approach can be quite prohibitive. Two strategies for reducing the computational complexity are then proposed and analyzed: first, a sparse, parametric and deterministic generalized polynomial chaos (gpc) ‘surrogate’ representation of the forward response map of the PDE over the entire parameter space, and, second, a novel multi-level Markov chain Monte Carlo strategy which utilizes sampling from a multi-level discretization of the posterior and the forward PDE. For both of these strategies, we derive asymptotic bounds on work versus accuracy, and hence asymptotic bounds on the computational complexity of the algorithms. In particular, we provide sufficient conditions on the regularity of the unknown coefficients of the PDE and on the

  7. Lability of nanoparticulate metal complexes in electrochemical speciation analysis

    DEFF Research Database (Denmark)

    van Leeuwen, Herman P.; Town, Raewyn M.

    2016-01-01

    Lability concepts are elaborated for metal complexes with soft (3D) and hard (2D) aqueous nanoparticles. In the presence of a non-equilibrium sensor, e.g. a voltammetric electrode, the notion of lability for nanoparticulate metal complexes, M-NP, reflects the ability of the M-NP to maintain...... equilibrium with the reduced concentration of the electroactive free M2+ in its diffusion layer. Since the metal ion binding sites are confined to the NP body, the conventional reaction layer in the form of a layer adjacent to the electrode surface is immaterial. Instead an intraparticulate reaction zone may...... of the electrochemical technique is crucial in the lability towards the electrode surface. In contrast, for nanoparticulate complexes it is the dynamics of the exchange of the electroactive metal ion with the surrounding medium that governs the effective lability towards the electrode surface....

  8. Qualitative analysis and control of complex neural networks with delays

    CERN Document Server

    Wang, Zhanshan; Zheng, Chengde

    2016-01-01

    This book focuses on the stability of the dynamical neural system, synchronization of the coupling neural system and their applications in automation control and electrical engineering. The redefined concept of stability, synchronization and consensus are adopted to provide a better explanation of the complex neural network. Researchers in the fields of dynamical systems, computer science, electrical engineering and mathematics will benefit from the discussions on complex systems. The book will also help readers to better understand the theory behind the control technique and its design.

  9. Simulation and Analysis of Complex Biological Processes: an Organisation Modelling Perspective

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2005-01-01

    This paper explores how the dynamics of complex biological processes can be modelled and simulated as an organisation of multiple agents. This modelling perspective identifies organisational structure occurring in complex decentralised processes and handles complexity of the analysis of the dynamics

  10. Complexity and Entropy Analysis of DNMT1 Gene

    Science.gov (United States)

    Background: The application of complexity information on DNA sequence and protein in biological processes are well established in this study. Available sequences for DNMT1 gene, which is a maintenance methyltransferase is responsible for copying DNA methylation patterns to the daughter strands durin...

  11. Phenylketonuria and Complex Spatial Visualization: An Analysis of Information Processing.

    Science.gov (United States)

    Brunner, Robert L.; And Others

    1987-01-01

    The study of the ability of 16 early treated phenylketonuric (PKU) patients (ages 6-23 years) to solve complex spatial problems suggested that choice of problem-solving strategy, attention span, and accuracy of mental representation may be affected in PKU patients, despite efforts to maintain well-controlled phenylalanine concentrations in the…

  12. Application of functional derivatives to analysis of complex systems

    Czech Academy of Sciences Publication Activity Database

    Beran, Zdeněk; Čelikovský, Sergej

    2013-01-01

    Roč. 350, č. 10 (2013), s. 2982-2993 ISSN 0016-0032 R&D Projects: GA ČR GA13-20433S Institutional support: RVO:67985556 Keywords : complex systems * linear equation * modeling Subject RIV: BC - Control Systems Theory Impact factor: 2.260, year: 2013 http://library.utia.cas.cz/separaty/2013/TR/beran-0398123.pdf

  13. Complexity analysis of the turbulent environmental fluid flow time series

    Science.gov (United States)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  14. Native Liquid Extraction Surface Analysis Mass Spectrometry: Analysis of Noncovalent Protein Complexes Directly from Dried Substrates

    Science.gov (United States)

    Martin, Nicholas J.; Griffiths, Rian L.; Edwards, Rebecca L.; Cooper, Helen J.

    2015-08-01

    Liquid extraction surface analysis (LESA) mass spectrometry is a promising tool for the analysis of intact proteins from biological substrates. Here, we demonstrate native LESA mass spectrometry of noncovalent protein complexes of myoglobin and hemoglobin from a range of surfaces. Holomyoglobin, in which apomyoglobin is noncovalently bound to the prosthetic heme group, was observed following LESA mass spectrometry of myoglobin dried onto glass and polyvinylidene fluoride surfaces. Tetrameric hemoglobin [(αβ)2 4H] was observed following LESA mass spectrometry of hemoglobin dried onto glass and polyvinylidene fluoride (PVDF) surfaces, and from dried blood spots (DBS) on filter paper. Heme-bound dimers and monomers were also observed. The `contact' LESA approach was particularly suitable for the analysis of hemoglobin tetramers from DBS.

  15. Special Year on Complex Analysis held at the University of Maryland

    CERN Document Server

    1987-01-01

    The past several years have witnessed a striking number of important developments in Complex Analysis. One of the characteristics of these developments has been to bridge the gap existing between the theory of functions of one and of several complex variables. The Special Year in Complex Analysis at the University of Maryland, and these proceedings, were conceived as a forum where these new developments could be presented and where specialists in different areas of complex analysis could exchange ideas. These proceedings contain both surveys of different subjects covered during the year as well as many new results and insights. The manuscripts are accessible not only to specialists but to a broader audience. Among the subjects touched upon are Nevanlinna theory in one and several variables, interpolation problems in Cn, estimations and integral representations of the solutions of the Cauchy-Riemann equations, the complex Monge-Ampère equation, geometric problems in complex analysis in Cn, applications of com...

  16. Application of «Sensor signal analysis network» complex for distributed, time synchronized analysis of electromagnetic radiation

    Science.gov (United States)

    Mochalov, Vladimir; Mochalova, Anastasia

    2017-10-01

    The paper considers a developing software-hardware complex «Sensor signal analysis network» for distributed and time synchronized analysis of electromagnetic radiations. The areas of application and the main features of the complex are described. An example of application of the complex to monitor natural electromagnetic radiation sources is considered based on the data recorded in VLF range. A generalized functional scheme of stream analysis of signals by a complex functional node is suggested and its application for stream detection of atmospherics, whistlers and tweaks is considered.

  17. Variable structure control of complex systems analysis and design

    CERN Document Server

    Yan, Xing-Gang; Edwards, Christopher

    2017-01-01

    This book systematizes recent research work on variable-structure control. It is self-contained, presenting necessary mathematical preliminaries so that the theoretical developments can be easily understood by a broad readership. The text begins with an introduction to the fundamental ideas of variable-structure control pertinent to their application in complex nonlinear systems. In the core of the book, the authors lay out an approach, suitable for a large class of systems, that deals with system uncertainties with nonlinear bounds. Its treatment of complex systems in which limited measurement information is available makes the results developed convenient to implement. Various case-study applications are described, from aerospace, through power systems to river pollution control with supporting simulations to aid the transition from mathematical theory to engineering practicalities. The book addresses systems with nonlinearities, time delays and interconnections and considers issues such as stabilization, o...

  18. Statistical analysis of complex systems with nonclassical invariant measures

    KAUST Repository

    Fratalocchi, Andrea

    2011-02-28

    I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a general formalism based on the Ablowitz-Kaup-Newell-Segur scheme, I demonstrate how to build an invariant measure and, within a one-dimensional phase space, how to develop a suitable thermodynamics. A detailed example is provided with a universal model of wave propagation, with reference to a transparent potential sustaining gray solitons. The system shows a rich thermodynamic scenario, with a free-energy landscape supporting phase transitions and controllable emergent properties. I finally discuss the origin of such behavior, trying to identify common denominators in the area of complex dynamics.

  19. Analysis of Linux kernel as a complex network

    International Nuclear Information System (INIS)

    Gao, Yichao; Zheng, Zheng; Qin, Fangyun

    2014-01-01

    Operating system (OS) acts as an intermediary between software and hardware in computer-based systems. In this paper, we analyze the core of the typical Linux OS, Linux kernel, as a complex network to investigate its underlying design principles. It is found that the Linux Kernel Network (LKN) is a directed network and its out-degree follows an exponential distribution while the in-degree follows a power-law distribution. The correlation between topology and functions is also explored, by which we find that LKN is a highly modularized network with 12 key communities. Moreover, we investigate the robustness of LKN under random failures and intentional attacks. The result shows that the failure of the large in-degree nodes providing basic services will do more damage on the whole system. Our work may shed some light on the design of complex software systems

  20. The analysis of a complex fire event using multispaceborne observations

    Directory of Open Access Journals (Sweden)

    Andrei Simona

    2018-01-01

    Full Text Available This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  1. The analysis of a complex fire event using multispaceborne observations

    Science.gov (United States)

    Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil

    2018-04-01

    This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  2. POWER ANALYSIS FOR COMPLEX MEDIATIONAL DESIGNS USING MONTE CARLO METHODS

    OpenAIRE

    Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.

    2010-01-01

    Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex mediational models. The approach is based on the well known technique of generating a large number of samples in a Monte Carlo study, and estimating power...

  3. STATISTICAL ANALYSIS OF RAW SUGAR MATERIAL FOR SUGAR PRODUCER COMPLEX

    OpenAIRE

    A. A. Gromkovskii; O. I. Sherstyuk

    2015-01-01

    Summary. In the article examines the statistical data on the development of average weight and average sugar content of sugar beet roots. The successful solution of the problem of forecasting these raw indices is essential for solving problems of sugar producing complex control. In the paper by calculating the autocorrelation function demonstrated that the predominant trend component of the growth raw characteristics. For construct the prediction model is proposed to use an autoregressive fir...

  4. Economical andgeographical analysis of publishing and printing complex of Ukraine. N. O. Podorozhko

    Directory of Open Access Journals (Sweden)

    Podorozhko N.O.

    2009-08-01

    Full Text Available Actuality and the theoretical foundations of the geographical analysis of publishing and printing complex of Ukraine are substantiated. The analysis of the main indicators of the publishing industry in dynamics for 2000-2007are submitted. In an unfolded state presents complex for 2007 as a whole in Ukraine as in some regions.

  5. Analysis of complex time series using refined composite multiscale entropy

    International Nuclear Information System (INIS)

    Wu, Shuen-De; Wu, Chiu-Wen; Lin, Shiou-Gwo; Lee, Kung-Yen; Peng, Chung-Kang

    2014-01-01

    Multiscale entropy (MSE) is an effective algorithm for measuring the complexity of a time series that has been applied in many fields successfully. However, MSE may yield an inaccurate estimation of entropy or induce undefined entropy because the coarse-graining procedure reduces the length of a time series considerably at large scales. Composite multiscale entropy (CMSE) was recently proposed to improve the accuracy of MSE, but it does not resolve undefined entropy. Here we propose a refined composite multiscale entropy (RCMSE) to improve CMSE. For short time series analyses, we demonstrate that RCMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy.

  6. Complex analysis and dynamical systems new trends and open problems

    CERN Document Server

    Golberg, Anatoly; Jacobzon, Fiana; Shoikhet, David; Zalcman, Lawrence

    2018-01-01

    This book focuses on developments in complex dynamical systems and geometric function theory over the past decade, showing strong links with other areas of mathematics and the natural sciences. Traditional methods and approaches surface in physics and in the life and engineering sciences with increasing frequency – the Schramm‐Loewner evolution, Laplacian growth, and quadratic differentials are just a few typical examples. This book provides a representative overview of these processes and collects open problems in the various areas, while at the same time showing where and how each particular topic evolves. This volume is dedicated to the memory of Alexander Vasiliev.

  7. Preparation and analysis of multilayer composites based on polyelectrolyte complexes

    Energy Technology Data Exchange (ETDEWEB)

    Petrova, V. A. [Russian Academy of Sciences, Institute of Macromolecular Compounds (Russian Federation); Orekhov, A. S. [Russian Academy of Sciences, Shubnikov Institute of Crystallography, Federal Scientific Research Centre “Crystallography and Photonics” (Russian Federation); Chernyakov, D. D. [St. Petersburg State Chemical Pharmaceutical Academy (Russian Federation); Baklagina, Yu. G. [Russian Academy of Sciences, Institute of Macromolecular Compounds (Russian Federation); Romanov, D. P. [Russian Academy of Sciences, Grebenshchikov Institute of Silicate Chemistry (Russian Federation); Kononova, S. V. [Russian Academy of Sciences, Institute of Macromolecular Compounds (Russian Federation); Volod’ko, A. V.; Ermak, I. M. [Russian Academy of Sciences, Elyakov Pacific Institute of Bioorganic Chemistry, Far-Eastern Branch (Russian Federation); Klechkovskaya, V. V., E-mail: klechvv@ns.crys.ras.ru [Russian Academy of Sciences, Shubnikov Institute of Crystallography, Federal Scientific Research Centre “Crystallography and Photonics” (Russian Federation); Skorik, Yu. A., E-mail: yury-skorik@mail.ru [Russian Academy of Sciences, Institute of Macromolecular Compounds (Russian Federation)

    2016-11-15

    A method for preparing multilayer film composites based on chitosan has been developed by the example of polymer pairs: chitosan–hyaluronic acid, chitosan–alginic acid, and chitosan–carrageenan. The structure of the composite films is characterized by X-ray diffractometry and scanning electron microscopy. It is shown that the deposition of a solution of hyaluronic acid, alginic acid, or carrageenan on a chitosan gel film leads to the formation of a polyelectrolyte complex layer at the interface, which is accompanied by the ordering of chitosan chains in the surface region; the microstructure of this layer depends on the nature of contacting polymer pairs.

  8. Adaptive Analysis of Locally Complex Systems in a Globally Complex World

    Directory of Open Access Journals (Sweden)

    Timothy Lynam

    1999-12-01

    Full Text Available Zambezi Valley agro-ecosystems are environmentally, economically, and institutionally variable. This variability means that it is not possible to measure everything necessary to develop a predictive understanding of them. In particular, because people and their environments are constantly changing, what was measured yesterday may change by tomorrow. Here, I describe elements of the approach that I have developed to address this problem. Called DAAWN, for Detail as and When Needed, the approach advocates an iterative and multiscaled methodology in which we first capture as broad an understanding of the system as possible and then use awareness developed at this scale to identify where to focus subsequent, more detailed, investigations. Because we cannot hope to measure or monitor everything in these complex and adaptive agro-ecosystems, the approach requires us to make judicious use of all available knowledge about the agro-ecosystem. The DAAWN approach is rooted in systems theory, but is tempered by systems and problems where boundaries are not clearly defined, where nonlinearities are the norm, and where structural and functional change is the order of the day. I describe a few of the most important data collection tools and methods that were developed to record the knowledge of local people and to observe, monitor, and measure changes in their resources. Of particular importance is the tool that I call a "spidergram." This tool, which I used extensively with village informants, symbolizes the DAAWN approach and was a major stimulus for its development. Simulation models provide another very important tool; here, I offer some examples of spatially explicit, multi-agent models. Some key findings of the research on Zambezi Valley agro-ecosystems are also briefly presented.

  9. Vibrational spectroscopy and structural analysis of uranium complexes

    Energy Technology Data Exchange (ETDEWEB)

    Umrejko, D.S.; Nikanovich, M.V.

    1984-12-01

    On the basis of experimental and theoretical studies of vibbrational spectra for halides, sulfates, phosphates, uranyl oxalates (and uranium) as well as for more complicated complex systems, reliable spectroscopic criteria have been established for estimation of their structural features by more simple and accessible (than direct) methods. Due to coordination to a central ion of U/sup 6/(U/sup 4/) ligands a geometry variation specific for each method of addition occurs and concomitant redistribution of the force interaction in the mentioned system as well, which directly affects the variation of their frequency characteristics and vibration modes. On this ground stable indications of particular types of coordination for mono- and polyatomic groups (including bridge-type, characteristic of polymetric structures) are pointed out in the IR absorption and Raman spectra. In the investigated structures the predominant effect of coordination on the spectral properties of complexes, as compared with other factors (for example, outer-sphere binding) is established. The presence of water molecules in an interlayer space does not tell essentially on the state of polyatomic ligands with all donor atoms bound with the uranium central atom (particularly, in binary uranyl phosphates). In the presence of free oxygen atoms the H/sub 2/O effect can lead only to some shift of the maxima of separate bands and their additional weak splitting (in uranyl sulfates).

  10. Analysis and application of classification methods of complex carbonate reservoirs

    Science.gov (United States)

    Li, Xiongyan; Qin, Ruibao; Ping, Haitao; Wei, Dan; Liu, Xiaomei

    2018-06-01

    There are abundant carbonate reservoirs from the Cenozoic to Mesozoic era in the Middle East. Due to variation in sedimentary environment and diagenetic process of carbonate reservoirs, several porosity types coexist in carbonate reservoirs. As a result, because of the complex lithologies and pore types as well as the impact of microfractures, the pore structure is very complicated. Therefore, it is difficult to accurately calculate the reservoir parameters. In order to accurately evaluate carbonate reservoirs, based on the pore structure evaluation of carbonate reservoirs, the classification methods of carbonate reservoirs are analyzed based on capillary pressure curves and flow units. Based on the capillary pressure curves, although the carbonate reservoirs can be classified, the relationship between porosity and permeability after classification is not ideal. On the basis of the flow units, the high-precision functional relationship between porosity and permeability after classification can be established. Therefore, the carbonate reservoirs can be quantitatively evaluated based on the classification of flow units. In the dolomite reservoirs, the average absolute error of calculated permeability decreases from 15.13 to 7.44 mD. Similarly, the average absolute error of calculated permeability of limestone reservoirs is reduced from 20.33 to 7.37 mD. Only by accurately characterizing pore structures and classifying reservoir types, reservoir parameters could be calculated accurately. Therefore, characterizing pore structures and classifying reservoir types are very important to accurate evaluation of complex carbonate reservoirs in the Middle East.

  11. Proposal for analysis of human talent from complex thought

    Directory of Open Access Journals (Sweden)

    Abel Del Río Cortina

    2015-12-01

    Full Text Available In this paper, it is expose a displaying scheme of the actions of the individual in the context of the productive world*, considering a series of demonstrations framed in the learning process, this, with respect to the complexity of the human development derived from the interactions of the individual, the family, the community, the labor environment, and society in general from the perspective of volitional, cognitive, and procedural dimensions. The proposed visualization, is conceived as a relational map that includes six pillars of human interaction immersed in the above dimensions, being them, know-to be, know-to know, Know-to live, Know-to create, know-to manage, and know-to communicate, being all these reflected as a synergic structure made manifest in the know-how, from the interplay of values, emotional skills or soft skills, attitudes, knowledge, and finally, ways of proceeding. All of the above, in order to generate an approach to the relational complexity of the human talent development in society. * The productive world, in this document, is conceived as the one in which people get articulated in order to live in family, community, entrepreneurial organization, a diverse kind of institutions, and society in general.

  12. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  13. Risk Analysis for Road Tunnels – A Metamodel to Efficiently Integrate Complex Fire Scenarios

    DEFF Research Database (Denmark)

    Berchtold, Florian; Knaust, Christian; Arnold, Lukas

    2018-01-01

    Fires in road tunnels constitute complex scenarios with interactions between the fire, tunnel users and safety measures. More and more methodologies for risk analysis quantify the consequences of these scenarios with complex models. Examples for complex models are the computational fluid dynamics...... complex scenarios in risk analysis. To face this challenge, we improved the metamodel used in the methodology for risk analysis presented on ISTSS 2016. In general, a metamodel quickly interpolates the consequences of few scenarios simulated with the complex models to a large number of arbitrary scenarios...... used in risk analysis. Now, our metamodel consists of the projection array-based design, the moving least squares method, and the prediction interval to quantify the metamodel uncertainty. Additionally, we adapted the projection array-based design in two ways: the focus of the sequential refinement...

  14. Entropy and complexity analysis of hydrogenic Rydberg atoms

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Rosa, S. [Instituto Carlos I de Fisica Teorica y Computacional, Universidad de Granada, 18071-Granada (Spain); Departamento de Fisica Aplicada II, Universidad de Sevilla, 41012-Sevilla (Spain); Toranzo, I. V.; Dehesa, J. S. [Instituto Carlos I de Fisica Teorica y Computacional, Universidad de Granada, 18071-Granada (Spain); Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, 18071-Granada (Spain); Sanchez-Moreno, P. [Instituto Carlos I de Fisica Teorica y Computacional, Universidad de Granada, 18071-Granada (Spain); Departamento de Matematica Aplicada, Universidad de Granada, 18071-Granada (Spain)

    2013-05-15

    The internal disorder of hydrogenic Rydberg atoms as contained in their position and momentum probability densities is examined by means of the following information-theoretic spreading quantities: the radial and logarithmic expectation values, the Shannon entropy, and the Fisher information. As well, the complexity measures of Cramer-Rao, Fisher-Shannon, and Lopez Ruiz-Mancini-Calvet types are investigated in both reciprocal spaces. The leading term of these quantities is rigorously calculated by use of the asymptotic properties of the concomitant entropic functionals of the Laguerre and Gegenbauer orthogonal polynomials which control the wavefunctions of the Rydberg states in both position and momentum spaces. The associated generalized Heisenberg-like, logarithmic and entropic uncertainty relations are also given. Finally, application to linear (l= 0), circular (l=n- 1), and quasicircular (l=n- 2) states is explicitly done.

  15. Complex analysis fundamentals of the classical theory of functions

    CERN Document Server

    Stalker, John

    1998-01-01

    This clear, concise introduction to the classical theory of one complex variable is based on the premise that "anything worth doing is worth doing with interesting examples." The content is driven by techniques and examples rather than definitions and theorems. This self-contained monograph is an excellent resource for a self-study guide and should appeal to a broad audience. The only prerequisite is a standard calculus course. The first chapter deals with a beautiful presentation of special functions. . . . The third chapter covers elliptic and modular functions. . . in much more detail, and from a different point of view, than one can find in standard introductory books. . . . For [the] subjects that are omitted, the author has suggested some excellent references for the reader who wants to go through these topics. The book is read easily and with great interest. It can be recommended to both students as a textbook and to mathematicians and physicists as a useful reference. ---Mathematical Reviews Mainly or...

  16. Entropy and complexity analysis of hydrogenic Rydberg atoms

    International Nuclear Information System (INIS)

    López-Rosa, S.; Toranzo, I. V.; Dehesa, J. S.; Sánchez-Moreno, P.

    2013-01-01

    The internal disorder of hydrogenic Rydberg atoms as contained in their position and momentum probability densities is examined by means of the following information-theoretic spreading quantities: the radial and logarithmic expectation values, the Shannon entropy, and the Fisher information. As well, the complexity measures of Crámer-Rao, Fisher-Shannon, and López Ruiz-Mancini-Calvet types are investigated in both reciprocal spaces. The leading term of these quantities is rigorously calculated by use of the asymptotic properties of the concomitant entropic functionals of the Laguerre and Gegenbauer orthogonal polynomials which control the wavefunctions of the Rydberg states in both position and momentum spaces. The associated generalized Heisenberg-like, logarithmic and entropic uncertainty relations are also given. Finally, application to linear (l= 0), circular (l=n− 1), and quasicircular (l=n− 2) states is explicitly done.

  17. FADES: A tool for automated fault analysis of complex systems

    International Nuclear Information System (INIS)

    Wood, C.

    1990-01-01

    FADES is an Expert System for performing fault analyses on complex connected systems. By using a graphical editor to draw components and link them together the FADES system allows the analyst to describe a given system. The knowledge base created is used to qualitatively simulate the system behaviour. By inducing all possible component failures in the system and determining their effects, a set of facts is built up. These facts are then used to create Fault Trees, or FMEA tables. The facts may also be used for explanation effects and to generate diagnostic rules allowing system instrumentation to be optimised. The prototype system has been built and tested and is preently undergoing testing by users. All comments from these trials will be used to tailor the system to the requirements of the user so that the end product performs the exact task required

  18. Complexity Analysis of Precedence Terminating Infinite Graph Rewrite Systems

    Directory of Open Access Journals (Sweden)

    Naohi Eguchi

    2015-05-01

    Full Text Available The general form of safe recursion (or ramified recurrence can be expressed by an infinite graph rewrite system including unfolding graph rewrite rules introduced by Dal Lago, Martini and Zorzi, in which the size of every normal form by innermost rewriting is polynomially bounded. Every unfolding graph rewrite rule is precedence terminating in the sense of Middeldorp, Ohsaki and Zantema. Although precedence terminating infinite rewrite systems cover all the primitive recursive functions, in this paper we consider graph rewrite systems precedence terminating with argument separation, which form a subclass of precedence terminating graph rewrite systems. We show that for any precedence terminating infinite graph rewrite system G with a specific argument separation, both the runtime complexity of G and the size of every normal form in G can be polynomially bounded. As a corollary, we obtain an alternative proof of the original result by Dal Lago et al.

  19. Risk and sustainability analysis of complex hydrogen infrastructures

    DEFF Research Database (Denmark)

    Markert, Frank; Marangon, A.; Carcassi, M.

    2017-01-01

    -based fuels. Therefore, future hydrogen supply and distribution chains will have to address several objectives. Such a complexity is a challenge for risk assessment and risk management of these chains because of the increasing interactions. Improved methods are needed to assess the supply chain as a whole......Building a network of hydrogen refuelling stations is essential to develop the hydrogen economy within transport. Additional, hydrogen is regarded a likely key component to store and convert back excess electrical power to secure future energy supply and to improve the quality of biomass....... The method of “Functional modelling” is discussed in this paper. It will be shown how it could be a basis for other decision support methods for comprehensive risk and sustainability assessments....

  20. Analysis of Semantic Networks using Complex Networks Concepts

    DEFF Research Database (Denmark)

    Ortiz-Arroyo, Daniel

    2013-01-01

    In this paper we perform a preliminary analysis of semantic networks to determine the most important terms that could be used to optimize a summarization task. In our experiments, we measure how the properties of a semantic network change, when the terms in the network are removed. Our preliminar...

  1. Genomic Analysis of Complex Microbial Communities in Wounds

    Science.gov (United States)

    2009-07-01

    software for common statistical methods for ecological and biodiversity studies. World Agroforestry Centre (ICRAF), Nairobi. Lane DJ. 1991. 16S/23S rRNA...World Agroforestry Centre (ICRAF), Nairobi. 40. Roberts DW (2007) labdsv: Ordination and Multivariate Analysis for Ecology. R package version 1.3-1

  2. Design and preliminary biomechanical analysis of artificial cervical joint complex.

    Science.gov (United States)

    Jian, Yu; Lan-Tao, Liu; Zhao, Jian-ning; Jian-ning, Zhao

    2013-06-01

    To design an artificial cervical joint complex (ACJC) prosthesis for non-fusion reconstruction after cervical subtotal corpectomy, and to evaluate the biomechanical stability, preservation of segment movements and influence on adjacent inter-vertebral movements of this prosthesis. The prosthesis was composed of three parts: the upper/lower joint head and the middle artificial vertebrae made of Cobalt-Chromium-Molybdenum (Co-Cr-Mo) alloy and polyethylene with a ball-and-socket joint design resembling the multi-axial movement in normal inter-vertebral spaces. Biomechanical tests of intact spine (control), Orion locking plate system and ACJC prosthesis were performed on formalin-fixed cervical spine specimens from 21 healthy cadavers to compare stability, range of motion (ROM) of the surgical segment and ROM of adjacent inter-vertebral spaces. As for stability of the whole lower cervical spine, there was no significant difference of flexion, extension, lateral bending and torsion between intact spine group and ACJC prosthesis group. As for segment movements, difference in flexion, lateral bending or torsion between ACJC prosthesis group and control group was not statistically significant, while ACJC prosthesis group showed an increase in extension (P inter-vertebral ROM of the ACJC prosthesis group was not statistically significant compared to that of the control group. After cervical subtotal corpectomy, reconstruction with ACJC prosthesis not only obtained instant stability, but also reserved segment motions effectively, without abnormal gain of mobility at adjacent inter-vertebral spaces.

  3. Analysis of a Mouse Skin Model of Tuberous Sclerosis Complex.

    Directory of Open Access Journals (Sweden)

    Yanan Guo

    Full Text Available Tuberous Sclerosis Complex (TSC is an autosomal dominant tumor suppressor gene syndrome in which patients develop several types of tumors, including facial angiofibroma, subungual fibroma, Shagreen patch, angiomyolipomas, and lymphangioleiomyomatosis. It is due to inactivating mutations in TSC1 or TSC2. We sought to generate a mouse model of one or more of these tumor types by targeting deletion of the Tsc1 gene to fibroblasts using the Fsp-Cre allele. Mutant, Tsc1ccFsp-Cre+ mice survived a median of nearly a year, and developed tumors in multiple sites but did not develop angiomyolipoma or lymphangioleiomyomatosis. They did develop a prominent skin phenotype with marked thickening of the dermis with accumulation of mast cells, that was minimally responsive to systemic rapamycin therapy, and was quite different from the pathology seen in human TSC skin lesions. Recombination and loss of Tsc1 was demonstrated in skin fibroblasts in vivo and in cultured skin fibroblasts. Loss of Tsc1 in fibroblasts in mice does not lead to a model of angiomyolipoma or lymphangioleiomyomatosis.

  4. Analysis and Reduction of Complex Networks Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Roger G [University of Southern California

    2014-07-31

    This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.

  5. Complex Analysis of Financial State and Performance of Construction Enterprises

    Directory of Open Access Journals (Sweden)

    Algirdas Krivka

    2015-12-01

    Full Text Available The paper analyses the financial state and performance of large constructions enterprises by applying financial indicators. As there is no one single decisive financial indicator enabling to objectively assess enterprise performance, the multi-criteria decision making (MCDM methods are applied with four groups of financial ratios (profitability, liquidity, solvency and asset turnover acting as evaluation criteria, while the alternatives assessed are two enterprises compared throughout the reference period of three years, also with the average indicator values of the whole construction sector. The weights of the criteria have been estimated by involving competent experts with chi-square test employed to check the degree of agreement of expert estimates. The research methodology contributes to the issue of complex evaluation of enterprise financial state and performance, while the result of the multi-criteria assessment – the ranking of enterprises and sector average with respect to financial state and performance – could be considered worth attention from business owners, potential investors, customers or other possible stakeholders.

  6. Seismic Hazard Analysis on a Complex, Interconnected Fault Network

    Science.gov (United States)

    Page, M. T.; Field, E. H.; Milner, K. R.

    2017-12-01

    In California, seismic hazard models have evolved from simple, segmented prescriptive models to much more complex representations of multi-fault and multi-segment earthquakes on an interconnected fault network. During the development of the 3rd Uniform California Earthquake Rupture Forecast (UCERF3), the prevalence of multi-fault ruptures in the modeling was controversial. Yet recent earthquakes, for example, the Kaikora earthquake - as well as new research on the potential of multi-fault ruptures (e.g., Nissen et al., 2016; Sahakian et al. 2017) - have validated this approach. For large crustal earthquakes, multi-fault ruptures may be the norm rather than the exception. As datasets improve and we can view the rupture process at a finer scale, the interconnected, fractal nature of faults is revealed even by individual earthquakes. What is the proper way to model earthquakes on a fractal fault network? We show multiple lines of evidence that connectivity even in modern models such as UCERF3 may be underestimated, although clustering in UCERF3 mitigates some modeling simplifications. We need a methodology that can be applied equally well where the fault network is well-mapped and where it is not - an extendable methodology that allows us to "fill in" gaps in the fault network and in our knowledge.

  7. Income inequality: A complex network analysis of US states

    Science.gov (United States)

    Gogas, Periklis; Gupta, Rangan; Miller, Stephen M.; Papadimitriou, Theophilos; Sarantitis, Georgios Antonios

    2017-10-01

    This study performs a long-run, inter-temporal analysis of income inequality in the US spanning the period 1916-2012. We employ both descriptive analysis and the Threshold-Minimum Dominating Set methodology from Graph Theory, to examine the evolution of inequality through time. In doing so, we use two alternative measures of inequality: the Top 1% share of income and the Gini coefficient. This provides new insight on the literature of income inequality across the US states. Several empirical findings emerge. First, a heterogeneous evolution of inequality exists across the four focal sub-periods. Second, the results differ between the inequality measures examined. Finally, we identify groups of similarly behaving states in terms of inequality. The US authorities can use these findings to identify inequality trends and innovations and/or examples to investigate the causes of inequality within the US and implement appropriate policies.

  8. Analysis of complex networks from biology to linguistics

    CERN Document Server

    Dehmer, Matthias

    2009-01-01

    Mathematical problems such as graph theory problems are of increasing importance for the analysis of modelling data in biomedical research such as in systems biology, neuronal network modelling etc. This book follows a new approach of including graph theory from a mathematical perspective with specific applications of graph theory in biomedical and computational sciences. The book is written by renowned experts in the field and offers valuable background information for a wide audience.

  9. Optical analysis of dust complexes in spiral galaxies

    International Nuclear Information System (INIS)

    Elmegreen, D.A.M.

    1979-01-01

    A method for quantitatively investigating properties of dust regions in external galaxies is presented. The technique involves matching radiative transfer models (with absorption plus scattering) to multicolor photographic and photometric observations. Dust features in each galaxy are modeled with two configurations; one is rectangular with a Gaussian distribution perpendicular to the plane of the galaxy, and the other is a uniform oblate spheroid with an arbitrary height from the midplane. It is found that it is possible to determine the intrinsic opacities in the clouds and in the nearby comparison regions, and that differention between high opacity low-lying clouds and low opacity clouds that are above the midplane can be made. This technique was used to study dust complexes in the late-type spiral galaxies NGC 628 (M74), NGC 5194 (M51), NGC 5457 (M101), and NGC 7793. Most of the features in the prominent dust lanes were found to have internal visual extinctions corresponding to 10 to 15 mag kpc -1 , while the adjacent comparison regions typically contained 4 mag kpc -1 . Thus the opacity through a dust lane is about 1.5 mag greater than the 0.5 to 1.0 mag of extinction through a comparison region. A noticeable deviation from this result was found for all of the dust lanes that occurred on the inner edges of the spiral arm branches. These features had internal densities that were approx. 10 times larger than in their comparison regions, in contrast to the normal dust lanes which had density enhancements of a factor of approx. 3. Dust features which were on the outer sides of spiral arms appeared to be no different than main inner dust lane features

  10. Dynamic analysis of complex tube systems in heat exchangers

    International Nuclear Information System (INIS)

    Kouba, J.; Dvorak, P.

    1985-01-01

    Using a computation model, a dynamic analysis was made of tube assemblies of heat exchanger bundles by the finite element method. The algorithm is presented for determining the frequency mode properties, based on the Sturm sequences combined with inverse vector iteration. The results obtained using the method are compared with those obtained by analytical solution and by the transfer matrix method, this for the cases of both eigenvibrations and resonance vibrations. The results are in very good agreement. For the first four eigenfrequencies, the calculation error is less than 1.5% as against the analytical solution. (J.B.). 4 tabs., 8 figs., 5 refs

  11. Accommodating complexity and human behaviors in decision analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.; Siirola, John Daniel; Schoenwald, David Alan; Strip, David R.; Hirsch, Gary B.; Bastian, Mark S.; Braithwaite, Karl R.; Homer, Jack [Homer Consulting

    2007-11-01

    This is the final report for a LDRD effort to address human behavior in decision support systems. One sister LDRD effort reports the extension of this work to include actual human choices and additional simulation analyses. Another provides the background for this effort and the programmatic directions for future work. This specific effort considered the feasibility of five aspects of model development required for analysis viability. To avoid the use of classified information, healthcare decisions and the system embedding them became the illustrative example for assessment.

  12. Analysis of complex wetland ecological system: Effect of harvesting

    Directory of Open Access Journals (Sweden)

    Nilesh Kumar Thakur

    2017-12-01

    Full Text Available In this paper, we have studied interaction among diffusive phytoplankton, zooplankton and fish population with Beddington-DeAngelis type functional response for the zooplankton and Holling type III for fish. The stability analysis of the model system with diffusion and without diffusion has been analyzed. The conditions for Maximum sustainable yield and Optimal harvesting policy for non-spatial model have been discussed. Our study may be helpful to improve and manage ecosystem services provided by wetlands on an agricultural landscapes include fisheries, water conservation, climate change and many more.

  13. Prospective Safety Analysis and the Complex Aviation System

    Science.gov (United States)

    Smith, Brian E.

    2013-01-01

    Fatal accident rates in commercial passenger aviation are at historic lows yet have plateaued and are not showing evidence of further safety advances. Modern aircraft accidents reflect both historic causal factors and new unexpected "Black Swan" events. The ever-increasing complexity of the aviation system, along with its associated technology and organizational relationships, provides fertile ground for fresh problems. It is important to take a proactive approach to aviation safety by working to identify novel causation mechanisms for future aviation accidents before they happen. Progress has been made in using of historic data to identify the telltale signals preceding aviation accidents and incidents, using the large repositories of discrete and continuous data on aircraft and air traffic control performance and information reported by front-line personnel. Nevertheless, the aviation community is increasingly embracing predictive approaches to aviation safety. The "prospective workshop" early assessment tool described in this paper represents an approach toward this prospective mindset-one that attempts to identify the future vectors of aviation and asks the question: "What haven't we considered in our current safety assessments?" New causation mechanisms threatening aviation safety will arise in the future because new (or revised) systems and procedures will have to be used under future contextual conditions that have not been properly anticipated. Many simulation models exist for demonstrating the safety cases of new operational concepts and technologies. However the results from such models can only be as valid as the accuracy and completeness of assumptions made about the future context in which the new operational concepts and/or technologies will be immersed. Of course that future has not happened yet. What is needed is a reasonably high-confidence description of the future operational context, capturing critical contextual characteristics that modulate

  14. Preview of the Mission Assurance Analysis Protocol (MAAP): Assessing Risk and Opportunity in Complex Environments

    National Research Council Canada - National Science Library

    Alberts, Christopher; Dorofee, Audrey; Marino, Lisa

    2008-01-01

    .... A MAAP assessment provides a systematic, in-depth analysis of the potential for success in distributed, complex, and uncertain environments and can be applied across the life cycle and throughout the supply chain...

  15. AMPLIFIED FRAGMENT LENGTH POLYMORPHISM ANALYSIS OF MYCOBACTERIUM AVIUM COMPLEX ISOLATES RECOVERED FROM SOUTHERN CALIFORNIA

    Science.gov (United States)

    Fine-scale genotyping methods are necessary in order to identify possible sources of human exposure to opportunistic pathogens belonging to the Mycobacterium avium complex (MAC). In this study, amplified fragment length polymorphism (AFLP) analysis was evaluated for fingerprintin...

  16. Using visual information analysis to explore complex patterns in the activity of designers

    DEFF Research Database (Denmark)

    Cash, Philip; Stanković, Tino; Štorga, Mario

    2014-01-01

    The analysis of complex interlinked datasets poses a significant problem for design researchers. This is addressed by proposing an information visualisation method for analysing patterns of design activity, qualitatively and quantitatively, with respect to time. This method visualises the tempora...

  17. An analysis methodology for impact of new technology in complex sociotechnical systems

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2013-11-01

    Full Text Available in support of Systems Engineering efforts, which is difficult with complex Sociotechnical Systems. Cognitive Work Analysis and System Dynamics are two complementary approaches that can be applied within this context. The products of these methods assist...

  18. Methods of Approximation Theory in Complex Analysis and Mathematical Physics

    CERN Document Server

    Saff, Edward

    1993-01-01

    The book incorporates research papers and surveys written by participants ofan International Scientific Programme on Approximation Theory jointly supervised by Institute for Constructive Mathematics of University of South Florida at Tampa, USA and the Euler International Mathematical Instituteat St. Petersburg, Russia. The aim of the Programme was to present new developments in Constructive Approximation Theory. The topics of the papers are: asymptotic behaviour of orthogonal polynomials, rational approximation of classical functions, quadrature formulas, theory of n-widths, nonlinear approximation in Hardy algebras,numerical results on best polynomial approximations, wavelet analysis. FROM THE CONTENTS: E.A. Rakhmanov: Strong asymptotics for orthogonal polynomials associated with exponential weights on R.- A.L. Levin, E.B. Saff: Exact Convergence Rates for Best Lp Rational Approximation to the Signum Function and for Optimal Quadrature in Hp.- H. Stahl: Uniform Rational Approximation of x .- M. Rahman, S.K. ...

  19. Overall analysis of meteorological information in the daeduk nuclear complex

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byung Woo; Lee, Young Bok; Han, Moon Hee; Kim, Eun Han; Suh, Kyung Suk; Hwang, Won Tae; Hong, Suk Boong [Korea Atomic Energy Res. Inst., Taejon (Korea, Republic of)

    1992-12-01

    Problem shooting in tower structure, sensor installation, earth, and cabling have been done with integrated field-test, establishment of data acquisition system, and instrument calibration since the completion of the main tower construction in this year. Procedure guide was also made for the effective management covering instrument operation, calibration and repair. Real measurement has been done during two months from this October after whole integration of equipments. Occurrence of nocturnal inversion layer, fogging, and frequent stable condition of atmospheric stability were shown as the analysis results of measured data which well represented seasonal and regional characteristics in the site. Wireless data transmission to MIPS(Meteorological Information Processing System) has been done after collection in the DAS(data acquision system) where environmental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. (Author).

  20. Problem analysis of geotechnical well drilling in complex environment

    International Nuclear Information System (INIS)

    Kasenov, A K; Biletskiy, M T; Ratov, B T; Korotchenko, T V

    2015-01-01

    The article examines primary causes of problems occurring during the drilling of geotechnical wells (injection, production and monitoring wells) for in-situ leaching to extract uranium in South Kazakhstan. Such a drilling problem as hole caving which is basically caused by various chemical and physical factors (hydraulic, mechanical, etc.) has been thoroughly investigated. The analysis of packing causes has revealed that this problem usually occurs because of insufficient amount of drilling mud being associated with small cross section downward flow and relatively large cross section upward flow. This is explained by the fact that when spear bores are used to drill clay rocks, cutting size is usually rather big and there is a risk for clay particles to coagulate

  1. Exergetic analysis of autonomous power complex for drilling rig

    Science.gov (United States)

    Lebedev, V. A.; Karabuta, V. S.

    2017-10-01

    The article considers the issue of increasing the energy efficiency of power equipment of the drilling rig. At present diverse types of power plants are used in power supply systems. When designing and choosing a power plant, one of the main criteria is its energy efficiency. The main indicator in this case is the effective efficiency factor calculated by the method of thermal balances. In the article, it is suggested to use the exergy method to determine energy efficiency, which allows to perform estimations of the thermodynamic perfection degree of the system by the example of a gas turbine plant: relative estimation (exergetic efficiency factor) and an absolute estimation. An exergetic analysis of the gas turbine plant operating in a simple scheme was carried out using the program WaterSteamPro. Exergy losses in equipment elements are calculated.

  2. Environmental management policy analysis using complex system simulation

    International Nuclear Information System (INIS)

    Van Eeckhout, E.; Roberts, D.; Oakes, R.; Shieh, A.; Hardie, W.; Pope, P.

    1999-01-01

    The two primary modules of Envirosim (the model of Los Alamos TA-55 and the WIPP transport/storage model) have been combined into one application, with the simulated waste generated by TA-55 operations being fed to storage, packaging, and transport simulation entities. Three simulation scenarios were executed which demonstrate the usefulness of Envirosim as a policy analysis tool for use in planning shipments to WIPP. A graphical user interface (GUI) has been implemented using IDL (Interactive Data Language) which allows the analyst to easily view simulation results. While IDL is not necessarily the graphics interface that would be selected for a production version of Envirosim, it does provide some powerful data manipulation capabilities, and it runs on a variety of platforms

  3. The use of network analysis to study complex animal communication systems: a study on nightingale song

    OpenAIRE

    Weiss, Michael; Hultsch, Henrike; Adam, Iris; Scharff, Constance; Kipper, Silke

    2014-01-01

    The singing of song birds can form complex signal systems comprised of numerous subunits sung with distinct combinatorial properties that have been described as syntax-like. This complexity has inspired inquiries into similarities of bird song to human language; but the quantitative analysis and description of song sequences is a challenging task. In this study, we analysed song sequences of common nightingales (Luscinia megarhynchos) by means of a network analysis. We translated long nocturn...

  4. Synthesis, analysis and radiolysis of the cobalt III 8 hydroxyquinolinate complex

    International Nuclear Information System (INIS)

    Mestnik, S.A.C.; Silva, C.P.G. da.

    1981-11-01

    The cobalt III 8-hidroxyquinolinate complex was syntetized from a solution of cobalt II. The compound was analysed by IR absorption spectroscopy, elemental analysis and by the determination of number of ligands. The radiolytic degradation was verified by spectrophotometry after submitting samples of 10 - 3 M complex in ethanolic solution to different doses of gamma radiation from a 60 Co source. The change of maximum absorbance of the complex with different doses of gamma radiation and its UV-VIS absorption spectra are presented. The complex in the solid state was also irradiated with 6,9 Mrad of gamma radiation but it didn't present degradation. (Author) [pt

  5. The application of HP-GFC chromatographic method for the analysis of oligosaccharides in bioactive complexes

    Directory of Open Access Journals (Sweden)

    Savić Ivan

    2009-01-01

    Full Text Available The aim of this work was to optimize a GFC method for the analysis of bioactive metal (Cu, Co and Fe complexes with olygosaccharides (dextran and pullulan. Bioactive metal complexes with olygosaccharides were synthesized by original procedure. GFC was used to study the molecular weight distribution, polymerization degree of oligosaccharides and bioactive metal complexes. The metal bounding in complexes depends on the ligand polymerization degree and the presence of OH groups in coordinative sphere of the central metal ion. The interaction between oligosaccharide and metal ions are very important in veterinary medicine, agriculture, pharmacy and medicine.

  6. The Pomatocalpa maculosum Complex (Orchidaceae Resolved by Multivariate Morphometric Analysis

    Directory of Open Access Journals (Sweden)

    Santi Watthana

    2006-03-01

    Full Text Available Principal components analysis (PCA was employed to analyse the morphological variation among 63 herbarium specimens tentatively identified as Pomatocalpa andamanicum (Hook.f. J. J. Sm., P. koordersii (Rolfe J. J. Sm., P. latifolium (Lindl. J. J. Sm., P. linearifolium Seidenf., P. maculosum (Lindl. J. J. Sm., P. marsupiale (Kraenzl. J. J. Sm., P. naevatum J. J. Sm., or P. siamense (Rolfe ex Downie Summerh. Thirty-seven quantitative and 5 binary characters were included in the analyses. Taxa were delimited according to the observed clustering of specimens in the PCA plots, diagnostic characters were identified, and the correct nomenclature was established through examination of type material. Four species could be recognized viz, P. diffusum Breda (syn. P. latifolium, P. fuscum (Lindl. J. J. Sm. (syn. P. latifolium, P. marsupiale (syn. P. koordersii and P. maculosum. For the latter species, two subspecies could be recognized, viz P. maculosum (Lindl. J. J. Sm. subsp. maculosum (syn. P. maculosum, P. naevatum p.p. and P. maculosum (Lindl. J. J. Sm. subsp. andamanicum (Hook.f. S. Watthana (syn. P. andamanicum, P. linearifolium, P. siamense, P. naevatum p.p.. An identification key and a taxonomic synopsis are provided.

  7. Complexity and corporate governance: an analysis of companies listed on the BM&FBOVESPA

    Directory of Open Access Journals (Sweden)

    Renata Rouquayrol Assunção

    2017-03-01

    Full Text Available ABSTRACT In light of the need to develop mechanisms of control, protection, and transparency regarding the relationships between principal and agent, and with the aim of eliminating or reducing the agency problem, corporate governance has emerged. Based on Agency Theory, separation of ownership and control of activities derives from the complexity of organizations. In this context, this study aims to analyze the relationship between dimensions of complexity and corporate governance in companies listed on the São Paulo Stock, Commodities, and Futures Exchange (BM&FBOVESPA, in which contingency factors might influence organizational characteristics. The investigation gathers data from a sample of 162 companies listed on the BM&FBOVESPA. The following statistical tests were used in the data analysis: Factor Analysis, Multiple Linear Regression, Correspondence Analysis, and Correlation Analysis. For measuring complexity, contingency variables such as age, size, diversification, and internationalization were adopted; and, to assess corporate governance, a representative index of the adoption of good governance practices was used. The results show that organizational complexity is explained by the size and diversification variables, whereas operational complexity is explained by the size, diversification, and internationalization variables. It was observed that in the two dimensions of complexity - organizational and operational - corporate governance was influenced by the diversification, internationalization, and age variables, with the latter involving an inverse relationship. It is concluded that companies displaying more complexity, in its two dimensions, record a higher level of corporate governance, which confirms the research hypothesis.

  8. Molecular and biochemical analysis of symbiotic plant receptor kinase complexes

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Douglas R; Riely, Brendan K

    2010-09-01

    -localize (i.e., the flotillin FLOT4) with symbiotic receptor-like proteins. As controls for TAP tag analysis we have generated protein isoforms that carry fluorescent domains (translational fusions to GFP) and these have been used to establish the subcellular location and dynamics of two symbiotic receptors, LYK3 and DMI2. Both proteins localize to membrane microdomains, or putative lipid rafts, and display dynamic behavior following elicitation with the Nod factor ligand. Finally, mass spectrometry of interacting proteins is yielding lists of candidate proteins that we are poised to test using semi-high throughput RNAi technology and Tnt1 knockout collections in Medicago truncatula.

  9. Architectural Analysis of Complex Evolving Systems of Systems

    Science.gov (United States)

    Lindvall, Mikael; Stratton, William C.; Sibol, Deane E.; Ray, Arnab; Ackemann, Chris; Yonkwa, Lyly; Ganesan, Dharma

    2009-01-01

    The goal of this collaborative project between FC-MD, APL, and GSFC and supported by NASA IV&V Software Assurance Research Program (SARP), was to develop a tool, Dynamic SAVE, or Dyn-SAVE for short, for analyzing architectures of systems of systems. The project team was comprised of the principal investigator (PI) from FC-MD and four other FC-MD scientists (part time) and several FC-MD students (full time), as well as, two APL software architects (part time), and one NASA POC (part time). The PI and FC-MD scientists together with APL architects were responsible for requirements analysis, and for applying and evaluating the Dyn-SAVE tool and method. The PI and a group of FC-MD scientists were responsible for improving the method and conducting outreach activities, while another group of FC-MD scientists were responsible for development and improvement of the tool. Oversight and reporting was conducted by the PI and NASA POC. The project team produced many results including several prototypes of the Dyn-SAVE tool and method, several case studies documenting how the tool and method was applied to APL s software systems, and several published papers in highly respected conferences and journals. Dyn-SAVE as developed and enhanced throughout this research period, is a software tool intended for software developers and architects, software integration testers, and persons who need to analyze software systems from the point of view of how it communicates with other systems. Using the tool, the user specifies the planned communication behavior of the system modeled as a sequence diagram. The user then captures and imports the actual communication behavior of the system, which is then converted and visualized as a sequence diagram by Dyn-SAVE. After mapping the planned to the actual and specifying parameter and timing constraints, Dyn-SAVE detects and highlights deviations between the planned and the actual behavior. Requirements based on the need to analyze two inter

  10. A Stylistic Analysis of Complexity in William Faulkner's "A Rose for Emily"

    Science.gov (United States)

    Abdurrahman, Israa' Burhanuddin

    2016-01-01

    Applying a stylistic analysis on certain texts refers to the identification of patterns of usage in writing. However, such an analysis is not restricted just to the description of the formal characteristics of texts, but it also tries to elucidate their functional importance for the interpretation of the text. This paper highlights complexity as a…

  11. Synthesis and physicochemical analysis of Sm (II, III) acetylacetone chelate complexes

    International Nuclear Information System (INIS)

    Kostyuk, N.N.; Dik, T.A.; Trebnikov, A.G.

    2004-01-01

    Sm (II, III) acetylacetone chelate complexes were synthesized by electrochemical method. It was shown that anode dissolution of the metal samarium over acetylacetone leads to formation of the Sm (II, III) chelate complexes: xSm(acac)2 · ySm(acac)3 · zH(acac). Factors x, y and z depend on quantity of the electricity, which flew through the electrolysis cell. The compositions of the obtained substances were confirmed by the physicochemical analysis (ultimate analysis, IR-, mass spectroscopy and thermal analysis (thermogravimetric, isothermal warming-up and differential scanning colorimetry). (Authors)

  12. Balancing the Quantitative and Qualitative Aspects of Social Network Analysis to Study Complex Social Systems

    OpenAIRE

    Schipper, Danny; Spekkink, Wouter

    2015-01-01

    Social Network Analysis (SNA) can be used to investigate complex social systems. SNA is typically applied as a quantitative method, which has important limitations. First, quantitative methods are capable of capturing the form of relationships (e.g. strength and frequency), but they are less suitable for capturing the content of relationships (e.g. interests and motivations). Second, while complex social systems are highly dynamic, the representations that SNA creates of such systems are ofte...

  13. Analysis and systematization of experience of state administration of Russian Federation military-industrial complex

    OpenAIRE

    O. F. Salnikova; H. P. Sytnik

    2014-01-01

    The analysis of the systems of development of the military-industrial complex of the Russian Federation is conducted in the article. Control system of the military-industrial complex of the Russian Federation is schematically represented in the article. Russia is one of the largest exporters of armaments. The most popular types of armaments are airplanes, systems of air defense, helicopters, fighting machines of infantry and small-arms. For today Russia actively masters new on ...

  14. Detecting coordinated regulation of multi-protein complexes using logic analysis of gene expression

    Directory of Open Access Journals (Sweden)

    Yeates Todd O

    2009-12-01

    Full Text Available Abstract Background Many of the functional units in cells are multi-protein complexes such as RNA polymerase, the ribosome, and the proteasome. For such units to work together, one might expect a high level of regulation to enable co-appearance or repression of sets of complexes at the required time. However, this type of coordinated regulation between whole complexes is difficult to detect by existing methods for analyzing mRNA co-expression. We propose a new methodology that is able to detect such higher order relationships. Results We detect coordinated regulation of multiple protein complexes using logic analysis of gene expression data. Specifically, we identify gene triplets composed of genes whose expression profiles are found to be related by various types of logic functions. In order to focus on complexes, we associate the members of a gene triplet with the distinct protein complexes to which they belong. In this way, we identify complexes related by specific kinds of regulatory relationships. For example, we may find that the transcription of complex C is increased only if the transcription of both complex A AND complex B is repressed. We identify hundreds of examples of coordinated regulation among complexes under various stress conditions. Many of these examples involve the ribosome. Some of our examples have been previously identified in the literature, while others are novel. One notable example is the relationship between the transcription of the ribosome, RNA polymerase and mannosyltransferase II, which is involved in N-linked glycan processing in the Golgi. Conclusions The analysis proposed here focuses on relationships among triplets of genes that are not evident when genes are examined in a pairwise fashion as in typical clustering methods. By grouping gene triplets, we are able to decipher coordinated regulation among sets of three complexes. Moreover, using all triplets that involve coordinated regulation with the ribosome

  15. Distinguishing PTSD, Complex PTSD, and Borderline Personality Disorder: A latent class analysis

    Directory of Open Access Journals (Sweden)

    Marylène Cloitre

    2014-09-01

    Full Text Available Background: There has been debate regarding whether Complex Posttraumatic Stress Disorder (Complex PTSD is distinct from Borderline Personality Disorder (BPD when the latter is comorbid with PTSD. Objective: To determine whether the patterns of symptoms endorsed by women seeking treatment for childhood abuse form classes that are consistent with diagnostic criteria for PTSD, Complex PTSD, and BPD. Method: A latent class analysis (LCA was conducted on an archival dataset of 280 women with histories of childhood abuse assessed for enrollment in a clinical trial for PTSD. Results: The LCA revealed four distinct classes of individuals: a Low Symptom class characterized by low endorsements on all symptoms; a PTSD class characterized by elevated symptoms of PTSD but low endorsement of symptoms that define the Complex PTSD and BPD diagnoses; a Complex PTSD class characterized by elevated symptoms of PTSD and self-organization symptoms that defined the Complex PTSD diagnosis but low on the symptoms of BPD; and a BPD class characterized by symptoms of BPD. Four BPD symptoms were found to greatly increase the odds of being in the BPD compared to the Complex PTSD class: frantic efforts to avoid abandonment, unstable sense of self, unstable and intense interpersonal relationships, and impulsiveness. Conclusions: Findings supported the construct validity of Complex PTSD as distinguishable from BPD. Key symptoms that distinguished between the disorders were identified, which may aid in differential diagnosis and treatment planning.

  16. A framework for the analysis of cognitive reliability in complex systems: a recovery centred approach

    International Nuclear Information System (INIS)

    Kontogiannis, Tom

    1997-01-01

    Managing complex industrial systems requires reliable performance of cognitive tasks undertaken by operating crews. The infrequent practice of cognitive skills and the reliance on operator performance for novel situations raised cognitive reliability into an urgent and essential aspect in system design and risk analysis. The aim of this article is to contribute to the development of methods for the analysis of cognitive tasks in complex man-machine interactions. A practical framework is proposed for analysing cognitive errors and enhancing error recovery through interface design. Cognitive errors are viewed as failures in problem solving which are difficult to recover under the task constrains imposed by complex systems. In this sense, the interaction between context and cognition, on the one hand, and the process of error recovery, on the other hand, become the focal points of the proposed framework which is illustrated in an analysis of a simulated emergency

  17. GEOMETRIC COMPLEXITY ANALYSIS IN AN INTEGRATIVE TECHNOLOGY EVALUATION MODEL (ITEM FOR SELECTIVE LASER MELTING (SLM#

    Directory of Open Access Journals (Sweden)

    S. Merkt

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Selective laser melting (SLM is becoming an economically viable choice for manufacturing complex serial parts. This paper focuses on a geometric complexity analysis as part of the integrative technology evaluation model (ITEM presented here. In contrast to conventional evaluation methodologies, the ITEM considers interactions between product and process innovations generated by SLM. The evaluation of manufacturing processes that compete with SLM is the main goal of ITEM. The paper includes a complexity analysis of a test part from Festo AG. The paper closes with a discussion of how the expanded design freedom of SLM can be used to improve company operations, and how the complexity analysis presented here can be seen as a starting point for feature-based complexity analysis..

    AFRIKAANSE OPSOMMING: Selektiewe lasersmelting word geleidelik ’n gangbare ekonomiese keuse vir die vervaar-diging van opeenvolgende komplekse onderdele. Die navorsing is toegespits op die ontleding van meetkundige kompleksiteit as ’n gedeelte van ’n integrerende tegnologiese evalueringsmodel. Gemeet teen konvensionele evalueringsmodelle behandel die genoemde metode interaksies tussen produkte- en prosesinnovasies wat gegenereer word. Die navorsing behandel ’n kompleksiteitsontleding van ’n toetsonderdeel van die firma FESTO AG. Die resultaat toon hoe kompleksiteits-analise gebruik kan word as die vertrekpunt vir eienskapsgebaseerde analise.

  18. Analysis of Proteins, Protein Complexes, and Organellar Proteomes Using Sheathless Capillary Zone Electrophoresis - Native Mass Spectrometry

    Science.gov (United States)

    Belov, Arseniy M.; Viner, Rosa; Santos, Marcia R.; Horn, David M.; Bern, Marshall; Karger, Barry L.; Ivanov, Alexander R.

    2017-12-01

    Native mass spectrometry (MS) is a rapidly advancing field in the analysis of proteins, protein complexes, and macromolecular species of various types. The majority of native MS experiments reported to-date has been conducted using direct infusion of purified analytes into a mass spectrometer. In this study, capillary zone electrophoresis (CZE) was coupled online to Orbitrap mass spectrometers using a commercial sheathless interface to enable high-performance separation, identification, and structural characterization of limited amounts of purified proteins and protein complexes, the latter with preserved non-covalent associations under native conditions. The performance of both bare-fused silica and polyacrylamide-coated capillaries was assessed using mixtures of protein standards known to form non-covalent protein-protein and protein-ligand complexes. High-efficiency separation of native complexes is demonstrated using both capillary types, while the polyacrylamide neutral-coated capillary showed better reproducibility and higher efficiency for more complex samples. The platform was then evaluated for the determination of monoclonal antibody aggregation and for analysis of proteomes of limited complexity using a ribosomal isolate from E. coli. Native CZE-MS, using accurate single stage and tandem-MS measurements, enabled identification of proteoforms and non-covalent complexes at femtomole levels. This study demonstrates that native CZE-MS can serve as an orthogonal and complementary technique to conventional native MS methodologies with the advantages of low sample consumption, minimal sample processing and losses, and high throughput and sensitivity. This study presents a novel platform for analysis of ribosomes and other macromolecular complexes and organelles, with the potential for discovery of novel structural features defining cellular phenotypes (e.g., specialized ribosomes). [Figure not available: see fulltext.

  19. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    Science.gov (United States)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  20. Simultaneous analysis of qualitative parameters of solid fuel using complex neutron gamma method

    International Nuclear Information System (INIS)

    Dombrovskij, V.P.; Ajtsev, N.I.; Ryashchikov, V.I.; Frolov, V.K.

    1983-01-01

    A study was made on complex neutron gamma method for simultaneous analysis of carbon content, ash content and humidity of solid fuel according to gamma radiation of inelastic fast neutron scattering and radiation capture of thermal neutrons. Metrological characteristics of pulse and stationary neutron gamma methods for determination of qualitative solid fuel parameters were analyzed, taking coke breeze as an example. Optimal energy ranges of gamma radiation detection (2-8 MeV) were determined. The advantages of using pulse neutron generator for complex analysis of qualitative parameters of solid fuel in large masses were shown

  1. Training Revising Based Traversability Analysis of Complex Terrains for Mobile Robot

    Directory of Open Access Journals (Sweden)

    Rui Song

    2014-05-01

    Full Text Available Traversability analysis is one of the core issues in the autonomous navigation for mobile robots to identify the accessible area by the information of sensors on mobile robots. This paper proposed a model to analyze the traversability of complex terrains based on rough sets and training revising. The model described the traversability for mobile robots by traversability cost. Through the experiment, the paper gets the conclusion that traversability analysis model based on rough sets and training revising can be used where terrain features are rich and complex, can effectively handle the unstructured environment, and can provide reliable and effective decision rules in the autonomous navigation for mobile robots.

  2. Analysis of Dynamic Complexity of the Cyber Security Ecosystem of Colombia

    Directory of Open Access Journals (Sweden)

    Angélica Flórez

    2016-07-01

    Full Text Available This paper presents two proposals for the analysis of the complexity of the Cyber security Ecosystem of Colombia (CEC. This analysis shows the available knowledge about entities engaged in cyber security in Colombia and the relationships between them, which allow an understanding of the synergy between the different existing components. The complexity of the CEC is detailed from the view of the Influence Diagram of System Dynamics and the Domain Diagram of Software Engineering. The resulting model makes cyber security evident as a strategic component of national security.

  3. Bio-Signal Complexity Analysis in Epileptic Seizure Monitoring: A Topic Review

    Directory of Open Access Journals (Sweden)

    Zhenning Mei

    2018-05-01

    Full Text Available Complexity science has provided new perspectives and opportunities for understanding a variety of complex natural or social phenomena, including brain dysfunctions like epilepsy. By delving into the complexity in electrophysiological signals and neuroimaging, new insights have emerged. These discoveries have revealed that complexity is a fundamental aspect of physiological processes. The inherent nonlinearity and non-stationarity of physiological processes limits the methods based on simpler underlying assumptions to point out the pathway to a more comprehensive understanding of their behavior and relation with certain diseases. The perspective of complexity may benefit both the research and clinical practice through providing novel data analytics tools devoted for the understanding of and the intervention about epilepsies. This review aims to provide a sketchy overview of the methods derived from different disciplines lucubrating to the complexity of bio-signals in the field of epilepsy monitoring. Although the complexity of bio-signals is still not fully understood, bundles of new insights have been already obtained. Despite the promising results about epileptic seizure detection and prediction through offline analysis, we are still lacking robust, tried-and-true real-time applications. Multidisciplinary collaborations and more high-quality data accessible to the whole community are needed for reproducible research and the development of such applications.

  4. Generating functional analysis of complex formation and dissociation in large protein interaction networks

    International Nuclear Information System (INIS)

    Coolen, A C C; Rabello, S

    2009-01-01

    We analyze large systems of interacting proteins, using techniques from the non-equilibrium statistical mechanics of disordered many-particle systems. Apart from protein production and removal, the most relevant microscopic processes in the proteome are complex formation and dissociation, and the microscopic degrees of freedom are the evolving concentrations of unbound proteins (in multiple post-translational states) and of protein complexes. Here we only include dimer-complexes, for mathematical simplicity, and we draw the network that describes which proteins are reaction partners from an ensemble of random graphs with an arbitrary degree distribution. We show how generating functional analysis methods can be used successfully to derive closed equations for dynamical order parameters, representing an exact macroscopic description of the complex formation and dissociation dynamics in the infinite system limit. We end this paper with a discussion of the possible routes towards solving the nontrivial order parameter equations, either exactly (in specific limits) or approximately.

  5. Progress of studies on traditional chinese medicine based on complex network analysis

    Directory of Open Access Journals (Sweden)

    Qian-Ru Zhang

    2017-01-01

    Full Text Available Traditional Chinese medicine (TCM is a distinct medical system that deals with the life–health–disease–environment relationship using holistic, dynamic, and dialectical thinking. However, reductionism has often restricted the conventional studies on TCM, and these studies did not investigate the central concepts of TCM theory about the multiple relationships among life, health, disease, and environment. Complex network analysis describes a wide variety of complex systems in the real world, and it has the potential to bridge the gap between TCM and modern science owing to the holism of TCM theory. This article summarizes the current research involving TCM network analysis and highlights the computational tools and analysis methods involved in this research. Finally, to inspire a new approach, the article discussed the potential problems underlying the application of TCM network analysis.

  6. Multi-complexity ensemble measures for gait time series analysis: application to diagnostics, monitoring and biometrics.

    Science.gov (United States)

    Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina

    2015-01-01

    Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.

  7. Analysis on complex structure stability under different bar angle with BIM technology

    Directory of Open Access Journals (Sweden)

    Wang Xiongjue

    2016-03-01

    Full Text Available Sun Valley, the landmark building of World Expo in Shanghai, which has free surface with single-layer reticulated shell structure, is a typical complex structure. CAD/CAM integrated information system to design is used for the complex structure; however, it is a very rigorous process to be used widely. The relevant technology of the Sun Valley is not open to the public at present, so we try to use BIM technology to model the Sun Valley, including architecture modelling and structure analysis. By analysis of the Sun Valley structure using this method, it is proved that the problems in modelling may be solved by writing some script codes in Rhino software and the stability of the model can also be analyzed. The new approach is viable and effective in combination with different softwares such as Rhino, Revit, and Midas in solution of the complex shaped surfaces’ structure for modelling and calculation.

  8. Joint Intelligence Analysis Complex: DOD Needs to Fully Incorporate Best Practices into Future Cost Estimates

    Science.gov (United States)

    2016-11-01

    February 2015 Joint Intelligence Analysis Complex (JIAC) Cost Estimate Compared to Best Practices 34 Contents Page ii GAO-17-29...staff of the House Permanent Select Committee on Intelligence conducted a review of the JIAC consolidation and compared locating the JIAC at RAF...Committee on Intelligence conducted an evaluation of DOD’s decision to consolidate the JIAC at RAF Croughton and developed a business case analysis

  9. Beam model for seismic analysis of complex shear wall structure based on the strain energy equivalence

    International Nuclear Information System (INIS)

    Reddy, G.R.; Mahajan, S.C.; Suzuki, Kohei

    1997-01-01

    A nuclear reactor building structure consists of shear walls with complex geometry, beams and columns. The complexity of the structure is explained in the section Introduction. Seismic analysis of the complex reactor building structure using the continuum mechanics approach may produce good results but this method is very difficult to apply. Hence, the finite element approach is found to be an useful technique for solving the dynamic equations of the reactor building structure. In this approach, the model which uses finite elements such as brick, plate and shell elements may produce accurate results. However, this model also poses some difficulties which are explained in the section Modeling Techniques. Therefore, seismic analysis of complex structures is generally carried out using a lumped mass beam model. This model is preferred because of its simplicity and economy. Nevertheless, mathematical modeling of a shear wall structure as a beam requires specialized skill and a thorough understanding of the structure. For accurate seismic analysis, it is necessary to model more realistically the stiffness, mass and damping. In linear seismic analysis, modeling of the mass and damping may pose few problems compared to modeling the stiffness. When used to represent a complex structure, the stiffness of the beam is directly related to the shear wall section properties such as area, shear area and moment of inertia. Various beam models which are classified based on the method of stiffness evaluation are also explained under the section Modeling Techniques. In the section Case Studies the accuracy and simplicity of the beam models are explained. Among various beam models, the one which evaluates the stiffness using strain energy equivalence proves to be the simplest and most accurate method for modeling the complex shear wall structure. (author)

  10. Using multi-criteria analysis of simulation models to understand complex biological systems

    Science.gov (United States)

    Maureen C. Kennedy; E. David. Ford

    2011-01-01

    Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...

  11. Functional analytic methods in complex analysis and applications to partial differential equations

    International Nuclear Information System (INIS)

    Mshimba, A.S.A.; Tutschke, W.

    1990-01-01

    The volume contains 24 lectures given at the Workshop on Functional Analytic Methods in Complex Analysis and Applications to Partial Differential Equations held in Trieste, Italy, between 8-19 February 1988, at the ICTP. A separate abstract was prepared for each of these lectures. Refs and figs

  12. Complex and unstable simple elbow dislocations: a review and quantitative analysis of individual patient data

    NARCIS (Netherlands)

    de Haan, Jeroen; Schep, Niels; Tuinebreijer, Wim; den Hartog, Dennis

    2010-01-01

    The primary objective of this review of the literature with quantitative analysis of individual patient data was to identify the results of available treatments for complex elbow dislocations and unstable simple elbow dislocations. The secondary objective was to compare the results of patients with

  13. A new high-throughput LC-MS method for the analysis of complex fructan mixtures

    DEFF Research Database (Denmark)

    Verspreet, Joran; Hansen, Anders Holmgaard; Dornez, Emmie

    2014-01-01

    In this paper, a new liquid chromatography-mass spectrometry (LC-MS) method for the analysis of complex fructan mixtures is presented. In this method, columns with a trifunctional C18 alkyl stationary phase (T3) were used and their performance compared with that of a porous graphitized carbon (PGC...

  14. Complexities of sibling analysis when exposures and outcomes change with time and birth order

    NARCIS (Netherlands)

    Sudan, Madhuri; Kheifets, Leeka I.; Arah, Onyebuchi A.; Divan, Hozefa A.; Olsen, Jørn

    2014-01-01

    In this study, we demonstrate the complexities of performing a sibling analysis with a re-examination of associations between cell phone exposures and behavioral problems observed previously in the Danish National Birth Cohort. Children (52,680; including 5441 siblings) followed up to age 7 were

  15. Complexity and Intensionality in a Type-1 Framework for Computable Analysis

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2005-01-01

    This paper describes a type-1 framework for computable analysis designed to facilitate efficient implementations and discusses properties that have not been well studied before for type-1 approaches: the introduction of complexity measures for type-1 representations of real functions, and ways...

  16. Analysis and Control of Epidemics: A survey of spreading processes on complex networks

    OpenAIRE

    Nowzari, Cameron; Preciado, Victor M.; Pappas, George J.

    2015-01-01

    This article reviews and presents various solved and open problems in the development, analysis, and control of epidemic models. We are interested in presenting a relatively concise report for new engineers looking to enter the field of spreading processes on complex networks.

  17. International Conference on Finite or Infinite Dimensional Complex Analysis and Applications

    CERN Document Server

    Tutschke, W; Yang, C

    2004-01-01

    There is almost no field in Mathematics which does not use Mathe­ matical Analysis. Computer methods in Applied Mathematics, too, are often based on statements and procedures of Mathematical Analysis. An important part of Mathematical Analysis is Complex Analysis because it has many applications in various branches of Mathematics. Since the field of Complex Analysis and its applications is a focal point in the Vietnamese research programme, the Hanoi University of Technology organized an International Conference on Finite or Infinite Dimensional Complex Analysis and Applications which took place in Hanoi from August 8 - 12, 2001. This conference th was the 9 one in a series of conferences which take place alternately in China, Japan, Korea and Vietnam each year. The first one took place th at Pusan University in Korea in 1993. The preceding 8 conference was th held in Shandong in China in August 2000. The 9 conference of the was the first one which took place above mentioned series of conferences in Vietnam....

  18. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    Science.gov (United States)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  19. Meta-analysis, complexity, and heterogeneity: a qualitative interview study of researchers’ methodological values and practices

    Directory of Open Access Journals (Sweden)

    Theo Lorenc

    2016-11-01

    Full Text Available Abstract Background Complex or heterogeneous data pose challenges for systematic review and meta-analysis. In recent years, a number of new methods have been developed to meet these challenges. This qualitative interview study aimed to understand researchers’ understanding of complexity and heterogeneity and the factors which may influence the choices researchers make in synthesising complex data. Methods We conducted interviews with a purposive sample of researchers (N = 19 working in systematic review or meta-analysis across a range of disciplines. We analysed data thematically using a framework approach. Results Participants reported using a broader range of methods and data types in complex reviews than in traditional reviews. A range of techniques are used to explore heterogeneity, but there is some debate about their validity, particularly when applied post hoc. Conclusions Technical considerations of how to synthesise complex evidence cannot be isolated from questions of the goals and contexts of research. However, decisions about how to analyse data appear to be made in a largely informal way, drawing on tacit expertise, and their relation to these broader questions remains unclear.

  20. Meta-analysis, complexity, and heterogeneity: a qualitative interview study of researchers' methodological values and practices.

    Science.gov (United States)

    Lorenc, Theo; Felix, Lambert; Petticrew, Mark; Melendez-Torres, G J; Thomas, James; Thomas, Sian; O'Mara-Eves, Alison; Richardson, Michelle

    2016-11-16

    Complex or heterogeneous data pose challenges for systematic review and meta-analysis. In recent years, a number of new methods have been developed to meet these challenges. This qualitative interview study aimed to understand researchers' understanding of complexity and heterogeneity and the factors which may influence the choices researchers make in synthesising complex data. We conducted interviews with a purposive sample of researchers (N = 19) working in systematic review or meta-analysis across a range of disciplines. We analysed data thematically using a framework approach. Participants reported using a broader range of methods and data types in complex reviews than in traditional reviews. A range of techniques are used to explore heterogeneity, but there is some debate about their validity, particularly when applied post hoc. Technical considerations of how to synthesise complex evidence cannot be isolated from questions of the goals and contexts of research. However, decisions about how to analyse data appear to be made in a largely informal way, drawing on tacit expertise, and their relation to these broader questions remains unclear.

  1. A meta-analysis of crop pest and natural enemy response to landscape complexity.

    Science.gov (United States)

    Chaplin-Kramer, Rebecca; O'Rourke, Megan E; Blitzer, Eleanor J; Kremen, Claire

    2011-09-01

    Many studies in recent years have investigated the relationship between landscape complexity and pests, natural enemies and/or pest control. However, no quantitative synthesis of this literature beyond simple vote-count methods yet exists. We conducted a meta-analysis of 46 landscape-level studies, and found that natural enemies have a strong positive response to landscape complexity. Generalist enemies show consistent positive responses to landscape complexity across all scales measured, while specialist enemies respond more strongly to landscape complexity at smaller scales. Generalist enemy response to natural habitat also tends to occur at larger spatial scales than for specialist enemies, suggesting that land management strategies to enhance natural pest control should differ depending on whether the dominant enemies are generalists or specialists. The positive response of natural enemies does not necessarily translate into pest control, since pest abundances show no significant response to landscape complexity. Very few landscape-scale studies have estimated enemy impact on pest populations, however, limiting our understanding of the effects of landscape on pest control. We suggest focusing future research efforts on measuring population dynamics rather than static counts to better characterise the relationship between landscape complexity and pest control services from natural enemies. © 2011 Blackwell Publishing Ltd/CNRS.

  2. Multivariate Multi-Scale Permutation Entropy for Complexity Analysis of Alzheimer’s Disease EEG

    Directory of Open Access Journals (Sweden)

    Isabella Palamara

    2012-07-01

    Full Text Available An original multivariate multi-scale methodology for assessing the complexity of physiological signals is proposed. The technique is able to incorporate the simultaneous analysis of multi-channel data as a unique block within a multi-scale framework. The basic complexity measure is done by using Permutation Entropy, a methodology for time series processing based on ordinal analysis. Permutation Entropy is conceptually simple, structurally robust to noise and artifacts, computationally very fast, which is relevant for designing portable diagnostics. Since time series derived from biological systems show structures on multiple spatial-temporal scales, the proposed technique can be useful for other types of biomedical signal analysis. In this work, the possibility of distinguish among the brain states related to Alzheimer’s disease patients and Mild Cognitive Impaired subjects from normal healthy elderly is checked on a real, although quite limited, experimental database.

  3. Solution XAS Analysis for Exploring the Active Species in Homogeneous Vanadium Complex Catalysis

    Science.gov (United States)

    Nomura, Kotohiro; Mitsudome, Takato; Tsutsumi, Ken; Yamazoe, Seiji

    2018-06-01

    Selected examples in V K-edge X-ray Absorption Near Edge Structure (XANES) analysis of a series of vanadium complexes containing imido ligands (possessing metal-nitrogen double bond) in toluene solution have been introduced, and their pre-edge and the edge were affected by their structures and nature of ligands. Selected results in exploring the oxidation states of the active species in ethylene dimerization/polymerization using homogeneous vanadium catalysts [consisting of (imido)vanadium(V) complexes and Al cocatalysts] by X-ray absorption spectroscopy (XAS) analyses have been introduced. It has been demonstrated that the method should provide more clear information concerning the active species in situ, especially by combination with the other methods (NMR and ESR spectra, X-ray crystallographic analysis, and reaction chemistry), and should be powerful tool for study of catalysis mechanism as well as for the structural analysis in solution.

  4. On the complex analysis of the reliability, safety, and economic efficiency of atomic electric power stations

    International Nuclear Information System (INIS)

    Emel'yanov, I.Ya.; Klemin, A.I.; Polyakov, E.F.

    1977-01-01

    The problem is posed of effectively increasing the engineering performance of nuclear electric power stations (APS). The principal components of the engineering performance of modern large APS are considered: economic efficiency, radiation safety, reliability, and their interrelationship. A nomenclature is proposed for the quantitative indices which most completely characterize the enumerated properties and are convenient for the analysis of the engineering performance. The urgent problem of developing a methodology for the complex analysis and optimization of the principal performance components is considered; this methodology is designed to increase the efficiency of the work on high-performance competitive APS. The principle of complex optimization of the reliability, safety, and economic-efficiency indices is formulated; specific recommendations are made for the practical realization of this principle. The structure of the complex quantiative analysis of the enumerated performance components is given. The urgency and promise of the complex approach to solving the problem of APS optimization is demonstrated, i.e., the solution of the problem of creating optimally reliable, fairly safe, and maximally economically efficient stations

  5. Complex Signal Kurtosis and Independent Component Analysis for Wideband Radio Frequency Interference Detection

    Science.gov (United States)

    Schoenwald, Adam; Mohammed, Priscilla; Bradley, Damon; Piepmeier, Jeffrey; Wong, Englin; Gholian, Armen

    2016-01-01

    Radio-frequency interference (RFI) has negatively implicated scientific measurements across a wide variation passive remote sensing satellites. This has been observed in the L-band radiometers SMOS, Aquarius and more recently, SMAP [1, 2]. RFI has also been observed at higher frequencies such as K band [3]. Improvements in technology have allowed wider bandwidth digital back ends for passive microwave radiometry. A complex signal kurtosis radio frequency interference detector was developed to help identify corrupted measurements [4]. This work explores the use of ICA (Independent Component Analysis) as a blind source separation technique to pre-process radiometric signals for use with the previously developed real and complex signal kurtosis detectors.

  6. Analysis and Perspective from the Complex Aerospace Systems Exchange (CASE) 2013

    Science.gov (United States)

    Jones, Kennie H.; Parker, Peter A.; Detweiler, Kurt N.; McGowan, Anna-Maria R.; Dress, David A.; Kimmel, William M.

    2014-01-01

    NASA Langley Research Center embedded four rapporteurs at the Complex Aerospace Systems Exchange (CASE) held in August 2013 with the objective to capture the essence of the conference presentations and discussions. CASE was established to provide a discussion forum among chief engineers, program managers, and systems engineers on challenges in the engineering of complex aerospace systems. The meeting consists of invited presentations and panels from industry, academia, and government followed by discussions among attendees. This report presents the major and reoccurring themes captured throughout the meeting and provides analysis and insights to further the CASE mission.

  7. A symbolic dynamics approach for the complexity analysis of chaotic pseudo-random sequences

    International Nuclear Information System (INIS)

    Xiao Fanghong

    2004-01-01

    By considering a chaotic pseudo-random sequence as a symbolic sequence, authors present a symbolic dynamics approach for the complexity analysis of chaotic pseudo-random sequences. The method is applied to the cases of Logistic map and one-way coupled map lattice to demonstrate how it works, and a comparison is made between it and the approximate entropy method. The results show that this method is applicable to distinguish the complexities of different chaotic pseudo-random sequences, and it is superior to the approximate entropy method

  8. Analysis and modeling of complex data in behavioral and social sciences

    CERN Document Server

    Okada, Akinori; Ragozini, Giancarlo; Weihs, Claus

    2014-01-01

    This volume presents theoretical developments, applications and computational methods for the analysis and modeling in behavioral and social sciences where data are usually complex to explore and investigate. The challenging proposals provide a connection between statistical methodology and the social domain with particular attention to computational issues in order to effectively address complicated data analysis problems. The papers in this volume stem from contributions initially presented at the joint international meeting JCS-CLADAG held in Anacapri (Italy) where the Japanese Classification Society and the Classification and Data Analysis Group of the Italian Statistical Society had a stimulating scientific discussion and exchange.

  9. The composition-explicit distillation curve technique: Relating chemical analysis and physical properties of complex fluids.

    Science.gov (United States)

    Bruno, Thomas J; Ott, Lisa S; Lovestead, Tara M; Huber, Marcia L

    2010-04-16

    The analysis of complex fluids such as crude oils, fuels, vegetable oils and mixed waste streams poses significant challenges arising primarily from the multiplicity of components, the different properties of the components (polarity, polarizability, etc.) and matrix properties. We have recently introduced an analytical strategy that simplifies many of these analyses, and provides the added potential of linking compositional information with physical property information. This aspect can be used to facilitate equation of state development for the complex fluids. In addition to chemical characterization, the approach provides the ability to calculate thermodynamic properties for such complex heterogeneous streams. The technique is based on the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. The analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. By far, the most widely used analytical technique we have used with the ADC is gas chromatography. This has enabled us to study finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this special issue of the Journal of Chromatography, specifically dedicated to extraction technologies, we describe the essential features of the advanced distillation curve metrology as an analytical strategy for complex fluids. Published by Elsevier B.V.

  10. Intraparticulate Metal Speciation Analysis of Soft Complexing Nanoparticles. The Intrinsic Chemical Heterogeneity of Metal-Humic Acid Complexes

    DEFF Research Database (Denmark)

    Town, R. M.; van Leeuwen, Herman P.

    2016-01-01

    ion condensation potential for higher valency counterions within the intraparticulate double layer zone of the soft NP. The approach offers new insights into the intrinsic heterogeneity of the HA. complexes, as revealed by the intraparticulate speciation as a function of the true degree of inner......-sphere complexation, theta(M). The ensuing intrinsic heterogeneity parameters, Gamma, for CdHA and CuHA complexes are in very good agreement with those obtained from dynamic electrochemical stripping chronopotentiometric measurements. The overall intraparticulate metal ion speciation is found to depend on theta...

  11. Iodine K-edge EXAFS analysis of iodide ion-cyclodextrin inclusion complexes in aqueous solution

    International Nuclear Information System (INIS)

    Kaneko, T; Ueda, M; Nagamatsu, S; Konishi, T; Fujikawa, T; Mizumaki, M

    2009-01-01

    We study the structure of inclusion complexes of α-, β-, γ-cyclodextrin with mono-iodide ion in aqueous solution by means of iodine K-edge EXAFS spectroscopy. The analysis is based on the assumption that two kinds of iodide ions exist in KI-cyclodextrin aqueous solution i.e. hydrated mono-iodide ions and one-one mono-iodide-cyclodextrin inclusion complexes. In KI-α-cyclodextrin system, iodine K-edge EXAFS analyse show that the average coordination number of the oxygen atoms in water molecules in the first hydration shell decreases as the fraction of included ions increases. This result suggests that dehydration process accompanies the formation of the inclusion complex. This is not found in the case of β-cyclodextrin, indicating that in this case the iodide ions are included together with the whole first hydration shell.

  12. System Testability Analysis for Complex Electronic Devices Based on Multisignal Model

    International Nuclear Information System (INIS)

    Long, B; Tian, S L; Huang, J G

    2006-01-01

    It is necessary to consider the system testability problems for electronic devices during their early design phase because modern electronic devices become smaller and more compositive while their function and structure are more complex. Multisignal model, combining advantage of structure model and dependency model, is used to describe the fault dependency relationship for the complex electronic devices, and the main testability indexes (including optimal test program, fault detection rate, fault isolation rate, etc.) to evaluate testability and corresponding algorithms are given. The system testability analysis process is illustrated for USB-GPIB interface circuit with TEAMS toolbox. The experiment results show that the modelling method is simple, the computation speed is rapid and this method has important significance to improve diagnostic capability for complex electronic devices

  13. SACS2: Dynamic and Formal Safety Analysis Method for Complex Safety Critical System

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2009-01-01

    Fault tree analysis (FTA) is one of the most widely used safety analysis technique in the development of safety critical systems. However, over the years, several drawbacks of the conventional FTA have become apparent. One major drawback is that conventional FTA uses only static gates and hence can not capture dynamic behaviors of the complex system precisely. Although several attempts such as dynamic fault tree (DFT), PANDORA, formal fault tree (FFT) and so on, have been made to overcome this problem, they can not still do absolute or actual time modeling because they adapt relative time concept and can capture only sequential behaviors of the system. Second drawback of conventional FTA is its lack of rigorous semantics. Because it is informal in nature, safety analysis results heavily depend on an analyst's ability and are error-prone. Finally reasoning process which is to check whether basic events really cause top events is done manually and hence very labor-intensive and timeconsuming for the complex systems. In this paper, we propose a new safety analysis method for complex safety critical system in qualitative manner. We introduce several temporal gates based on timed computational tree logic (TCTL) which can represent quantitative notion of time. Then, we translate the information of the fault trees into UPPAAL query language and the reasoning process is automatically done by UPPAAL which is the model checker for time critical system

  14. Analysis of the quantitative dermatoglyphics of the digito-palmar complex in patients with multiple sclerosis.

    Science.gov (United States)

    Supe, S; Milicić, J; Pavićević, R

    1997-06-01

    Recent studies on the etiopathogenesis of multiple sclerosis (MS) all point out that there is a polygenetical predisposition for this illness. The so called "MS Trait" determines the reactivity of the immunological system upon ecological factors. The development of the glyphological science and the study of the characteristics of the digito-palmar dermatoglyphic complex (for which it was established that they are polygenetically determined characteristics) all enable a better insight into the genetic development during early embriogenesis. The aim of this study was to estimate certain differences in the dermatoglyphics of digito-palmar complexes between the group with multiple sclerosis and the comparable, phenotypically healthy groups of both sexes. This study is based on the analysis of 18 quantitative characteristics of the digito-palmar complex in 125 patients with multiple sclerosis (41 males and 84 females) in comparison to a group of 400 phenotypically healthy patients (200 males and 200 females). The conducted analysis pointed towards a statistically significant decrease of the number of digital and palmar ridges, as well as with lower values of atd angles in a group of MS patients of both sexes. The main discriminators were the characteristic palmar dermatoglyphics with the possibility that the discriminate analysis classifies over 80% of the examinees which exceeds the statistical significance. The results of this study suggest a possible discrimination of patients with MS and the phenotypically health population through the analysis of the dermatoglyphic status, and therefore the possibility that multiple sclerosis is genetically predisposed disease.

  15. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    Science.gov (United States)

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  16. 'Fractional recovery' analysis of a presynaptic synaptotagmin 1-anchored endocytic protein complex.

    Directory of Open Access Journals (Sweden)

    Rajesh Khanna

    Full Text Available BACKGROUND: The integral synaptic vesicle protein and putative calcium sensor, synaptotagmin 1 (STG, has also been implicated in synaptic vesicle (SV recovery. However, proteins with which STG interacts during SV endocytosis remain poorly understood. We have isolated an STG-associated endocytic complex (SAE from presynaptic nerve terminals and have used a novel fractional recovery (FR assay based on electrostatic dissociation to identify SAE components and map the complex structure. The location of SAE in the presynaptic terminal was determined by high-resolution quantitative immunocytochemistry at the chick ciliary ganglion giant calyx-type synapse. METHODOLOGY/PRINCIPLE FINDINGS: The first step in FR analysis was to immunoprecipitate (IP the complex with an antibody against one protein component (the IP-protein. The immobilized complex was then exposed to a high salt (1150 mM stress-test that caused shedding of co-immunoprecipitated proteins (co-IP-proteins. A Fractional Recovery ratio (FR: recovery after high salt/recovery with control salt as assayed by Western blot was calculated for each co-IP-protein. These FR values reflect complex structure since an easily dissociated protein, with a low FR value, cannot be intermediary between the IP-protein and a salt-resistant protein. The structure of the complex was mapped and a blueprint generated with a pair of FR analyses generated using two different IP-proteins. The blueprint of SAE contains an AP180/X/STG/stonin 2/intersectin/epsin core (X is unknown and epsin is hypothesized, and an AP2 adaptor, H-/L-clathrin coat and dynamin scission protein perimeter. Quantitative immunocytochemistry (ICA/ICQ method at an isolated calyx-type presynaptic terminal indicates that this complex is associated with STG at the presynaptic transmitter release face but not with STG on intracellular synaptic vesicles. CONCLUSIONS/SIGNIFICANCE: We hypothesize that the SAE serves as a recognition site and also as a

  17. Evidence for proposed ICD-11 PTSD and complex PTSD: a latent profile analysis

    Directory of Open Access Journals (Sweden)

    Marylène Cloitre

    2013-05-01

    Full Text Available Background: The WHO International Classification of Diseases, 11th version (ICD-11, has proposed two related diagnoses, posttraumatic stress disorder (PTSD and complex PTSD within the spectrum of trauma and stress-related disorders. Objective: To use latent profile analysis (LPA to determine whether there are classes of individuals that are distinguishable according to the PTSD and complex PTSD symptom profiles and to identify potential differences in the type of stressor and severity of impairment associated with each profile. Method: An LPA and related analyses were conducted on 302 individuals who had sought treatment for interpersonal traumas ranging from chronic trauma (e.g., childhood abuse to single-incident events (e.g., exposure to 9/11 attacks. Results: The LPA revealed three classes of individuals: (1 a complex PTSD class defined by elevated PTSD symptoms as well as disturbances in three domains of self-organization: affective dysregulation, negative self-concept, and interpersonal problems; (2 a PTSD class defined by elevated PTSD symptoms but low scores on the three self-organization symptom domains; and (3 a low symptom class defined by low scores on all symptoms and problems. Chronic trauma was more strongly predictive of complex PTSD than PTSD and, conversely, single-event trauma was more strongly predictive of PTSD. In addition, complex PTSD was associated with greater impairment than PTSD. The LPA analysis was completed both with and without individuals with borderline personality disorder (BPD yielding identical results, suggesting the stability of these classes regardless of BPD comorbidity. Conclusion: Preliminary data support the proposed ICD-11 distinction between PTSD and complex PTSD and support the value of testing the clinical utility of this distinction in field trials. Replication of results is necessary.For the abstract or full text in other languages, please see Supplementary files under Article Tools online

  18. Hurst Exponent Analysis of Resting-State fMRI Signal Complexity across the Adult Lifespan

    Directory of Open Access Journals (Sweden)

    Jianxin Dong

    2018-02-01

    Full Text Available Exploring functional information among various brain regions across time enables understanding of healthy aging process and holds great promise for age-related brain disease diagnosis. This paper proposed a method to explore fractal complexity of the resting-state functional magnetic resonance imaging (rs-fMRI signal in the human brain across the adult lifespan using Hurst exponent (HE. We took advantage of the examined rs-fMRI data from 116 adults 19 to 85 years of age (44.3 ± 19.4 years, 49 females from NKI/Rockland sample. Region-wise and voxel-wise analyses were performed to investigate the effects of age, gender, and their interaction on complexity. In region-wise analysis, we found that the healthy aging is accompanied by a loss of complexity in frontal and parietal lobe and increased complexity in insula, limbic, and temporal lobe. Meanwhile, differences in HE between genders were found to be significant in parietal lobe (p = 0.04, corrected. However, there was no interaction between gender and age. In voxel-wise analysis, the significant complexity decrease with aging was found in frontal and parietal lobe, and complexity increase was found in insula, limbic lobe, occipital lobe, and temporal lobe with aging. Meanwhile, differences in HE between genders were found to be significant in frontal, parietal, and limbic lobe. Furthermore, we found age and sex interaction in right parahippocampal gyrus (p = 0.04, corrected. Our findings reveal HE variations of the rs-fMRI signal across the human adult lifespan and show that HE may serve as a new parameter to assess healthy aging process.

  19. Design Analysis Method for Multidisciplinary Complex Product using SysML

    Directory of Open Access Journals (Sweden)

    Liu Jihong

    2017-01-01

    Full Text Available In the design of multidisciplinary complex products, model-based systems engineering methods are widely used. However, the methodologies only contain only modeling order and simple analysis steps, and lack integrated design analysis methods supporting the whole process. In order to solve the problem, a conceptual design analysis method with integrating modern design methods has been proposed. First, based on the requirement analysis of the quantization matrix, the user’s needs are quantitatively evaluated and translated to system requirements. Then, by the function decomposition of the function knowledge base, the total function is semi-automatically decomposed into the predefined atomic function. The function is matched into predefined structure through the behaviour layer using function-structure mapping based on the interface matching. Finally based on design structure matrix (DSM, the structure reorganization is completed. The process of analysis is implemented with SysML, and illustrated through an aircraft air conditioning system for the system validation.

  20. Managing Complexity in Evidence Analysis: A Worked Example in Pediatric Weight Management.

    Science.gov (United States)

    Parrott, James Scott; Henry, Beverly; Thompson, Kyle L; Ziegler, Jane; Handu, Deepa

    2018-05-02

    Nutrition interventions are often complex and multicomponent. Typical approaches to meta-analyses that focus on individual causal relationships to provide guideline recommendations are not sufficient to capture this complexity. The objective of this study is to describe the method of meta-analysis used for the Pediatric Weight Management (PWM) Guidelines update and provide a worked example that can be applied in other areas of dietetics practice. The effects of PWM interventions were examined for body mass index (BMI), body mass index z-score (BMIZ), and waist circumference at four different time periods. For intervention-level effects, intervention types were identified empirically using multiple correspondence analysis paired with cluster analysis. Pooled effects of identified types were examined using random effects meta-analysis models. Differences in effects among types were examined using meta-regression. Context-level effects are examined using qualitative comparative analysis. Three distinct types (or families) of PWM interventions were identified: medical nutrition, behavioral, and missing components. Medical nutrition and behavioral types showed statistically significant improvements in BMIZ across all time points. Results were less consistent for BMI and waist circumference, although four distinct patterns of weight status change were identified. These varied by intervention type as well as outcome measure. Meta-regression indicated statistically significant differences between the medical nutrition and behavioral types vs the missing component type for both BMIZ and BMI, although the pattern varied by time period and intervention type. Qualitative comparative analysis identified distinct configurations of context characteristics at each time point that were consistent with positive outcomes among the intervention types. Although analysis of individual causal relationships is invaluable, this approach is inadequate to capture the complexity of dietetics

  1. Multi-scale complexity analysis of muscle coactivation during gait in children with cerebral palsy

    Directory of Open Access Journals (Sweden)

    Wen eTao

    2015-07-01

    Full Text Available The objective of this study is to characterize complexity of lower-extremity muscle coactivation and coordination during gait in children with cerebral palsy (CP, children with typical development (TD and healthy adults, by applying recently developed multivariate multi-scale entropy (MMSE analysis to surface EMG signals. Eleven CP children (CP group, eight TD children and seven healthy adults (consider as an entire control group were asked to walk while surface EMG signals were collected from 5 thigh muscles and 3 lower leg muscles on each leg (16 EMG channels in total. The 16-channel surface EMG data, recorded during a series of consecutive gait cycles, were simultaneously processed by multivariate empirical mode decomposition (MEMD, to generate fully aligned data scales for subsequent MMSE analysis. In order to conduct extensive examination of muscle coactivation complexity using the MEMD-enhanced MMSE, 14 data analysis schemes were designed by varying partial muscle combinations and time durations of data segments. Both TD children and healthy adults showed almost consistent MMSE curves over multiple scales for all the 14 schemes, without any significant difference (p > 0.09. However, quite diversity in MMSE curve was observed in the CP group when compared with those in the control group. There appears to be diverse neuropathological processes in CP that may affect dynamical complexity of muscle coactivation and coordination during gait. The abnormal complexity patterns emerging in CP group can be attributed to different factors such as motor control impairments, loss of muscle couplings, and spasticity or paralysis in individual muscles. All these findings expand our knowledge of neuropathology of CP from a novel point of view of muscle co-activation complexity, also indicating the potential to derive a quantitative index for assessing muscle activation characteristics as well as motor function in CP.

  2. Data Analysis with the Morse-Smale Complex: The msr Package for R

    KAUST Repository

    Gerber, Samuel

    2012-01-01

    In many areas, scientists deal with increasingly high-dimensional data sets. An important aspect for these scientists is to gain a qualitative understanding of the process or system from which the data is gathered. Often, both input variables and an outcome are observed and the data can be characterized as a sample from a high-dimensional scalar function. This work presents the R package msr for exploratory data analysis of multivariate scalar functions based on the Morse-Smale complex. The Morse-Smale complex provides a topologically meaningful decomposition of the domain. The msr package implements a discrete approximation of the Morse-Smale complex for data sets. In previous work this approximation has been exploited for visualization and partition-based regression, which are both supported in the msr package. The visualization combines the Morse-Smale complex with dimension-reduction techniques for a visual summary representation that serves as a guide for interactive exploration of the high-dimensional function. In a similar fashion, the regression employs a combination of linear models based on the Morse-Smale decomposition of the domain. This regression approach yields topologically accurate estimates and facilitates interpretation of general trends and statistical comparisons between partitions. In this manner, the msr package supports high-dimensional data understanding and exploration through the Morse-Smale complex.

  3. Analysis of the landscape complexity and heterogeneity of the Pantanal wetland.

    Science.gov (United States)

    Miranda, C S; Gamarra, R M; Mioto, C L; Silva, N M; Conceição Filho, A P; Pott, A

    2018-05-01

    This is the first report on analysis of habitat complexity and heterogeneity of the Pantanal wetland. The Pantanal encompasses a peculiar mosaic of environments, being important to evaluate and monitor this area concerning conservation of biodiversity. Our objective was to indirectly measure the habitat complexity and heterogeneity of the mosaic forming the sub-regions of the Pantanal, by means of remote sensing. We obtained free images of Normalized Difference Vegetation Index (NDVI) from the sensor MODIS and calculated the mean value (complexity) and standard deviation (heterogeneity) for each sub-region in the years 2000, 2008 and 2015. The sub-regions of Poconé, Canoeira, Paraguai and Aquidauana presented the highest values of complexity (mean NDVI), between 0.69 and 0.64 in the evaluated years. The highest horizontal heterogeneity (NDVI standard deviation) was observed in the sub-region of Tuiuiú, with values of 0.19 in the years 2000 and 2015, and 0.21 in the year 2008. We concluded that the use of NDVI to estimate landscape parameters is an efficient tool for assessment and monitoring of the complexity and heterogeneity of the Pantanal habitats, applicable in other regions.

  4. [download] (941Data Analysis with the Morse-Smale Complex: The msr Package for R

    Directory of Open Access Journals (Sweden)

    Samuel Gerber

    2012-07-01

    Full Text Available In many areas, scientists deal with increasingly high-dimensional data sets. An important aspect for these scientists is to gain a qualitative understanding of the process or system from which the data is gathered. Often, both input variables and an outcome are observed and the data can be characterized as a sample from a high-dimensional scalar function. This work presents the R package msr for exploratory data analysis of multivariate scalar functions based on the Morse-Smale complex. The Morse-Smale complex provides a topologically meaningful decomposition of the domain. The msr package implements a discrete approximation of the Morse-Smale complex for data sets. In previous work this approximation has been exploited for visualization and partition-based regression, which are both supported in the msr package. The visualization combines the Morse-Smale complex with dimension-reduction techniques for a visual summary representation that serves as a guide for interactive exploration of the high-dimensional function. In a similar fashion, the regression employs a combination of linear models based on the Morse-Smale decomposition of the domain. This regression approach yields topologically accurate estimates and facilitates interpretation of general trends and statistical comparisons between partitions. In this manner, the msr package supports high-dimensional data understanding and exploration through the Morse-Smale complex.

  5. Does complexity matter? Meta-analysis of learner performance in artificial grammar tasks.

    Science.gov (United States)

    Schiff, Rachel; Katan, Pesia

    2014-01-01

    Complexity has been shown to affect performance on artificial grammar learning (AGL) tasks (categorization of test items as grammatical/ungrammatical according to the implicitly trained grammar rules). However, previously published AGL experiments did not utilize consistent measures to investigate the comprehensive effect of grammar complexity on task performance. The present study focused on computerizing Bollt and Jones's (2000) technique of calculating topological entropy (TE), a quantitative measure of AGL charts' complexity, with the aim of examining associations between grammar systems' TE and learners' AGL task performance. We surveyed the literature and identified 56 previous AGL experiments based on 10 different grammars that met the sampling criteria. Using the automated matrix-lift-action method, we assigned a TE value for each of these 10 previously used AGL systems and examined its correlation with learners' task performance. The meta-regression analysis showed a significant correlation, demonstrating that the complexity effect transcended the different settings and conditions in which the categorization task was performed. The results reinforced the importance of using this new automated tool to uniformly measure grammar systems' complexity when experimenting with and evaluating the findings of AGL studies.

  6. Nonlinear analysis of gas-water/oil-water two-phase flow in complex networks

    CERN Document Server

    Gao, Zhong-Ke; Wang, Wen-Xu

    2014-01-01

    Understanding the dynamics of multi-phase flows has been a challenge in the fields of nonlinear dynamics and fluid mechanics. This chapter reviews our work on two-phase flow dynamics in combination with complex network theory. We systematically carried out gas-water/oil-water two-phase flow experiments for measuring the time series of flow signals which is studied in terms of the mapping from time series to complex networks. Three network mapping methods were proposed for the analysis and identification of flow patterns, i.e. Flow Pattern Complex Network (FPCN), Fluid Dynamic Complex Network (FDCN) and Fluid Structure Complex Network (FSCN). Through detecting the community structure of FPCN based on K-means clustering, distinct flow patterns can be successfully distinguished and identified. A number of FDCN’s under different flow conditions were constructed in order to reveal the dynamical characteristics of two-phase flows. The FDCNs exhibit universal power-law degree distributions. The power-law exponent ...

  7. [Ultraviolet-visible spectrometry analysis of insoluble xanthate heavy metal complexes].

    Science.gov (United States)

    Qiu, Bo; Liu, Jin-Feng; Liu, Yao-Chi; Yang, Zhao-Guang; Li, Hai-Pu

    2014-11-01

    A ultraviolet-visible spectrometry method of determining insoluble xanthate heavy metal complexes in flotation wastewater was the first time to be put forward. In this work, the changes of ultraviolet-visible spectra of xanthate solution after the addition of various heavy metal ions were investigated firstly. It was found that Pb2+ and Cu2+ can form insoluble complexes with xanthate, while Fe2+, Zn2+ and Mn2+ have little effect on the ultraviolet absorption of xanthate solution. Then the removal efficiencies of filter membrane with different pore sizes were compared, and the 0.22 μm membrane was found to be effective to separate copper xanthate or lead xanthate from the filtrate. Furthermore, the results of the study on the reaction of sodium sulfide and insoluble xanthate heavy metal complexes showed that S(2-) can release the xanthate ion quantitatively from insoluble complexes to solution. Based on the above research, it was concluded that the amount of insoluble xanthate heavy metal complexes in water samples can be obtained through the increase of free xanthate in the filtrate after the addition of sodium sulfide. Finally, the feasibility of this method was verified by the application to the analysis of flotation wastewater from three ore-dressing plants in the Thirty-six Coves in Chenzhou.

  8. The use of network analysis to study complex animal communication systems: a study on nightingale song.

    Science.gov (United States)

    Weiss, Michael; Hultsch, Henrike; Adam, Iris; Scharff, Constance; Kipper, Silke

    2014-06-22

    The singing of song birds can form complex signal systems comprised of numerous subunits sung with distinct combinatorial properties that have been described as syntax-like. This complexity has inspired inquiries into similarities of bird song to human language; but the quantitative analysis and description of song sequences is a challenging task. In this study, we analysed song sequences of common nightingales (Luscinia megarhynchos) by means of a network analysis. We translated long nocturnal song sequences into networks of song types with song transitions as connectors. As network measures, we calculated shortest path length and transitivity and identified the 'small-world' character of nightingale song networks. Besides comparing network measures with conventional measures of song complexity, we also found a correlation between network measures and age of birds. Furthermore, we determined the numbers of in-coming and out-going edges of each song type, characterizing transition patterns. These transition patterns were shared across males for certain song types. Playbacks with different transition patterns provided first evidence that these patterns are responded to differently and thus play a role in singing interactions. We discuss potential functions of the network properties of song sequences in the framework of vocal leadership. Network approaches provide biologically meaningful parameters to describe the song structure of species with extremely large repertoires and complex rules of song retrieval.

  9. Stein manifolds and holomorphic mappings the homotopy principle in complex analysis

    CERN Document Server

    Forstnerič, Franc

    2017-01-01

    This book, now in a carefully revised second edition, provides an up-to-date account of Oka theory, including the classical Oka-Grauert theory and the wide array of applications to the geometry of Stein manifolds. Oka theory is the field of complex analysis dealing with global problems on Stein manifolds which admit analytic solutions in the absence of topological obstructions. The exposition in the present volume focuses on the notion of an Oka manifold introduced by the author in 2009. It explores connections with elliptic complex geometry initiated by Gromov in 1989, with the Andersén-Lempert theory of holomorphic automorphisms of complex Euclidean spaces and of Stein manifolds with the density property, and with topological methods such as homotopy theory and the Seiberg-Witten theory. Researchers and graduate students interested in the homotopy principle in complex analysis will find this book particularly useful. It is currently the only work that offers a comprehensive introduction to both the Oka t...

  10. Developing a framework for qualitative engineering: Research in design and analysis of complex structural systems

    Science.gov (United States)

    Franck, Bruno M.

    1990-01-01

    The research is focused on automating the evaluation of complex structural systems, whether for the design of a new system or the analysis of an existing one, by developing new structural analysis techniques based on qualitative reasoning. The problem is to identify and better understand: (1) the requirements for the automation of design, and (2) the qualitative reasoning associated with the conceptual development of a complex system. The long-term objective is to develop an integrated design-risk assessment environment for the evaluation of complex structural systems. The scope of this short presentation is to describe the design and cognition components of the research. Design has received special attention in cognitive science because it is now identified as a problem solving activity that is different from other information processing tasks (1). Before an attempt can be made to automate design, a thorough understanding of the underlying design theory and methodology is needed, since the design process is, in many cases, multi-disciplinary, complex in size and motivation, and uses various reasoning processes involving different kinds of knowledge in ways which vary from one context to another. The objective is to unify all the various types of knowledge under one framework of cognition. This presentation focuses on the cognitive science framework that we are using to represent the knowledge aspects associated with the human mind's abstraction abilities and how we apply it to the engineering knowledge and engineering reasoning in design.

  11. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    Science.gov (United States)

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and

  12. Mobility and Position Error Analysis of a Complex Planar Mechanism with Redundant Constraints

    Science.gov (United States)

    Sun, Qipeng; Li, Gangyan

    2018-03-01

    Nowadays mechanisms with redundant constraints have been created and attracted much attention for their merits. The mechanism of the redundant constraints in a mechanical system is analyzed in this paper. A analysis method of Planar Linkage with a repetitive structure is proposed to get the number and type of constraints. According to the difference of applications and constraint characteristics, the redundant constraints are divided into the theoretical planar redundant constraints and the space-planar redundant constraints. And the calculation formula for the number of redundant constraints and type of judging method are carried out. And a complex mechanism with redundant constraints is analyzed of the influence about redundant constraints on mechanical performance. With the combination of theoretical derivation and simulation research, a mechanism analysis method is put forward about the position error of complex mechanism with redundant constraints. It points out the direction on how to eliminate or reduce the influence of redundant constraints.

  13. Use of a russian software and hardware complex for quantitative analysis of coronary angiograms

    International Nuclear Information System (INIS)

    Savchenko, A.P.; Pavlov, N.A.; Myasnikova, A.L.

    1996-01-01

    The software and hardware complex developed by the Cardiology Research Center, Russian Academy of Medical Sciences, jointly with the Technomash Research Production Association on the basis of a IBM 386DX personal computer equipped with a VS-100 video controller and a DS P31 VS signal processor board. Testing has indicated that it provides a qualitative image and a quantitative analysis both of phantoms and real images of coronarograms, but more accurately in the analysis of the image obtained from a film projector. Clinical tests have shown that the software and hardware complex may yield a rather qualitative image and calculate the required diameter of a vessel, virtually without prolonging the time of intervention. 4 refs.; 3 figs. 1 tab

  14. Analysis of a hysteresis motor on asynchronous speed using complex permeability

    International Nuclear Information System (INIS)

    Horii, T.; Yuge, N.; Wakui, G.

    1994-01-01

    Although hysteresis motors have a comparatively small output for their mechanical dimensions compared with other types of motor, they offer the advantages of extremely low vibration and noise levels, and so are widely used as driving motors in acoustic equipment and uranium gas centrifuges. This paper deals with a method for determining the complex permeability in analysis of hysteresis motors. The method assumes that the magnetic intensity distribution is sinusoidal in the direction of rotation. Analysis of the asynchronous speed of a hysteresis motor is then performed for cylindrical coordinates, using modified Bessel functions. The results of calculations are in good agreement with experimental results, confirming the effectiveness of the proposed model and method for determining the complex permeability

  15. A pragmatic approach to including complex natural modes of vibration in aeroelastic analysis

    CSIR Research Space (South Africa)

    Van Zyl, Lourens H

    2015-09-01

    Full Text Available complex natural modes of vibration in aeroelastic analysis Louw van Zyl International Aerospace Symposium of South Africa 14 to 16 September, 2015 Stellenbosch, South Africa Slide 2 © CSIR 2006 www.csir.co.za Problem statement..., the square of the angular frequencies in radians per second) [ ]{ } [ ]{ } [ ]{ } { }fxKxCxM =++ &&& [ ]{ } [ ]{ } 0=+ xKxMs2 Slide 4 © CSIR 2006 www.csir.co.za Structural Dynamics (continued) • The corresponding eigenvectors are real...

  16. Regular Functions with Values in Ternary Number System on the Complex Clifford Analysis

    Directory of Open Access Journals (Sweden)

    Ji Eun Kim

    2013-01-01

    Full Text Available We define a new modified basis i^ which is an association of two bases, e1 and e2. We give an expression of the form z=x0+ i ^z0-, where x0 is a real number and z0- is a complex number on three-dimensional real skew field. And we research the properties of regular functions with values in ternary field and reduced quaternions by Clifford analysis.

  17. Fast-track to a solid dispersion formulation using multi-way analysis of complex interactions

    DEFF Research Database (Denmark)

    Wu, Jian-Xiong; Den Berg, Frans Van; Søgaard, Søren Vinter

    2013-01-01

    Several factors with complex interactions influence the physical stability of solid dispersions, thus highlighting the need for efficient experimental design together with robust and simple multivariate model. Design of Experiments together with ANalysis Of VAriance (ANOVA) model is one of the ce.......g., an entire spectral data set), model uniqueness, and curve resolution abilities. © 2012 Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 102:904-914, 2013....

  18. Complexities of sibling analysis when exposures and outcomes change with time and birth order

    OpenAIRE

    Sudan, M; Kheifets, LI; Arah, OA; Divan, HA; Olsen, J

    2014-01-01

    In this study, we demonstrate the complexities of performing a sibling analysis with a re-examination of associations between cell phone exposures and behavioral problems observed previously in the Danish National Birth Cohort. Children (52,680; including 5441 siblings) followed up to age 7 were included. We examined differences in exposures and behavioral problems between siblings and non-siblings and by birth order and birth year. We estimated associations between cell phone exposures and b...

  19. Fatigue analysis of assembled marine floating platform for special purposes under complex water environments

    Science.gov (United States)

    Ma, Guang-ying; Yao, Yun-long

    2018-03-01

    In this paper, the fatigue lives of a new type of assembled marine floating platform for special purposes were studied. Firstly, by using ANSYS AQWA software, the hydrodynamic model of the platform was established. Secondly, the structural stresses under alternating change loads were calculated under complex water environments, such as wind, wave, current and ice. The minimum fatigue lives were obtained under different working conditions. The analysis results showed that the fatigue life of the platform structure can meet the requirements

  20. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  1. Atmospheric pressure chemical ionization Fourier transform ion cyclotron resonance mass spectrometry for complex thiophenic mixture analysis

    KAUST Repository

    Hourani, Nadim

    2013-10-01

    Rationale Polycyclic aromatic sulfur heterocycles (PASHs) are detrimental species for refining processes in petroleum industry. Current mass spectrometric Methods that determine their composition are often preceded by derivatization and dopant addition approaches. Different ionization Methods have different impact on the molecular assignment of complex PASHs. The analysis of such species under atmospheric pressure chemical ionization (APCI) is still considered limited due to uncontrolled ion generation with low- and high-mass PASHs. Methods The ionization behavior of a model mixture of five selected PASH standards was investigated using an APCI source with nitrogen as the reagent gas. A complex thiophenic fraction was separated from a vacuum gas oil (VGO) and injected using the same method. The samples were analyzed using Fourier transform ion cyclotron resonance mass spectrometry (FTICR MS). RESULTS PASH model analytes were successfully ionized and mainly [M + H]+ ions were produced. The same ionization pattern was observed for the real thiophenic sample. It was found that S1 class species were the major sulfur-containing species found in the VGO sample. These species indicated the presence of alkylated benzothiophenic (BT), dibenzothiophenic (DBT) and benzonaphthothiophenic (BNT) series that were detected by APCI-FTICR MS. CONCLUSIONS This study provides an established APCI-FTICR MS method for the analysis of complex PASHs. PASHs were detected without using any derivatization and without fragmentation. The method can be used for the analysis of S-containing crude oil samples. © 2013 John Wiley & Sons, Ltd.

  2. Fuzzy approximate entropy analysis of chaotic and natural complex systems: detecting muscle fatigue using electromyography signals.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Jing-Yi; Zheng, Yong-Ping

    2010-04-01

    In the present contribution, a complexity measure is proposed to assess surface electromyography (EMG) in the study of muscle fatigue during sustained, isometric muscle contractions. Approximate entropy (ApEn) is believed to provide quantitative information about the complexity of experimental data that is often corrupted with noise, short data length, and in many cases, has inherent dynamics that exhibit both deterministic and stochastic behaviors. We developed an improved ApEn measure, i.e., fuzzy approximate entropy (fApEn), which utilizes the fuzzy membership function to define the vectors' similarity. Tests were conducted on independent, identically distributed (i.i.d.) Gaussian and uniform noises, a chirp signal, MIX processes, Rossler equation, and Henon map. Compared with the standard ApEn, the fApEn showed better monotonicity, relative consistency, and more robustness to noise when characterizing signals with different complexities. Performance analysis on experimental EMG signals demonstrated that the fApEn significantly decreased during the development of muscle fatigue, which is a similar trend to that of the mean frequency (MNF) of the EMG signal, while the standard ApEn failed to detect this change. Moreover, fApEn of EMG demonstrated a better robustness to the length of the analysis window in comparison with the MNF of EMG. The results suggest that the fApEn of an EMG signal may potentially become a new reliable method for muscle fatigue assessment and be applicable to other short noisy physiological signal analysis.

  3. Integrative analysis for finding genes and networks involved in diabetes and other complex diseases

    DEFF Research Database (Denmark)

    Bergholdt, R.; Størling, Zenia, Marian; Hansen, Kasper Lage

    2007-01-01

    We have developed an integrative analysis method combining genetic interactions, identified using type 1 diabetes genome scan data, and a high-confidence human protein interaction network. Resulting networks were ranked by the significance of the enrichment of proteins from interacting regions. We...... identified a number of new protein network modules and novel candidate genes/proteins for type 1 diabetes. We propose this type of integrative analysis as a general method for the elucidation of genes and networks involved in diabetes and other complex diseases....

  4. Foundations of complex analysis in non locally convex spaces function theory without convexity condition

    CERN Document Server

    Bayoumi, A

    2003-01-01

    All the existing books in Infinite Dimensional Complex Analysis focus on the problems of locally convex spaces. However, the theory without convexity condition is covered for the first time in this book. This shows that we are really working with a new, important and interesting field. Theory of functions and nonlinear analysis problems are widespread in the mathematical modeling of real world systems in a very broad range of applications. During the past three decades many new results from the author have helped to solve multiextreme problems arising from important situations, non-convex and

  5. The instrumental neutron-activation analysis of granites from the Bushveld Complex

    International Nuclear Information System (INIS)

    Watterson, J.I.W.

    1978-01-01

    Three methods of instrumental neutron-activation analysis, 14MeV, reactor thermal, and reactor epithermal, are compared for the analysis of granites form the Bushveld Complex. A total of 34 elements can be determined in the granites by these methods. Samples from the Zaaiplaats area were analysed by thermal neutron activation, and 22 elements were determined in all of them. These elements were used to distinguish between the mineralized Bobbejaankop and Lease granites and the Main granite by the use of multivariate statistics. The Bobbejaankop granite appears as a more-differentaited rock carrying greater amounts of the incompatible elements than does the Main granite [af

  6. Ensemble Sensitivity Analysis of a Severe Downslope Windstorm in Complex Terrain: Implications for Forecast Predictability Scales and Targeted Observing Networks

    Science.gov (United States)

    2013-09-01

    observations, linear regression finds the straight line that explains the linear relationship of the sample. This line is given by the equation y = mx + b...SENSITIVITY ANALYSIS OF A SEVERE DOWNSLOPE WINDSTORM IN COMPLEX TERRAIN: IMPLICATIONS FOR FORECAST PREDICTABILITY SCALES AND TARGETED OBSERVING...SENSITIVITY ANALYSIS OF A SEVERE DOWNSLOPE WINDSTORM IN COMPLEX TERRAIN: IMPLICATIONS FOR FORECAST PREDICTABILITY SCALES AND TARGETED OBSERVING NETWORKS

  7. Analysis of Instantaneous Linear, Nonlinear and Complex Cardiovascular Dynamics from Videophotoplethysmography.

    Science.gov (United States)

    Valenza, Gaetano; Iozzia, Luca; Cerina, Luca; Mainardi, Luca; Barbieri, Riccardo

    2018-05-01

    There is a fast growing interest in the use of non-contact devices for health and performance assessment in humans. In particular, the use of non-contact videophotoplethysmography (vPPG) has been recently demonstrated as a feasible way to extract cardiovascular information. Nevertheless, proper validation of vPPG-derived heartbeat dynamics is still missing. We aim to an in-depth validation of time-varying, linear and nonlinear/complex dynamics of the pulse rate variability extracted from vPPG. We apply inhomogeneous pointprocess nonlinear models to assess instantaneous measures defined in the time, frequency, and bispectral domains as estimated through vPPG and standard ECG. Instantaneous complexity measures, such as the instantaneous Lyapunov exponents and the recently defined inhomogeneous point-process approximate and sample entropy, were estimated as well. Video recordings were processed using our recently proposed method based on zerophase principal component analysis. Experimental data were gathered from 60 young healthy subjects (age: 24±3 years) undergoing postural changes (rest-to-stand maneuver). Group averaged results show that there is an overall agreement between linear and nonlinear/complexity indices computed from ECG and vPPG during resting state conditions. However, important differences are found, particularly in the bispectral and complexity domains, in recordings where the subjects has been instructed to stand up. Although significant differences exist between cardiovascular estimates from vPPG and ECG, it is very promising that instantaneous sympathovagal changes, as well as time-varying complex dynamics, were correctly identified, especially during resting state. In addition to a further improvement of the video signal quality, more research is advocated towards a more precise estimation of cardiovascular dynamics by a comprehensive nonlinear/complex paradigm specifically tailored to the non-contact quantification. Schattauer GmbH.

  8. Electroencephalogram complexity analysis in children with attention-deficit/hyperactivity disorder during a visual cognitive task.

    Science.gov (United States)

    Zarafshan, Hadi; Khaleghi, Ali; Mohammadi, Mohammad Reza; Moeini, Mahdi; Malmir, Nastaran

    2016-01-01

    The aim of this study was to investigate electroencephalogram (EEG) dynamics using complexity analysis in children with attention-deficit/hyperactivity disorder (ADHD) compared with healthy control children when performing a cognitive task. Thirty 7-12-year-old children meeting Diagnostic and Statistical Manual of Mental Disorders-Fifth Edition (DSM-5) criteria for ADHD and 30 healthy control children underwent an EEG evaluation during a cognitive task, and Lempel-Ziv complexity (LZC) values were computed. There were no significant differences between ADHD and control groups on age and gender. The mean LZC of the ADHD children was significantly larger than healthy children over the right anterior and right posterior regions during the cognitive performance. In the ADHD group, complexity of the right hemisphere was higher than that of the left hemisphere, but the complexity of the left hemisphere was higher than that of the right hemisphere in the normal group. Although fronto-striatal dysfunction is considered conclusive evidence for the pathophysiology of ADHD, our arithmetic mental task has provided evidence of structural and functional changes in the posterior regions and probably cerebellum in ADHD.

  9. Comprehensive analysis of the transcriptional profile of the Mediator complex across human cancer types.

    Science.gov (United States)

    Syring, Isabella; Klümper, Niklas; Offermann, Anne; Braun, Martin; Deng, Mario; Boehm, Diana; Queisser, Angela; von Mässenhausen, Anne; Brägelmann, Johannes; Vogel, Wenzel; Schmidt, Doris; Majores, Michael; Schindler, Anne; Kristiansen, Glen; Müller, Stefan C; Ellinger, Jörg; Shaikhibrahim, Zaki; Perner, Sven

    2016-04-26

    The Mediator complex is a key regulator of gene transcription and several studies demonstrated altered expressions of particular subunits in diverse human diseases, especially cancer. However a systematic study deciphering the transcriptional expression of the Mediator across different cancer entities is still lacking.We therefore performed a comprehensive in silico cancer vs. benign analysis of the Mediator complex subunits (MEDs) for 20 tumor entities using Oncomine datasets. The transcriptional expression profiles across almost all cancer entities showed differentially expressed MEDs as compared to benign tissue. Differential expression of MED8 in renal cell carcinoma (RCC) and MED12 in lung cancer (LCa) were validated and further investigated by immunohistochemical staining on tissue microarrays containing large numbers of specimen. MED8 in clear cell RCC (ccRCC) associated with shorter survival and advanced TNM stage and showed higher expression in metastatic than primary tumors. In vitro, siRNA mediated MED8 knockdown significantly impaired proliferation and motility in ccRCC cell lines, hinting at a role for MED8 to serve as a novel therapeutic target in ccRCC. Taken together, our Mediator complex transcriptome proved to be a valid tool for identifying cancer-related shifts in Mediator complex composition, revealing that MEDs do exhibit cancer specific transcriptional expression profiles.

  10. Quantum-chemical analysis of formation reactions of Со2+ complexes

    Directory of Open Access Journals (Sweden)

    Viktor F. Vargalyuk

    2017-11-01

    Full Text Available Based on the analysis of quantum chemical calculations results (GAMESS, density functional theory, B3LYP method as to coordination compounds of Co2+ions with H2O, NH3, OH–, F–, Cl–, Br–, I–, CN–, Ac–, Ak– generally given by [Co(H2O6–nLn]2+nx, it has been demonstrated that within the selected series of ligands, there is no correlation between the amount of energy of monosubstituted cobalt aqua complexes formation(∆Е and pK1,just like between the effective nuclear charge of the central atom (z*Со and pK1. According to the behavior of ∆Е and z*Со,we identified two groups of ligands. The first group (OH–, F–, Ac–, Ak–, CN–, NH3 demonstrates logical ∆Е decrease caused by the growth of z*Со. On the contrary, the second group (Cl–, Br–, I– demonstrates ∆Е increase caused by the growth of z*Со. This phenomenon is explained by the change in electronegativity and polarizability of donor atoms in groups and periods of the periodic table. It is established that linear correlations given by lgK = A + B·z*Со can be actualized only for complexes having ligands with similar donor atoms. Referring to the literature on stepwise complex formation of hydroxide, amine and chloride cobalt complexes in combination with z*Со calculations results, we determined A and B constants of lgK, z*Со-correlations for the atoms of oxygen (30.2, –17.7; nitrogen (125.4, –69.9 and chlorine (–6.3, 5.8. The existence of the detected correlation series enables us to lean on lgK,z*М–dependence parameters for the fixed donor atom and to determine Kn values for various complexes with complex-based ligands using calculations and z*М data. This applies to complexes having central atoms of the same nature as well as simple monodentate ligands. The mentioned approach was used to calculate the stability constants for acrylate cobalt complexes (lgK1 = 1.2 и lgК2 = 4.3, which are not covered in literature.

  11. Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004

    Energy Technology Data Exchange (ETDEWEB)

    Torralba, B.; Martinez-Arias, R.

    2007-07-01

    Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)

  12. Analysis of Operators Comments on the PSF Questionnaire of the Task Complexity Experiment 2003/2004

    International Nuclear Information System (INIS)

    Torralba, B.; Martinez-Arias, R.

    2007-01-01

    Human Reliability Analysis (HRA) methods usually take into account the effect of Performance Shaping Factors (PSF). Therefore, the adequate treatment of PSFs in HRA of Probabilistic Safety Assessment (PSA) models has a crucial importance. There is an important need for collecting PSF data based on simulator experiments. During the task complexity experiment 2003-2004, carried out in the BWR simulator of Halden Man-Machine Laboratory (HAMMLAB), there was a data collection on PSF by means of a PSF Questionnaire. Seven crews (composed of shift supervisor, reactor operator and turbine operator) from Swedish Nuclear Power Plants participated in the experiment. The PSF Questionnaire collected data on the factors: procedures, training and experience, indications, controls, team management, team communication, individual work practice, available time for the tasks, number of tasks or information load, masking and seriousness. The main statistical significant results are presented on Performance Shaping Factors data collection and analysis of the task complexity experiment 2003/2004 (HWR-810). The analysis of the comments about PSFs, which were provided by operators on the PSF Questionnaire, is described. It has been summarised the comments provided for each PSF on the scenarios, using a content analysis technique. (Author)

  13. Using qualitative comparative analysis in a systematic review of a complex intervention.

    Science.gov (United States)

    Kahwati, Leila; Jacobs, Sara; Kane, Heather; Lewis, Megan; Viswanathan, Meera; Golin, Carol E

    2016-05-04

    Systematic reviews evaluating complex interventions often encounter substantial clinical heterogeneity in intervention components and implementation features making synthesis challenging. Qualitative comparative analysis (QCA) is a non-probabilistic method that uses mathematical set theory to study complex phenomena; it has been proposed as a potential method to complement traditional evidence synthesis in reviews of complex interventions to identify key intervention components or implementation features that might explain effectiveness or ineffectiveness. The objective of this study was to describe our approach in detail and examine the suitability of using QCA within the context of a systematic review. We used data from a completed systematic review of behavioral interventions to improve medication adherence to conduct two substantive analyses using QCA. The first analysis sought to identify combinations of nine behavior change techniques/components (BCTs) found among effective interventions, and the second analysis sought to identify combinations of five implementation features (e.g., agent, target, mode, time span, exposure) found among effective interventions. For each substantive analysis, we reframed the review's research questions to be designed for use with QCA, calibrated sets (i.e., transformed raw data into data used in analysis), and identified the necessary and/or sufficient combinations of BCTs and implementation features found in effective interventions. Our application of QCA for each substantive analysis is described in detail. We extended the original review findings by identifying seven combinations of BCTs and four combinations of implementation features that were sufficient for improving adherence. We found reasonable alignment between several systematic review steps and processes used in QCA except that typical approaches to study abstraction for some intervention components and features did not support a robust calibration for QCA. QCA was

  14. Quantitative analysis of ribosome–mRNA complexes at different translation stages

    Science.gov (United States)

    Shirokikh, Nikolay E.; Alkalaeva, Elena Z.; Vassilenko, Konstantin S.; Afonina, Zhanna A.; Alekhina, Olga M.; Kisselev, Lev L.; Spirin, Alexander S.

    2010-01-01

    Inhibition of primer extension by ribosome–mRNA complexes (toeprinting) is a proven and powerful technique for studying mechanisms of mRNA translation. Here we have assayed an advanced toeprinting approach that employs fluorescently labeled DNA primers, followed by capillary electrophoresis utilizing standard instruments for sequencing and fragment analysis. We demonstrate that this improved technique is not merely fast and cost-effective, but also brings the primer extension inhibition method up to the next level. The electrophoretic pattern of the primer extension reaction can be characterized with a precision unattainable by the common toeprint analysis utilizing radioactive isotopes. This method allows us to detect and quantify stable ribosomal complexes at all stages of translation, including initiation, elongation and termination, generated during the complete translation process in both the in vitro reconstituted translation system and the cell lysate. We also point out the unique advantages of this new methodology, including the ability to assay sites of the ribosomal complex assembly on several mRNA species in the same reaction mixture. PMID:19910372

  15. Using value-based analysis to influence outcomes in complex surgical systems.

    Science.gov (United States)

    Kirkpatrick, John R; Marks, Stanley; Slane, Michele; Kim, Donald; Cohen, Lance; Cortelli, Michael; Plate, Juan; Perryman, Richard; Zapas, John

    2015-04-01

    Value-based analysis (VBA) is a management strategy used to determine changes in value (quality/cost) when a usual practice (UP) is replaced by a best practice (BP). Previously validated in clinical initiatives, its usefulness in complex systems is unknown. To answer this question, we used VBA to correct deficiencies in cardiac surgery at Memorial Healthcare System. Cardiac surgery is a complex surgical system that lends itself to VBA because outcomes metrics provided by the Society of Thoracic Surgeons provide an estimate of quality; cost is available from Centers for Medicare and Medicaid Services and other contemporary sources; the UP can be determined; and the best practice can be established. Analysis of the UP at Memorial Healthcare System revealed considerable deficiencies in selection of patients for surgery; the surgery itself, including choice of procedure and outcomes; after care; follow-up; and control of expenditures. To correct these deficiencies, each UP was replaced with a BP. Changes included replacement of most of the cardiac surgeons; conversion to an employed physician model; restructuring of a heart surgery unit; recruitment of cardiac anesthesiologists; introduction of an interactive educational program; eliminating unsafe practices; and reducing cost. There was a significant (p value (quality/cost) in a complex surgical system. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  16. Fracture mechanics and residual fatigue life analysis for complex stress fields. Technical report

    International Nuclear Information System (INIS)

    Besuner, P.M.

    1975-07-01

    This report reviews the development and application of an influence function method for calculating stress intensity factors and residual fatigue life for two- and three-dimensional structures with complex stress fields and geometries. Through elastic superposition, the method properly accounts for redistribution of stress as the crack grows through the structure. The analytical methods used and the computer programs necessary for computation and application of load independent influence functions are presented. A new exact solution is obtained for the buried elliptical crack, under an arbitrary Mode I stress field, for stress intensity factors at four positions around the crack front. The IF method is then applied to two fracture mechanics problems with complex stress fields and geometries. These problems are of current interest to the electric power generating industry and include (1) the fatigue analysis of a crack in a pipe weld under nominal and residual stresses and (2) fatigue analysis of a reactor pressure vessel nozzle corner crack under a complex bivariate stress field

  17. Complexity Analysis of Reed-Solomon Decoding over GF(2m without Using Syndromes

    Directory of Open Access Journals (Sweden)

    Zhiyuan Yan

    2008-06-01

    Full Text Available There has been renewed interest in decoding Reed-Solomon (RS codes without using syndromes recently. In this paper, we investigate the complexity of syndromeless decoding, and compare it to that of syndrome-based decoding. Aiming to provide guidelines to practical applications, our complexity analysis focuses on RS codes over characteristic-2 fields, for which some multiplicative FFT techniques are not applicable. Due to moderate block lengths of RS codes in practice, our analysis is complete, without big O notation. In addition to fast implementation using additive FFT techniques, we also consider direct implementation, which is still relevant for RS codes with moderate lengths. For high-rate RS codes, when compared to syndrome-based decoding algorithms, not only syndromeless decoding algorithms require more field operations regardless of implementation, but also decoder architectures based on their direct implementations have higher hardware costs and lower throughput. We also derive tighter bounds on the complexities of fast polynomial multiplications based on Cantor's approach and the fast extended Euclidean algorithm.

  18. Engineering Complex Embedded Systems with State Analysis and the Mission Data System

    Science.gov (United States)

    Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.

  19. Complexity Analysis of Reed-Solomon Decoding over GF without Using Syndromes

    Directory of Open Access Journals (Sweden)

    Chen Ning

    2008-01-01

    Full Text Available Abstract There has been renewed interest in decoding Reed-Solomon (RS codes without using syndromes recently. In this paper, we investigate the complexity of syndromeless decoding, and compare it to that of syndrome-based decoding. Aiming to provide guidelines to practical applications, our complexity analysis focuses on RS codes over characteristic-2 fields, for which some multiplicative FFT techniques are not applicable. Due to moderate block lengths of RS codes in practice, our analysis is complete, without big notation. In addition to fast implementation using additive FFT techniques, we also consider direct implementation, which is still relevant for RS codes with moderate lengths. For high-rate RS codes, when compared to syndrome-based decoding algorithms, not only syndromeless decoding algorithms require more field operations regardless of implementation, but also decoder architectures based on their direct implementations have higher hardware costs and lower throughput. We also derive tighter bounds on the complexities of fast polynomial multiplications based on Cantor's approach and the fast extended Euclidean algorithm.

  20. A finite element framework for multiscale/multiphysics analysis of structures with complex microstructures

    Science.gov (United States)

    Varghese, Julian

    This research work has contributed in various ways to help develop a better understanding of textile composites and materials with complex microstructures in general. An instrumental part of this work was the development of an object-oriented framework that made it convenient to perform multiscale/multiphysics analyses of advanced materials with complex microstructures such as textile composites. In addition to the studies conducted in this work, this framework lays the groundwork for continued research of these materials. This framework enabled a detailed multiscale stress analysis of a woven DCB specimen that revealed the effect of the complex microstructure on the stress and strain energy release rate distribution along the crack front. In addition to implementing an oxidation model, the framework was also used to implement strategies that expedited the simulation of oxidation in textile composites so that it would take only a few hours. The simulation showed that the tow architecture played a significant role in the oxidation behavior in textile composites. Finally, a coupled diffusion/oxidation and damage progression analysis was implemented that was used to study the mechanical behavior of textile composites under mechanical loading as well as oxidation. A parametric study was performed to determine the effect of material properties and the number of plies in the laminate on its mechanical behavior. The analyses indicated a significant effect of the tow architecture and other parameters on the damage progression in the laminates.

  1. Network reliability analysis of complex systems using a non-simulation-based method

    International Nuclear Information System (INIS)

    Kim, Youngsuk; Kang, Won-Hee

    2013-01-01

    Civil infrastructures such as transportation, water supply, sewers, telecommunications, and electrical and gas networks often establish highly complex networks, due to their multiple source and distribution nodes, complex topology, and functional interdependence between network components. To understand the reliability of such complex network system under catastrophic events such as earthquakes and to provide proper emergency management actions under such situation, efficient and accurate reliability analysis methods are necessary. In this paper, a non-simulation-based network reliability analysis method is developed based on the Recursive Decomposition Algorithm (RDA) for risk assessment of generic networks whose operation is defined by the connections of multiple initial and terminal node pairs. The proposed method has two separate decomposition processes for two logical functions, intersection and union, and combinations of these processes are used for the decomposition of any general system event with multiple node pairs. The proposed method is illustrated through numerical network examples with a variety of system definitions, and is applied to a benchmark gas transmission pipe network in Memphis TN to estimate the seismic performance and functional degradation of the network under a set of earthquake scenarios.

  2. Flavonoids as matrices for MALDI-TOF mass spectrometric analysis of transition metal complexes

    Science.gov (United States)

    Petkovic, Marijana; Petrovic, Biljana; Savic, Jasmina; Bugarcic, Zivadin D.; Dimitric-Markovic, Jasmina; Momic, Tatjana; Vasic, Vesna

    2010-02-01

    Matrix-assisted laser desorption and ionization time-of-flight mass spectrometry (MALDI-TOF MS) is a suitable method for the analysis of inorganic and organic compounds and biomolecules. This makes MALDI-TOF MS convenient for monitoring the interaction of metallo-drugs with biomolecules. Results presented in this manuscript demonstrate that flavonoids such as apigenin, kaempferol and luteolin are suitable for MALDI-TOF MS analysis of Pt(II), Pd(II), Pt(IV) and Ru(III) complexes, giving different signal-to-noise ratios of the analyte peak. The MALDI-TOF mass spectra of inorganic complexes acquired with these flavonoid matrices are easy to interpret and have some advantages over the application of other commonly used matrices: a low number of matrix peaks are detectable and the coordinative metal-ligand bond is, in most cases, preserved. On the other hand, flavonoids do not act as typical matrices, as their excess is not required for the acquisition of MALDI-TOF mass spectra of inorganic complexes.

  3. Genomic analysis of organismal complexity in the multicellular green alga Volvox carteri

    Energy Technology Data Exchange (ETDEWEB)

    Prochnik, Simon E.; Umen, James; Nedelcu, Aurora; Hallmann, Armin; Miller, Stephen M.; Nishii, Ichiro; Ferris, Patrick; Kuo, Alan; Mitros, Therese; Fritz-Laylin, Lillian K.; Hellsten, Uffe; Chapman, Jarrod; Simakov, Oleg; Rensing, Stefan A.; Terry, Astrid; Pangilinan, Jasmyn; Kapitonov, Vladimir; Jurka, Jerzy; Salamov, Asaf; Shapiro, Harris; Schmutz, Jeremy; Grimwood, Jane; Lindquist, Erika; Lucas, Susan; Grigoriev, Igor V.; Schmitt, Rudiger; Kirk, David; Rokhsar, Daniel S.

    2010-07-01

    Analysis of the Volvox carteri genome reveals that this green alga's increased organismal complexity and multicellularity are associated with modifications in protein families shared with its unicellular ancestor, and not with large-scale innovations in protein coding capacity. The multicellular green alga Volvox carteri and its morphologically diverse close relatives (the volvocine algae) are uniquely suited for investigating the evolution of multicellularity and development. We sequenced the 138 Mb genome of V. carteri and compared its {approx}14,500 predicted proteins to those of its unicellular relative, Chlamydomonas reinhardtii. Despite fundamental differences in organismal complexity and life history, the two species have similar protein-coding potentials, and few species-specific protein-coding gene predictions. Interestingly, volvocine algal-specific proteins are enriched in Volvox, including those associated with an expanded and highly compartmentalized extracellular matrix. Our analysis shows that increases in organismal complexity can be associated with modifications of lineage-specific proteins rather than large-scale invention of protein-coding capacity.

  4. Direct power comparisons between simple LOD scores and NPL scores for linkage analysis in complex diseases.

    Science.gov (United States)

    Abreu, P C; Greenberg, D A; Hodge, S E

    1999-09-01

    Several methods have been proposed for linkage analysis of complex traits with unknown mode of inheritance. These methods include the LOD score maximized over disease models (MMLS) and the "nonparametric" linkage (NPL) statistic. In previous work, we evaluated the increase of type I error when maximizing over two or more genetic models, and we compared the power of MMLS to detect linkage, in a number of complex modes of inheritance, with analysis assuming the true model. In the present study, we compare MMLS and NPL directly. We simulated 100 data sets with 20 families each, using 26 generating models: (1) 4 intermediate models (penetrance of heterozygote between that of the two homozygotes); (2) 6 two-locus additive models; and (3) 16 two-locus heterogeneity models (admixture alpha = 1.0,.7,.5, and.3; alpha = 1.0 replicates simple Mendelian models). For LOD scores, we assumed dominant and recessive inheritance with 50% penetrance. We took the higher of the two maximum LOD scores and subtracted 0.3 to correct for multiple tests (MMLS-C). We compared expected maximum LOD scores and power, using MMLS-C and NPL as well as the true model. Since NPL uses only the affected family members, we also performed an affecteds-only analysis using MMLS-C. The MMLS-C was both uniformly more powerful than NPL for most cases we examined, except when linkage information was low, and close to the results for the true model under locus heterogeneity. We still found better power for the MMLS-C compared with NPL in affecteds-only analysis. The results show that use of two simple modes of inheritance at a fixed penetrance can have more power than NPL when the trait mode of inheritance is complex and when there is heterogeneity in the data set.

  5. Radioactive waste management complex low-level waste radiological composite analysis

    Energy Technology Data Exchange (ETDEWEB)

    McCarthy, J.M.; Becker, B.H.; Magnuson, S.O.; Keck, K.N.; Honeycutt, T.K.

    1998-05-01

    The composite analysis estimates the projected cumulative impacts to future members of the public from the disposal of low-level radioactive waste (LLW) at the Idaho National Engineering and Environmental Laboratory (INEEL) Radioactive Waste Management Complex (RWMC) and all other sources of radioactive contamination at the INEEL that could interact with the LLW disposal facility to affect the radiological dose. Based upon the composite analysis evaluation, waste buried in the Subsurface Disposal Area (SDA) at the RWMC is the only source at the INEEL that will significantly interact with the LLW facility. The source term used in the composite analysis consists of all historical SDA subsurface disposals of radionuclides as well as the authorized LLW subsurface disposal inventory and projected LLW subsurface disposal inventory. Exposure scenarios evaluated in the composite analysis include all the all-pathways and groundwater protection scenarios. The projected dose of 58 mrem/yr exceeds the composite analysis guidance dose constraint of 30 mrem/yr; therefore, an options analysis was conducted to determine the feasibility of reducing the projected annual dose. Three options for creating such a reduction were considered: (1) lowering infiltration of precipitation through the waste by providing a better cover, (2) maintaining control over the RWMC and portions of the INEEL indefinitely, and (3) extending the period of institutional control beyond the 100 years assumed in the composite analysis. Of the three options investigated, maintaining control over the RWMC and a small part of the present INEEL appears to be feasible and cost effective.

  6. Radioactive waste management complex low-level waste radiological composite analysis

    International Nuclear Information System (INIS)

    McCarthy, J.M.; Becker, B.H.; Magnuson, S.O.; Keck, K.N.; Honeycutt, T.K.

    1998-05-01

    The composite analysis estimates the projected cumulative impacts to future members of the public from the disposal of low-level radioactive waste (LLW) at the Idaho National Engineering and Environmental Laboratory (INEEL) Radioactive Waste Management Complex (RWMC) and all other sources of radioactive contamination at the INEEL that could interact with the LLW disposal facility to affect the radiological dose. Based upon the composite analysis evaluation, waste buried in the Subsurface Disposal Area (SDA) at the RWMC is the only source at the INEEL that will significantly interact with the LLW facility. The source term used in the composite analysis consists of all historical SDA subsurface disposals of radionuclides as well as the authorized LLW subsurface disposal inventory and projected LLW subsurface disposal inventory. Exposure scenarios evaluated in the composite analysis include all the all-pathways and groundwater protection scenarios. The projected dose of 58 mrem/yr exceeds the composite analysis guidance dose constraint of 30 mrem/yr; therefore, an options analysis was conducted to determine the feasibility of reducing the projected annual dose. Three options for creating such a reduction were considered: (1) lowering infiltration of precipitation through the waste by providing a better cover, (2) maintaining control over the RWMC and portions of the INEEL indefinitely, and (3) extending the period of institutional control beyond the 100 years assumed in the composite analysis. Of the three options investigated, maintaining control over the RWMC and a small part of the present INEEL appears to be feasible and cost effective

  7. Interaction proteomics analysis of polycomb proteins defines distinct PRC1 complexes in mammalian cells

    DEFF Research Database (Denmark)

    Vandamme, Julien; Völkel, Pamela; Rosnoblet, Claire

    2011-01-01

    Polycomb group (PcG) proteins maintain transcriptional repression of hundreds of genes involved in development, signaling or cancer using chromatin-based epigenetic mechanisms. Biochemical studies in Drosophila have revealed that PcG proteins associate in at least two classes of protein complexes...... known as Polycomb repressive complexes 1 and 2 (PRC1 and PRC2). Drosophila core PRC1 is composed of four subunits, Polycomb (Pc), Sex combs extra (Sce), Polyhomeotic (Ph), and Posterior sex combs (Psc). Each of these proteins has multiple orthologs in vertebrates classified respectively as the CBX, RING...... in order to identify interacting partners of CBX family proteins under the same experimental conditions. Our analysis identified with high confidence about 20 proteins co-eluted with CBX2 and CBX7 tagged proteins, about 40 with CBX4, and around 60 with CBX6 and CBX8. We provide evidences that the CBX...

  8. Interpreting Popov criteria in Lure´ systems with complex scaling stability analysis

    Science.gov (United States)

    Zhou, J.

    2018-06-01

    The paper presents a novel frequency-domain interpretation of Popov criteria for absolute stability in Lure´ systems by means of what we call complex scaling stability analysis. The complex scaling technique is developed for exponential/asymptotic stability in LTI feedback systems, which dispenses open-loop poles distribution, contour/locus orientation and prior frequency sweeping. Exploiting the technique for alternatively revealing positive realness of transfer functions, re-interpreting Popov criteria is explicated. More specifically, the suggested frequency-domain stability conditions are conformable both in scalar and multivariable cases, and can be implemented either graphically with locus plotting or numerically without; in particular, the latter is suitable as a design tool with auxiliary parameter freedom. The interpretation also reveals further frequency-domain facts about Lure´ systems. Numerical examples are included to illustrate the main results.

  9. Investigating size effects of complex nanostructures through Young-Laplace equation and finite element analysis

    International Nuclear Information System (INIS)

    Lu, Dingjie; Xie, Yi Min; Huang, Xiaodong; Zhou, Shiwei; Li, Qing

    2015-01-01

    Analytical studies on the size effects of a simply-shaped beam fixed at both ends have successfully explained the sudden changes of effective Young's modulus as its diameter decreases below 100 nm. Yet they are invalid for complex nanostructures ubiquitously existing in nature. In accordance with a generalized Young-Laplace equation, one of the representative size effects is transferred to non-uniformly distributed pressure against an external surface due to the imbalance of inward and outward loads. Because the magnitude of pressure depends on the principal curvatures, iterative steps have to be adopted to gradually stabilize the structure in finite element analysis. Computational results are in good agreement with both experiment data and theoretical prediction. Furthermore, the investigation on strengthened and softened Young's modulus for two complex nanostructures demonstrates that the proposed computational method provides a general and effective approach to analyze the size effects for nanostructures in arbitrary shape

  10. Integrative Analysis of Complex Cancer Genomics and Clinical Profiles Using the cBioPortal

    Science.gov (United States)

    Gao, Jianjiong; Aksoy, Bülent Arman; Dogrusoz, Ugur; Dresdner, Gideon; Gross, Benjamin; Sumer, S. Onur; Sun, Yichao; Jacobsen, Anders; Sinha, Rileen; Larsson, Erik; Cerami, Ethan; Sander, Chris; Schultz, Nikolaus

    2014-01-01

    The cBioPortal for Cancer Genomics (http://cbioportal.org) provides a Web resource for exploring, visualizing, and analyzing multidimensional cancer genomics data. The portal reduces molecular profiling data from cancer tissues and cell lines into readily understandable genetic, epigenetic, gene expression, and proteomic events. The query interface combined with customized data storage enables researchers to interactively explore genetic alterations across samples, genes, and pathways and, when available in the underlying data, to link these to clinical outcomes. The portal provides graphical summaries of gene-level data from multiple platforms, network visualization and analysis, survival analysis, patient-centric queries, and software programmatic access. The intuitive Web interface of the portal makes complex cancer genomics profiles accessible to researchers and clinicians without requiring bioinformatics expertise, thus facilitating biological discoveries. Here, we provide a practical guide to the analysis and visualization features of the cBioPortal for Cancer Genomics. PMID:23550210

  11. Heteroprotein Complex Formation of Bovine Lactoferrin and Pea Protein Isolate: A Multiscale Structural Analysis.

    Science.gov (United States)

    Adal, Eda; Sadeghpour, Amin; Connell, Simon; Rappolt, Michael; Ibanoglu, Esra; Sarkar, Anwesha

    2017-02-13

    Associative electrostatic interactions between two oppositely charged globular proteins, lactoferrin (LF) and pea protein isolate (PPI), the latter being a mixture of vicilin, legumin, and convicilin, was studied with a specific PPI/LF molar ratio at room temperature. Structural aspects of the electrostatic complexes probed at different length scales were investigated as a function of pH by means of different complementary techniques, namely, with dynamic light scattering, small-angle X-ray scattering (SAXS), turbidity measurements, and atomic force microscopy (AFM). Irrespective of the applied techniques, the results consistently displayed that complexation between LF and PPI did occur. In an optimum narrow range of pH 5.0-5.8, a viscous liquid phase of complex coacervate was obtained upon mild centrifugation of the turbid LF-PPI mixture with a maximum R h , turbidity and the ζ-potential being close to zero observed at pH 5.4. In particular, the SAXS data demonstrated that the coacervates were densely assembled with a roughly spherical size distribution exhibiting a maximum extension of ∼80 nm at pH 5.4. Equally, AFM image analysis showed size distributions containing most frequent cluster sizes around 40-80 nm with spherical to elliptical shapes (axis aspect ratio ≤ 2) as well as less frequent elongated to chainlike structures. The most frequently observed compact complexes, we identify as mainly leading to LF-PPI coacervation, whereas for the less frequent chain-like aggregates, we hypothesize that additionally PPI-PPI facilitated complexes exist.

  12. Multi-signal sedimentation velocity analysis with mass conservation for determining the stoichiometry of protein complexes.

    Directory of Open Access Journals (Sweden)

    Chad A Brautigam

    Full Text Available Multi-signal sedimentation velocity analytical ultracentrifugation (MSSV is a powerful tool for the determination of the number, stoichiometry, and hydrodynamic shape of reversible protein complexes in two- and three-component systems. In this method, the evolution of sedimentation profiles of macromolecular mixtures is recorded simultaneously using multiple absorbance and refractive index signals and globally transformed into both spectrally and diffusion-deconvoluted component sedimentation coefficient distributions. For reactions with complex lifetimes comparable to the time-scale of sedimentation, MSSV reveals the number and stoichiometry of co-existing complexes. For systems with short complex lifetimes, MSSV reveals the composition of the reaction boundary of the coupled reaction/migration process, which we show here may be used to directly determine an association constant. A prerequisite for MSSV is that the interacting components are spectrally distinguishable, which may be a result, for example, of extrinsic chromophores or of different abundances of aromatic amino acids contributing to the UV absorbance. For interacting components that are spectrally poorly resolved, here we introduce a method for additional regularization of the spectral deconvolution by exploiting approximate knowledge of the total loading concentrations. While this novel mass conservation principle does not discriminate contributions to different species, it can be effectively combined with constraints in the sedimentation coefficient range of uncomplexed species. We show in theory, computer simulations, and experiment, how mass conservation MSSV as implemented in SEDPHAT can enhance or even substitute for the spectral discrimination of components. This should broaden the applicability of MSSV to the analysis of the composition of reversible macromolecular complexes.

  13. Auditable safety analysis for the surveillance and maintenance of the REDOX complex

    International Nuclear Information System (INIS)

    Cuneo, V.J.

    1997-02-01

    The Reduction-Oxidation (REDOX) Complex is an inactive surplus facility that contains two former fuel processing facilities (the 202-S Canyon Building and the 233-S Plutonium Concentration Facility) and a number of ancillary support structures. Deactivation started in 1967 and was completed in 1969 when the plant was transferred to surveillance and maintenance (S ampersand M). This document provides the auditable safety analysis (ASA) for the post-deactivation, long-term S ampersand M phase of the above grade structures of the REDOX Complex. The S ampersand M phase is conducted for the following reasons: (1) Maintain confinement of residual inventories of radioactive materials and other contaminants until the facility is ultimately dispositioned, (2) Prevent deterioration of confinement structures, (3) Respond to potential accident conditions requiring response and mitigation, (4) Provide for the safety of workers involved in the S ampersand M phase, and (5) Provide the basis for evaluation and selection of ultimate disposal alternatives. The ability of the existing facilities to withstand the effects of natural phenomena hazard events is evaluated and the active support systems used to maintain ventilation and/or prevent the spread of contamination are described. This auditable safety analysis document evaluates the routinely required S ampersand M activities (i.e., the S ampersand M of facility barriers, equipment, structures, and postings [including repair and upgrade]; measures to identify, remove, or repair damaged asbestos; measures to identify, remove, or appropriately manage existing containers of hazardous substances; and the performance of spill response measures as needed). For the REDOX Complex, the movement of cell cover blocks is also evaluated, as D-cell cover block was removed a number of years ago and should be replaced. The type and nature of the hazards presented by the REDOX Complex and the REDOX-specific controls required to maintain these

  14. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    Science.gov (United States)

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  15. Differential network analysis reveals evolutionary complexity in secondary metabolism of Rauvolfia serpentina over Catharanthus roseus

    Directory of Open Access Journals (Sweden)

    Shivalika Pathania

    2016-08-01

    Full Text Available Comparative co-expression analysis of multiple species using high-throughput data is an integrative approach to determine the uniformity as well as diversification in biological processes. Rauvolfia serpentina and Catharanthus roseus, both members of Apocyanacae family, are reported to have remedial properties against multiple diseases. Despite of sharing upstream of terpenoid indole alkaloid pathway, there is significant diversity in tissue-specific synthesis and accumulation of specialized metabolites in these plants. This led us to implement comparative co-expression network analysis to investigate the modules and genes responsible for differential tissue-specific expression as well as species-specific synthesis of metabolites. Towards these goals differential network analysis was implemented to identify candidate genes responsible for diversification of metabolites profile. Three genes were identified with significant difference in connectivity leading to differential regulatory behavior between these plants. These mechanisms may be responsible for diversification of secondary metabolism, and thereby for species-specific metabolite synthesis. The network robustness of R. serpentina, determined based on topological properties, was also complemented by comparison of gene-metabolite networks of both plants, and may have evolved to have complex metabolic mechanisms as compared to C. roseus under the influence of various stimuli. This study reveals evolution of complexity in secondary metabolism of Rauvolfia serpentina, and key genes that contribute towards diversification of specific metabolites.

  16. Differential Network Analysis Reveals Evolutionary Complexity in Secondary Metabolism of Rauvolfia serpentina over Catharanthus roseus.

    Science.gov (United States)

    Pathania, Shivalika; Bagler, Ganesh; Ahuja, Paramvir S

    2016-01-01

    Comparative co-expression analysis of multiple species using high-throughput data is an integrative approach to determine the uniformity as well as diversification in biological processes. Rauvolfia serpentina and Catharanthus roseus, both members of Apocyanacae family, are reported to have remedial properties against multiple diseases. Despite of sharing upstream of terpenoid indole alkaloid pathway, there is significant diversity in tissue-specific synthesis and accumulation of specialized metabolites in these plants. This led us to implement comparative co-expression network analysis to investigate the modules and genes responsible for differential tissue-specific expression as well as species-specific synthesis of metabolites. Toward these goals differential network analysis was implemented to identify candidate genes responsible for diversification of metabolites profile. Three genes were identified with significant difference in connectivity leading to differential regulatory behavior between these plants. These genes may be responsible for diversification of secondary metabolism, and thereby for species-specific metabolite synthesis. The network robustness of R. serpentina, determined based on topological properties, was also complemented by comparison of gene-metabolite networks of both plants, and may have evolved to have complex metabolic mechanisms as compared to C. roseus under the influence of various stimuli. This study reveals evolution of complexity in secondary metabolism of R. serpentina, and key genes that contribute toward diversification of specific metabolites.

  17. Dynamics and causalities of atmospheric and oceanic data identified by complex networks and Granger causality analysis

    Science.gov (United States)

    Charakopoulos, A. K.; Katsouli, G. A.; Karakasidis, T. E.

    2018-04-01

    Understanding the underlying processes and extracting detailed characteristics of spatiotemporal dynamics of ocean and atmosphere as well as their interaction is of significant interest and has not been well thoroughly established. The purpose of this study was to examine the performance of two main additional methodologies for the identification of spatiotemporal underlying dynamic characteristics and patterns among atmospheric and oceanic variables from Seawatch buoys from Aegean and Ionian Sea, provided by the Hellenic Center for Marine Research (HCMR). The first approach involves the estimation of cross correlation analysis in an attempt to investigate time-lagged relationships, and further in order to identify the direction of interactions between the variables we performed the Granger causality method. According to the second approach the time series are converted into complex networks and then the main topological network properties such as degree distribution, average path length, diameter, modularity and clustering coefficient are evaluated. Our results show that the proposed analysis of complex network analysis of time series can lead to the extraction of hidden spatiotemporal characteristics. Also our findings indicate high level of positive and negative correlations and causalities among variables, both from the same buoy and also between buoys from different stations, which cannot be determined from the use of simple statistical measures.

  18. Combining biophysical methods for the analysis of protein complex stoichiometry and affinity in SEDPHAT

    International Nuclear Information System (INIS)

    Zhao, Huaying; Schuck, Peter

    2015-01-01

    Global multi-method analysis for protein interactions (GMMA) can increase the precision and complexity of binding studies for the determination of the stoichiometry, affinity and cooperativity of multi-site interactions. The principles and recent developments of biophysical solution methods implemented for GMMA in the software SEDPHAT are reviewed, their complementarity in GMMA is described and a new GMMA simulation tool set in SEDPHAT is presented. Reversible macromolecular interactions are ubiquitous in signal transduction pathways, often forming dynamic multi-protein complexes with three or more components. Multivalent binding and cooperativity in these complexes are often key motifs of their biological mechanisms. Traditional solution biophysical techniques for characterizing the binding and cooperativity are very limited in the number of states that can be resolved. A global multi-method analysis (GMMA) approach has recently been introduced that can leverage the strengths and the different observables of different techniques to improve the accuracy of the resulting binding parameters and to facilitate the study of multi-component systems and multi-site interactions. Here, GMMA is described in the software SEDPHAT for the analysis of data from isothermal titration calorimetry, surface plasmon resonance or other biosensing, analytical ultracentrifugation, fluorescence anisotropy and various other spectroscopic and thermodynamic techniques. The basic principles of these techniques are reviewed and recent advances in view of their particular strengths in the context of GMMA are described. Furthermore, a new feature in SEDPHAT is introduced for the simulation of multi-method data. In combination with specific statistical tools for GMMA in SEDPHAT, simulations can be a valuable step in the experimental design

  19. [Social-professional status, identity, social participation and media utilization. Analysis of a complex dynamics].

    Science.gov (United States)

    Laflamme, Simon; Roggero, Pascal; Southcott, Chris

    2010-08-01

    This article examines the link between the domain and level of occupation, on the one hand, and use of media, including internet, on the other. It adds to this investigation an analysis of identity in its relation to media use and accessibility. It challenges the hypothesis of a strong correlation between level of occupation and use and accessibility to media. It reveals complex phenomena of social homogenization and differentiation. Data is extracted from a sample of workers who completed a questionnaire which focused on use of media.

  20. Complex Analysis of 700-Year-Old Skeletal Remains found in an Unusual Grave: Case Report

    Czech Academy of Sciences Publication Activity Database

    Vaněk, D.; Brzobohatá, Hana; Šilerová, M.; Horák, Z.; Nývltová Fišáková, Miriam; Vašinová Galiová, M.; Zedníková Malá, P.; Urbanová, V.; Dobisíková, M.; Beran, M.; Brestovanský, P.

    2015-01-01

    Roč. 2, č. 5 (2015) ISSN 2332-0915 R&D Projects: GA ČR GB14-36938G Grant - others:GA MŠk(CZ) ED1.1.00/02.0068 Program:ED Institutional support: RVO:67985912 ; RVO:68081758 Keywords : mass spectrometry * genealogical * physical anomalies * anthropological Subject RIV: AC - Archeology, Anthropology, Ethnology http://www.omicsonline.org/ open - access /complex-analysis-of-700yearold-skeletal-remains-found-in-an-unusualgravecase-report-2332-0915-1000138.pdf

  1. Cognitive human reliability analysis for an assessment of the safety significance of complex transients

    International Nuclear Information System (INIS)

    Amico, P.J.; Hsu, C.J.; Youngblood, R.W.; Fitzpatrick, R.G.

    1989-01-01

    This paper reports that as part of a probabilistic assessment of the safety significance of complex transients at certain PWR power plants, it was necessary to perform a cognitive human reliability analysis. To increase the confidence in the results, it was desirable to make use of actual observations of operator response which were available for the assessment. An approach was developed which incorporated these observations into the human cognitive reliability (HCR) modeling approach. The results obtained provided additional insights over what would have been found using other approaches. These insights were supported by the observations, and it is suggested that this approach be considered for use in future probabilistic safety assessments

  2. Osmium tetroxide complexes as versatile tools for structure probing and electrochemical analysis of biopolymers

    Czech Academy of Sciences Publication Activity Database

    Fojta, Miroslav; Kostečka, Pavel; Pivoňková, Hana; Horáková Brázdilová, Petra; Havran, Luděk

    2011-01-01

    Roč. 7, č. 1 (2011), s. 35-50 ISSN 1573-4110 R&D Projects: GA AV ČR(CZ) IAA400040901; GA AV ČR(CZ) IAA400040903; GA ČR(CZ) GP203/08/P598; GA MŠk(CZ) LC06035 Institutional research plan: CEZ:AV0Z50040507; CEZ:AV0Z50040702 Keywords : osmium complexes * DNA labelling * electrochemical analysis Subject RIV: BO - Biophysics Impact factor: 1.000, year: 2011

  3. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  4. Music video shot segmentation using independent component analysis and keyframe extraction based on image complexity

    Science.gov (United States)

    Li, Wei; Chen, Ting; Zhang, Wenjun; Shi, Yunyu; Li, Jun

    2012-04-01

    In recent years, Music video data is increasing at an astonishing speed. Shot segmentation and keyframe extraction constitute a fundamental unit in organizing, indexing, retrieving video content. In this paper a unified framework is proposed to detect the shot boundaries and extract the keyframe of a shot. Music video is first segmented to shots by illumination-invariant chromaticity histogram in independent component (IC) analysis feature space .Then we presents a new metric, image complexity, to extract keyframe in a shot which is computed by ICs. Experimental results show the framework is effective and has a good performance.

  5. Approximate analytical solutions in the analysis of elastic structures of complex geometry

    Science.gov (United States)

    Goloskokov, Dmitriy P.; Matrosov, Alexander V.

    2018-05-01

    A method of analytical decomposition for analysis plane structures of a complex configuration is presented. For each part of the structure in the form of a rectangle all the components of the stress-strain state are constructed by the superposition method. The method is based on two solutions derived in the form of trigonometric series with unknown coefficients using the method of initial functions. The coefficients are determined from the system of linear algebraic equations obtained while satisfying the boundary conditions and the conditions for joining the structure parts. The components of the stress-strain state of a bent plate with holes are calculated using the analytical decomposition method.

  6. Structural analysis of the Rubjerg Knude Glaciotectonic Complex, Vendsyssel, Northern Denmark

    Directory of Open Access Journals (Sweden)

    Pedersen, Stig A. Schack

    2005-12-01

    Full Text Available The Rubjerg Knude Glaciotectonic Complex is a thin-skinned thrust-fault complex that was formed during the advance of the Scandinavian Ice Sheet (30 000 – 26 000 B.P.; it is well exposed in a 6 km long coastal profile bordering the North Sea in northern Denmark. Theglaciotectonic thrust-fault deformation revealed by this cliff section has been subjected to detailed structural analysis based on photogrammetric measurement and construction of a balanced cross-section. Thirteen sections are differentiated, characterising the distal to proximal structural development of the complex. The deformation affected three stratigraphic units: the Middle Weichselian arctic marine Stortorn Formation, the mainly glaciolacustrine Lønstrup Klint Formation and the dominantly fluvial Rubjerg Knude Formation; these three formations are formally defined herein, together with the Skærumhede Group which includes the Stortorn and Lønstrup Klint Formations. The Rubjerg Knude Formation was deposited on a regional unconformity that caps the Lønstrup Klint Formation and separates pre-tectonic deposits below from syntectonic deposits above.In the distal part of the complex, the thrust-fault architecture is characterised by thin flatlying thrust sheets displaced over the footwall flat of the foreland for a distance of more than 500 m. Towards the proximal part of the complex, the dip of the thrust faults increases, and over long stretches they are over-steepened to an upright position. The lowest décollement zone is about 40 m below sea level in the proximal part of the system, and shows a systematicstep-wise change to higher levels in a distal (southwards direction. The structural elements are ramps and flats related to hanging-wall and footwall positions. Above upper ramp-hinges,hanging-wall anticlines developed; footwall synclines are typically related to growth-fault sedimentation in syntectonic piggyback basins, represented by the Rubjerg Knude Formation. Blocks

  7. Experimental design for optimizing MALDI-TOF-MS analysis of palladium complexes

    Directory of Open Access Journals (Sweden)

    Rakić-Kostić Tijana M.

    2017-01-01

    Full Text Available This paper presents optimization of matrix-assisted laser desorption/ionization (MALDI time-of-flight (TOF mass spectrometer (MS instrumental parameters for the analysis of chloro(2,2'',2"-terpyridinepalladium(II chloride dihydrate complex applying design of experiments methodology (DoE. This complex is of interest for potential use in the cancer therapy. DoE methodology was proved to succeed in optimization of many complex analytical problems. However, it has been poorly used for MALDI-TOF-MS optimization up to now. The theoretical mathematical relationships which explain the influence of important experimental factors (laser energy, grid voltage and number of laser shots on the selected responses (signal to noise – S/N ratio and the resolution – R of the leading peak is established. The optimal instrumental settings providing maximal S/N and R are identified and experimentally verified. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. 172052 and Grant no. 172011

  8. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  9. Patterns of precipitation and soil moisture extremes in Texas, US: A complex network analysis

    Science.gov (United States)

    Sun, Alexander Y.; Xia, Youlong; Caldwell, Todd G.; Hao, Zengchao

    2018-02-01

    Understanding of the spatial and temporal dynamics of extreme precipitation not only improves prediction skills, but also helps to prioritize hazard mitigation efforts. This study seeks to enhance the understanding of spatiotemporal covariation patterns embedded in precipitation (P) and soil moisture (SM) by using an event-based, complex-network-theoretic approach. Events concurrences are quantified using a nonparametric event synchronization measure, and spatial patterns of hydroclimate variables are analyzed by using several network measures and a community detection algorithm. SM-P coupling is examined using a directional event coincidence analysis measure that takes the order of event occurrences into account. The complex network approach is demonstrated for Texas, US, a region possessing a rich set of hydroclimate features and is frequented by catastrophic flooding. Gridded daily observed P data and simulated SM data are used to create complex networks of P and SM extremes. The uncovered high degree centrality regions and community structures are qualitatively in agreement with the overall existing knowledge of hydroclimate extremes in the study region. Our analyses provide new visual insights on the propagation, connectivity, and synchronicity of P extremes, as well as the SM-P coupling, in this flood-prone region, and can be readily used as a basis for event-driven predictive analytics for other regions.

  10. Analysis Thermal Comfort Condition in Complex Residential Building, Case Study: Chiangmai, Thailand

    Science.gov (United States)

    Juangjandee, Warangkana

    2017-10-01

    Due to the increasing need for complex residential buildings, it appears that people migrate into the high-density urban areas because the infrastructural facilities can be easily found in the modern metropolitan areas. Such rapid growth of urbanization creates congested residential buildings obstructing solar radiation and wind flow, whereas most urban residents spend 80-90% of their time indoor. Furthermore, the buildings were mostly built with average materials and construction detail. This causes high humidity condition for tenants that could promote mould growth. This study aims to analyse thermal comfort condition in complex residential building, Thailand for finding the passive solution to improve indoor air quality and respond to local conditions. The research methodology will be in two folds: 1) surveying on case study 2) analysis for finding the passive solution of reducing humidity indoor air The result of the survey indicated that the building need to find passive solution for solving humidity problem, that can be divided into two ways which raising ventilation and indoor temperature including increasing wind-flow ventilation and adjusting thermal temperature, for example; improving building design and stack driven ventilation. For raising indoor temperature or increasing mean radiant temperature, daylight can be passive solution for complex residential design for reducing humidity and enhance illumination indoor space simultaneous.

  11. Analysis of Pilot Feedback Regarding the Use of State Awareness Technologies During Complex Situations

    Science.gov (United States)

    Evans, Emory; Young, Steven D.; Daniels, Taumi; Santiago-Espada, Yamira; Etherington, Tim

    2016-01-01

    A flight simulation study was conducted at NASA Langley Research Center to evaluate flight deck systems that (1) predict aircraft energy state and/or autoflight configuration, (2) present the current state and expected future state of automated systems, and/or (3) show the state of flight-critical data systems in use by automated systems and primary flight instruments. Four new technology concepts were evaluated vis-à-vis current state-of-the-art flight deck systems and indicators. This human-in-the-loop study was conducted using commercial airline crews. Scenarios spanned a range of complex conditions and several emulated causal factors and complexity in recent accidents involving loss of state awareness by pilots (e.g. energy state, automation state, and/or system state). Data were collected via questionnaires administered after each flight, audio/video recordings, physiological data, head and eye tracking data, pilot control inputs, and researcher observations. This paper strictly focuses on findings derived from the questionnaire responses. It includes analysis of pilot subjective measures of complexity, decision making, workload, situation awareness, usability, and acceptability.

  12. Microarray R-based analysis of complex lysate experiments with MIRACLE.

    Science.gov (United States)

    List, Markus; Block, Ines; Pedersen, Marlene Lemvig; Christiansen, Helle; Schmidt, Steffen; Thomassen, Mads; Tan, Qihua; Baumbach, Jan; Mollenhauer, Jan

    2014-09-01

    Reverse-phase protein arrays (RPPAs) allow sensitive quantification of relative protein abundance in thousands of samples in parallel. Typical challenges involved in this technology are antibody selection, sample preparation and optimization of staining conditions. The issue of combining effective sample management and data analysis, however, has been widely neglected. This motivated us to develop MIRACLE, a comprehensive and user-friendly web application bridging the gap between spotting and array analysis by conveniently keeping track of sample information. Data processing includes correction of staining bias, estimation of protein concentration from response curves, normalization for total protein amount per sample and statistical evaluation. Established analysis methods have been integrated with MIRACLE, offering experimental scientists an end-to-end solution for sample management and for carrying out data analysis. In addition, experienced users have the possibility to export data to R for more complex analyses. MIRACLE thus has the potential to further spread utilization of RPPAs as an emerging technology for high-throughput protein analysis. Project URL: http://www.nanocan.org/miracle/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  13. Earthquake Complex Network Analysis Before and After the Mw 8.2 Earthquake in Iquique, Chile

    Science.gov (United States)

    Pasten, D.

    2017-12-01

    The earthquake complex networks have shown that they are abble to find specific features in seismic data set. In space, this networkshave shown a scale-free behavior for the probability distribution of connectivity, in directed networks and theyhave shown a small-world behavior, for the undirected networks.In this work, we present an earthquake complex network analysis for the large earthquake Mw 8.2 in the north ofChile (near to Iquique) in April, 2014. An earthquake complex network is made dividing the three dimensional space intocubic cells, if one of this cells contain an hypocenter, we name this cell like a node. The connections between nodes aregenerated in time. We follow the time sequence of seismic events and we are making the connections betweennodes. Now, we have two different networks: a directed and an undirected network. Thedirected network takes in consideration the time-direction of the connections, that is very important for the connectivityof the network: we are considering the connectivity, ki of the i-th node, like the number of connections going out ofthe node i plus the self-connections (if two seismic events occurred successive in time in the same cubic cell, we havea self-connection). The undirected network is made removing the direction of the connections and the self-connectionsfrom the directed network. For undirected networks, we are considering only if two nodes are or not connected.We have built a directed complex network and an undirected complex network, before and after the large earthquake in Iquique. We have used magnitudes greater than Mw = 1.0 and Mw = 3.0. We found that this method can recognize the influence of thissmall seismic events in the behavior of the network and we found that the size of the cell used to build the network isanother important factor to recognize the influence of the large earthquake in this complex system. This method alsoshows a difference in the values of the critical exponent γ (for the probability

  14. Molecular signature of complex regional pain syndrome (CRPS) and its analysis.

    Science.gov (United States)

    König, Simone; Schlereth, Tanja; Birklein, Frank

    2017-10-01

    Complex Regional Pain Syndrome (CRPS) is a rare, but often disabling pain disease. Biomarkers are lacking, but several inflammatory substances have been associated with the pathophysiology. This review outlines the current knowledge with respect to target biomolecules and the analytical tools available to measure them. Areas covered: Targets include cytokines, neuropeptides and resolvins; analysis strategies are thus needed for different classes of substances such as proteins, peptides, lipids and small molecules. Traditional methods like immunoassays are of importance next to state-of-the art high-resolution mass spectrometry techniques and 'omics' approaches. Expert commentary: Future biomarker studies need larger cohorts, which improve subgrouping of patients due to their presumed pathophysiology, and highly standardized workflows from sampling to analysis.

  15. A Variable Stiffness Analysis Model for Large Complex Thin-Walled Guide Rail

    Directory of Open Access Journals (Sweden)

    Wang Xiaolong

    2016-01-01

    Full Text Available Large complex thin-walled guide rail has complicated structure and no uniform low rigidity. The traditional cutting simulations are time consuming due to huge computation especially in large workpiece. To solve these problems, a more efficient variable stiffness analysis model has been propose, which can obtain quantitative stiffness value of the machining surface. Applying simulate cutting force in sampling points using finite element analysis software ABAQUS, the single direction variable stiffness rule can be obtained. The variable stiffness matrix has been propose by analyzing multi-directions coupling variable stiffness rule. Combining with the three direction cutting force value, the reasonability of existing processing parameters can be verified and the optimized cutting parameters can be designed.

  16. The Schrödinger–Robinson inequality from stochastic analysis on a complex Hilbert space

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2013-01-01

    We explored the stochastic analysis on a complex Hilbert space to show that one of the cornerstones of quantum mechanics (QM), namely Heisenberg's uncertainty relation, can be derived in the classical probabilistic framework. We created a new mathematical representation of quantum averages: as averages with respect to classical random fields. The existence of a classical stochastic model matching with Heisenberg's uncertainty relation makes the connection between classical and quantum probabilistic models essentially closer. In real physical situations, random fields are valued in the L 2 -space. Hence, although we model QM and not QFT, the classical systems under consideration have an infinite number of degrees of freedom. And in our modeling, infinite-dimensional stochastic analysis is the basic mathematical tool. (comment)

  17. Investigation on Law and Economics Based on Complex Network and Time Series Analysis

    Science.gov (United States)

    Yang, Jian; Qu, Zhao; Chang, Hui

    2015-01-01

    The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing. PMID:26076460

  18. Characterizing scaling properties of complex signals with missed data segments using the multifractal analysis

    Science.gov (United States)

    Pavlov, A. N.; Pavlova, O. N.; Abdurashitov, A. S.; Sindeeva, O. A.; Semyachkina-Glushkovskaya, O. V.; Kurths, J.

    2018-01-01

    The scaling properties of complex processes may be highly influenced by the presence of various artifacts in experimental recordings. Their removal produces changes in the singularity spectra and the Hölder exponents as compared with the original artifacts-free data, and these changes are significantly different for positively correlated and anti-correlated signals. While signals with power-law correlations are nearly insensitive to the loss of significant parts of data, the removal of fragments of anti-correlated signals is more crucial for further data analysis. In this work, we study the ability of characterizing scaling features of chaotic and stochastic processes with distinct correlation properties using a wavelet-based multifractal analysis, and discuss differences between the effect of missed data for synchronous and asynchronous oscillatory regimes. We show that even an extreme data loss allows characterizing physiological processes such as the cerebral blood flow dynamics.

  19. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  20. Time series analysis of embodied interaction: Movement variability and complexity matching as dyadic properties

    Directory of Open Access Journals (Sweden)

    Leonardo Zapata-Fonseca

    2016-12-01

    Full Text Available There is a growing consensus that a fuller understanding of social cognition depends on more systematic studies of real-time social interaction. Such studies require methods that can deal with the complex dynamics taking place at multiple interdependent temporal and spatial scales, spanning sub-personal, personal, and dyadic levels of analysis. We demonstrate the value of adopting an extended multi-scale approach by re-analyzing movement time series generated in a study of embodied dyadic interaction in a minimal virtual reality environment (a perceptual crossing experiment. Reduced movement variability revealed an interdependence between social awareness and social coordination that cannot be accounted for by either subjective or objective factors alone: it picks out interactions in which subjective and objective conditions are convergent (i.e. elevated coordination is perceived as clearly social, and impaired coordination is perceived as socially ambiguous. This finding is consistent with the claim that interpersonal interaction can be partially constitutive of direct social perception. Clustering statistics (Allan Factor of salient events revealed fractal scaling. Complexity matching defined as the similarity between these scaling laws was significantly more pronounced in pairs of participants as compared to surrogate dyads. This further highlights the multi-scale and distributed character of social interaction and extends previous complexity matching results from dyadic conversation to nonverbal social interaction dynamics. Trials with successful joint interaction were also associated with an increase in local coordination. Consequently, a local coordination pattern emerges on the background of complex dyadic interactions in the PCE task and makes joint successful performance possible.

  1. Meta-analysis of mismatch negativity to simple versus complex deviants in schizophrenia.

    Science.gov (United States)

    Avissar, Michael; Xie, Shanghong; Vail, Blair; Lopez-Calderon, Javier; Wang, Yuanjia; Javitt, Daniel C

    2018-01-01

    Mismatch negativity (MMN) deficits in schizophrenia (SCZ) have been studied extensively since the early 1990s, with the vast majority of studies using simple auditory oddball task deviants that vary in a single acoustic dimension such as pitch or duration. There has been a growing interest in using more complex deviants that violate more abstract rules to probe higher order cognitive deficits. It is still unclear how sensory processing deficits compare to and contribute to higher order cognitive dysfunction, which can be investigated with later attention-dependent auditory event-related potential (ERP) components such as a subcomponent of P300, P3b. In this meta-analysis, we compared MMN deficits in SCZ using simple deviants to more complex deviants. We also pooled studies that measured MMN and P3b in the same study sample and examined the relationship between MMN and P3b deficits within study samples. Our analysis reveals that, to date, studies using simple deviants demonstrate larger deficits than those using complex deviants, with effect sizes in the range of moderate to large. The difference in effect sizes between deviant types was reduced significantly when accounting for magnitude of MMN measured in healthy controls. P3b deficits, while large, were only modestly greater than MMN deficits (d=0.21). Taken together, our findings suggest that MMN to simple deviants may still be optimal as a biomarker for SCZ and that sensory processing dysfunction contributes significantly to MMN deficit and disease pathophysiology. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Methods for the analysis of complex fluorescence decays: sum of Becquerel functions versus sum of exponentials

    International Nuclear Information System (INIS)

    Menezes, Filipe; Fedorov, Alexander; Baleizão, Carlos; Berberan-Santos, Mário N; Valeur, Bernard

    2013-01-01

    Ensemble fluorescence decays are usually analyzed with a sum of exponentials. However, broad continuous distributions of lifetimes, either unimodal or multimodal, occur in many situations. A simple and flexible fitting function for these cases that encompasses the exponential is the Becquerel function. In this work, the applicability of the Becquerel function for the analysis of complex decays of several kinds is tested. For this purpose, decays of mixtures of four different fluorescence standards (binary, ternary and quaternary mixtures) are measured and analyzed. For binary and ternary mixtures, the expected sum of narrow distributions is well recovered from the Becquerel functions analysis, if the correct number of components is used. For ternary mixtures, however, satisfactory fits are also obtained with a number of Becquerel functions smaller than the true number of fluorophores in the mixture, at the expense of broadening the lifetime distributions of the fictitious components. The quaternary mixture studied is well fitted with both a sum of three exponentials and a sum of two Becquerel functions, showing the inevitable loss of information when the number of components is large. Decays of a fluorophore in a heterogeneous environment, known to be represented by unimodal and broad continuous distributions (as previously obtained by the maximum entropy method), are also measured and analyzed. It is concluded that these distributions can be recovered by the Becquerel function method with an accuracy similar to that of the much more complex maximum entropy method. It is also shown that the polar (or phasor) plot is not always helpful for ascertaining the degree (and kind) of complexity of a fluorescence decay. (paper)

  3. DNA Barcode Analysis of Thrips (Thysanoptera) Diversity in Pakistan Reveals Cryptic Species Complexes.

    Science.gov (United States)

    Iftikhar, Romana; Ashfaq, Muhammad; Rasool, Akhtar; Hebert, Paul D N

    2016-01-01

    Although thrips are globally important crop pests and vectors of viral disease, species identifications are difficult because of their small size and inconspicuous morphological differences. Sequence variation in the mitochondrial COI-5' (DNA barcode) region has proven effective for the identification of species in many groups of insect pests. We analyzed barcode sequence variation among 471 thrips from various plant hosts in north-central Pakistan. The Barcode Index Number (BIN) system assigned these sequences to 55 BINs, while the Automatic Barcode Gap Discovery detected 56 partitions, a count that coincided with the number of monophyletic lineages recognized by Neighbor-Joining analysis and Bayesian inference. Congeneric species showed an average of 19% sequence divergence (range = 5.6% - 27%) at COI, while intraspecific distances averaged 0.6% (range = 0.0% - 7.6%). BIN analysis suggested that all intraspecific divergence >3.0% actually involved a species complex. In fact, sequences for three major pest species (Haplothrips reuteri, Thrips palmi, Thrips tabaci), and one predatory thrips (Aeolothrips intermedius) showed deep intraspecific divergences, providing evidence that each is a cryptic species complex. The study compiles the first barcode reference library for the thrips of Pakistan, and examines global haplotype diversity in four important pest thrips.

  4. LC-MS/MS signal suppression effects in the analysis of pesticides in complex environmental matrices.

    Science.gov (United States)

    Choi, B K; Hercules, D M; Gusev, A I

    2001-02-01

    The application of LC separation and mobile phase additives in addressing LC-MS/MS matrix signal suppression effects for the analysis of pesticides in a complex environmental matrix was investigated. It was shown that signal suppression is most significant for analytes eluting early in the LC-MS analysis. Introduction of different buffers (e.g. ammonium formate, ammonium hydroxide, formic acid) into the LC mobile phase was effective in improving signal correlation between the matrix and standard samples. The signal improvement is dependent on buffer concentration as well as LC separation of the matrix components. The application of LC separation alone was not effective in addressing suppression effects when characterizing complex matrix samples. Overloading of the LC column by matrix components was found to significantly contribute to analyte-matrix co-elution and suppression of signal. This signal suppression effect can be efficiently compensated by 2D LC (LC-LC) separation techniques. The effectiveness of buffers and LC separation in improving signal correlation between standard and matrix samples is discussed.

  5. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets.

    Science.gov (United States)

    Johnson, Z P; Eady, R D; Ahmad, S F; Agravat, S; Morris, T; Else, J; Lank, S M; Wiseman, R W; O'Connor, D H; Penedo, M C T; Larsen, C P; Kean, L S

    2012-04-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permits multiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox on Windows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie.kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo , user name: imsdemo7@gmail.com and password: imsdemo.

  6. Agent-based financial dynamics model from stochastic interacting epidemic system and complexity analysis

    International Nuclear Information System (INIS)

    Lu, Yunfan; Wang, Jun; Niu, Hongli

    2015-01-01

    An agent-based financial stock price model is developed and investigated by a stochastic interacting epidemic system, which is one of the statistical physics systems and has been used to model the spread of an epidemic or a forest fire. Numerical and statistical analysis are performed on the simulated returns of the proposed financial model. Complexity properties of the financial time series are explored by calculating the correlation dimension and using the modified multiscale entropy method. In order to verify the rationality of the financial model, the real stock market indexes, Shanghai Composite Index and Shenzhen Component Index, are studied in comparison with the simulation data of the proposed model for the different infectiousness parameters. The empirical research reveals that this financial model can reproduce some important features of the real stock markets. - Highlights: • A new agent-based financial price model is developed by stochastic interacting epidemic system. • The structure of the proposed model allows to simulate the financial dynamics. • Correlation dimension and MMSE are applied to complexity analysis of financial time series. • Empirical results show the rationality of the proposed financial model

  7. Macroscopic Spatial Complexity of the Game of Life Cellular Automaton: A Simple Data Analysis

    Science.gov (United States)

    Hernández-Montoya, A. R.; Coronel-Brizio, H. F.; Rodríguez-Achach, M. E.

    In this chapter we present a simple data analysis of an ensemble of 20 time series, generated by averaging the spatial positions of the living cells for each state of the Game of Life Cellular Automaton (GoL). We show that at the macroscopic level described by these time series, complexity properties of GoL are also presented and the following emergent properties, typical of data extracted complex systems such as financial or economical come out: variations of the generated time series following an asymptotic power law distribution, large fluctuations tending to be followed by large fluctuations, and small fluctuations tending to be followed by small ones, and fast decay of linear correlations, however, the correlations associated to their absolute variations exhibit a long range memory. Finally, a Detrended Fluctuation Analysis (DFA) of the generated time series, indicates that the GoL spatial macro states described by the time series are not either completely ordered or random, in a measurable and very interesting way.

  8. SPICE: exploration and analysis of post-cytometric complex multivariate datasets.

    Science.gov (United States)

    Roederer, Mario; Nozzi, Joshua L; Nason, Martha C

    2011-02-01

    Polychromatic flow cytometry results in complex, multivariate datasets. To date, tools for the aggregate analysis of these datasets across multiple specimens grouped by different categorical variables, such as demographic information, have not been optimized. Often, the exploration of such datasets is accomplished by visualization of patterns with pie charts or bar charts, without easy access to statistical comparisons of measurements that comprise multiple components. Here we report on algorithms and a graphical interface we developed for these purposes. In particular, we discuss thresholding necessary for accurate representation of data in pie charts, the implications for display and comparison of normalized versus unnormalized data, and the effects of averaging when samples with significant background noise are present. Finally, we define a statistic for the nonparametric comparison of complex distributions to test for difference between groups of samples based on multi-component measurements. While originally developed to support the analysis of T cell functional profiles, these techniques are amenable to a broad range of datatypes. Published 2011 Wiley-Liss, Inc.

  9. Structural insights into the mycobacteria transcription initiation complex from analysis of X-ray crystal structures

    Energy Technology Data Exchange (ETDEWEB)

    Hubin, Elizabeth A.; Lilic, Mirjana; Darst, Seth A.; Campbell, Elizabeth A.

    2017-07-13

    The mycobacteria RNA polymerase (RNAP) is a target for antimicrobials against tuberculosis, motivating structure/function studies. Here we report a 3.2 Å-resolution crystal structure of a Mycobacterium smegmatis (Msm) open promoter complex (RPo), along with structural analysis of the Msm RPo and a previously reported 2.76 Å-resolution crystal structure of an Msm transcription initiation complex with a promoter DNA fragment. We observe the interaction of the Msm RNAP α-subunit C-terminal domain (αCTD) with DNA, and we provide evidence that the αCTD may play a role in Mtb transcription regulation. Our results reveal the structure of an Actinobacteria-unique insert of the RNAP β' subunit. Finally, our analysis reveals the disposition of the N-terminal segment of Msm σA, which may comprise an intrinsically disordered protein domain unique to mycobacteria. The clade-specific features of the mycobacteria RNAP provide clues to the profound instability of mycobacteria RPo compared with E. coli.

  10. Agent-based financial dynamics model from stochastic interacting epidemic system and complexity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Yunfan, E-mail: yunfanlu@yeah.net; Wang, Jun; Niu, Hongli

    2015-06-12

    An agent-based financial stock price model is developed and investigated by a stochastic interacting epidemic system, which is one of the statistical physics systems and has been used to model the spread of an epidemic or a forest fire. Numerical and statistical analysis are performed on the simulated returns of the proposed financial model. Complexity properties of the financial time series are explored by calculating the correlation dimension and using the modified multiscale entropy method. In order to verify the rationality of the financial model, the real stock market indexes, Shanghai Composite Index and Shenzhen Component Index, are studied in comparison with the simulation data of the proposed model for the different infectiousness parameters. The empirical research reveals that this financial model can reproduce some important features of the real stock markets. - Highlights: • A new agent-based financial price model is developed by stochastic interacting epidemic system. • The structure of the proposed model allows to simulate the financial dynamics. • Correlation dimension and MMSE are applied to complexity analysis of financial time series. • Empirical results show the rationality of the proposed financial model.

  11. Tools for Accurate and Efficient Analysis of Complex Evolutionary Mechanisms in Microbial Genomes. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Nakhleh, Luay

    2014-03-12

    I proposed to develop computationally efficient tools for accurate detection and reconstruction of microbes' complex evolutionary mechanisms, thus enabling rapid and accurate annotation, analysis and understanding of their genomes. To achieve this goal, I proposed to address three aspects. (1) Mathematical modeling. A major challenge facing the accurate detection of HGT is that of distinguishing between these two events on the one hand and other events that have similar "effects." I proposed to develop a novel mathematical approach for distinguishing among these events. Further, I proposed to develop a set of novel optimization criteria for the evolutionary analysis of microbial genomes in the presence of these complex evolutionary events. (2) Algorithm design. In this aspect of the project, I proposed to develop an array of e cient and accurate algorithms for analyzing microbial genomes based on the formulated optimization criteria. Further, I proposed to test the viability of the criteria and the accuracy of the algorithms in an experimental setting using both synthetic as well as biological data. (3) Software development. I proposed the nal outcome to be a suite of software tools which implements the mathematical models as well as the algorithms developed.

  12. Complex network analysis of resting-state fMRI of the brain.

    Science.gov (United States)

    Anwar, Abdul Rauf; Hashmy, Muhammad Yousaf; Imran, Bilal; Riaz, Muhammad Hussnain; Mehdi, Sabtain Muhammad Muntazir; Muthalib, Makii; Perrey, Stephane; Deuschl, Gunther; Groppa, Sergiu; Muthuraman, Muthuraman

    2016-08-01

    Due to the fact that the brain activity hardly ever diminishes in healthy individuals, analysis of resting state functionality of the brain seems pertinent. Various resting state networks are active inside the idle brain at any time. Based on various neuro-imaging studies, it is understood that various structurally distant regions of the brain could be functionally connected. Regions of the brain, that are functionally connected, during rest constitutes to the resting state network. In the present study, we employed the complex network measures to estimate the presence of community structures within a network. Such estimate is named as modularity. Instead of using a traditional correlation matrix, we used a coherence matrix taken from the causality measure between different nodes. Our results show that in prolonged resting state the modularity starts to decrease. This decrease was observed in all the resting state networks and on both sides of the brain. Our study highlights the usage of coherence matrix instead of correlation matrix for complex network analysis.

  13. Scale-free crystallization of two-dimensional complex plasmas: Domain analysis using Minkowski tensors

    Science.gov (United States)

    Böbel, A.; Knapek, C. A.; Räth, C.

    2018-05-01

    Experiments of the recrystallization processes in two-dimensional complex plasmas are analyzed to rigorously test a recently developed scale-free phase transition theory. The "fractal-domain-structure" (FDS) theory is based on the kinetic theory of Frenkel. It assumes the formation of homogeneous domains, separated by defect lines, during crystallization and a fractal relationship between domain area and boundary length. For the defect number fraction and system energy a scale-free power-law relation is predicted. The long-range scaling behavior of the bond-order correlation function shows clearly that the complex plasma phase transitions are not of the Kosterlitz, Thouless, Halperin, Nelson, and Young type. Previous preliminary results obtained by counting the number of dislocations and applying a bond-order metric for structural analysis are reproduced. These findings are supplemented by extending the use of the bond-order metric to measure the defect number fraction and furthermore applying state-of-the-art analysis methods, allowing a systematic testing of the FDS theory with unprecedented scrutiny: A morphological analysis of lattice structure is performed via Minkowski tensor methods. Minkowski tensors form a complete family of additive, motion covariant and continuous morphological measures that are sensitive to nonlinear properties. The FDS theory is rigorously confirmed and predictions of the theory are reproduced extremely well. The predicted scale-free power-law relation between defect fraction number and system energy is verified for one more order of magnitude at high energies compared to the inherently discontinuous bond-order metric. It is found that the fractal relation between crystalline domain area and circumference is independent of the experiment, the particular Minkowski tensor method, and the particular choice of parameters. Thus, the fractal relationship seems to be inherent to two-dimensional phase transitions in complex plasmas. Minkowski

  14. Hyperplane distance neighbor clustering based on local discriminant analysis for complex chemical processes monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chunhong; Xiao, Shaoqing; Gu, Xiaofeng [Jiangnan University, Wuxi (China)

    2014-11-15

    The collected training data often include both normal and faulty samples for complex chemical processes. However, some monitoring methods, such as partial least squares (PLS), principal component analysis (PCA), independent component analysis (ICA) and Fisher discriminant analysis (FDA), require fault-free data to build the normal operation model. These techniques are applicable after the preliminary step of data clustering is applied. We here propose a novel hyperplane distance neighbor clustering (HDNC) based on the local discriminant analysis (LDA) for chemical process monitoring. First, faulty samples are separated from normal ones using the HDNC method. Then, the optimal subspace for fault detection and classification can be obtained using the LDA approach. The proposed method takes the multimodality within the faulty data into account, and thus improves the capability of process monitoring significantly. The HDNC-LDA monitoring approach is applied to two simulation processes and then compared with the conventional FDA based on the K-nearest neighbor (KNN-FDA) method. The results obtained in two different scenarios demonstrate the superiority of the HDNC-LDA approach in terms of fault detection and classification accuracy.

  15. Complex network analysis of conventional and Islamic stock market in Indonesia

    Science.gov (United States)

    Rahmadhani, Andri; Purqon, Acep; Kim, Sehyun; Kim, Soo Yong

    2015-09-01

    The rising popularity of Islamic financial products in Indonesia has become a new interesting topic to be analyzed recently. We introduce a complex network analysis to compare conventional and Islamic stock market in Indonesia. Additionally, Random Matrix Theory (RMT) has been added as a part of reference to expand the analysis of the result. Both of them are based on the cross correlation matrix of logarithmic price returns. Closing price data, which is taken from June 2011 to July 2012, is used to construct logarithmic price returns. We also introduce the threshold value using winner-take-all approach to obtain scale-free property of the network. This means that the nodes of the network that has a cross correlation coefficient below the threshold value should not be connected with an edge. As a result, we obtain 0.5 as the threshold value for all of the stock market. From the RMT analysis, we found that there is only market wide effect on both stock market and no clustering effect has been found yet. From the network analysis, both of stock market networks are dominated by the mining sector. The length of time series of closing price data must be expanded to get more valuable results, even different behaviors of the system.

  16. Hyperplane distance neighbor clustering based on local discriminant analysis for complex chemical processes monitoring

    International Nuclear Information System (INIS)

    Lu, Chunhong; Xiao, Shaoqing; Gu, Xiaofeng

    2014-01-01

    The collected training data often include both normal and faulty samples for complex chemical processes. However, some monitoring methods, such as partial least squares (PLS), principal component analysis (PCA), independent component analysis (ICA) and Fisher discriminant analysis (FDA), require fault-free data to build the normal operation model. These techniques are applicable after the preliminary step of data clustering is applied. We here propose a novel hyperplane distance neighbor clustering (HDNC) based on the local discriminant analysis (LDA) for chemical process monitoring. First, faulty samples are separated from normal ones using the HDNC method. Then, the optimal subspace for fault detection and classification can be obtained using the LDA approach. The proposed method takes the multimodality within the faulty data into account, and thus improves the capability of process monitoring significantly. The HDNC-LDA monitoring approach is applied to two simulation processes and then compared with the conventional FDA based on the K-nearest neighbor (KNN-FDA) method. The results obtained in two different scenarios demonstrate the superiority of the HDNC-LDA approach in terms of fault detection and classification accuracy

  17. The complexity of Rhipicephalus (Boophilus microplus genome characterised through detailed analysis of two BAC clones

    Directory of Open Access Journals (Sweden)

    Valle Manuel

    2011-07-01

    Full Text Available Abstract Background Rhipicephalus (Boophilus microplus (Rmi a major cattle ectoparasite and tick borne disease vector, impacts on animal welfare and industry productivity. In arthropod research there is an absence of a complete Chelicerate genome, which includes ticks, mites, spiders, scorpions and crustaceans. Model arthropod genomes such as Drosophila and Anopheles are too taxonomically distant for a reference in tick genomic sequence analysis. This study focuses on the de-novo assembly of two R. microplus BAC sequences from the understudied R microplus genome. Based on available R. microplus sequenced resources and comparative analysis, tick genomic structure and functional predictions identify complex gene structures and genomic targets expressed during tick-cattle interaction. Results In our BAC analyses we have assembled, using the correct positioning of BAC end sequences and transcript sequences, two challenging genomic regions. Cot DNA fractions compared to the BAC sequences confirmed a highly repetitive BAC sequence BM-012-E08 and a low repetitive BAC sequence BM-005-G14 which was gene rich and contained short interspersed elements (SINEs. Based directly on the BAC and Cot data comparisons, the genome wide frequency of the SINE Ruka element was estimated. Using a conservative approach to the assembly of the highly repetitive BM-012-E08, the sequence was de-convoluted into three repeat units, each unit containing an 18S, 5.8S and 28S ribosomal RNA (rRNA encoding gene sequence (rDNA, related internal transcribed spacer and complex intergenic region. In the low repetitive BM-005-G14, a novel gene complex was found between to 2 genes on the same strand. Nested in the second intron of a large 9 Kb papilin gene was a helicase gene. This helicase overlapped in two exonic regions with the papilin. Both these genes were shown expressed in different tick life stage important in ectoparasite interaction with the host. Tick specific sequence

  18. Analysis of co-occurrence toponyms in web pages based on complex networks

    Science.gov (United States)

    Zhong, Xiang; Liu, Jiajun; Gao, Yong; Wu, Lun

    2017-01-01

    A large number of geographical toponyms exist in web pages and other documents, providing abundant geographical resources for GIS. It is very common for toponyms to co-occur in the same documents. To investigate these relations associated with geographic entities, a novel complex network model for co-occurrence toponyms is proposed. Then, 12 toponym co-occurrence networks are constructed from the toponym sets extracted from the People's Daily Paper documents of 2010. It is found that two toponyms have a high co-occurrence probability if they are at the same administrative level or if they possess a part-whole relationship. By applying complex network analysis methods to toponym co-occurrence networks, we find the following characteristics. (1) The navigation vertices of the co-occurrence networks can be found by degree centrality analysis. (2) The networks express strong cluster characteristics, and it takes only several steps to reach one vertex from another one, implying that the networks are small-world graphs. (3) The degree distribution satisfies the power law with an exponent of 1.7, so the networks are free-scale. (4) The networks are disassortative and have similar assortative modes, with assortative exponents of approximately 0.18 and assortative indexes less than 0. (5) The frequency of toponym co-occurrence is weakly negatively correlated with geographic distance, but more strongly negatively correlated with administrative hierarchical distance. Considering the toponym frequencies and co-occurrence relationships, a novel method based on link analysis is presented to extract the core toponyms from web pages. This method is suitable and effective for geographical information retrieval.

  19. Navigating the complexities of qualitative comparative analysis: case numbers, necessity relations, and model ambiguities.

    Science.gov (United States)

    Thiem, Alrik

    2014-12-01

    In recent years, the method of Qualitative Comparative Analysis (QCA) has been enjoying increasing levels of popularity in evaluation and directly neighboring fields. Its holistic approach to causal data analysis resonates with researchers whose theories posit complex conjunctions of conditions and events. However, due to QCA's relative immaturity, some of its technicalities and objectives have not yet been well understood. In this article, I seek to raise awareness of six pitfalls of employing QCA with regard to the following three central aspects: case numbers, necessity relations, and model ambiguities. Most importantly, I argue that case numbers are irrelevant to the methodological choice of QCA or any of its variants, that necessity is not as simple a concept as it has been suggested by many methodologists, and that doubt must be cast on the determinacy of virtually all results presented in past QCA research. By means of empirical examples from published articles, I explain the background of these pitfalls and introduce appropriate procedures, partly with reference to current software, that help avoid them. QCA carries great potential for scholars in evaluation and directly neighboring areas interested in the analysis of complex dependencies in configurational data. If users beware of the pitfalls introduced in this article, and if they avoid mechanistic adherence to doubtful "standards of good practice" at this stage of development, then research with QCA will gain in quality, as a result of which a more solid foundation for cumulative knowledge generation and well-informed policy decisions will also be created. © The Author(s) 2014.

  20. Advances in complex analysis and operator theory festschrift in honor of Daniel Alpay’s 60th birthday

    CERN Document Server

    Sabadini, Irene; Struppa, Daniele; Vajiac, Mihaela

    2017-01-01

    This book gathers contributions written by Daniel Alpay’s friends and collaborators. Several of the papers were presented at the International Conference on Complex Analysis and Operator Theory held in honor of Professor Alpay’s 60th birthday at Chapman University in November 2016. The main topics covered are complex analysis, operator theory and other areas of mathematics close to Alpay’s primary research interests. The book is recommended for mathematicians from the graduate level on, working in various areas of mathematical analysis, operator theory, infinite dimensional analysis, linear systems, and stochastic processes.

  1. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    Science.gov (United States)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in

  2. An economic parametric analysis of the synthetic fuel produced by a fusion-fission complex

    International Nuclear Information System (INIS)

    Tai, A.S.; Krakowski, R.A.

    1980-01-01

    A simple analytic model is used to examine economic constraints of a fusion-fission complex in which a portion of a thermal energy is used for producing synthetic fuel (synfuel). Since the values of many quantities are not well-known, a parametric analysis has been carried out for testing the sensitivity of the synfuel production cost in relation to crucial economic and technological quantities (investment costs of hybrid and synfuel plants, energy multiplication of the fission blanket, recirculating power fraction of the fusion driver, etc.). In addition, a minimum synfuel selling price has been evaluated, from which the fission-fusion-synfuel complex brings about a higher economic benefit than does the fusion-fission hybrid entirely devoted to fissile-fuel and electricity generation. This paper describes the energy flow diagram of fusion-fission synfuel concept, express the revenue-to-cost formulation and the breakeven synfuel selling price. The synfuel production cost given by the model is evaluated within a range of values of crucial parameters. Assuming an electric cost of 2.7 cents/kWh, an annual investment cost per energy unit of 4.2 to 6 $/FJ for the fusion-fission complex and 1.5 to 3 $/GJ for the synfuel plant, the synfuel production cost lies between 6.5 and 8.5 $/GJ. These production costs can compete with those evaluated for other processes. The study points out a potential use of the fusion-fission hybrid reactor for other than fissile-fuel and electricity generation. (orig.) [de

  3. Analysis of gene expression profile microarray data in complex regional pain syndrome.

    Science.gov (United States)

    Tan, Wulin; Song, Yiyan; Mo, Chengqiang; Jiang, Shuangjian; Wang, Zhongxing

    2017-09-01

    The aim of the present study was to predict key genes and proteins associated with complex regional pain syndrome (CRPS) using bioinformatics analysis. The gene expression profiling microarray data, GSE47603, which included peripheral blood samples from 4 patients with CRPS and 5 healthy controls, was obtained from the Gene Expression Omnibus (GEO) database. The differentially expressed genes (DEGs) in CRPS patients compared with healthy controls were identified using the GEO2R online tool. Functional enrichment analysis was then performed using The Database for Annotation Visualization and Integrated Discovery online tool. Protein‑protein interaction (PPI) network analysis was subsequently performed using Search Tool for the Retrieval of Interaction Genes database and analyzed with Cytoscape software. A total of 257 DEGs were identified, including 243 upregulated genes and 14 downregulated ones. Genes in the human leukocyte antigen (HLA) family were most significantly differentially expressed. Enrichment analysis demonstrated that signaling pathways, including immune response, cell motion, adhesion and angiogenesis were associated with CRPS. PPI network analysis revealed that key genes, including early region 1A binding protein p300 (EP300), CREB‑binding protein (CREBBP), signal transducer and activator of transcription (STAT)3, STAT5A and integrin α M were associated with CRPS. The results suggest that the immune response may therefore serve an important role in CRPS development. In addition, genes in the HLA family, such as HLA‑DQB1 and HLA‑DRB1, may present potential biomarkers for the diagnosis of CRPS. Furthermore, EP300, its paralog CREBBP, and the STAT family genes, STAT3 and STAT5 may be important in the development of CRPS.

  4. Correlation analysis of motor current and chatter vibration in grinding using complex continuous wavelet coherence

    International Nuclear Information System (INIS)

    Liu, Yao; Wang, Xiufeng; Lin, Jing; Zhao, Wei

    2016-01-01

    Motor current is an emerging and popular signal which can be used to detect machining chatter with its multiple advantages. To achieve accurate and reliable chatter detection using motor current, it is important to make clear the quantitative relationship between motor current and chatter vibration, which has not yet been studied clearly. In this study, complex continuous wavelet coherence, including cross wavelet transform and wavelet coherence, is applied to the correlation analysis of motor current and chatter vibration in grinding. Experimental results show that complex continuous wavelet coherence performs very well in demonstrating and quantifying the intense correlation between these two signals in frequency, amplitude and phase. When chatter occurs, clear correlations in frequency and amplitude in the chatter frequency band appear and the phase difference of current signal to vibration signal turns from random to stable. The phase lead of the most correlated chatter frequency is the largest. With the further development of chatter, the correlation grows up in intensity and expands to higher order chatter frequency band. The analyzing results confirm that there is a consistent correlation between motor current and vibration signals in the grinding chatter process. However, to achieve accurate and reliable chatter detection using motor current, the frequency response bandwidth of current loop of the feed drive system must be wide enough to response chatter effectively. (paper)

  5. Pan-Cancer Mutational and Transcriptional Analysis of the Integrator Complex

    Directory of Open Access Journals (Sweden)

    Antonio Federico

    2017-04-01

    Full Text Available The integrator complex has been recently identified as a key regulator of RNA Polymerase II-mediated transcription, with many functions including the processing of small nuclear RNAs, the pause-release and elongation of polymerase during the transcription of protein coding genes, and the biogenesis of enhancer derived transcripts. Moreover, some of its components also play a role in genome maintenance. Thus, it is reasonable to hypothesize that their functional impairment or altered expression can contribute to malignancies. Indeed, several studies have described the mutations or transcriptional alteration of some Integrator genes in different cancers. Here, to draw a comprehensive pan-cancer picture of the genomic and transcriptomic alterations for the members of the complex, we reanalyzed public data from The Cancer Genome Atlas. Somatic mutations affecting Integrator subunit genes and their transcriptional profiles have been investigated in about 11,000 patients and 31 tumor types. A general heterogeneity in the mutation frequencies was observed, mostly depending on tumor type. Despite the fact that we could not establish them as cancer drivers, INTS7 and INTS8 genes were highly mutated in specific cancers. A transcriptome analysis of paired (normal and tumor samples revealed that the transcription of INTS7, INTS8, and INTS13 is significantly altered in several cancers. Experimental validation performed on primary tumors confirmed these findings.

  6. The application of complex network time series analysis in turbulent heated jets

    International Nuclear Information System (INIS)

    Charakopoulos, A. K.; Karakasidis, T. E.; Liakopoulos, A.; Papanicolaou, P. N.

    2014-01-01

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics

  7. An unsymmetrical porphyrin and its metal complexes: synthesis, spectroscopy, thermal analysis and liquid crystal properties

    Directory of Open Access Journals (Sweden)

    CHANGFU ZHUANG

    2009-09-01

    Full Text Available The synthesis and characterization of a new unsymmetrical porphyrin liquid crystal, 5-(4-stearoyloxyphenylphenyl-10,15,20-triphenylporphyrin (SPTPPH2 and its transition metal complexes (SPTPPM, M(II = Zn, Fe, Co, Ni, Cu or Mn are reported. Their structure and properties were studied by elemental analysis, and UV–Vis, IR, mass and 1H-HMR spectroscopy. Their luminescent properties were studied by excitation and emission spectroscopy. The quantum yields of the S1 ® S0 fluorescence were measured at room temperature. According to thermal studies, the complexes have a higher thermal stability (no decomposition until 200 °C. Differential scanning calorimetry (DSC data and an optical textural photograph, obtained using a polarizing microscope (POM, indicate that the porphyrin ligand had liquid crystalline character and that it exhibited more than one mesophase and a low-lying phase transition temperature, with transition temperatures of 19.3 and 79.4 °C; the temperature range of the liquid crystal (LC phase of the ligand was 70.1 °C.

  8. A simple iterative independent component analysis algorithm for vibration source signal identification of complex structures

    Directory of Open Access Journals (Sweden)

    Dong-Sup Lee

    2015-01-01

    Full Text Available Independent Component Analysis (ICA, one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: insta- bility and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to vali- date the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.

  9. Sedimentation Velocity Analysis of Large Oligomeric Chromatin Complexes Using Interference Detection.

    Science.gov (United States)

    Rogge, Ryan A; Hansen, Jeffrey C

    2015-01-01

    Sedimentation velocity experiments measure the transport of molecules in solution under centrifugal force. Here, we describe a method for monitoring the sedimentation of very large biological molecular assemblies using the interference optical systems of the analytical ultracentrifuge. The mass, partial-specific volume, and shape of macromolecules in solution affect their sedimentation rates as reflected in the sedimentation coefficient. The sedimentation coefficient is obtained by measuring the solute concentration as a function of radial distance during centrifugation. Monitoring the concentration can be accomplished using interference optics, absorbance optics, or the fluorescence detection system, each with inherent advantages. The interference optical system captures data much faster than these other optical systems, allowing for sedimentation velocity analysis of extremely large macromolecular complexes that sediment rapidly at very low rotor speeds. Supramolecular oligomeric complexes produced by self-association of 12-mer chromatin fibers are used to illustrate the advantages of the interference optics. Using interference optics, we show that chromatin fibers self-associate at physiological divalent salt concentrations to form structures that sediment between 10,000 and 350,000S. The method for characterizing chromatin oligomers described in this chapter will be generally useful for characterization of any biological structures that are too large to be studied by the absorbance optical system. © 2015 Elsevier Inc. All rights reserved.

  10. Exploring the Anti-Burkholderia cepacia Complex Activity of Essential Oils: A Preliminary Analysis

    Directory of Open Access Journals (Sweden)

    Isabel Maida

    2014-01-01

    Full Text Available In this work we have checked the ability of the essential oils extracted from six different medicinal plants (Eugenia caryophyllata, Origanum vulgare, Rosmarinus officinalis, Lavandula officinalis, Melaleuca alternifolia, and Thymus vulgaris to inhibit the growth of 18 bacterial type strains belonging to the 18 known species of the Burkholderia cepacia complex (Bcc. These bacteria are opportunistic human pathogens that can cause severe infection in immunocompromised patients, especially those affected by cystic fibrosis (CF, and are often resistant to multiple antibiotics. The analysis of the aromatograms produced by the six oils revealed that, in spite of their different chemical composition, all of them were able to contrast the growth of Bcc members. However, three of them (i.e., Eugenia caryophyllata, Origanum vulgare, and Thymus vulgaris were particularly active versus the Bcc strains, including those exhibiting a high degree or resistance to ciprofloxacin, one of the most used antibiotics to treat Bcc infections. These three oils are also active toward both environmental and clinical strains (isolated from CF patients, suggesting that they might be used in the future to fight B. cepacia complex infections.

  11. Monitoring and Analysis of Environmental Gamma Dose Rate around Serpong Nuclear Complex

    Directory of Open Access Journals (Sweden)

    I.P. Susila

    2017-08-01

    Full Text Available An environmental radiation monitoring system that continuously measures gamma dose rate around nuclear facilities is an important tool to present dose rate information to the public or authorities for radiological protection during both normal operation and radiological accidents. We have developed such a system that consists of six GM-based device for monitoring the environmental dose rate around Serpong Nuclear Complex. It has operated since 2010. In this study, a description of the system and analysis of measured data are presented. Analysis of the data for the last five years shows that the average dose rate levels were between 84-99 nSv/h which are still lower than terrestrial gamma radiation levels at several other locations in Indonesia. Time series analysis of the monitoring data demonstrates a good agreement between an increase in environmental gamma dose rate and the presence of iodine and argon in the air by in situ measurement. This result indicates that system is also effective for an early warning system in the case of radiological emergency.

  12. From LIDAR Scanning to 3d FEM Analysis for Complex Surface and Underground Excavations

    Science.gov (United States)

    Chun, K.; Kemeny, J.

    2017-12-01

    Light detection and ranging (LIDAR) has been a prevalent remote-sensing technology applied in the geological fields due to its high precision and ease to use. One of the major applications is to use the detailed geometrical information of underground structures as a basis for the generation of three-dimensional numerical model that can be used in FEM analysis. To date, however, straightforward techniques in reconstructing numerical model from the scanned data of underground structures have not been well established or tested. In this paper, we propose a comprehensive approach integrating from LIDAR scanning to finite element numerical analysis, specifically converting LIDAR 3D point clouds of object containing complex surface geometry into finite element model. This methodology has been applied to the Kartchner Caverns in Arizona for the stability analysis. Numerical simulations were performed using the finite element code ABAQUS. The results indicate that the highlights of our technologies obtained from LIDAR is effective and provide reference for other similar engineering project in practice.

  13. Using Enterprise Architecture for Analysis of a Complex Adaptive Organization's Risk Inducing Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Salguero, Laura Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Huff, Johnathon [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Matta, Anthony R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Collins, Sue S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    Sandia National Laboratories is an organization with a wide range of research and development activities that include nuclear, explosives, and chemical hazards. In addition, Sandia has over 2000 labs and over 40 major test facilities, such as the Thermal Test Complex, the Lightning Test Facility, and the Rocket Sled Track. In order to support safe operations, Sandia has a diverse Environment, Safety, and Health (ES&H) organization that provides expertise to support engineers and scientists in performing work safely. With such a diverse organization to support, the ES&H program continuously seeks opportunities to improve the services provided for Sandia by using various methods as part of their risk management strategy. One of the methods being investigated is using enterprise architecture analysis to mitigate risk inducing characteristics such as normalization of deviance, organizational drift, and problems in information flow. This paper is a case study for how a Department of Defense Architecture Framework (DoDAF) model of the ES&H enterprise, including information technology applications, can be analyzed to understand the level of risk associated with the risk inducing characteristics discussed above. While the analysis is not complete, we provide proposed analysis methods that will be used for future research as the project progresses.

  14. Food supplements complexes for pregnant' s: composition analysis and selection features at community pharmacies

    OpenAIRE

    Tamelytė, Raimonda

    2016-01-01

    Compared food supplements complexes composition for pregnant women and adults. made a questionnaire survey. the composition of the food supplements complexes for pregnant women includes all major necessary vitamins and minerals for pregnancy. the assessment of active substances in the composition of the food supplements complexes for pregnant women are more suitable for use during pregnancy than, food supplements complexes for adults. in pharmacy, selecting food supplements complexes for preg...

  15. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...... workflow and computational resources. The current version of the tool implements an extension to previous work in that it allows for general workflow and resource bounds and provides optimal solutions even to systems with cyclic dependencies. Despite the virtues of the current tool, improvements...... and extensions still remain, which are in focus of ongoing activities. Improvements include accounting for phase information to improve bounds, whereas the tool awaits extension to include flow control models, which both depend on the possibility of accounting for propagation delay. Since the current version...

  16. Analysis and computer simulation for transient flow in complex system of liquid piping

    International Nuclear Information System (INIS)

    Mitry, A.M.

    1985-01-01

    This paper is concerned with unsteady state analysis and development of a digital computer program, FLUTRAN, that performs a simulation of transient flow behavior in a complex system of liquid piping. The program calculates pressure and flow transients in the liquid filled piping system. The analytical model is based on the method of characteristics solution to the fluid hammer continuity and momentum equations. The equations are subject to wide variety of boundary conditions to take into account the effect of hydraulic devices. Water column separation is treated as a boundary condition with known head. Experimental tests are presented that exhibit transients induced by pump failure and valve closure in the McGuire Nuclear Station Low Level Intake Cooling Water System. Numerical simulation is conducted to compare theory with test data. Analytical and test data are shown to be in good agreement and provide validation of the model

  17. Feminisms in the north borderland of Mexico. An analysis from the intersectionality and the complex identities

    Directory of Open Access Journals (Sweden)

    Janet Gabriela García Alcaraz

    2018-01-01

    Full Text Available The point of beginning of this study is the idea that women are agents of change, who are acting in different spaces of resistance, and who claim demands for the improvement of gender conditions and life precariousness.  Across the analysis of feminist’s narratives, who reside and have incident in the state of Baja California, we discuss the perspective of the intersectionality and its relation to the approach on the complex identities. So, an analytical scheme of the narratives is designed, considering the generational, ethnic differences and of social position. It is achieved that the condition of gender oppression is a shared element, but not as a unifier component, as well it is identified the configuration of a feminist intersubjectivity.

  18. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    International Nuclear Information System (INIS)

    Cuerva, A.

    1997-01-01

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC1400-12 Ref. [9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc. ) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  19. A digital processing method for the analysis of complex nuclear spectra

    International Nuclear Information System (INIS)

    Madan, V.K.; Abani, M.C.; Bairi, B.R.

    1994-01-01

    This paper describes a digital processing method using frequency power spectra for the analysis of complex nuclear spectra. The power spectra were estimated by employing modified discrete Fourier transform. The method was applied to observed spectral envelopes. The results for separating closely-spaced doublets in nuclear spectra of low statistical precision compared favorably with those obtained by using a popular peak fitting program SAMPO. The paper also describes limitations of the peak fitting methods. It describes the advantages of digital processing techniques for type II digital signals including nuclear spectra. A compact computer program occupying less than 2.5 kByte of memory space was written in BASIC for the processing of observed spectral envelopes. (orig.)

  20. Water Complexes of Cytochrome P450: Insights from Energy Decomposition Analysis

    Directory of Open Access Journals (Sweden)

    Hajime Hirao

    2013-06-01

    Full Text Available Water is a small molecule that nevertheless perturbs, sometimes significantly, the electronic properties of an enzyme’s active site. In this study, interactions of a water molecule with the ferric heme and the compound I (Cpd I intermediate of cytochrome P450 are studied. Energy decomposition analysis (EDA schemes are used to investigate the physical origins of these interactions. Localized molecular orbital EDA (LMOEDA implemented in the quantum chemistry software GAMESS and the EDA method implemented in the ADF quantum chemistry program are used. EDA reveals that the electrostatic and polarization effects act as the major driving force in both of these interactions. The hydrogen bonding in the Cpd I•••H2O complex is similar to that in the water dimer; however, the relative importance of the electrostatic effect is somewhat larger in the water dimer.

  1. Complex network analysis of phase dynamics underlying oil-water two-phase flows

    Science.gov (United States)

    Gao, Zhong-Ke; Zhang, Shan-Shan; Cai, Qing; Yang, Yu-Xuan; Jin, Ning-De

    2016-01-01

    Characterizing the complicated flow behaviors arising from high water cut and low velocity oil-water flows is an important problem of significant challenge. We design a high-speed cycle motivation conductance sensor and carry out experiments for measuring the local flow information from different oil-in-water flow patterns. We first use multivariate time-frequency analysis to probe the typical features of three flow patterns from the perspective of energy and frequency. Then we infer complex networks from multi-channel measurements in terms of phase lag index, aiming to uncovering the phase dynamics governing the transition and evolution of different oil-in-water flow patterns. In particular, we employ spectral radius and weighted clustering coefficient entropy to characterize the derived unweighted and weighted networks and the results indicate that our approach yields quantitative insights into the phase dynamics underlying the high water cut and low velocity oil-water flows. PMID:27306101

  2. Analysis of the airport network of India as a complex weighted network

    Science.gov (United States)

    Bagler, Ganesh

    2008-05-01

    Transportation infrastructure of a country is one of the most important indicators of its economic growth. Here we study the Airport Network of India (ANI) which represents India’s domestic civil aviation infrastructure as a complex network. We find that ANI, a network of domestic airports connected by air links, is a small-world network characterized by a truncated power-law degree distribution and has a signature of hierarchy. We investigate ANI as a weighted network to explore its various properties and compare them with their topological counterparts. The traffic in ANI, as in the World-wide Airport Network (WAN), is found to be accumulated on interconnected groups of airports and is concentrated between large airports. In contrast to WAN, ANI is found to be having disassortative mixing which is offset by the traffic dynamics. The analysis indicates possible mechanism of formation of a national transportation network, which is different from that on a global scale.

  3. Thermal decomposition and Moessbauer analysis of two iron hydroxy-carbonate complexes

    International Nuclear Information System (INIS)

    Greaves, T.L.; Cashio, J.D.; Turney, T.

    2002-01-01

    Full text:The two iron hydroxy carbonate complexes (NH 4 ) 2 Fe 2 (OH) 4 (CO 3 ) 2 .H 2 O and (NH 4 ) 4 Fe 2 (OH) 4 (CO 3 ) 3 .3H 2 O were prepared by the method of Dvo a k and Feitknecht. Moessbauer spectra of the first sample at room temperature and 81K showed principally a ferric doublet with a small quadrupole splitting while spectra of the second sample showed a broad ferric doublet with a large mean quadrupole splitting of 1mm/s. Parameters for both spectra were characteristic of distorted octahedral coordination to oxygens. Thermal gravimetric analysis of both samples up to 750 K showed several fractions corresponding to the loss of the more volatile components

  4. A sophisticated cad tool for the creation of complex models for electromagnetic interaction analysis

    Science.gov (United States)

    Dion, Marc; Kashyap, Satish; Louie, Aloisius

    1991-06-01

    This report describes the essential features of the MS-DOS version of DIDEC-DREO, an interactive program for creating wire grid, surface patch, and cell models of complex structures for electromagnetic interaction analysis. It uses the device-independent graphics library DIGRAF and the graphics kernel system HALO, and can be executed on systems with various graphics devices. Complicated structures can be created by direct alphanumeric keyboard entry, digitization of blueprints, conversion form existing geometric structure files, and merging of simple geometric shapes. A completed DIDEC geometric file may then be converted to the format required for input to a variety of time domain and frequency domain electromagnetic interaction codes. This report gives a detailed description of the program DIDEC-DREO, its installation, and its theoretical background. Each available interactive command is described. The associated program HEDRON which generates simple geometric shapes, and other programs that extract the current amplitude data from electromagnetic interaction code outputs, are also discussed.

  5. High-throughput metagenomic technologies for complex microbial community analysis: open and closed formats.

    Science.gov (United States)

    Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa

    2015-01-27

    Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. Copyright © 2015 Zhou et al.

  6. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    Energy Technology Data Exchange (ETDEWEB)

    Cuerva, A.

    1997-10-01

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC 1400-12 Re.[9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc.) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  7. P2P Lending Risk Contagion Analysis Based on a Complex Network Model

    Directory of Open Access Journals (Sweden)

    Qi Wei

    2016-01-01

    Full Text Available This paper analyzes two major channels of P2P lending risk contagion in China—direct risk contagion between platforms and indirect risk contagion with other financial organizations as the contagion medium. Based on this analysis, the current study constructs a complex network model of P2P lending risk contagion in China and performs dynamics analogue simulations in order to analyze general characteristics of direct risk contagion among China’s online P2P lending platforms. The assumed conditions are that other financial organizations act as the contagion medium, with variations in the risk contagion characteristics set under the condition of significant information asymmetry in Internet lending. It is indicated that, compared to direct risk contagion among platforms, both financial organizations acting as the contagion medium and information asymmetry magnify the effect of risk contagion. It is also found that the superposition of media effects and information asymmetry is more likely to magnify the risk contagion effect.

  8. Semiquantitative Culture Analysis during Therapy for Mycobacterium avium Complex Lung Disease.

    Science.gov (United States)

    Griffith, David E; Adjemian, Jennifer; Brown-Elliott, Barbara A; Philley, Julie V; Prevots, D Rebecca; Gaston, Christopher; Olivier, Kenneth N; Wallace, Richard J

    2015-09-15

    Microbiologically based criteria such as sputum culture conversion to negative have traditionally been used to define treatment success for mycobacterial diseases. There are, however, limited data regarding whether nontuberculous mycobacterial sputum culture conversion or semiquantitative culture analysis correlates with subjective or nonmicrobiologic objective indices of treatment response. To determine whether a semiquantitative mycobacterial culture scale correlated with clinical disease status and was predictive of long-term sputum mycobacterial culture conversion to negative in a cohort of patients with nodular/bronchiectatic Mycobacterium avium complex lung disease undergoing therapy. One hundred and eighty patients undergoing standard macrolide-based therapy for M. avium complex lung disease were monitored at standard frequent intervals with symptomatic, radiographic, and microbiologic data collected, including semiquantitative mycobacterial culture analysis. Analyses were used to evaluate clinical and microbiologic predictors of long-term sputum conversion to culture negative. After 12 months of therapy, 148 (82%) patients had sputum conversion to culture negative. Baseline semiquantitative sputum culture scores did not differ between patients with sputum conversion and those without. The change in sputum culture semiquantitative score from baseline to Month 3 was highly predictive of subsequent sputum long-term conversion status indicative of treatment success, as was improvement in cough, and especially early radiographic improvement. Early semiquantitative sputum agar plate culture results can be used to predict symptomatic and radiographic improvement as well as long-term sputum culture conversion to negative in this population. We suggest that semiquantitative sputum culture scores can be a useful tool for evaluating new nontuberculous mycobacterial lung disease therapies.

  9. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory

    Science.gov (United States)

    Wang, Na; Li, Dong; Wang, Qiwen

    2012-12-01

    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government

  10. Institutional Analysis of Knowledge Generation Resource Potential at the Enterprises of Regional Military-Industrial Complex

    Directory of Open Access Journals (Sweden)

    Evgeny Vasilyevich Popov

    2016-09-01

    Full Text Available The article is devoted to the processes of knowledge generation at the enterprises of military-industrial complex, which are the leaders of the regional innovative activity. The target of the research is to develop the methodology based on the use of resource application potential for increasing the efficiency of knowledge generation at the instrument-making enterprises of military-industrial complex. The system analysis of the knowledge generation processes is conducted at one of them. It allows to draw a conclusion that such enterprises have a lack of the institutes of knowledge generation processes. The authors are offered a technique of the development of the knowledge generation system at the military-industrial enterprises based on the accounting of assets and opportunities of the enterprise in the realization of intellectual activity. The developed technique is based on the determination of the horizontal resource potential of knowledge generation and allows to determine the potential of resource application at each stage of product life cycle. The comparison of the actual and theoretical values of horizontal resource potential allows to correct the distribution of a share of each of resources within a stage, and therefore, to optimize the realization of tasks at a specific stage. The offered tools were implemented in 2015 at one of the regional military-Industrial enterprises. The methodological tools of the research include the methods of expert assessment, mathematical statistics and the institutional analysis. On the basis of the offered technique and received empirical results, the institutional spiral of knowledge generation during the filling of state order at the military-industrial enterprise is developed. Its implementation will promote the decrease in the level of uncertainty during the whole life cycle of innovative activity product. The developed institutional spiral of knowledge generation at instrument-making military

  11. The role of risk analysis in control of complex plant safe operation

    International Nuclear Information System (INIS)

    Dumitrescu, Maria; Preda, Irina Aida; Lazar, Roxana Elena; Carcadea, Elena

    1999-01-01

    The problem of risk estimation assessment and control is necessary to be discussed in every decision making level of an activity. Performances of a system, action or technology, by indicating the possible consequences on environment, people or property should be qualitatively assessed. The paper presents methodologies of risk assessment successful applied on isotopic separation plants. The quantitative methodologies presented, use fault tree and event tree to determine the accident states frequency, physical models to analyse the dispersion in atmosphere of dangerous substances. The qualitative methodologies use the fuzzy models for the multicriterial decision making, models based on risk matrix build on the base of combination between the severity and the probability of maximum admissible consequence. These methodologies present the following steps for applying: familiarising with the activity in study, establishing the adequate method of risk assessment, building the model of risk assessment for the activity or objective in study, developing the applications of the proposed model. Applying this methodology to isotopic separation plants have led to: analysis of operation events and establishing of principal types of events potentially dangerous, analysis of human error in these plant operations and operating experience assessment, technical specifications for optimisation by probabilistic safety assessment, reliability analysis and development of reliability and exploitation of events database, post accident events analysis (releases, fires, explosions) and mathematical modelling of dispersion in atmosphere of dangerous substances. The risk concept being complex and with multiple implications, is not the case of a rigid approaching neither of existence of some methods universally valid. Because of these reasons, choosing of the most appropriate method for the risk assessment of an activity, leads to a solution in useful time, of some problems with economic, social

  12. The role of risk analysis in control of complex plants' safety operation

    International Nuclear Information System (INIS)

    Dumitrescu, Maria; Preda, Irina Aida; Lazar, Roxana Elena; Carcadea, Elena

    1999-01-01

    The problem of risk estimation, assessment and control is necessary to be discussed at every decision level of an activity. In this way the performances of a system, action or technology are qualitatively assessed by indicating the possible consequences on environmental, people or property. The paper presents methodologies of risk assessment successfully applied on isotopic separation plants. The quantitative methodologies presented use fault tree and event tree to determine the accident states frequency and physical models to analyse the dispersion in atmosphere of dangerous substances. The qualitative methodologies use fuzzy models for the multi-criteria decision making, models based on risk matrix built on the basis of a combination between severity and probability of maximum admissible consequence. These methodologies present the following steps for applying: familiarising with the activity in study, establishing the adequate method of risk assessment, realising of the model of risk assessment for the activity or objective in study, developing of application of the proposed model. Applying this methodology to isotopic separation plants has led to: analysis of operation events and establishing of principal types of events potentially dangerous, analysis of human error in these plants operation and operating experience assessment, technical specifications optimisation by probabilistic safety assessment, reliability analysis and development of reliability and exploitation events database, post accident events analysis (releases, fires, explosions) and mathematical modelling of dispersion in atmosphere of dangerous substances. The risk concept being complex and with multiple implications, it is not the case of a rigid approaching neither of existence of some methods universally valid. Because of these reasons choosing of the most appropriate method for the risk assessment of an activity, leads to solution in due time, of some problems with economic, social

  13. Interacting price model and fluctuation behavior analysis from Lempel–Ziv complexity and multi-scale weighted-permutation entropy

    International Nuclear Information System (INIS)

    Li, Rui; Wang, Jun

    2016-01-01

    A financial price model is developed based on the voter interacting system in this work. The Lempel–Ziv complexity is introduced to analyze the complex behaviors of the stock market. Some stock market stylized facts including fat tails, absence of autocorrelation and volatility clustering are investigated for the proposed price model firstly. Then the complexity of fluctuation behaviors of the real stock markets and the proposed price model are mainly explored by Lempel–Ziv complexity (LZC) analysis and multi-scale weighted-permutation entropy (MWPE) analysis. A series of LZC analyses of the returns and the absolute returns of daily closing prices and moving average prices are performed. Moreover, the complexity of the returns, the absolute returns and their corresponding intrinsic mode functions (IMFs) derived from the empirical mode decomposition (EMD) with MWPE is also investigated. The numerical empirical study shows similar statistical and complex behaviors between the proposed price model and the real stock markets, which exhibits that the proposed model is feasible to some extent. - Highlights: • A financial price dynamical model is developed based on the voter interacting system. • Lempel–Ziv complexity is the firstly applied to investigate the stock market dynamics system. • MWPE is employed to explore the complexity fluctuation behaviors of the stock market. • Empirical results show the feasibility of the proposed financial model.

  14. Interacting price model and fluctuation behavior analysis from Lempel–Ziv complexity and multi-scale weighted-permutation entropy

    Energy Technology Data Exchange (ETDEWEB)

    Li, Rui, E-mail: lirui1401@bjtu.edu.cn; Wang, Jun

    2016-01-08

    A financial price model is developed based on the voter interacting system in this work. The Lempel–Ziv complexity is introduced to analyze the complex behaviors of the stock market. Some stock market stylized facts including fat tails, absence of autocorrelation and volatility clustering are investigated for the proposed price model firstly. Then the complexity of fluctuation behaviors of the real stock markets and the proposed price model are mainly explored by Lempel–Ziv complexity (LZC) analysis and multi-scale weighted-permutation entropy (MWPE) analysis. A series of LZC analyses of the returns and the absolute returns of daily closing prices and moving average prices are performed. Moreover, the complexity of the returns, the absolute returns and their corresponding intrinsic mode functions (IMFs) derived from the empirical mode decomposition (EMD) with MWPE is also investigated. The numerical empirical study shows similar statistical and complex behaviors between the proposed price model and the real stock markets, which exhibits that the proposed model is feasible to some extent. - Highlights: • A financial price dynamical model is developed based on the voter interacting system. • Lempel–Ziv complexity is the firstly applied to investigate the stock market dynamics system. • MWPE is employed to explore the complexity fluctuation behaviors of the stock market. • Empirical results show the feasibility of the proposed financial model.

  15. How a submarine returns to periscope depth: analysing complex socio-technical systems using Cognitive Work Analysis.

    Science.gov (United States)

    Stanton, Neville A; Bessell, Kevin

    2014-01-01

    This paper presents the application of Cognitive Work Analysis to the description of the functions, situations, activities, decisions, strategies, and competencies of a Trafalgar class submarine when performing the function of returning to periscope depth. All five phases of Cognitive Work Analysis are presented, namely: Work Domain Analysis, Control Task Analysis, Strategies Analysis, Social Organisation and Cooperation Analysis, and Worker Competencies Analysis. Complex socio-technical systems are difficult to analyse but Cognitive Work Analysis offers an integrated way of analysing complex systems with the core of functional means-ends analysis underlying all of the other representations. The joined-up analysis offers a coherent framework for understanding how socio-technical systems work. Data were collected through observation and interviews at different sites across the UK. The resultant representations present a statement of how the work domain and current activities are configured in this complex socio-technical system. This is intended to provide a baseline, from which all future conceptions of the domain may be compared. The strength of the analysis is in the multiple representations from which the constraints acting on the work may be analysed. Future research needs to challenge the assumptions behind these constraints in order to develop new ways of working. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  16. Use of eye tracking equipment for human reliability analysis applied to complex system operations

    International Nuclear Information System (INIS)

    Pinheiro, Andre Ricardo Mendonça; Prado, Eugenio Anselmo Pessoa do; Martins, Marcelo Ramos

    2017-01-01

    This article will discuss the preliminary results of an evaluation methodology for the analysis and quantification of manual character errors (human), by monitoring cognitive parameters and skill levels in the operation of a complex control system based on parameters provided by a eye monitoring equipment (Eye Tracker). The research was conducted using a simulator (game) that plays concepts of operation of a nuclear reactor with a split sample for evaluation of aspects of learning, knowledge and standard operating within the context addressed. bridge operators were monitored using the EYE TRACKING, eliminating the presence of the analyst in the evaluation of the operation, allowing the analysis of the results by means of multivariate statistical techniques within the scope of system reliability. The experiments aim to observe state change situations such as stops and scheduled departures, incidents assumptions and common operating characteristics. Preliminary results of this research object indicate that technical and cognitive aspects can contribute to improving the reliability of the available techniques in human reliability, making them more realistic both in the context of quantitative approaches to regulatory and training purposes, as well as reduced incidence of human error. (author)

  17. The contribution of cluster and discriminant analysis to the classification of complex aquifer systems.

    Science.gov (United States)

    Panagopoulos, G P; Angelopoulou, D; Tzirtzilakis, E E; Giannoulopoulos, P

    2016-10-01

    This paper presents an innovated method for the discrimination of groundwater samples in common groups representing the hydrogeological units from where they have been pumped. This method proved very efficient even in areas with complex hydrogeological regimes. The proposed method requires chemical analyses of water samples only for major ions, meaning that it is applicable to most of cases worldwide. Another benefit of the method is that it gives a further insight of the aquifer hydrogeochemistry as it provides the ions that are responsible for the discrimination of the group. The procedure begins with cluster analysis of the dataset in order to classify the samples in the corresponding hydrogeological unit. The feasibility of the method is proven from the fact that the samples of volcanic origin were separated into two different clusters, namely the lava units and the pyroclastic-ignimbritic aquifer. The second step is the discriminant analysis of the data which provides the functions that distinguish the groups from each other and the most significant variables that define the hydrochemical composition of the aquifer. The whole procedure was highly successful as the 94.7 % of the samples were classified to the correct aquifer system. Finally, the resulted functions can be safely used to categorize samples of either unknown or doubtful origin improving thus the quality and the size of existing hydrochemical databases.

  18. Complexities of sibling analysis when exposures and outcomes change with time and birth order.

    Science.gov (United States)

    Sudan, Madhuri; Kheifets, Leeka I; Arah, Onyebuchi A; Divan, Hozefa A; Olsen, Jørn

    2014-01-01

    In this study, we demonstrate the complexities of performing a sibling analysis with a re-examination of associations between cell phone exposures and behavioral problems observed previously in the Danish National Birth Cohort. Children (52,680; including 5441 siblings) followed up to age 7 were included. We examined differences in exposures and behavioral problems between siblings and non-siblings and by birth order and birth year. We estimated associations between cell phone exposures and behavioral problems while accounting for the random family effect among siblings. The association of behavioral problems with both prenatal and postnatal exposure differed between siblings (odds ratio (OR): 1.07; 95% confidence interval (CI): 0.69-1.66) and non-siblings (OR: 1.54; 95% CI: 1.36-1.74) and within siblings by birth order; the association was strongest for first-born siblings (OR: 1.72; 95% CI: 0.86-3.42) and negative for later-born siblings (OR: 0.63; 95% CI: 0.31-1.25), which may be because of increases in cell phone use with later birth year. Sibling analysis can be a powerful tool for (partially) accounting for confounding by invariant unmeasured within-family factors, but it cannot account for uncontrolled confounding by varying family-level factors, such as those that vary with time and birth order.

  19. Extractive Atmospheric Pressure Photoionization (EAPPI) Mass Spectrometry: Rapid Analysis of Chemicals in Complex Matrices.

    Science.gov (United States)

    Liu, Chengyuan; Yang, Jiuzhong; Wang, Jian; Hu, Yonghua; Zhao, Wan; Zhou, Zhongyue; Qi, Fei; Pan, Yang

    2016-10-01

    Extractive atmospheric pressure photoionization (EAPPI) mass spectrometry was designed for rapid qualitative and quantitative analysis of chemicals in complex matrices. In this method, an ultrasonic nebulization system was applied to sample extraction, nebulization, and vaporization. Mixed with a gaseous dopant, vaporized analytes were ionized through ambient photon-induced ion-molecule reactions, and were mass-analyzed by a high resolution time-of-flight mass spectrometer (TOF-MS). After careful optimization and testing with pure sample solution, EAPPI was successfully applied to the fast screening of capsules, soil, natural products, and viscous compounds. Analysis was completed within a few seconds without the need for preseparation. Moreover, the quantification capability of EAPPI for matrices was evaluated by analyzing six polycyclic aromatic hydrocarbons (PAHs) in soil. The correlation coefficients (R (2) ) for standard curves of all six PAHs were above 0.99, and the detection limits were in the range of 0.16-0.34 ng/mg. In addition, EAPPI could also be used to monitor organic chemical reactions in real time. Graphical Abstract ᅟ.

  20. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    International Nuclear Information System (INIS)

    Ryan, C.G.; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-01-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  1. PGSB/MIPS Plant Genome Information Resources and Concepts for the Analysis of Complex Grass Genomes.

    Science.gov (United States)

    Spannagl, Manuel; Bader, Kai; Pfeifer, Matthias; Nussbaumer, Thomas; Mayer, Klaus F X

    2016-01-01

    PGSB (Plant Genome and Systems Biology; formerly MIPS-Munich Institute for Protein Sequences) has been involved in developing, implementing and maintaining plant genome databases for more than a decade. Genome databases and analysis resources have focused on individual genomes and aim to provide flexible and maintainable datasets for model plant genomes as a backbone against which experimental data, e.g., from high-throughput functional genomics, can be organized and analyzed. In addition, genomes from both model and crop plants form a scaffold for comparative genomics, assisted by specialized tools such as the CrowsNest viewer to explore conserved gene order (synteny) between related species on macro- and micro-levels.The genomes of many economically important Triticeae plants such as wheat, barley, and rye present a great challenge for sequence assembly and bioinformatic analysis due to their enormous complexity and large genome size. Novel concepts and strategies have been developed to deal with these difficulties and have been applied to the genomes of wheat, barley, rye, and other cereals. This includes the GenomeZipper concept, reference-guided exome assembly, and "chromosome genomics" based on flow cytometry sorted chromosomes.

  2. Plastome Sequence Determination and Comparative Analysis for Members of the Lolium-Festuca Grass Species Complex

    Science.gov (United States)

    Hand, Melanie L.; Spangenberg, German C.; Forster, John W.; Cogan, Noel O. I.

    2013-01-01

    Chloroplast genome sequences are of broad significance in plant biology, due to frequent use in molecular phylogenetics, comparative genomics, population genetics, and genetic modification studies. The present study used a second-generation sequencing approach to determine and assemble the plastid genomes (plastomes) of four representatives from the agriculturally important Lolium-Festuca species complex of pasture grasses (Lolium multiflorum, Festuca pratensis, Festuca altissima, and Festuca ovina). Total cellular DNA was extracted from either roots or leaves, was sequenced, and the output was filtered for plastome-related reads. A comparison between sources revealed fewer plastome-related reads from root-derived template but an increase in incidental bacterium-derived sequences. Plastome assembly and annotation indicated high levels of sequence identity and a conserved organization and gene content between species. However, frequent deletions within the F. ovina plastome appeared to contribute to a smaller plastid genome size. Comparative analysis with complete plastome sequences from other members of the Poaceae confirmed conservation of most grass-specific features. Detailed analysis of the rbcL–psaI intergenic region, however, revealed a “hot-spot” of variation characterized by independent deletion events. The evolutionary implications of this observation are discussed. The complete plastome sequences are anticipated to provide the basis for potential organelle-specific genetic modification of pasture grasses. PMID:23550121

  3. Use of eye tracking equipment for human reliability analysis applied to complex system operations

    Energy Technology Data Exchange (ETDEWEB)

    Pinheiro, Andre Ricardo Mendonça; Prado, Eugenio Anselmo Pessoa do; Martins, Marcelo Ramos, E-mail: andrericardopinheiro@usp.br, E-mail: eugenio.prado@labrisco.usp.br, E-mail: mrmatins@usp.br [Universidade de Sao Paulo (LABRISCO/USP), Sao Paulo, SP (Brazil). Lab. de Análise, Avaliação e Gerenciamento de Risco

    2017-07-01

    This article will discuss the preliminary results of an evaluation methodology for the analysis and quantification of manual character errors (human), by monitoring cognitive parameters and skill levels in the operation of a complex control system based on parameters provided by a eye monitoring equipment (Eye Tracker). The research was conducted using a simulator (game) that plays concepts of operation of a nuclear reactor with a split sample for evaluation of aspects of learning, knowledge and standard operating within the context addressed. bridge operators were monitored using the EYE TRACKING, eliminating the presence of the analyst in the evaluation of the operation, allowing the analysis of the results by means of multivariate statistical techniques within the scope of system reliability. The experiments aim to observe state change situations such as stops and scheduled departures, incidents assumptions and common operating characteristics. Preliminary results of this research object indicate that technical and cognitive aspects can contribute to improving the reliability of the available techniques in human reliability, making them more realistic both in the context of quantitative approaches to regulatory and training purposes, as well as reduced incidence of human error. (author)

  4. Analysis of a fuel cell on-site integrated energy system for a residential complex

    Science.gov (United States)

    Simons, S. N.; Maag, W. L.

    1979-01-01

    The energy use and costs of the on-site integrated energy system (OS/IES) which provides electric power from an on-site power plant and recovers heat that would normally be rejected to the environment is compared to a conventional system purchasing electricity from a utility and a phosphoric acid fuel cell powered system. The analysis showed that for a 500-unit apartment complex a fuel OS/IES would be about 10% more energy conservative in terms of total coal consumption than a diesel OS/IES system or a conventional system. The fuel cell OS/IES capital costs could be 30 to 55% greater than the diesel OS/IES capital costs for the same life cycle costs. The life cycle cost of a fuel cell OS/IES would be lower than that for a conventional system as long as the cost of electricity is greater than $0.05 to $0.065/kWh. An analysis of several parametric combinations of fuel cell power plant and state-of-art energy recovery systems and annual fuel requirement calculations for four locations were made. It was shown that OS/IES component choices are a major factor in fuel consumption, with the least efficient system using 25% more fuel than the most efficient. Central air conditioning and heat pumps result in minimum fuel consumption while individual air conditioning units increase it, and in general the fuel cell of highest electrical efficiency has the lowest fuel consumption.

  5. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, C.G., E-mail: chris.ryan@csiro.au; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-11-15

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  6. Digital Elevation Profile: A Complex Tool for the Spatial Analysis of Hiking

    Directory of Open Access Journals (Sweden)

    Laura TÎRLĂ

    2014-11-01

    Full Text Available One of the current attributions of mountain geomorphology is to provide information for tourism purposes, such as the spatial analysis of hiking trails. Therefore, geomorphic tools are indispensable for terrain analyses. Elevation profile is one of the most adequate tools for assessing the morphometric patterns of the hiking trails. In this study we tested several applications in order to manage raw data, create profile graphs and obtain the morphometric parameters of five hiking trails in the Căpățânii Mountains (South Carpathians, Romania. Different data complexity was explored: distance, elevation, cumulative gain or loss, slope etc. Furthermore, a comparative morphometric analysis was performed in order to emphasize the multiple possibilities provided by the elevation profile. Results show that GPS Visualizer, Geocontext and in some manner Google Earth are the most adequate applications that provide high-quality elevation profiles and detailed data, with multiple additional functions, according to user's needs. The applied tools and techniques are very useful for mountain route planning, elaborating mountain guides, enhancing knowledge about specific trails or routes, or assessing the landscape and tourism value of a mountain area.

  7. Letter Report: Stable Hydrogen and Oxygen Isotope Analysis of B-Complex Perched Water Samples

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Brady D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moran, James J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Nims, Megan K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Saunders, Danielle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-04-13

    Fine-grained sediments associated with the Cold Creek Unit at Hanford have caused the formation of a perched water aquifer in the deep vadose zone at the B Complex area, which includes waste sites in the 200-DV-1 Operable Unit and the single-shell tank farms in Waste Management Area B-BX-BY. High levels of contaminants, such as uranium, technetium-99, and nitrate, make this aquifer a continuing source of contamination for the groundwater located a few meters below the perched zone. Analysis of deuterium (2H) and 18-oxygen (18O) of nine perched water samples from three different wells was performed. Samples represent time points from hydraulic tests performed on the perched aquifer using the three wells. The isotope analyses showed that the perched water had δ2H and δ18O ratios consistent with the regional meteoric water line, indicating that local precipitation events at the Hanford site likely account for recharge of the perched water aquifer. Data from the isotope analysis can be used along with pumping and recovery data to help understand the perched water dynamics related to aquifer size and hydraulic control of the aquifer in the future.

  8. Innovation Analysis Approach to Design Parameters of High Speed Train Carriage and Their Intrinsic Complexity Relationships

    Science.gov (United States)

    Xiao, Shou-Ne; Wang, Ming-Meng; Hu, Guang-Zhong; Yang, Guang-Wu

    2017-09-01

    In view of the problem that it's difficult to accurately grasp the influence range and transmission path of the vehicle top design requirements on the underlying design parameters. Applying directed-weighted complex network to product parameter model is an important method that can clarify the relationships between product parameters and establish the top-down design of a product. The relationships of the product parameters of each node are calculated via a simple path searching algorithm, and the main design parameters are extracted by analysis and comparison. A uniform definition of the index formula for out-in degree can be provided based on the analysis of out-in-degree width and depth and control strength of train carriage body parameters. Vehicle gauge, axle load, crosswind and other parameters with higher values of the out-degree index are the most important boundary conditions; the most considerable performance indices are the parameters that have higher values of the out-in-degree index including torsional stiffness, maximum testing speed, service life of the vehicle, and so on; the main design parameters contain train carriage body weight, train weight per extended metre, train height and other parameters with higher values of the in-degree index. The network not only provides theoretical guidance for exploring the relationship of design parameters, but also further enriches the application of forward design method to high-speed trains.

  9. Proteomic analysis reveals the diversity and complexity of membrane proteins in chickpea (Cicer arietinum L.

    Directory of Open Access Journals (Sweden)

    Jaiswal Dinesh Kumar

    2012-10-01

    Full Text Available Abstract Background Compartmentalization is a unique feature of eukaryotes that helps in maintaining cellular homeostasis not only in intra- and inter-organellar context, but also between the cells and the external environment. Plant cells are highly compartmentalized with a complex metabolic network governing various cellular events. The membranes are the most important constituents in such compartmentalization, and membrane-associated proteins play diverse roles in many cellular processes besides being part of integral component of many signaling cascades. Results To obtain valuable insight into the dynamic repertoire of membrane proteins, we have developed a proteome reference map of a grain legume, chickpea, using two-dimensional gel electrophoresis. MALDI-TOF/TOF and LC-ESI-MS/MS analysis led to the identification of 91 proteins involved in a variety of cellular functions viz., bioenergy, stress-responsive and signal transduction, metabolism, protein synthesis and degradation, among others. Significantly, 70% of the identified proteins are putative integral membrane proteins, possessing transmembrane domains. Conclusions The proteomic analysis revealed many resident integral membrane proteins as well as membrane-associated proteins including those not reported earlier. To our knowledge, this is the first report of membrane proteome from aerial tissues of a crop plant. The findings may provide a better understanding of the biochemical machinery of the plant membranes at the molecular level that might help in functional genomics studies of different developmental pathways and stress-responses.

  10. Using high complexity analysis to probe the evolution of organic aerosol during pollution events in Beijing

    Science.gov (United States)

    Hamilton, J.; Dixon, W.; Dunmore, R.; Squires, F. A.; Swift, S.; Lee, J. D.; Rickard, A. R.; Sun, Y.; Xu, W.

    2017-12-01

    There is increasing evidence that exposure to air pollution results in significant impacts on human health. In Beijing, home to over 20 million inhabitants, particulate matter levels are very high by international standards, with official estimates of an annual mean PM2.5 concentration in 2014 of 86 μg m-3, nearly 9 times higher than the WHO guideline. Changes in particle composition during pollution events will provide key information on sources and can be used to inform strategies for pollution mitigation and health benefits. The organic fraction of PM is an extremely complex mixture reflecting the diversity of sources to the atmosphere. In this study we attempt to harness the chemical complexity of OA by developing an extensive database of over 700 mass spectra, built using literature data and sources specific tracers (e.g. diesel emission characterisation experiments and SOA generated in chamber simulations). Using a high throughput analysis method (15 min), involving UHPLC coupled to Orbitrap mass spectrometry, chromatograms are integrated, compared to the library and a list of identified compounds produced. Purpose built software based on R is used to automatically produce time series, alongside common aerosol metrics and data visualisation techniques, dramatically reducing analysis times. Offline measurements of organic aerosol composition were made as part of the Sources and Emissions of Air Pollutants in Beijing project, a collaborative program between leading UK and Chinese research groups. Rather than studying only a small number of 24 hr PM samples, we collected 250 filters samples at a range of different time resolutions, from 30 minutes to 12 hours, depending on the time of day and PM loadings. In total 643 species were identified based on their elemental formula and retention time, with species ranging from C2-C22 and between 1-13 oxygens. A large fraction of the OA species observed were organosulfates and/or nitrates. Here we will present

  11. Layout design and energetic analysis of a complex diesel parallel hybrid electric vehicle

    International Nuclear Information System (INIS)

    Finesso, Roberto; Spessa, Ezio; Venditti, Mattia

    2014-01-01

    Highlights: • Layout design, energetic and cost analysis of complex parallel hybrid vehicles. • Development of global and real-time optimizers for control strategy identification. • Rule-based control strategies to minimize fuel consumption and NO x . • Energy share across each working mode for battery and thermal engine. - Abstract: The present paper is focused on the design, optimization and analysis of a complex parallel hybrid electric vehicle, equipped with two electric machines on both the front and rear axles, and on the evaluation of its potential to reduce fuel consumption and NO x emissions over several driving missions. The vehicle has been compared with two conventional parallel hybrid vehicles, equipped with a single electric machine on the front axle or on the rear axle, as well as with a conventional vehicle. All the vehicles have been equipped with compression ignition engines. The optimal layout of each vehicle was identified on the basis of the minimization of the overall powertrain costs during the whole vehicle life. These costs include the initial investment due to the production of the components as well as the operating costs related to fuel consumption and to battery depletion. Identification of the optimal powertrain control strategy, in terms of the management of the power flows of the engine and electric machines, and of gear selection, is necessary in order to be able to fully exploit the potential of the hybrid architecture. To this end, two global optimizers, one of a deterministic nature and another of a stochastic type, and two real-time optimizers have been developed, applied and compared. A new mathematical technique has been developed and applied to the vehicle simulation model in order to decrease the computational time of the optimizers. First, the vehicle model equations were written in order to allow a coarse time grid to be used, then, the control variables (i.e., power flow and gear number) were discretized, and the

  12. CT-Guided Microwave Ablation of 45 Renal Tumors: Analysis of Procedure Complexity Utilizing a Percutaneous Renal Ablation Complexity Scoring System.

    Science.gov (United States)

    Mansilla, Alberto V; Bivins, Eugene E; Contreras, Francisco; Hernandez, Manuel A; Kohler, Nathan; Pepe, Julie W

    2017-02-01

    To develop a scoring system that stratifies complexity of percutaneous ablation of renal tumors. Analysis was performed of 36 consecutive patients (mean age, 64 y; range, 30-89 y) who underwent CT-guided microwave (MW) ablation of 45 renal tumors (mean tumor diameter, 2.4 cm; range, 1.2-4.0 cm). Technical success and effectiveness were determined based on intraprocedural and follow-up imaging studies. The RENAL score and the proposed percutaneous renal ablation complexity (P-RAC) score were calculated for each tumor. Technical success was 93.3% (n = 42). Biopsy of 38 of 45 renal tumors revealed 23 renal cell carcinomas. Median follow-up period was 9.7 months (range, 2.9-46.8 months). There were no tumor recurrences. One major complication, ureteropelvic junction stricture, occurred (2.6%). The P-RAC score was found to differ statistically from the RENAL score (t = 3.754, df = 44, P = .001). A positive correlation was found between the P-RAC score and number of antenna insertions (r = .378, n = 45, P = .011) and procedure duration (r = .328, n = 45, P = .028). No correlation was found between the RENAL score and number of MW antenna insertions (r = .110, n = 45, P = .472) or procedure duration (r = .263, n = 45, P = .081). Hydrodissection was significantly more common in the P-RAC high-complexity category than in low-complexity category (χ 2 = 12.073, df = 2, P = .002). The P-RAC score may be useful in stratifying percutaneous renal ablation complexity. Further studies with larger sample sizes are necessary to validate the P-RAC score and to determine if it can predict risk of complications. Copyright © 2016 SIR. Published by Elsevier Inc. All rights reserved.

  13. Strategies for the structural analysis of multi-protein complexes: lessons from the 3D-Repertoire project.

    Science.gov (United States)

    Collinet, B; Friberg, A; Brooks, M A; van den Elzen, T; Henriot, V; Dziembowski, A; Graille, M; Durand, D; Leulliot, N; Saint André, C; Lazar, N; Sattler, M; Séraphin, B; van Tilbeurgh, H

    2011-08-01

    Structural studies of multi-protein complexes, whether by X-ray diffraction, scattering, NMR spectroscopy or electron microscopy, require stringent quality control of the component samples. The inability to produce 'keystone' subunits in a soluble and correctly folded form is a serious impediment to the reconstitution of the complexes. Co-expression of the components offers a valuable alternative to the expression of single proteins as a route to obtain sufficient amounts of the sample of interest. Even in cases where milligram-scale quantities of purified complex of interest become available, there is still no guarantee that good quality crystals can be obtained. At this step, protein engineering of one or more components of the complex is frequently required to improve solubility, yield or the ability to crystallize the sample. Subsequent characterization of these constructs may be performed by solution techniques such as Small Angle X-ray Scattering and Nuclear Magnetic Resonance to identify 'well behaved' complexes. Herein, we recount our experiences gained at protein production and complex assembly during the European 3D Repertoire project (3DR). The goal of this consortium was to obtain structural information on multi-protein complexes from yeast by combining crystallography, electron microscopy, NMR and in silico modeling methods. We present here representative set case studies of complexes that were produced and analyzed within the 3DR project. Our experience provides useful insight into strategies that are more generally applicable for structural analysis of protein complexes. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Complexity and Automation Displays of Air Traffic Control: Literature Review and Analysis

    National Research Council Canada - National Science Library

    Xing, Jing; Manning, Carol A

    2005-01-01

    This report reviewed a number of measures of complexity associated with visual displays and analyzed the potential to apply these methods to assess the complexity of air traffic control (ATC) displays...

  15. Separating complex compound patient motion tracking data using independent component analysis

    Science.gov (United States)

    Lindsay, C.; Johnson, K.; King, M. A.

    2014-03-01

    In SPECT imaging, motion from respiration and body motion can reduce image quality by introducing motion-related artifacts. A minimally-invasive way to track patient motion is to attach external markers to the patient's body and record their location throughout the imaging study. If a patient exhibits multiple movements simultaneously, such as respiration and body-movement, each marker location data will contain a mixture of these motions. Decomposing this complex compound motion into separate simplified motions can have the benefit of applying a more robust motion correction to the specific type of motion. Most motion tracking and correction techniques target a single type of motion and either ignore compound motion or treat it as noise. Few methods account for compound motion exist, but they fail to disambiguate super-position in the compound motion (i.e. inspiration in addition to body movement in the positive anterior/posterior direction). We propose a new method for decomposing the complex compound patient motion using an unsupervised learning technique called Independent Component Analysis (ICA). Our method can automatically detect and separate different motions while preserving nuanced features of the motion without the drawbacks of previous methods. Our main contributions are the development of a method for addressing multiple compound motions, the novel use of ICA in detecting and separating mixed independent motions, and generating motion transform with 12 DOFs to account for twisting and shearing. We show that our method works with clinical datasets and can be employed to improve motion correction in single photon emission computed tomography (SPECT) images.

  16. Complex-Wide Waste Flow Analysis V1.0 verification and validation report

    International Nuclear Information System (INIS)

    Hsu, K.M.; Lundeen, A.S.; Oswald, K.B.; Shropshire, D.E.; Robinson, J.M.; West, W.H.

    1997-01-01

    The complex-wide waste flow analysis model (CWWFA) was developed to assist the Department of Energy (DOE) Environmental Management (EM) Office of Science and Technology (EM-50) to evaluate waste management scenarios with emphasis on identifying and prioritizing technology development opportunities to reduce waste flows and public risk. In addition, the model was intended to support the needs of the Complex-Wide Environmental Integration (EMI) team supporting the DOE's Accelerating Cleanup: 2006 Plan. CWWFA represents an integrated environmental modeling system that covers the life cycle of waste management activities including waste generation, interim process storage, retrieval, characterization and sorting, waste preparation and processing, packaging, final interim storage, transport, and disposal at a final repository. The CWWFA shows waste flows through actual site-specific and facility-specific conditions. The system requirements for CWWFA are documented in the Technical Requirements Document (TRD). The TRD is intended to be a living document that will be modified over the course of the execution of CWWFA development. Thus, it is anticipated that CWWFA will continue to evolve as new requirements are identified (i.e., transportation, small sites, new streams, etc.). This report provides a documented basis for system verification of CWWFA requirements. System verification is accomplished through formal testing and evaluation to ensure that all performance requirements as specified in the TRD have been satisfied. A Requirement Verification Matrix (RVM) was used to map the technical requirements to the test procedures. The RVM is attached as Appendix A. Since February of 1997, substantial progress has been made toward development of the CWWFA to meet the system requirements. This system verification activity provides a baseline on system compliance to requirements and also an opportunity to reevaluate what requirements need to be satisfied in FY-98

  17. Probabilities in quantum cosmological models: A decoherent histories analysis using a complex potential

    International Nuclear Information System (INIS)

    Halliwell, J. J.

    2009-01-01

    In the quantization of simple cosmological models (minisuperspace models) described by the Wheeler-DeWitt equation, an important step is the construction, from the wave function, of a probability distribution answering various questions of physical interest, such as the probability of the system entering a given region of configuration space at any stage in its entire history. A standard but heuristic procedure is to use the flux of (components of) the wave function in a WKB approximation. This gives sensible semiclassical results but lacks an underlying operator formalism. In this paper, we address the issue of constructing probability distributions linked to the Wheeler-DeWitt equation using the decoherent histories approach to quantum theory. The key step is the construction of class operators characterizing questions of physical interest. Taking advantage of a recent decoherent histories analysis of the arrival time problem in nonrelativistic quantum mechanics, we show that the appropriate class operators in quantum cosmology are readily constructed using a complex potential. The class operator for not entering a region of configuration space is given by the S matrix for scattering off a complex potential localized in that region. We thus derive the class operators for entering one or more regions in configuration space. The class operators commute with the Hamiltonian, have a sensible classical limit, and are closely related to an intersection number operator. The definitions of class operators given here handle the key case in which the underlying classical system has multiple crossings of the boundaries of the regions of interest. We show that oscillatory WKB solutions to the Wheeler-DeWitt equation give approximate decoherence of histories, as do superpositions of WKB solutions, as long as the regions of configuration space are sufficiently large. The corresponding probabilities coincide, in a semiclassical approximation, with standard heuristic procedures

  18. A new acoustic portal into the odontocete ear and vibrational analysis of the tympanoperiotic complex.

    Directory of Open Access Journals (Sweden)

    Ted W Cranford

    Full Text Available Global concern over the possible deleterious effects of noise on marine organisms was catalyzed when toothed whales stranded and died in the presence of high intensity sound. The lack of knowledge about mechanisms of hearing in toothed whales prompted our group to study the anatomy and build a finite element model to simulate sound reception in odontocetes. The primary auditory pathway in toothed whales is an evolutionary novelty, compensating for the impedance mismatch experienced by whale ancestors as they moved from hearing in air to hearing in water. The mechanism by which high-frequency vibrations pass from the low density fats of the lower jaw into the dense bones of the auditory apparatus is a key to understanding odontocete hearing. Here we identify a new acoustic portal into the ear complex, the tympanoperiotic complex (TPC and a plausible mechanism by which sound is transduced into the bony components. We reveal the intact anatomic geometry using CT scanning, and test functional preconceptions using finite element modeling and vibrational analysis. We show that the mandibular fat bodies bifurcate posteriorly, attaching to the TPC in two distinct locations. The smaller branch is an inconspicuous, previously undescribed channel, a cone-shaped fat body that fits into a thin-walled bony funnel just anterior to the sigmoid process of the TPC. The TPC also contains regions of thin translucent bone that define zones of differential flexibility, enabling the TPC to bend in response to sound pressure, thus providing a mechanism for vibrations to pass through the ossicular chain. The techniques used to discover the new acoustic portal in toothed whales, provide a means to decipher auditory filtering, beam formation, impedance matching, and transduction. These tools can also be used to address concerns about the potential deleterious effects of high-intensity sound in a broad spectrum of marine organisms, from whales to fish.

  19. Qualitative and quantitative analysis of complex temperature-programmed desorption data by multivariate curve resolution

    Science.gov (United States)

    Rodríguez-Reyes, Juan Carlos F.; Teplyakov, Andrew V.; Brown, Steven D.

    2010-10-01

    The substantial amount of information carried in temperature-programmed desorption (TPD) experiments is often difficult to mine due to the occurrence of competing reaction pathways that produce compounds with similar mass spectrometric features. Multivariate curve resolution (MCR) is introduced as a tool capable of overcoming this problem by mathematically detecting spectral variations and correlations between several m/z traces, which is later translated into the extraction of the cracking pattern and the desorption profile for each desorbate. Different from the elegant (though complex) methods currently available to analyze TPD data, MCR analysis is applicable even when no information regarding the specific surface reaction/desorption process or the nature of the desorbing species is available. However, when available, any information can be used as constraints that guide the outcome, increasing the accuracy of the resolution. This approach is especially valuable when the compounds desorbing are different from what would be expected based on a chemical intuition, when the cracking pattern of the model test compound is difficult or impossible to obtain (because it could be unstable or very rare), and when knowing major components desorbing from the surface could in more traditional methods actually bias the quantification of minor components. The enhanced level of understanding of thermal processes achieved through MCR analysis is demonstrated by analyzing three phenomena: i) the cryogenic desorption of vinyltrimethylsilane from silicon, an introductory system where the known multilayer and monolayer components are resolved; ii) acrolein hydrogenation on a bimetallic Pt-Ni-Pt catalyst, where a rapid identification of hydrogenated products as well as other desorbing species is achieved, and iii) the thermal reaction of Ti[N(CH 3) 2] 4 on Si(100), where the products of surface decomposition are identified and an estimation of the surface composition after the

  20. Rational pain management in complex regional pain syndrome 1 (CRPS 1)--a network meta-analysis.

    Science.gov (United States)

    Wertli, Maria M; Kessels, Alphons G H; Perez, Roberto S G M; Bachmann, Lucas M; Brunner, Florian

    2014-09-01

    Guidelines for complex regional pain syndrome (CRPS) 1 advocate several substance classes to reduce pain and support physical rehabilitation, but guidance about which agent should be prioritized when designing a therapeutic regimen is not provided. Using a network meta-analytic approach, we examined the efficacy of all agent classes investigated in randomized clinical trials of CRPS 1 and provide a rank order of various substances stratified by length of illness duration. In this study a network meta-analysis was conducted. The participants of this study were patients with CRPS 1. Searches in electronic, previous systematic reviews, conference abstracts, book chapters, and the reference lists of relevant articles were performed. Eligible studies were randomized controlled trials comparing at least one analgesic agent with placebo or with another analgesic and reporting efficacy in reducing pain. Summary efficacy stratified by symptom duration and length of follow-up was computed across all substance classes. Two authors independently extracted data. In total, 16 studies were included in the analysis. Bisphosphonates appear to be the treatment of choice in early stages of CRPS 1. The effects of calcitonin surpass that of bisphosphonates and other substances as a short-term medication in more chronic stages of the illness. While most medications showed some efficacy on short-term follow-up, only bisphosphonates, NMDA analogs, and vasodilators showed better long-term pain reduction than placebo. For some drug classes, only a few studies were available and many studies included a small group of patients. Insufficient data were available to analyze efficacy on disability. This network meta-analysis indicates that a rational pharmacological treatment strategy of pain management should consider bisphosphonates in early CRPS 1 and a short-term course of calcitonin in later stages. While most medications showed some efficacy on short-term follow-up, only bisphosphonates

  1. Complexity Plots

    KAUST Repository

    Thiyagalingam, Jeyarajan

    2013-06-01

    In this paper, we present a novel visualization technique for assisting the observation and analysis of algorithmic complexity. In comparison with conventional line graphs, this new technique is not sensitive to the units of measurement, allowing multivariate data series of different physical qualities (e.g., time, space and energy) to be juxtaposed together conveniently and consistently. It supports multivariate visualization as well as uncertainty visualization. It enables users to focus on algorithm categorization by complexity classes, while reducing visual impact caused by constants and algorithmic components that are insignificant to complexity analysis. It provides an effective means for observing the algorithmic complexity of programs with a mixture of algorithms and black-box software through visualization. Through two case studies, we demonstrate the effectiveness of complexity plots in complexity analysis in research, education and application. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  2. Comparative evolutionary analysis of protein complexes in E. coli and yeast

    Directory of Open Access Journals (Sweden)

    Ranea Juan AG

    2010-02-01

    Full Text Available Abstract Background Proteins do not act in isolation; they frequently act together in protein complexes to carry out concerted cellular functions. The evolution of complexes is poorly understood, especially in organisms other than yeast, where little experimental data has been available. Results We generated accurate, high coverage datasets of protein complexes for E. coli and yeast in order to study differences in the evolution of complexes between these two species. We show that substantial differences exist in how complexes have evolved between these organisms. A previously proposed model of complex evolution identified complexes with cores of interacting homologues. We support findings of the relative importance of this mode of evolution in yeast, but find that it is much less common in E. coli. Additionally it is shown that those homologues which do cluster in complexes are involved in eukaryote-specific functions. Furthermore we identify correlated pairs of non-homologous domains which occur in multiple protein complexes. These were identified in both yeast and E. coli and we present evidence that these too may represent complex cores in yeast but not those of E. coli. Conclusions Our results suggest that there are differences in the way protein complexes have evolved in E. coli and yeast. Whereas some yeast complexes have evolved by recruiting paralogues, this is not apparent in E. coli. Furthermore, such complexes are involved in eukaryotic-specific functions. This implies that the increase in gene family sizes seen in eukaryotes in part reflects multiple family members being used within complexes. However, in general, in both E. coli and yeast, homologous domains are used in different complexes.

  3. Advanced analysis of complex seismic waveforms to characterize the subsurface Earth structure

    Science.gov (United States)

    Jia, Tianxia

    2011-12-01

    This thesis includes three major parts, (1) Body wave analysis of mantle structure under the Calabria slab, (2) Spatial Average Coherency (SPAC) analysis of microtremor to characterize the subsurface structure in urban areas, and (3) Surface wave dispersion inversion for shear wave velocity structure. Although these three projects apply different techniques and investigate different parts of the Earth, their aims are the same, which is to better understand and characterize the subsurface Earth structure by analyzing complex seismic waveforms that are recorded on the Earth surface. My first project is body wave analysis of mantle structure under the Calabria slab. Its aim is to better understand the subduction structure of the Calabria slab by analyzing seismograms generated by natural earthquakes. The rollback and subduction of the Calabrian Arc beneath the southern Tyrrhenian Sea is a case study of slab morphology and slab-mantle interactions at short spatial scale. I analyzed the seismograms traversing the Calabrian slab and upper mantle wedge under the southern Tyrrhenian Sea through body wave dispersion, scattering and attenuation, which are recorded during the PASSCAL CAT/SCAN experiment. Compressional body waves exhibit dispersion correlating with slab paths, which is high-frequency components arrivals being delayed relative to low-frequency components. Body wave scattering and attenuation are also spatially correlated with slab paths. I used this correlation to estimate the positions of slab boundaries, and further suggested that the observed spatial variation in near-slab attenuation could be ascribed to mantle flow patterns around the slab. My second project is Spatial Average Coherency (SPAC) analysis of microtremors for subsurface structure characterization. Shear-wave velocity (Vs) information in soil and rock has been recognized as a critical parameter for site-specific ground motion prediction study, which is highly necessary for urban areas located

  4. Quantitative on-line analysis of sulfur compounds in complex hydrocarbon matrices.

    Science.gov (United States)

    Djokic, Marko R; Ristic, Nenad D; Olahova, Natalia; Marin, Guy B; Van Geem, Kevin M

    2017-08-04

    An improved method for on-line measurement of sulfur containing compounds in complex matrices is presented. The on-line system consists of a specifically designed sampling system connected to a comprehensive two-dimensional gas chromatograph (GC×GC) equipped with two capillary columns (Rtx ® -1 PONA×SGE BPX50), a flame ionization detector (FID) and a sulfur chemiluminescence detector (SCD). The result is an unprecedented sensitivity down to ppm level (1 ppm-w) for various sulfur containing compounds in very complex hydrocarbon matrices. In addition to the GC×GC-SCD, the low molecular weight sulfur containing compounds such as hydrogen sulfide (H 2 S) and carbonyl sulfide (COS) can be analyzed using a thermal conductivity detector of a so-called refinery gas analyzer (RGA). The methodology was extensively tested on a continuous flow pilot plant for steam cracking, in which quantification of sulfur containing compounds in the reactor effluent was carried out using 3-chlorothiophene as internal standard. The GC×GC-FID/-SCD settings were optimized for ppm analysis of sulfur compounds in olefin-rich (ethylene- and propylene-rich) hydrocarbon matrices produced by steam cracking of petroleum feedstocks. Besides that is primarily used for analysis of the hydrocarbon matrix, FID of the GC×GC-FID/-SCD set-up serves to double check the amount of added sulfur internal standard which is crucial for a proper quantification of sulfur compounds. When vacuum gas oil containing 780 ppm-w of elemental sulfur in the form of benzothiophenes and dibenzothiophenes is subjected to steam cracking, the sulfur balance was closed, with 75% of the sulfur contained in the feed is converted to hydrogen sulfide, 13% to alkyl homologues of thiophene while the remaining 12% is present in the form of alkyl homologues of benzothiophenes. The methodology can be applied for many other conversion processes which use sulfur containing feeds such as hydrocracking, catalytic cracking, kerogen

  5. New heteroleptic Zn(II) complexes of thiosemicarbazone and diimine Co-Ligands: Structural analysis and their biological impacts

    Science.gov (United States)

    Mathan Kumar, Shanmugaiah; Kesavan, Mookkandi Palsamy; Vinoth Kumar, Gujuluva Gangatharan; Sankarganesh, Murugesan; Chakkaravarthi, Ganesan; Rajagopal, Gurusamy; Rajesh, Jegathalaprathaban

    2018-02-01

    A thiosemicarbazone ligand HL appended new Zn(II) complexes [Zn(L)(bpy)] (1) and [Zn(L)(phen)] (2) (where, HL = {2-(3-bromo-5-chloro-2-hydroxybenzylidene)-N-phenylhydrazinecarbothioamide}, bpy = 2, 2‧-bipyridine and phen = 1, 10-phenanthroline) have been synthesized and well characterized using conventional spectroscopic techniques viz.,1H NMR, FTIR and UV-Vis spectra. The crystal structures of complexes 1 and 2 have been determined by single crystal X-ray diffraction studies. Both the complex 1 (τ = 0.5) and 2 (τ = 0.37) possesses square based pyramidally distorted trigonal bipyramidal geometry. The ground state electronic structures of complexes 1 and 2 were investigated by DFT/B3LYP theoretical analysis using 6-311G (d,p) and LANL2DZ basis set level. The superior DNA binding ability of complex 2 has been evaluated using absorption and fluorescence spectral titration studies. Antimicrobial evaluation reveals that complex 2 endowed better screening than HL and complex 1 against both bacterial as well as fungal species. Consequently, complex 2 possesses highest antibacterial screening against Staphylococcus aureus (MIC = 3.0 ± 0.23 mM) and antifungal screening against Candida albicans (MIC = 6.0 ± 0.11 mM). Furthermore, the anticancer activity of the ligand HL, complexes 1 and 2 have been examined against the MCF-7 cell line (Human breast cancer cell line) using MTT assay. It is remarkable that complex 2 (12 ± 0.67 μM) show highest anticancer activity than HL (25.0 ± 0.91 μM) and complex 1 (15 ± 0.88 μM) due to the presence of phen ligand moiety.

  6. Analysis of the Response of a 600 kW Stall Controlled Wind Turbine in Complex Terrain

    Energy Technology Data Exchange (ETDEWEB)

    Cuerva, A.; Bercebal, D.; De la Cruz, S.; Lopez-Diez, S.; Lopez-Roque, V.; Vazquez-Aguado, A.; Marti, I.; Marchante, M.; Navarro, J. [CIEMAT. Madrid (Spain)

    1998-12-31

    This work presents a detailed analysis of the operating characteristics of a 600 kW rated power wind turbine installed in complex terrain. The description of the experimental set up and analysis system is included. The relationships between parameters that describe the wind turbine response and the environmental conditions are established via high level statistical analysis, fatigue analysis and analysis is the frequency domain. Dimensionless factors are calculated to explain the intrinsic response of the structure before stochastic and deterministic wind conditions, independently from its size and wind intensity. Finally, conclusions are presented regarding the parameters that affect the loading state and power production of the machine. (Author) 12 refs.

  7. Analysis of the Response of a 600 kW Stall Controlled Wind Turbine in Complex Terrain

    International Nuclear Information System (INIS)

    Cuerva, A.; Bercebal, D.; De La Cruz, M.; Lopez-Diez, S.; Lopez-Roque, V.; Vazquez-Aguado, A.; Marti, I.; Marchante, M.; Navarro, J.

    1998-01-01

    This work presents a detailed analysis of the operating characteristics of a 600 kW rated power wind turbine installed in complex terrain. The description of the experimental set up and analysis system is included. The relationships between parameters that describe the wind turbine response and the environmental conditions are established via high level statistical analysis, fatigue analysis and analysis in the frequency domain. Dimension less factors are calculated to explain the intrinsic response of the structure before stochastic and deterministic wind conditions, independently from its size and wind intensity. Finally, conclusions are presented regarding the parameters that affect the loading state and power production of the machine. (Author) 12 refs

  8. Anatomical Network Analysis Shows Decoupling of Modular Lability and Complexity in the Evolution of the Primate Skull

    Science.gov (United States)

    Esteve-Altava, Borja; Boughner, Julia C.; Diogo, Rui; Villmoare, Brian A.; Rasskin-Gutman, Diego

    2015-01-01

    Modularity and complexity go hand in hand in the evolution of the skull of primates. Because analyses of these two parameters often use different approaches, we do not know yet how modularity evolves within, or as a consequence of, an also-evolving complex organization. Here we use a novel network theory-based approach (Anatomical Network Analysis) to assess how the organization of skull bones constrains the co-evolution of modularity and complexity among primates. We used the pattern of bone contacts modeled as networks to identify connectivity modules and quantify morphological complexity. We analyzed whether modularity and complexity evolved coordinately in the skull of primates. Specifically, we tested Herbert Simon’s general theory of near-decomposability, which states that modularity promotes the evolution of complexity. We found that the skulls of extant primates divide into one conserved cranial module and up to three labile facial modules, whose composition varies among primates. Despite changes in modularity, statistical analyses reject a positive feedback between modularity and complexity. Our results suggest a decoupling of complexity and modularity that translates to varying levels of constraint on the morphological evolvability of the primate skull. This study has methodological and conceptual implications for grasping the constraints that underlie the developmental and functional integration of the skull of humans and other primates. PMID:25992690

  9. Probabilistic analysis of preload in the abutment screw of a dental implant complex.

    Science.gov (United States)

    Guda, Teja; Ross, Thomas A; Lang, Lisa A; Millwater, Harry R

    2008-09-01

    Screw loosening is a problem for a percentage of implants. A probabilistic analysis to determine the cumulative probability distribution of the preload, the probability of obtaining an optimal preload, and the probabilistic sensitivities identifying important variables is lacking. The purpose of this study was to examine the inherent variability of material properties, surface interactions, and applied torque in an implant system to determine the probability of obtaining desired preload values and to identify the significant variables that affect the preload. Using software programs, an abutment screw was subjected to a tightening torque and the preload was determined from finite element (FE) analysis. The FE model was integrated with probabilistic analysis software. Two probabilistic analysis methods (advanced mean value and Monte Carlo sampling) were applied to determine the cumulative distribution function (CDF) of preload. The coefficient of friction, elastic moduli, Poisson's ratios, and applied torque were modeled as random variables and defined by probability distributions. Separate probability distributions were determined for the coefficient of friction in well-lubricated and dry environments. The probabilistic analyses were performed and the cumulative distribution of preload was determined for each environment. A distinct difference was seen between the preload probability distributions generated in a dry environment (normal distribution, mean (SD): 347 (61.9) N) compared to a well-lubricated environment (normal distribution, mean (SD): 616 (92.2) N). The probability of obtaining a preload value within the target range was approximately 54% for the well-lubricated environment and only 0.02% for the dry environment. The preload is predominately affected by the applied torque and coefficient of friction between the screw threads and implant bore at lower and middle values of the preload CDF, and by the applied torque and the elastic modulus of the abutment

  10. Time Series Analysis OF SAR Image Fractal Maps: The Somma-Vesuvio Volcanic Complex Case Study

    Science.gov (United States)

    Pepe, Antonio; De Luca, Claudio; Di Martino, Gerardo; Iodice, Antonio; Manzo, Mariarosaria; Pepe, Susi; Riccio, Daniele; Ruello, Giuseppe; Sansosti, Eugenio; Zinno, Ivana

    2016-04-01

    The fractal dimension is a significant geophysical parameter describing natural surfaces representing the distribution of the roughness over different spatial scale; in case of volcanic structures, it has been related to the specific nature of materials and to the effects of active geodynamic processes. In this work, we present the analysis of the temporal behavior of the fractal dimension estimates generated from multi-pass SAR images relevant to the Somma-Vesuvio volcanic complex (South Italy). To this aim, we consider a Cosmo-SkyMed data-set of 42 stripmap images acquired from ascending orbits between October 2009 and December 2012. Starting from these images, we generate a three-dimensional stack composed by the corresponding fractal maps (ordered according to the acquisition dates), after a proper co-registration. The time-series of the pixel-by-pixel estimated fractal dimension values show that, over invariant natural areas, the fractal dimension values do not reveal significant changes; on the contrary, over urban areas, it correctly assumes values outside the natural surfaces fractality range and show strong fluctuations. As a final result of our analysis, we generate a fractal map that includes only the areas where the fractal dimension is considered reliable and stable (i.e., whose standard deviation computed over the time series is reasonably small). The so-obtained fractal dimension map is then used to identify areas that are homogeneous from a fractal viewpoint. Indeed, the analysis of this map reveals the presence of two distinctive landscape units corresponding to the Mt. Vesuvio and Gran Cono. The comparison with the (simplified) geological map clearly shows the presence in these two areas of volcanic products of different age. The presented fractal dimension map analysis demonstrates the ability to get a figure about the evolution degree of the monitored volcanic edifice and can be profitably extended in the future to other volcanic systems with

  11. Structural Analysis of Der p 1–Antibody Complexes and Comparison with Complexes of Proteins or Peptides with Monoclonal Antibodies

    Energy Technology Data Exchange (ETDEWEB)

    Osinski, Tomasz; Pomés, Anna; Majorek, Karolina A.; Glesner, Jill; Offermann, Lesa R.; Vailes, Lisa D.; Chapman, Martin D.; Minor, Wladek; Chruszcz, Maksymilian (INDOOR); (UV); (SC)

    2015-05-29

    Der p 1 is a major allergen from the house dust mite, Dermatophagoides pteronyssinus, that belongs to the papain-like cysteine protease family. To investigate the antigenic determinants of Der p 1, we determined two crystal structures of Der p 1 in complex with the Fab fragments of mAbs 5H8 or 10B9. Epitopes for these two Der p 1–specific Abs are located in different, nonoverlapping parts of the Der p 1 molecule. Nevertheless, surface area and identity of the amino acid residues involved in hydrogen bonds between allergen and Ab are similar. The epitope for mAb 10B9 only showed a partial overlap with the previously reported epitope for mAb 4C1, a cross-reactive mAb that binds Der p 1 and its homolog Der f 1 from Dermatophagoides farinae. Upon binding to Der p 1, the Fab fragment of mAb 10B9 was found to form a very rare α helix in its third CDR of the H chain. To provide an overview of the surface properties of the interfaces formed by the complexes of Der p 1–10B9 and Der p 1–5H8, along with the complexes of 4C1 with Der p 1 and Der f 1, a broad analysis of the surfaces and hydrogen bonds of all complexes of Fab–protein or Fab–peptide was performed. This work provides detailed insight into the cross-reactive and specific allergen–Ab interactions in group 1 mite allergens. The surface data of Fab–protein and Fab–peptide interfaces can be used in the design of conformational epitopes with reduced Ab binding for immunotherapy.

  12. Thermochemical analysis on rare earth complex of gadolinium with salicylic acid and 8-hydroxyquinoline

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Sheng-Xiong, E-mail: 54xsx@163.com [Hunan Provincial Key Laboratory of Rare-Precious Metals Compounds and Applications, Department of Chemistry and Life Science, Xiangnan University, Chenzhou 423000, Hunan Province (China); Li, Ai-Tao; Jiang, Jian-Hong; Huang, Shuang; Xu, Xiao-Yan; Li, Qiang-Guo [Hunan Provincial Key Laboratory of Rare-Precious Metals Compounds and Applications, Department of Chemistry and Life Science, Xiangnan University, Chenzhou 423000, Hunan Province (China)

    2012-11-20

    Highlights: Black-Right-Pointing-Pointer A new novel rare earth complex Gd(C{sub 7}H{sub 5}O{sub 3}){sub 2}{center_dot}(C{sub 9}H{sub 6}NO) was synthesized and characterized. Black-Right-Pointing-Pointer The dissolution enthalpies of the relevant substances were determined. Black-Right-Pointing-Pointer The enthalpy change of the reaction was determined to be (211.54 {+-} 0.69) kJ mol{sup -1}. Black-Right-Pointing-Pointer The standard molar enthalpy of formation of complex was -(1890.7 {+-} 3.1) kJ mol{sup -1}. - Abstract: The rare earth complex, Gd(C{sub 7}H{sub 5}O{sub 3}){sub 2}{center_dot}(C{sub 9}H{sub 6}NO), was synthesized by the reaction of Gadolinium nitrate hexahydrate with salicylic acid (C{sub 7}H{sub 6}O{sub 3}) and 8-hydroxyquinoline (C{sub 9}H{sub 7}NO). And it was characterized by elemental analysis, UV spectra, IR spectra, molar conductance and thermogravimetric analysis. In a optimalizing calorimetric solvent, the dissolution enthalpies were determined by an advanced solution-reaction isoperibol microcalorimeter, respectively: {Delta}{sub s}H{sub m}{sup {Theta}} [2 C{sub 7}H{sub 6}O{sub 3}(s) + C{sub 9}H{sub 7}NO(s), 298.15 K] = 41.95 {+-} 0.44 kJ mol{sup -1}, {Delta}{sub s}H{sub m}{sup {Theta}} [Gd(NO{sub 3}){sub 3}{center_dot}6H{sub 2}O(s), 298.15 K] = -29.11 {+-} 0.39 kJ mol{sup -1}, {Delta}{sub s}H{sub m}{sup {Theta}} [Gd(C{sub 7}H{sub 5}O{sub 3}){sub 2}{center_dot}(C{sub 9}H{sub 6}NO)(s), 298.15 K] = -46.99 {+-} 0.39 kJ mol{sup -1} and {Delta}{sub s}H{sub m}{sup {Theta}} [Solution D(aq), 298.15 K] = -90.33 {+-} 0.37 kJ mol{sup -1}. The enthalpy change of the synthesized reaction was estimated to be {Delta}{sub r}H{sub m}{sup {Theta}}=211.54{+-}0.69 kJ mol{sup -1}. From data in the literature, through Hess' law, the standard molar enthalpy of formation of Gd(C{sub 7}H{sub 5}O{sub 3}){sub 2}{center_dot}(C{sub 9}H{sub 7}NO)(s) was calculated to be {Delta}{sub f}H{sub m}{sup {Theta}} [Gd(C{sub 7}H{sub 5}O{sub 3}){sub 2}{center_dot}(C{sub 9}H

  13. System-wide analysis reveals a complex network of tumor-fibroblast interactions involved in tumorigenicity.

    Directory of Open Access Journals (Sweden)

    Megha Rajaram

    Full Text Available Many fibroblast-secreted proteins promote tumorigenicity, and several factors secreted by cancer cells have in turn been proposed to induce these proteins. It is not clear whether there are single dominant pathways underlying these interactions or whether they involve multiple pathways acting in parallel. Here, we identified 42 fibroblast-secreted factors induced by breast cancer cells using comparative genomic analysis. To determine what fraction was active in promoting tumorigenicity, we chose five representative fibroblast-secreted factors for in vivo analysis. We found that the majority (three out of five played equally major roles in promoting tumorigenicity, and intriguingly, each one had distinct effects on the tumor microenvironment. Specifically, fibroblast-secreted amphiregulin promoted breast cancer cell survival, whereas the chemokine CCL7 stimulated tumor cell proliferation while CCL2 promoted innate immune cell infiltration and angiogenesis. The other two factors tested had minor (CCL8 or minimally (STC1 significant effects on the ability of fibroblasts to promote tumor growth. The importance of parallel interactions between fibroblasts and cancer cells was tested by simultaneously targeting fibroblast-secreted amphiregulin and the CCL7 receptor on cancer cells, and this was significantly more efficacious than blocking either pathway alone. We further explored the concept of parallel interactions by testing the extent to which induction of critical fibroblast-secreted proteins could be achieved by single, previously identified, factors produced by breast cancer cells. We found that although single factors could induce a subset of genes, even combinations of factors failed to induce the full repertoire of functionally important fibroblast-secreted proteins. Together, these results delineate a complex network of tumor-fibroblast interactions that act in parallel to promote tumorigenicity and suggest that effective anti

  14. Sensitivity analysis of complex models: Coping with dynamic and static inputs

    International Nuclear Information System (INIS)

    Anstett-Collin, F.; Goffart, J.; Mara, T.; Denis-Vidal, L.

    2015-01-01

    In this paper, we address the issue of conducting a sensitivity analysis of complex models with both static and dynamic uncertain inputs. While several approaches have been proposed to compute the sensitivity indices of the static inputs (i.e. parameters), the one of the dynamic inputs (i.e. stochastic fields) have been rarely addressed. For this purpose, we first treat each dynamic as a Gaussian process. Then, the truncated Karhunen–Loève expansion of each dynamic input is performed. Such an expansion allows to generate independent Gaussian processes from a finite number of independent random variables. Given that a dynamic input is represented by a finite number of random variables, its variance-based sensitivity index is defined by the sensitivity index of this group of variables. Besides, an efficient sampling-based strategy is described to estimate the first-order indices of all the input factors by only using two input samples. The approach is applied to a building energy model, in order to assess the impact of the uncertainties of the material properties (static inputs) and the weather data (dynamic inputs) on the energy performance of a real low energy consumption house. - Highlights: • Sensitivity analysis of models with uncertain static and dynamic inputs is performed. • Karhunen–Loève (KL) decomposition of the spatio/temporal inputs is performed. • The influence of the dynamic inputs is studied through the modes of the KL expansion. • The proposed approach is applied to a building energy model. • Impact of weather data and material properties on performance of real house is given

  15. Deciphering the spatio-temporal complexity of climate change of the last deglaciation: a model analysis

    Directory of Open Access Journals (Sweden)

    D. M. Roche

    2011-06-01

    Full Text Available Understanding the sequence of events occuring during the last major glacial to interglacial transition (21 ka BP to 9 ka BP is a challenging task that has the potential to unveil the mechanisms behind large scale climate changes. Though many studies have focused on the understanding of the complex sequence of rapid climatic change that accompanied or interrupted the deglaciation, few have analysed it in a more theoretical framework with simple forcings. In the following, we address when and where the first significant temperature anomalies appeared when using slow varying forcing of the last deglaciation. We used here coupled transient simulations of the last deglaciation, including ocean, atmosphere and vegetation components to analyse the spatial timing of the deglaciation. To keep the analysis in a simple framework, we did not include freshwater forcings that potentially cause rapid climate shifts during that time period. We aimed to disentangle the direct and subsequent response of the climate system to slow forcing and moreover, the location where those changes are more clearly expressed. In a data – modelling comparison perspective, this could help understand the physically plausible phasing between known forcings and recorded climatic changes. Our analysis of climate variability could also help to distinguish deglacial warming signals from internal climate variability. We thus are able to better pinpoint the onset of local deglaciation, as defined by the first significant local warming and further show that there is a large regional variability associated with it, even with the set of slow forcings used here. In our model, the first significant hemispheric warming occurred simultaneously in the North and in the South and is a direct response to the obliquity forcing.

  16. Model complexity in carbon sequestration:A design of experiment and response surface uncertainty analysis

    Science.gov (United States)

    Zhang, Y.; Li, S.

    2014-12-01

    Geologic carbon sequestration (GCS) is proposed for the Nugget Sandstone in Moxa Arch, a regional saline aquifer with a large storage potential. For a proposed storage site, this study builds a suite of increasingly complex conceptual "geologic" model families, using subsets of the site characterization data: a homogeneous model family, a stationary petrophysical model family, a stationary facies model family with sub-facies petrophysical variability, and a non-stationary facies model family (with sub-facies variability) conditioned to soft data. These families, representing alternative conceptual site models built with increasing data, were simulated with the same CO2 injection test (50 years at 1/10 Mt per year), followed by 2950 years of monitoring. Using the Design of Experiment, an efficient sensitivity analysis (SA) is conducted for all families, systematically varying uncertain input parameters. Results are compared among the families to identify parameters that have 1st order impact on predicting the CO2 storage ratio (SR) at both end of injection and end of monitoring. At this site, geologic modeling factors do not significantly influence the short-term prediction of the storage ratio, although they become important over monitoring time, but only for those families where such factors are accounted for. Based on the SA, a response surface analysis is conducted to generate prediction envelopes of the storage ratio, which are compared among the families at both times. Results suggest a large uncertainty in the predicted storage ratio given the uncertainties in model parameters and modeling choices: SR varies from 5-60% (end of injection) to 18-100% (end of monitoring), although its variation among the model families is relatively minor. Moreover, long-term leakage risk is considered small at the proposed site. In the lowest-SR scenarios, all families predict gravity-stable supercritical CO2 migrating toward the bottom of the aquifer. In the highest

  17. Patient Safety Culture Survey in Pediatric Complex Care Settings: A Factor Analysis.

    Science.gov (United States)

    Hessels, Amanda J; Murray, Meghan; Cohen, Bevin; Larson, Elaine L

    2017-04-19

    Children with complex medical needs are increasing in number and demanding the services of pediatric long-term care facilities (pLTC), which require a focus on patient safety culture (PSC). However, no tool to measure PSC has been tested in this unique hybrid acute care-residential setting. The objective of this study was to evaluate the psychometric properties of the Nursing Home Survey on Patient Safety Culture tool slightly modified for use in the pLTC setting. Factor analyses were performed on data collected from 239 staff at 3 pLTC in 2012. Items were screened by principal axis factoring, and the original structure was tested using confirmatory factor analysis. Exploratory factor analysis was conducted to identify the best model fit for the pLTC data, and factor reliability was assessed by Cronbach alpha. The extracted, rotated factor solution suggested items in 4 (staffing, nonpunitive response to mistakes, communication openness, and organizational learning) of the original 12 dimensions may not be a good fit for this population. Nevertheless, in the pLTC setting, both the original and the modified factor solutions demonstrated similar reliabilities to the published consistencies of the survey when tested in adult nursing homes and the items factored nearly identically as theorized. This study demonstrates that the Nursing Home Survey on Patient Safety Culture with minimal modification may be an appropriate instrument to measure PSC in pLTC settings. Additional psychometric testing is recommended to further validate the use of this instrument in this setting, including examining the relationship to safety outcomes. Increased use will yield data for benchmarking purposes across these specialized settings to inform frontline workers and organizational leaders of areas of strength and opportunity for improvement.

  18. Complex Retrieval of Embedded IVC Filters: Alternative Techniques and Histologic Tissue Analysis

    International Nuclear Information System (INIS)

    Kuo, William T.; Cupp, John S.; Louie, John D.; Kothary, Nishita; Hofmann, Lawrence V.; Sze, Daniel Y.; Hovsepian, David M.

    2012-01-01

    Purpose: We evaluated the safety and effectiveness of alternative endovascular methods to retrieve embedded optional and permanent filters in order to manage or reduce risk of long-term complications from implantation. Histologic tissue analysis was performed to elucidate the pathologic effects of chronic filter implantation. Methods: We studied the safety and effectiveness of alternative endovascular methods for removing embedded inferior vena cava (IVC) filters in 10 consecutive patients over 12 months. Indications for retrieval were symptomatic chronic IVC occlusion, caval and aortic perforation, and/or acute PE (pulmonary embolism) from filter-related thrombus. Retrieval was also performed to reduce risk of complications from long-term filter implantation and to eliminate the need for lifelong anticoagulation. All retrieved specimens were sent for histologic analysis. Results: Retrieval was successful in all 10 patients. Filter types and implantation times were as follows: one Venatech (1,495 days), one Simon-Nitinol (1,485 days), one Optease (300 days), one G2 (416 days), five Günther-Tulip (GTF; mean 606 days, range 154–1,010 days), and one Celect (124 days). There were no procedural complications or adverse events at a mean follow-up of 304 days after removal (range 196–529 days). Histology revealed scant native intima surrounded by a predominance of neointimal hyperplasia and dense fibrosis in all specimens. Histologic evidence of photothermal tissue ablation was confirmed in three laser-treated specimens. Conclusion: Complex retrieval methods can now be used in select patients to safely remove embedded optional and permanent IVC filters previously considered irretrievable. Neointimal hyperplasia and dense fibrosis are the major components that must be separated to achieve successful retrieval of chronic filter implants.

  19. Complex analysis of energy efficiency in operated high-rise residential building: Case study

    Directory of Open Access Journals (Sweden)

    Korniyenko Sergey

    2018-01-01

    Full Text Available Energy conservation and human thermal comfort enhancement in buildings is a topical issue of modern architecture and construction. The innovative solution of this problem makes it possible to enhance building ecological and maintenance safety, to reduce hydrocarbon fuel consumption, and to improve life standard of people. The requirements to increase of energy efficiency in buildings should be provided at all the stages of building's life cycle that is at the stage of design, construction and maintenance of buildings. The research purpose is complex analysis of energy efficiency in operated high-rise residential building. Many actions for building energy efficiency are realized according to the project; mainly it is the effective building envelope and engineering systems. Based on results of measurements the energy indicators of the building during annual period have been calculated. The main reason of increase in heat losses consists in the raised infiltration of external air in the building through a building envelope owing to the increased air permeability of windows and balcony doors (construction defects. Thermorenovation of the building based on ventilating and infiltration heat losses reduction through a building envelope allows reducing annual energy consumption. Energy efficiency assessment based on the total annual energy consumption of building, including energy indices for heating and a ventilation, hot water supply and electricity supply, in comparison with heating is more complete. The account of various components in building energy balance completely corresponds to modern direction of researches on energy conservation and thermal comfort enhancement in buildings.

  20. A novel four-dimensional analytical approach for analysis of complex samples.

    Science.gov (United States)

    Stephan, Susanne; Jakob, Cornelia; Hippler, Jörg; Schmitz, Oliver J

    2016-05-01

    A two-dimensional LC (2D-LC) method, based on the work of Erni and Frei in 1978, was developed and coupled to an ion mobility-high-resolution mass spectrometer (IM-MS), which enabled the separation of complex samples in four dimensions (2D-LC, ion mobility spectrometry (IMS), and mass spectrometry (MS)). This approach works as a continuous multiheart-cutting LC system, using a long modulation time of 4 min, which allows the complete transfer of most of the first - dimension peaks to the second - dimension column without fractionation, in comparison to comprehensive two-dimensional liquid chromatography. Hence, each compound delivers only one peak in the second dimension, which simplifies the data handling even when ion mobility spectrometry as a third and mass spectrometry as a fourth dimension are introduced. The analysis of a plant extract from Ginkgo biloba shows the separation power of this four-dimensional separation method with a calculated total peak capacity of more than 8700. Furthermore, the advantage of ion mobility for characterizing unknown compounds by their collision cross section (CCS) and accurate mass in a non-target approach is shown for different matrices like plant extracts and coffee. Graphical abstract Principle of the four-dimensional separation.

  1. Brownian motion of polyphosphate complexes in yeast vacuoles: characterization by fluorescence microscopy with image analysis.

    Science.gov (United States)

    Puchkov, Evgeny O

    2010-06-01

    In the vacuoles of Saccharomyces cerevisiae yeast cells, vividly moving insoluble polyphosphate complexes (IPCs) movement of the IPCs and to evaluate the viscosity in the vacuoles using the obtained data. Studies were conducted on S. cerevisiae cells stained by DAPI and fluorescein isothyocyanate-labelled latex microspheres, using fluorescence microscopy combined with computer image analysis (ImageJ software, NIH, USA). IPC movement was photorecorded and shown to be Brownian motion. On latex microspheres, a methodology was developed for measuring a fluorescing particle's two-dimensional (2D) displacements and its size. In four yeast cells, the 2D displacements and sizes of the IPCs were evaluated. Apparent viscosity values in the vacuoles of the cells, computed by the Einstein-Smoluchowski equation using the obtained data, were found to be 2.16 +/- 0.60, 2.52 +/- 0.63, 3.32 +/- 0.9 and 11.3 +/- 1.7 cP. The first three viscosity values correspond to 30-40% glycerol solutions. The viscosity value of 11.3 +/- 1.7 cP was supposed to be an overestimation, caused by the peculiarities of the vacuole structure and/or volume in this particular cell. This conclusion was supported by the particular quality of the Brownian motion trajectories set in this cell as compared to the other three cells.

  2. Histological analysis of thelohaniasis in white-clawed crayfish Austropotamobius pallipes complex

    Directory of Open Access Journals (Sweden)

    Quaglio F.

    2011-09-01

    Full Text Available From 2004 to 2006, a parasitological survey aimed at the detection of the microsporidian parasite Thelohania contejeani Henneguy was carried out on 177 wild white-clawed crayfish (Austropotamobius pallipes complex captured in six streams and rivers of the province of Belluno in north-eastern Italy. Microscopical examination of the skeletal muscles, and histological analysis applying different histochemical stains to full transverse and sagittal sections of the cephalothorax and abdomen were carried out. Transmission electron microscopy (TEM was also conducted on the parasites recovered during the survey. Out of 177 crayfish examined, Thelohania contejeani (Microsporidia, Thelohaniidae was present in only one crayfish from the Vena d’oro creek. The parasite was detected in the skeletal muscles in several developmental stages, including mature spores, which represented the most common stage recovered. Sporophorous vesicles were also present. Histological examination revealed that the fibres of the skeletal, cardiac and intestinal muscles were filled with spores. Melanin infiltrations were focally present in the infected striated muscles. The gill phagocytic nephrocytes were engulfed by small masses of spores. Among the staining techniques applied, Crossman’s trichrome stain represented the most effective method of detecting T. contejeani.

  3. Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing

    Science.gov (United States)

    Meng, Bo; Cheng, Lihong

    2017-01-01

    The rise of global value chains (GVCs) characterized by the so-called “outsourcing”, “fragmentation production”, and “trade in tasks” has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014) and Wang et al. (2013) in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics. PMID:28081201

  4. Comparative Visual Analysis of Structure-Performance Relations in Complex Bulk-Heterojunction Morphologies

    KAUST Repository

    Aboulhassan, A.

    2017-07-04

    The structure of Bulk-Heterojunction (BHJ) materials, the main component of organic photovoltaic solar cells, is very complex, and the relationship between structure and performance is still largely an open question. Overall, there is a wide spectrum of fabrication configurations resulting in different BHJ morphologies and correspondingly different performances. Current state-of-the-art methods for assessing the performance of BHJ morphologies are either based on global quantification of morphological features or simply on visual inspection of the morphology based on experimental imaging. This makes finding optimal BHJ structures very challenging. Moreover, finding the optimal fabrication parameters to get an optimal structure is still an open question. In this paper, we propose a visual analysis framework to help answer these questions through comparative visualization and parameter space exploration for local morphology features. With our approach, we enable scientists to explore multivariate correlations between local features and performance indicators of BHJ morphologies. Our framework is built on shape-based clustering of local cubical regions of the morphology that we call patches. This enables correlating the features of clusters with intuition-based performance indicators computed from geometrical and topological features of charge paths.

  5. Stereological analysis of neuron, glial and endothelial cell numbers in the human amygdaloid complex.

    Directory of Open Access Journals (Sweden)

    María García-Amado

    Full Text Available Cell number alterations in the amygdaloid complex (AC might coincide with neurological and psychiatric pathologies with anxiety imbalances as well as with changes in brain functionality during aging. This stereological study focused on estimating, in samples from 7 control individuals aged 20 to 75 years old, the number and density of neurons, glia and endothelial cells in the entire AC and in its 5 nuclear groups (including the basolateral (BL, corticomedial and central groups, 5 nuclei and 13 nuclear subdivisions. The volume and total cell number in these territories were determined on Nissl-stained sections with the Cavalieri principle and the optical fractionator. The AC mean volume was 956 mm(3 and mean cell numbers (x10(6 were: 15.3 neurons, 60 glial cells and 16.8 endothelial cells. The numbers of endothelial cells and neurons were similar in each AC region and were one fourth the number of glial cells. Analysis of the influence of the individuals' age at death on volume, cell number and density in each of these 24 AC regions suggested that aging does not affect regional size or the amount of glial cells, but that neuron and endothelial cell numbers respectively tended to decrease and increase in territories such as AC or BL. These accurate stereological measures of volume and total cell numbers and densities in the AC of control individuals could serve as appropriate reference values to evaluate subtle alterations in this structure in pathological conditions.

  6. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  7. Convergence analysis of the alternating RGLS algorithm for the identification of the reduced complexity Volterra model.

    Science.gov (United States)

    Laamiri, Imen; Khouaja, Anis; Messaoud, Hassani

    2015-03-01

    In this paper we provide a convergence analysis of the alternating RGLS (Recursive Generalized Least Square) algorithm used for the identification of the reduced complexity Volterra model describing stochastic non-linear systems. The reduced Volterra model used is the 3rd order SVD-PARAFC-Volterra model provided using the Singular Value Decomposition (SVD) and the Parallel Factor (PARAFAC) tensor decomposition of the quadratic and the cubic kernels respectively of the classical Volterra model. The Alternating RGLS (ARGLS) algorithm consists on the execution of the classical RGLS algorithm in alternating way. The ARGLS convergence was proved using the Ordinary Differential Equation (ODE) method. It is noted that the algorithm convergence canno׳t be ensured when the disturbance acting on the system to be identified has specific features. The ARGLS algorithm is tested in simulations on a numerical example by satisfying the determined convergence conditions. To raise the elegies of the proposed algorithm, we proceed to its comparison with the classical Alternating Recursive Least Squares (ARLS) presented in the literature. The comparison has been built on a non-linear satellite channel and a benchmark system CSTR (Continuous Stirred Tank Reactor). Moreover the efficiency of the proposed identification approach is proved on an experimental Communicating Two Tank system (CTTS). Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Accelerated fluctuation analysis by graphic cards and complex pattern formation in financial markets

    International Nuclear Information System (INIS)

    Preis, Tobias; Virnau, Peter; Paul, Wolfgang; Schneider, Johannes J

    2009-01-01

    The compute unified device architecture is an almost conventional programming approach for managing computations on a graphics processing unit (GPU) as a data-parallel computing device. With a maximum number of 240 cores in combination with a high memory bandwidth, a recent GPU offers resources for computational physics. We apply this technology to methods of fluctuation analysis, which includes determination of the scaling behavior of a stochastic process and the equilibrium autocorrelation function. Additionally, the recently introduced pattern formation conformity (Preis T et al 2008 Europhys. Lett. 82 68005), which quantifies pattern-based complex short-time correlations of a time series, is calculated on a GPU and analyzed in detail. Results are obtained up to 84 times faster than on a current central processing unit core. When we apply this method to high-frequency time series of the German BUND future, we find significant pattern-based correlations on short time scales. Furthermore, an anti-persistent behavior can be found on short time scales. Additionally, we compare the recent GPU generation, which provides a theoretical peak performance of up to roughly 10 12 floating point operations per second with the previous one.

  9. NMR spectroscopic and analytical ultracentrifuge analysis of membrane protein detergent complexes

    Directory of Open Access Journals (Sweden)

    Choe Senyon

    2007-11-01

    Full Text Available Abstract Background Structural studies of integral membrane proteins (IMPs are hampered by inherent difficulties in their heterologous expression and in the purification of solubilized protein-detergent complexes (PDCs. The choice and concentrations of detergents used in an IMP preparation play a critical role in protein homogeneity and are thus important for successful crystallization. Results Seeking an effective and standardized means applicable to genomic approaches for the characterization of PDCs, we chose 1D-NMR spectroscopic analysis to monitor the detergent content throughout their purification: protein extraction, detergent exchange, and sample concentration. We demonstrate that a single NMR measurement combined with a SDS-PAGE of a detergent extracted sample provides a useful gauge of the detergent's extraction potential for a given protein. Furthermore, careful monitoring of the detergent content during the process of IMP production allows for a high level of reproducibility. We also show that in many cases a simple sedimentation velocity measurement provides sufficient data to estimate both the oligomeric state and the detergent-to-protein ratio in PDCs, as well as to evaluate the homogeneity of the samples prior to crystallization screening. Conclusion The techniques presented here facilitate the screening and selection of the extraction detergent, as well as help to maintain reproducibility in the detergent exchange and PDC concentration procedures. Such reproducibility is particularly important for the optimization of initial crystallization conditions, for which multiple purifications are routinely required.

  10. Complex young lives: a collective qualitative case study analysis of young fatherhood and breastfeeding.

    Science.gov (United States)

    Ayton, Jennifer; Hansen, Emily

    2016-01-01

    Of all births in Australia, 10 % are to young fathers aged less than 24 years. How young fathers experience any breastfeeding and how this is shaped by their social context is poorly understood. Our aim is to increase understanding of the lived experience of young fathers (aged less than 24 years) and to explore the way they speak about breastfeeding in the context of their lives and parenting. This collective case study analysis uses qualitative data from interviews and focus groups with young fathers (aged less than 24 years) and community support staff. The research was undertaken in Tasmania, Australia, March to December 2013. Young fathers in our study had complex social and emotional circumstances that meant breastfeeding was not a high priority despite them valuing the health benefits of breastfeeding for their babies. If supported by peers and their community they appear to have a more positive parenting experience. Breastfeeding although understood by the young fathers in our study as healthy and desirable is not a priority in their lives. Learning to be a parent and support their partners to breastfeed may be more effectively gained through mentoring and father-to-father localized community based support services.

  11. Micro-earthquake signal analysis and hypocenter determination around Lokon volcano complex

    Energy Technology Data Exchange (ETDEWEB)

    Firmansyah, Rizky, E-mail: rizkyfirmansyah@hotmail.com [Geophysical Engineering, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Bandung, 40132 (Indonesia); Nugraha, Andri Dian, E-mail: nugraha@gf.itb.ac.id [Global Geophysical Group, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Bandung, 40132 (Indonesia); Kristianto, E-mail: kris@vsi.esdm.go.id [Center for Volcanology and Geological Hazard Mitigation (CVGHM), Geological Agency, Bandung, 40122 (Indonesia)

    2015-04-24

    Mount Lokon is one of five active volcanoes which is located in the North Sulawesi region. Since June 26{sup th}, 2011, standby alert set by the Center for Volcanology and Geological Hazard Mitigation (CVGHM) for this mountain. The Mount Lokon volcano erupted on July 4{sup th}, 2011 and still continuously erupted until August 28{sup th}, 2011. Due to its high seismic activity, this study is focused to analysis of micro-earthquake signal and determine the micro-earthquake hypocenter location around the complex area of Lokon-Empung Volcano before eruption phase in 2011 (time periods of January, 2009 up to March, 2010). Determination of the hypocenter location was conducted with Geiger Adaptive Damping (GAD) method. We used initial model from previous study in Volcan de Colima, Mexico. The reason behind the model selection was based on the same characteristics that shared between Mount Lokon and Colima including andesitic stratovolcano and small-plinian explosions volcanian types. In this study, a picking events was limited to the volcano-tectonics of A and B types, hybrid, long-period that has a clear signal onset, and local tectonic with different maximum S – P time are not more than three seconds. As a result, we observed the micro-earthquakes occurred in the area north-west of Mount Lokon region.

  12. Application fo fault tree methodology in the risk analysis of complex systems

    International Nuclear Information System (INIS)

    Vasconcelos, V. de.

    1984-01-01

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author) [pt

  13. A holistic approach to determine tree structural complexity based on laser scanning data and fractal analysis.

    Science.gov (United States)

    Seidel, Dominik

    2018-01-01

    The three-dimensional forest structure affects many ecosystem functions and services provided by forests. As forests are made of trees it seems reasonable to approach their structure by investigating individual tree structure. Based on three-dimensional point clouds from laser scanning, a newly developed holistic approach is presented that enables to calculate the box dimension as a measure of structural complexity of individual trees using fractal analysis. It was found that the box dimension of trees was significantly different among the tested species, among trees belonging to the same species but exposed to different growing conditions (at gap vs. forest interior) or to different kinds of competition (intraspecific vs. interspecific). Furthermore, it was shown that the box dimension is positively related to the trees' growth rate. The box dimension was identified as an easy to calculate measure that integrates the effect of several external drivers of tree structure, such as competition strength and type, while simultaneously providing information on structure-related properties, like tree growth.

  14. Comparative Visual Analysis of Structure-Performance Relations in Complex Bulk-Heterojunction Morphologies

    KAUST Repository

    Aboulhassan, A.; Sicat, R.; Baum, D.; Wodo, O.; Hadwiger, Markus

    2017-01-01

    The structure of Bulk-Heterojunction (BHJ) materials, the main component of organic photovoltaic solar cells, is very complex, and the relationship between structure and performance is still largely an open question. Overall, there is a wide spectrum of fabrication configurations resulting in different BHJ morphologies and correspondingly different performances. Current state-of-the-art methods for assessing the performance of BHJ morphologies are either based on global quantification of morphological features or simply on visual inspection of the morphology based on experimental imaging. This makes finding optimal BHJ structures very challenging. Moreover, finding the optimal fabrication parameters to get an optimal structure is still an open question. In this paper, we propose a visual analysis framework to help answer these questions through comparative visualization and parameter space exploration for local morphology features. With our approach, we enable scientists to explore multivariate correlations between local features and performance indicators of BHJ morphologies. Our framework is built on shape-based clustering of local cubical regions of the morphology that we call patches. This enables correlating the features of clusters with intuition-based performance indicators computed from geometrical and topological features of charge paths.

  15. Stereological analysis of neuron, glial and endothelial cell numbers in the human amygdaloid complex.

    Science.gov (United States)

    García-Amado, María; Prensa, Lucía

    2012-01-01

    Cell number alterations in the amygdaloid complex (AC) might coincide with neurological and psychiatric pathologies with anxiety imbalances as well as with changes in brain functionality during aging. This stereological study focused on estimating, in samples from 7 control individuals aged 20 to 75 years old, the number and density of neurons, glia and endothelial cells in the entire AC and in its 5 nuclear groups (including the basolateral (BL), corticomedial and central groups), 5 nuclei and 13 nuclear subdivisions. The volume and total cell number in these territories were determined on Nissl-stained sections with the Cavalieri principle and the optical fractionator. The AC mean volume was 956 mm(3) and mean cell numbers (x10(6)) were: 15.3 neurons, 60 glial cells and 16.8 endothelial cells. The numbers of endothelial cells and neurons were similar in each AC region and were one fourth the number of glial cells. Analysis of the influence of the individuals' age at death on volume, cell number and density in each of these 24 AC regions suggested that aging does not affect regional size or the amount of glial cells, but that neuron and endothelial cell numbers respectively tended to decrease and increase in territories such as AC or BL. These accurate stereological measures of volume and total cell numbers and densities in the AC of control individuals could serve as appropriate reference values to evaluate subtle alterations in this structure in pathological conditions.

  16. Complex analysis of energy efficiency in operated high-rise residential building: Case study

    Science.gov (United States)

    Korniyenko, Sergey

    2018-03-01

    Energy conservation and human thermal comfort enhancement in buildings is a topical issue of modern architecture and construction. The innovative solution of this problem makes it possible to enhance building ecological and maintenance safety, to reduce hydrocarbon fuel consumption, and to improve life standard of people. The requirements to increase of energy efficiency in buildings should be provided at all the stages of building's life cycle that is at the stage of design, construction and maintenance of buildings. The research purpose is complex analysis of energy efficiency in operated high-rise residential building. Many actions for building energy efficiency are realized according to the project; mainly it is the effective building envelope and engineering systems. Based on results of measurements the energy indicators of the building during annual period have been calculated. The main reason of increase in heat losses consists in the raised infiltration of external air in the building through a building envelope owing to the increased air permeability of windows and balcony doors (construction defects). Thermorenovation of the building based on ventilating and infiltration heat losses reduction through a building envelope allows reducing annual energy consumption. Energy efficiency assessment based on the total annual energy consumption of building, including energy indices for heating and a ventilation, hot water supply and electricity supply, in comparison with heating is more complete. The account of various components in building energy balance completely corresponds to modern direction of researches on energy conservation and thermal comfort enhancement in buildings.

  17. Micro-earthquake signal analysis and hypocenter determination around Lokon volcano complex

    International Nuclear Information System (INIS)

    Firmansyah, Rizky; Nugraha, Andri Dian; Kristianto

    2015-01-01

    Mount Lokon is one of five active volcanoes which is located in the North Sulawesi region. Since June 26 th , 2011, standby alert set by the Center for Volcanology and Geological Hazard Mitigation (CVGHM) for this mountain. The Mount Lokon volcano erupted on July 4 th , 2011 and still continuously erupted until August 28 th , 2011. Due to its high seismic activity, this study is focused to analysis of micro-earthquake signal and determine the micro-earthquake hypocenter location around the complex area of Lokon-Empung Volcano before eruption phase in 2011 (time periods of January, 2009 up to March, 2010). Determination of the hypocenter location was conducted with Geiger Adaptive Damping (GAD) method. We used initial model from previous study in Volcan de Colima, Mexico. The reason behind the model selection was based on the same characteristics that shared between Mount Lokon and Colima including andesitic stratovolcano and small-plinian explosions volcanian types. In this study, a picking events was limited to the volcano-tectonics of A and B types, hybrid, long-period that has a clear signal onset, and local tectonic with different maximum S – P time are not more than three seconds. As a result, we observed the micro-earthquakes occurred in the area north-west of Mount Lokon region

  18. Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing.

    Directory of Open Access Journals (Sweden)

    Hao Xiao

    Full Text Available The rise of global value chains (GVCs characterized by the so-called "outsourcing", "fragmentation production", and "trade in tasks" has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014 and Wang et al. (2013 in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics.

  19. Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing.

    Science.gov (United States)

    Xiao, Hao; Sun, Tianyang; Meng, Bo; Cheng, Lihong

    2017-01-01

    The rise of global value chains (GVCs) characterized by the so-called "outsourcing", "fragmentation production", and "trade in tasks" has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014) and Wang et al. (2013) in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics.

  20. THE COMPLEX ANALYSIS METHOD OF SEMANTIC ASSOCIATIONS IN STUDYING THE STUDENTS’ CREATIVE ETHOS

    Directory of Open Access Journals (Sweden)

    P. A. Starikov

    2013-01-01

    Full Text Available The paper demonstrates the sociological research findings concerning the students’ ideas of creativity based on the questionnaires and testing of the students of the natural science, humanities and technical profiles at Siberian Federal University over the period of 2007-2011.The author suggests a new method of semantic association analysis in order to identify the latent groups of notions related to the concept of creativity. The range of students’ common opinions demonstrate the obvious trend for humanizing the idea of creativity, considering  it as the perfect mode of human existence, which coincide with the ideas of K. Rogers, A. Maslow and other scholars. Today’s students associate creativity primarily with pleasure, self-development, self-expression, inspiration, improvisation, spontaneity; and the resulting semantic complex incorporates such characteristics of creative work as goodness, abundance of energy, integrity, health, freedom and independence, self-development and spirituality.The obtained data prove the importance of the inspiration experience in creative pedagogy; the research outcomes along with the continuing monitoring of students attitude to creativity development can optimize the learning process. The author emphasizes the necessity of introducing some special courses, based on the integral approach (including social, philosophical, psychological, psycho-social and technical aspects, and aimed at developing students’ creative competence. 

  1. Traceability and Risk Analysis Strategies for Addressing Counterfeit Electronics in Supply Chains for Complex Systems.

    Science.gov (United States)

    DiMase, Daniel; Collier, Zachary A; Carlson, Jinae; Gray, Robin B; Linkov, Igor

    2016-10-01

    Within the microelectronics industry, there is a growing concern regarding the introduction of counterfeit electronic parts into the supply chain. Even though this problem is widespread, there have been limited attempts to implement risk-based approaches for testing and supply chain management. Supply chain risk management tends to focus on the highly visible disruptions of the supply chain instead of the covert entrance of counterfeits; thus counterfeit risk is difficult to mitigate. This article provides an overview of the complexities of the electronics supply chain, and highlights some gaps in risk assessment practices. In particular, this article calls for enhanced traceability capabilities to track and trace parts at risk through various stages of the supply chain. Placing the focus on risk-informed decision making through the following strategies is needed, including prioritization of high-risk parts, moving beyond certificates of conformance, incentivizing best supply chain management practices, adoption of industry standards, and design and management for supply chain resilience. © 2016 Society for Risk Analysis.

  2. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part I: Template-Based Generic Programming

    Directory of Open Access Journals (Sweden)

    Roger P. Pawlowski

    2012-01-01

    Full Text Available An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering.

  3. Complexity Analysis of Carbon Market Using the Modified Multi-Scale Entropy

    Directory of Open Access Journals (Sweden)

    Jiuli Yin

    2018-06-01

    Full Text Available Carbon markets provide a market-based way to reduce climate pollution. Subject to general market regulations, the major existing emission trading markets present complex characteristics. This paper analyzes the complexity of carbon market by using the multi-scale entropy. Pilot carbon markets in China are taken as the example. Moving average is adopted to extract the scales due to the short length of the data set. Results show a low-level complexity inferring that China’s pilot carbon markets are quite immature in lack of market efficiency. However, the complexity varies in different time scales. China’s carbon markets (except for the Chongqing pilot are more complex in the short period than in the long term. Furthermore, complexity level in most pilot markets increases as the markets developed, showing an improvement in market efficiency. All these results demonstrate that an effective carbon market is required for the full function of emission trading.

  4. Proteomics-Based Analysis of Protein Complexes in Pluripotent Stem Cells and Cancer Biology.

    Science.gov (United States)

    Sudhir, Putty-Reddy; Chen, Chung-Hsuan

    2016-03-22

    A protein complex consists of two or more proteins that are linked together through protein-protein interactions. The proteins show stable/transient and direct/indirect interactions within the protein complex or between the protein complexes. Protein complexes are involved in regulation of most of the cellular processes and molecular functions. The delineation of protein complexes is important to expand our knowledge on proteins functional roles in physiological and pathological conditions. The genetic yeast-2-hybrid method has been extensively used to characterize protein-protein interactions. Alternatively, a biochemical-based affinity purification coupled with mass spectrometry (AP-MS) approach has been widely used to characterize the protein complexes. In the AP-MS method, a protein complex of a target protein of interest is purified using a specific antibody or an affinity tag (e.g., DYKDDDDK peptide (FLAG) and polyhistidine (His)) and is subsequently analyzed by means of MS. Tandem affinity purification, a two-step purification system, coupled with MS has been widely used mainly to reduce the contaminants. We review here a general principle for AP-MS-based characterization of protein complexes and we explore several protein complexes identified in pluripotent stem cell biology and cancer biology as examples.

  5. Testing and Analysis of Complex Structures to Improve CalcuRep

    National Research Council Canada - National Science Library

    Verhoeven, Stephan

    2000-01-01

    ...: The contractor shall investigate the performance of bonded composite repairs to complex metal aircraft structures in comparison with analytical predictions in order to verify new analytical models...

  6. Complex network analysis in inclined oil–water two-phase flow

    International Nuclear Information System (INIS)

    Zhong-Ke, Gao; Ning-De, Jin

    2009-01-01

    Complex networks have established themselves in recent years as being particularly suitable and flexible for representing and modelling many complex natural and artificial systems. Oil–water two-phase flow is one of the most complex systems. In this paper, we use complex networks to study the inclined oil–water two-phase flow. Two different complex network construction methods are proposed to build two types of networks, i.e. the flow pattern complex network (FPCN) and fluid dynamic complex network (FDCN). Through detecting the community structure of FPCN by the community-detection algorithm based on K-means clustering, useful and interesting results are found which can be used for identifying three inclined oil–water flow patterns. To investigate the dynamic characteristics of the inclined oil–water two-phase flow, we construct 48 FDCNs under different flow conditions, and find that the power-law exponent and the network information entropy, which are sensitive to the flow pattern transition, can both characterize the nonlinear dynamics of the inclined oil–water two-phase flow. In this paper, from a new perspective, we not only introduce a complex network theory into the study of the oil–water two-phase flow but also indicate that the complex network may be a powerful tool for exploring nonlinear time series in practice. (general)

  7. Quantum chemical analysis of Со2+ aqua complexes electrochemical reduction

    Directory of Open Access Journals (Sweden)

    Viktor F. Vargalyuk

    2017-11-01

    Full Text Available Based on the analysis of quantum chemical calculations results (GAMESS, density functional theory, B3LYP method as to [Co(H2On]z(H2O6–n clusters for z = 0, 1, 2 and n=1÷6, it has been demonstrated that electrochemical reduction of [Co(H2O6]2+ aqua complexes runs stage-wise. At the first stage, an electron injected into the [Co(H2O6]2+ complex is entirely located in the orbital of the central atom, as z(Co herewith changes from +1.714 е to +0.777 е. The weakening of Со–ОН2 bonds leads to decomposition of resulting [Co(H2O6]+ particles into two energetically related forms – [Co(H2O4]+ and [Co(H2O3]+. Further reduction of these intermediates runs differently. Electron injection into the [Co(H2O3]+ intermediate terminatesthe transition of Со2+-ions to Со0 z(Co= –0.264 е. This process is accompanied by rapid decomposition of [Co(H2O3]0 product into monohydrate atom of cobalt Со(Н2О. On the contrary, electron injection into the [Co(H2O4]+ intermediate leads to emergence of a specific structure – [Co+(H2O–(Н2О3]¹0, whereby the electron is located in the atoms of cobalt only by 28%, and by 72% in cobalt-coordinated water molecules, clearly focusing on one of the. In this molecule, z(H2O changes from +0.148 е to –0.347 е. There is an assumption that a non-equilibrium [Co+(H2O–(Н2О3]0¹ form transits to [Co(ОH(Н2О3]0 hydroxo-form, which further disproportionates turning into Co(ОH2 hydroxide. In order to reduce the impact of this unfavorable reaction pathway on the overall reaction rate Со2+ + 2ē = Со0, we suggest raising the temperature to ensure complete dissociation of [Co(H2O4]+ to [Co(H2O3]+.

  8. A cross-sectional retrospective analysis of the regionalization of complex surgery.

    Science.gov (United States)

    Studnicki, James; Craver, Christopher; Blanchette, Christopher M; Fisher, John W; Shahbazi, Sara

    2014-08-16

    The Veterans Health Administration (VHA) system has assigned a surgical complexity level to each of its medical centers by specifying requirements to perform standard, intermediate or complex surgical procedures. No study to similarly describe the patterns of relative surgical complexity among a population of United States (U.S) civilian hospitals has been completed. single year, retrospective, cross-sectional. the study used Florida Inpatient Discharge Data from short-term acute hospitals for calendar year 2009. Two hundred hospitals with 2,542,920 discharges were organized into four quartiles (Q 1, 2, 3, 4) based on the number of complex procedures per hospital. The VHA surgical complexity matrix was applied to assign relative complexity to each procedure. The Clinical Classification Software (CCS) system assigned complex procedures to clinically meaningful groups. For outcome comparisons, propensity score matching methods adjusted for the surgical procedure, age, gender, race, comorbidities, mechanical ventilator use and type of admission. in-hospital mortality and length-of-stay (LOS). Only 5.2% of all inpatient discharges involve a complex procedure. The highest volume complex procedure hospitals (Q4) have 49.8% of all discharges but 70.1% of all complex procedures. In the 133,436 discharges with a primary complex procedure, 374 separate specific procedures are identified, only about one third of which are performed in the lowest volume complex procedure (Q1) hospitals. Complex operations of the digestive, respiratory, integumentary and musculoskeletal systems are the least concentrated and proportionately more likely to occur in the lower volume hospitals. Operations of the cardiovascular system and certain technology dependent miscellaneous diagnostic and therapeutic procedures are the most concentrated in high volume hospitals. Organ transplants are only done in Q4 hospitals. There were no significant differences in in-hospital mortality rates and the

  9. Emission Properties, Solubility, Thermodynamic Analysis and NMR Studies of Rare-Earth Complexes with Two Different Phosphine Oxides

    Directory of Open Access Journals (Sweden)

    Hiroki Iwanaga

    2010-07-01

    Full Text Available The paper proposes novel molecular designs for rare-earth complexes involving the introduction of two different phosphine oxide structures into one rare-earth ion. These designs are effective for improving solubility and emission intensity. Additionally, the complexes are indispensable for realizing high performances in LEDs and security media. The thermodynamic properties of Eu(III complexes are correlated with the solubility. Correlations between coordination structures and emission intensity were explained by NMR analysis. The luminous flux of red LED devices with Eu(III complexes is very high (20 mA, 870 m lumen. A new white LED has its largest spectra intensity in the red region and a human look much more vividly under this light.

  10. Analysis of chromosome rearrangements on the basis of synaptonemal complexes in the offspring of mice exposed to γ-rays

    International Nuclear Information System (INIS)

    Kalikinskaya, E.I.; Bogdanov, Yu.F.; Kolomiets, O.L.; Shevchenko, V.A.

    1986-01-01

    Electron-microscopic analysis of synaptonemic complexes (SC), spread on the hypophase surface, was conducted to investigate chromosome rearrangements in sterile and semisterile F 1 malemause offsprings, exposed to 5 Gy γ-rays Paralelly Chromosome rearrangement account in diakinesis-metaphase 1 was conducted using light microscope, in the same animals. During SC analysis in pachytene chromosome rearrangements were found in 63% of spermatocytes. Under chromosome analysis in diakinesis-metaphase 1 in the same animals chromosome rearrangements were found only in 32% of cells. SC analysis allows one to reveal chromosome rearrangements, which can not be revealed in diakinesis-metaphase 1

  11. Purchasing complex services on the Internet; An analysis of mortgage loan acquisitions

    NARCIS (Netherlands)

    B.L.K. Vroomen (Björn); A.C.D. Donkers (Bas); P.C. Verhoef (Peter); Ph.H.B.F. Franses (Philip Hans)

    2003-01-01

    textabstractIn contrast to, for example, books and compact discs, the number of complex services offered on the Internet is still small. A good example of such a service concerns mortgage loans. The decision-making process differs for complex services in that they have an extra intermediate step of

  12. Thermodynamic analysis of ferulate complexation with α-, β- and γ-cyclodextrins

    International Nuclear Information System (INIS)

    González-Mondragón, Edith; Torralba-González, Armando; García-Gutiérrez, Ponciano; Robles-González, Vania S.; Salazar-Govea, Alma Y.; Zubillaga, Rafael A.

    2016-01-01

    Highlights: • Ferulate exhibits the highest affinity for the β-cyclodextrin. • The β-CD cavity fits better with FER, according to the docking simulations. • The complexation of FER with β-CD is the only one favored by entropy. • More water molecules seem to be displaced after the complexation of FER with β-CD. - Abstract: Isothermal titration calorimetry (ITC) was used to characterize the thermodynamics of the complexation processes of α-, β- and γ-cyclodextrin (CD) with ferulate (FER) in aqueous solutions. The equilibrium constants of ferulate complexation with CDs (K_c, in dm"3 mol"−"1) at pH 9.0 and 25.0 °C were: 176.5 ± 5.0 (β-CD), 53.2 ± 3.4 (α-CD) and 19.4 ± 0.4 (γ-CD). Although FER–β-CD is the tightest complex of the three studied, its binding reaction is also the least exothermic and the only one that is entropically favored. Calculated binding enthalpies, based on the buried surface area upon complexation, are close to those determined by ITC except for the FER–β-CD complex which is more than two times more exothermic. According to these results and those obtained by molecular docking simulations, it is proposed that ferulate binds to the hydrophobic cavity of β-CD, displacing more water molecules than in the other two CD complexes.

  13. Confirmation of molecular formulas of metallic complexes through X-ray fluorescence quantitative analysis

    International Nuclear Information System (INIS)

    Filgueiras, C.A.L.; Marques, E.V.; Machado, R.M.

    1984-01-01

    X-ray fluorescence spectrophotometry was employed to determined the metal content in a series of five transition element complexes (Mn, Ti, Zn, V). The results confirmed the molecular formulas of these complexes, already proposed on the basis of elemental microanalysis, solution condutimetry and other analytical methods. (C.L.B.) [pt

  14. Identification of Uranyl Surface Complexes an Ferrihydrite: Advanced EXAFS Data Analysis and CD-MUSIC Modeling

    NARCIS (Netherlands)

    Rossberg, A.; Ulrich, K.U.; Weiss, S.; Tsushima, S.; Hiemstra, T.; Scheinost, A.C.

    2009-01-01

    Previous spectroscopic research suggested that uranium(VI) adsorption to iron oxides is dominated by ternary uranyl-carbonato surface complexes across an unexpectedly wide pH range. Formation of such complexes would have a significant impact on the sorption behavior and mobility of uranium in

  15. Functional and structural analysis of photosystem II core complexes from spinach with high oxygen evolution capacity

    NARCIS (Netherlands)

    Haag, Elisabeth; Irrgang, Klaus-D.; Boekema, Egbert J.; Renger, Gernot

    1990-01-01

    Oxygen-evolving photo system II core complexes were prepared from spinach by solubilizing photosystem II membrane fragments with dodecyl-β-D-maltoside. The core complexes consist of the intrinsic 47-kDa, 43-kDa, D1 and D2 polypeptides, the two subunits of cytochrome b559 and the extrinsic 33-kDa

  16. Surviving Blind Decomposition: A Distributional Analysis of the Time-Course of Complex Word Recognition

    Science.gov (United States)

    Schmidtke, Daniel; Matsuki, Kazunaga; Kuperman, Victor

    2017-01-01

    The current study addresses a discrepancy in the psycholinguistic literature about the chronology of information processing during the visual recognition of morphologically complex words. "Form-then-meaning" accounts of complex word recognition claim that morphemes are processed as units of form prior to any influence of their meanings,…

  17. Uncertainty analysis of point by point sampling complex surfaces using touch probe CMMs

    DEFF Research Database (Denmark)

    Barini, Emanuele; Tosello, Guido; De Chiffre, Leonardo

    2007-01-01

    The paper describes a study concerning point by point scanning of complex surfaces using tactile CMMs. A four factors-two level full factorial experiment was carried out, involving measurements on a complex surface configuration item comprising a sphere, a cylinder and a cone, combined in a singl...

  18. Thermodynamic analysis of ferulate complexation with α-, β- and γ-cyclodextrins

    Energy Technology Data Exchange (ETDEWEB)

    González-Mondragón, Edith, E-mail: edith@mixteco.utm.mx [Universidad Tecnológica de la Mixteca, C.P. 69000 Huajuapan de León, Oax. (Mexico); Torralba-González, Armando [Universidad Tecnológica de la Mixteca, C.P. 69000 Huajuapan de León, Oax. (Mexico); García-Gutiérrez, Ponciano [Departamento de Química, Universidad Autónoma Metropolitana – Iztapalapa, Apartado Postal 55-534, Iztapalapa, C.P. 09340 México, D.F. (Mexico); Robles-González, Vania S.; Salazar-Govea, Alma Y. [Universidad Tecnológica de la Mixteca, C.P. 69000 Huajuapan de León, Oax. (Mexico); Zubillaga, Rafael A., E-mail: zlra@xanum.uam.mx [Departamento de Química, Universidad Autónoma Metropolitana – Iztapalapa, Apartado Postal 55-534, Iztapalapa, C.P. 09340 México, D.F. (Mexico)

    2016-06-20

    Highlights: • Ferulate exhibits the highest affinity for the β-cyclodextrin. • The β-CD cavity fits better with FER, according to the docking simulations. • The complexation of FER with β-CD is the only one favored by entropy. • More water molecules seem to be displaced after the complexation of FER with β-CD. - Abstract: Isothermal titration calorimetry (ITC) was used to characterize the thermodynamics of the complexation processes of α-, β- and γ-cyclodextrin (CD) with ferulate (FER) in aqueous solutions. The equilibrium constants of ferulate complexation with CDs (K{sub c}, in dm{sup 3} mol{sup −1}) at pH 9.0 and 25.0 °C were: 176.5 ± 5.0 (β-CD), 53.2 ± 3.4 (α-CD) and 19.4 ± 0.4 (γ-CD). Although FER–β-CD is the tightest complex of the three studied, its binding reaction is also the least exothermic and the only one that is entropically favored. Calculated binding enthalpies, based on the buried surface area upon complexation, are close to those determined by ITC except for the FER–β-CD complex which is more than two times more exothermic. According to these results and those obtained by molecular docking simulations, it is proposed that ferulate binds to the hydrophobic cavity of β-CD, displacing more water molecules than in the other two CD complexes.

  19. Systematic analysis of barrier-forming FG hydrogels from Xenopus nuclear pore complexes

    NARCIS (Netherlands)

    Labokha, A.A.; Gradmann, S.H.E.; Frey, S.; Hülsmann, B.B.; Urlaub, H.; Baldus, M.; Görlich, D.

    2013-01-01

    Nuclear pore complexes (NPCs) control the traffic between cell nucleus and cytoplasm. While facilitating translocation of nuclear transport receptors (NTRs) and NTR·cargo complexes, they suppress passive passage of macromolecules ⩾30 kDa. Previously, we reconstituted the NPC barrier as hydrogels

  20. Variable speed limit strategies analysis with mesoscopic traffic flow model based on complex networks

    Science.gov (United States)

    Li, Shu-Bin; Cao, Dan-Ni; Dang, Wen-Xiu; Zhang, Lin

    As a new cross-discipline, the complexity science has penetrated into every field of economy and society. With the arrival of big data, the research of the complexity science has reached its summit again. In recent years, it offers a new perspective for traffic control by using complex networks theory. The interaction course of various kinds of information in traffic system forms a huge complex system. A new mesoscopic traffic flow model is improved with variable speed limit (VSL), and the simulation process is designed, which is based on the complex networks theory combined with the proposed model. This paper studies effect of VSL on the dynamic traffic flow, and then analyzes the optimal control strategy of VSL in different network topologies. The conclusion of this research is meaningful to put forward some reasonable transportation plan and develop effective traffic management and control measures to help the department of traffic management.