WorldWideScience

Sample records for acid-water complexes measured

  1. Technology of complex cleaning of mine acidic waters

    It is shown, that problem of complex use of mine waters includes two tasks: its cleaning and use of these waters in capacity of hydro-mineral raw. The floatation-extraction technology of mine acidic waters reprocessing is developed. The possibility of extraction processing of foam products of floatation with purpose of selective isolation of valuable components (Co, Ni, Sc, numerous of rare elements) is considered, optimum modes of metal extraction are defined. (author)

  2. Measuring Tax Complexity

    David Ulph

    2014-01-01

    This paper critically examines a number of issues relating to the measurement of tax complexity. It starts with an analysis of the concept of tax complexity, distinguishing tax design complexity and operational complexity. It considers the consequences/costs of complexity, and then examines the rationale for measuring complexity. Finally it applies the analysis to an examination of an index of complexity developed by the UK Office of Tax Simplification (OTS). Postprint

  3. Viral quasispecies complexity measures.

    Gregori, Josep; Perales, Celia; Rodriguez-Frias, Francisco; Esteban, Juan I; Quer, Josep; Domingo, Esteban

    2016-06-01

    Mutant spectrum dynamics (changes in the related mutants that compose viral populations) has a decisive impact on virus behavior. The several platforms of next generation sequencing (NGS) to study viral quasispecies offer a magnifying glass to study viral quasispecies complexity. Several parameters are available to quantify the complexity of mutant spectra, but they have limitations. Here we critically evaluate the information provided by several population diversity indices, and we propose the introduction of some new ones used in ecology. In particular we make a distinction between incidence, abundance and function measures of viral quasispecies composition. We suggest a multidimensional approach (complementary information contributed by adequately chosen indices), propose some guidelines, and illustrate the use of indices with a simple example. We apply the indices to three clinical samples of hepatitis C virus that display different population heterogeneity. Areas of virus biology in which population complexity plays a role are discussed. PMID:27060566

  4. Fundamental Complexity Measures of Life

    Grandpierre, Attila

    2012-01-01

    At present, there is a great deal of confusion regarding complexity and its measures (reviews on complexity measures are found in, e.g. Lloyd, 2001 and Shalizi, 2006 and more references therein). Moreover, there is also confusion regarding the nature of life. In this situation, it seems the task of determining the fundamental complexity measures of life is especially difficult. Yet this task is just part of a greater task: obtaining substantial insights into the nature of biological evolution. We think that without a firm quantitative basis characterizing the most fundamental aspects of life, it is impossible to overcome the confusion so as to clarify the nature of biological evolution. The approach we present here offers such quantitative measures of complexity characterizing biological organization and, as we will see, evolution.

  5. Measuring importance in complex networks

    Morrison, Greg; Dudte, Levi; Mahadevan, L.

    2013-03-01

    A variety of centrality measures can be defined on a network to determine the global `importance' of a node i. However, the inhomogeneity of complex networks implies that not all nodes j will consider i equally important. In this talk, we use a linearized form of the Generalized Erdos numbers [Morrison and Mahadevan EPL 93 40002 (2011)] to define a pairwise measure of the importance of a node i from the perspective of node j which incorporates the global network topology. This localized importance can be used to define a global measure of centrality that is consistent with other well-known centrality measures. We illustrate the use of the localized importance in both artificial and real-world networks with a complex global topology.

  6. Hierarchy measure for complex networks.

    Enys Mones

    Full Text Available Nature, technology and society are full of complexity arising from the intricate web of the interactions among the units of the related systems (e.g., proteins, computers, people. Consequently, one of the most successful recent approaches to capturing the fundamental features of the structure and dynamics of complex systems has been the investigation of the networks associated with the above units (nodes together with their relations (edges. Most complex systems have an inherently hierarchical organization and, correspondingly, the networks behind them also exhibit hierarchical features. Indeed, several papers have been devoted to describing this essential aspect of networks, however, without resulting in a widely accepted, converging concept concerning the quantitative characterization of the level of their hierarchy. Here we develop an approach and propose a quantity (measure which is simple enough to be widely applicable, reveals a number of universal features of the organization of real-world networks and, as we demonstrate, is capable of capturing the essential features of the structure and the degree of hierarchy in a complex network. The measure we introduce is based on a generalization of the m-reach centrality, which we first extend to directed/partially directed graphs. Then, we define the global reaching centrality (GRC, which is the difference between the maximum and the average value of the generalized reach centralities over the network. We investigate the behavior of the GRC considering both a synthetic model with an adjustable level of hierarchy and real networks. Results for real networks show that our hierarchy measure is related to the controllability of the given system. We also propose a visualization procedure for large complex networks that can be used to obtain an overall qualitative picture about the nature of their hierarchical structure.

  7. Acceptable Complexity Measures of Theorems

    Grenet, Bruno

    2009-01-01

    In 1931, G\\"odel presented in K\\"onigsberg his famous Incompleteness Theorem, stating that some true mathematical statements are unprovable. Yet, this result gives us no idea about those independent (that is, true and unprovable) statements, about their frequency, the reason they are unprovable, and so on. Calude and J\\"urgensen proved in 2005 Chaitin's "heuristic principle" for an appropriate measure: the theorems of a finitely-specified theory cannot be significantly more complex than the t...

  8. Computerized measures of visual complexity.

    Machado, Penousal; Romero, Juan; Nadal, Marcos; Santos, Antonino; Correia, João; Carballal, Adrián

    2015-09-01

    Visual complexity influences people's perception of, preference for, and behaviour toward many classes of objects, from artworks to web pages. The ability to predict people's impression of the complexity of different kinds of visual stimuli holds, therefore, great potential for many domains, basic and applied. Here we use edge detection operations and several image metrics based on image compression error and Zipf's law to estimate the visual complexity of images. The experiments involved 800 images, each previously rated by thirty participants on perceived complexity. In a first set of experiments we analysed the correlation of individual features with the average human response, obtaining correlations up to rs = .771. In a second set of experiments we employed Machine Learning techniques to predict the average visual complexity score attributed by humans to each stimuli. The best configurations obtained a correlation of rs = .832. The average prediction error of the Machine Learning system over the set of all stimuli was .096 in a normalized 0 to 1 interval, showing that it is possible to predict, with high accuracy human responses. Overall, edge density and compression error were the strongest predictors of human complexity ratings. PMID:26164647

  9. Metric for Early Measurement of Software Complexity

    Ghazal Keshavarz,; Dr. Nasser Modiri,; Dr. Mirmohsen Pedram

    2011-01-01

    Software quality depends on several factors such as on time delivery; within budget and fulfilling user's needs. Complexity is one of the most important factors that may affect the quality. Therefore, measuring and controlling the complexity result in improving the quality. So far, most of the researches have tried to identify and measure the complexity in design and code phase. However, whenwe have the code or design for software, it is too late to control complexity. In this article, with e...

  10. Complexity measures, emergence, and multiparticle correlations

    Galla, Tobias

    2011-01-01

    We study correlation measures for complex systems. First, we investigate some recently proposed measures based on information geometry. We show that these measures can increase under local transformations as well as under discarding particles, thereby questioning their interpretation as a quantifier for complexity or correlations. We then propose a refined definition of these measures, investigate its properties and discuss its numerical evaluation. As an example, we study coupled logistic maps and study the behavior of the different measures for that case. Finally, we investigate other local effects during the coarse graining of the complex system.

  11. Complexity measures, emergence, and multiparticle correlations

    Galla, Tobias; Gühne, Otfried

    2012-04-01

    We study correlation measures for complex systems. First, we investigate some recently proposed measures based on information geometry. We show that these measures can increase under local transformations as well as under discarding particles, thereby questioning their interpretation as a quantifier for complexity or correlations. We then propose a refined definition of these measures, investigate its properties, and discuss its numerical evaluation. As an example, we study coupled logistic maps and study the behavior of the different measures for that case. Finally, we investigate other local effects during the coarse graining of the complex system.

  12. Metric for Early Measurement of Software Complexity

    Ghazal Keshavarz,

    2011-06-01

    Full Text Available Software quality depends on several factors such as on time delivery; within budget and fulfilling user's needs. Complexity is one of the most important factors that may affect the quality. Therefore, measuring and controlling the complexity result in improving the quality. So far, most of the researches have tried to identify and measure the complexity in design and code phase. However, whenwe have the code or design for software, it is too late to control complexity. In this article, with emphasis on Requirement Engineering process, we analyze the causes of software complexity, particularly in the first phase of software development, and propose a requirement based metric. This metric enables a software engineer to measure the complexity before actual design and implementation and choosestrategies that are appropriate to the software complexity degree, thus saving on cost and human resource wastage and, more importantly, leading to lower maintenance costs.

  13. Measurement methods on the complexity of network

    LIN Lin; DING Gang; CHEN Guo-song

    2010-01-01

    Based on the size of network and the number of paths in the network,we proposed a model of topology complexity of a network to measure the topology complexity of the network.Based on the analyses of the effects of the number of the equipment,the types of equipment and the processing time of the node on the complexity of the network with the equipment-constrained,a complexity model of equipment-constrained network was constructed to measure the integrated complexity of the equipment-constrained network.The algorithms for the two models were also developed.An automatic generator of the random single label network was developed to test the models.The results show that the models can correctly evaluate the topology complexity and the integrated complexity of the networks.

  14. Cardiac Aging Detection Using Complexity Measures

    Balasubramanian, Karthi

    2016-01-01

    As we age, our hearts undergo changes which result in reduction in complexity of physiological interactions between different control mechanisms. This results in a potential risk of cardiovascular diseases which are the number one cause of death globally. Since cardiac signals are nonstationary and nonlinear in nature, complexity measures are better suited to handle such data. In this study, non-invasive methods for detection of cardiac aging using complexity measures are explored. Lempel-Ziv (LZ) complexity, Approximate Entropy (ApEn) and Effort-to-Compress (ETC) measures are used to differentiate between healthy young and old subjects using heartbeat interval data. We show that both LZ and ETC complexity measures are able to differentiate between young and old subjects with only 10 data samples while ApEn requires at least 15 data samples.

  15. Artificial sequences and complexity measures

    Baronchelli, Andrea; Caglioti, Emanuele; Loreto, Vittorio

    2005-04-01

    In this paper we exploit concepts of information theory to address the fundamental problem of identifying and defining the most suitable tools for extracting, in a automatic and agnostic way, information from a generic string of characters. We introduce in particular a class of methods which use in a crucial way data compression techniques in order to define a measure of remoteness and distance between pairs of sequences of characters (e.g. texts) based on their relative information content. We also discuss in detail how specific features of data compression techniques could be used to introduce the notion of dictionary of a given sequence and of artificial text and we show how these new tools can be used for information extraction purposes. We point out the versatility and generality of our method that applies to any kind of corpora of character strings independently of the type of coding behind them. We consider as a case study linguistic motivated problems and we present results for automatic language recognition, authorship attribution and self-consistent classification.

  16. Measuring Complexity in an Aquatic Ecosystem

    Fernandez, Nelson; Gershenson, Carlos

    2013-01-01

    We apply formal measures of emergence, self-organization, homeostasis, autopoiesis and complexity to an aquatic ecosystem; in particular to the physiochemical component of an Arctic lake. These measures are based on information theory. Variables with an homogeneous distribution have higher values of emergence, while variables with a more heterogeneous distribution have a higher self-organization. Variables with a high complexity reflect a balance between change (emergence) and regularity/orde...

  17. A Simple Measure of Economic Complexity

    Inoua, Sabiou

    2016-01-01

    We show from a simple model that a country's technological development can be measured by the logarithm of the number of products it makes. We show that much of the income gaps among countries are due to differences in technology, as measured by this simple metric. Finally, we show that the so-called Economic Complexity Index (ECI), a recently proposed measure of collective knowhow, is in fact an estimate of this simple metric (with correlation above 0.9).

  18. Measuring Customer Profitability in Complex Environments

    Holm, Morten; Kumar, V.; Rohde, Carsten

    2012-01-01

    Customer profitability measurement is an important element in customer relationship management and a lever for enhanced marketing accountability. Two distinct measurement approaches have emerged in the marketing literature: Customer Lifetime Value (CLV) and Customer Profitability Analysis (CPA...... degree of sophistication deployed when implementing customer profitability measurement models is determined by the type of complexity encountered in firms’ customer environments. This gives rise to a contingency framework for customer profitability measurement model selection and five research...... propositions. Additionally, the framework provides design and implementation guidance for managers seeking to implement customer profitability measurement models for resource allocation purposes....

  19. A complexity measure for diachronic Chinese phonology

    Raman, A; Patrick, J; Raman, Anand; Newman, John; Patrick, Jon

    1997-01-01

    This paper addresses the problem of deriving distance measures between parent and daughter languages with specific relevance to historical Chinese phonology. The diachronic relationship between the languages is modelled as a Probabilistic Finite State Automaton. The Minimum Message Length principle is then employed to find the complexity of this structure. The idea is that this measure is representative of the amount of dissimilarity between the two languages.

  20. Measurement of Diffusion in Flowing Complex Fluids

    Leonard, Edward F.; Aucoin, Christian P.; Nanne, Edgar E.

    2006-01-01

    A microfluidic device for the measurement of solute diffusion as well as particle diffusion and migration in flowing complex fluids is described. The device is particularly suited to obtaining diffusivities in such fluids, which require a desired flow state to be maintained during measurement. A method based on the Loschmidt diffusion theory and short times of exposure is presented to allow calculation of diffusivities from concentration differences in the flow streams leaving the cell.

  1. Study on fluorescence spectra of molecular association of acetic acid-water

    Caiqin Han; Ying Liu; Yang Yang; Xiaowu Ni; Jian Lu; Xiaosen Luo

    2009-01-01

    Fluorescence spectra of acetic acid-water solution excited by ultraviolet (UV) light are studied, and the relationship between fluorescence spectra and molecular association of acetic acid is discussed. The results indicate that when the exciting light wavelength is longer than 246 nm, there are two fluorescence peaks located at 305 and 334 nm, respectively. By measuring the excitation spectra, the optimal wavelengths of the two fluorescence peaks are obtained, which are 258 and 284 nm, respectively. Fluorescence spectra of acetic acid-water solution change with concentrations, which is primarily attributed to changes of molecular association of acetic acid in aqueous solution. Through theoretical analysis, three variations of molecular association have been obtained in acetic acid-water solution, which are the hydrated monomers, the linear dimers, and the water separated dimers. This research can provide references to studies of molecular association of acetic acid-water, especially studies of hydrogen bonds.

  2. Balancing model complexity and measurements in hydrology

    Van De Giesen, N.; Schoups, G.; Weijs, S. V.

    2012-12-01

    The Data Processing Inequality implies that hydrological modeling can only reduce, and never increase, the amount of information available in the original data used to formulate and calibrate hydrological models: I(X;Z(Y)) ≤ I(X;Y). Still, hydrologists around the world seem quite content building models for "their" watersheds to move our discipline forward. Hydrological models tend to have a hybrid character with respect to underlying physics. Most models make use of some well established physical principles, such as mass and energy balances. One could argue that such principles are based on many observations, and therefore add data. These physical principles, however, are applied to hydrological models that often contain concepts that have no direct counterpart in the observable physical universe, such as "buckets" or "reservoirs" that fill up and empty out over time. These not-so-physical concepts are more like the Artificial Neural Networks and Support Vector Machines of the Artificial Intelligence (AI) community. Within AI, one quickly came to the realization that by increasing model complexity, one could basically fit any dataset but that complexity should be controlled in order to be able to predict unseen events. The more data are available to train or calibrate the model, the more complex it can be. Many complexity control approaches exist in AI, with Solomonoff inductive inference being one of the first formal approaches, the Akaike Information Criterion the most popular, and Statistical Learning Theory arguably being the most comprehensive practical approach. In hydrology, complexity control has hardly been used so far. There are a number of reasons for that lack of interest, the more valid ones of which will be presented during the presentation. For starters, there are no readily available complexity measures for our models. Second, some unrealistic simplifications of the underlying complex physics tend to have a smoothing effect on possible model

  3. Residual radioactivity measurements at Indus accelerator complex

    Indus-1 and Indus-2 are two Synchrotron Radiation Sources (SRS) operational at RRCAT, Indore. Indus-1 and Indus-2 are designed for maximum electron beam energy of 450 MeV and 2.5 GeV respectively. During shut down of these accelerators for maintenance purpose, residual radioactivity measurements were carried out. The residual radioactivity formation in various parts of the high energy electron accelerators is due to the beam loss taking place at these locations. The present paper describes the recent residual radioactivity measurements carried out at the electron accelerators of Indus Accelerator Complex and the radio-isotopes identified. The maximum dose rate due to induced activity obtained is 30 μSv/h, near dipole-5 of booster synchrotron after 12 h of cooling time. In case of Indus-1 and Indus-2 SRS the dose rate due to induced radioactivity is found to be of the order of 2 - 3 μSv/h. The radio isotopes identified at these beam loss locations are beta emitters that do not pose serious external hazard to the working personnel. However, precautions are to be observed while doing maintenance on activated components. The paper describes the measurements in detail with the results. (author)

  4. Measure of robustness for complex networks

    Youssef, Mina Nabil

    Critical infrastructures are repeatedly attacked by external triggers causing tremendous amount of damages. Any infrastructure can be studied using the powerful theory of complex networks. A complex network is composed of extremely large number of different elements that exchange commodities providing significant services. The main functions of complex networks can be damaged by different types of attacks and failures that degrade the network performance. These attacks and failures are considered as disturbing dynamics, such as the spread of viruses in computer networks, the spread of epidemics in social networks, and the cascading failures in power grids. Depending on the network structure and the attack strength, every network differently suffers damages and performance degradation. Hence, quantifying the robustness of complex networks becomes an essential task. In this dissertation, new metrics are introduced to measure the robustness of technological and social networks with respect to the spread of epidemics, and the robustness of power grids with respect to cascading failures. First, we introduce a new metric called the Viral Conductance (VCSIS ) to assess the robustness of networks with respect to the spread of epidemics that are modeled through the susceptible/infected/susceptible (SIS) epidemic approach. In contrast to assessing the robustness of networks based on a classical metric, the epidemic threshold, the new metric integrates the fraction of infected nodes at steady state for all possible effective infection strengths. Through examples, VCSIS provides more insights about the robustness of networks than the epidemic threshold. In addition, both the paradoxical robustness of Barabasi-Albert preferential attachment networks and the effect of the topology on the steady state infection are studied, to show the importance of quantifying the robustness of networks. Second, a new metric VCSIR is introduced to assess the robustness of networks with respect

  5. A New Method for Measurement and Reduction of Software Complexity

    SHI Yindun; XU Shiyi

    2007-01-01

    This paper develops an improved structural software complexity metrics named information flow complexity which is closely related to the reliability of software. Together with the three software complexity metrics, the total software complexity is measured and some rules to reduce the complexity are presented in the paper. To illustrate and explain the process of measurement and reduction of software complexity, several examples and experiments are given. It is proposed that software complexity metrics can be measured earlier in software development and can provide substantial information of software systems whose reliability can be modeled and used in the determination of initial parameter estimation.

  6. Laser beam complex amplitude measurement by phase diversity

    Védrenne, Nicolas; Mugnier, Laurent M.; Michau, Vincent; Velluet, Marie-Thérèse; Bierent, Rudolph

    2014-01-01

    The control of the optical quality of a laser beam requires a complex amplitude measurement able to deal with strong modulus variations and potentially highly perturbed wavefronts. The method proposed here consists in an extension of phase diversity to complex amplitude measurements that is effective for highly perturbed beams. Named CAMELOT for Complex Amplitude MEasurement by a Likelihood Optimization Tool, it relies on the acquisition and processing of few images of the beam section taken ...

  7. Complexity analysis in particulate matter measurements

    Luciano Telesca

    2011-09-01

    Full Text Available We investigated the complex temporal fluctuations of particulate matter data recorded in London area by using the Fisher-Shannon (FS information plane. In the FS plane the PM10 and PM2.5 data are aggregated in two different clusters, characterized by different degrees of order and organization. This results could be related to different sources of the particulate matter.

  8. An entropy based measure for comparing distributions of complexity

    Rajaram, R.; Castellani, B.

    2016-07-01

    This paper is part of a series addressing the empirical/statistical distribution of the diversity of complexity within and amongst complex systems. Here, we consider the problem of measuring the diversity of complexity in a system, given its ordered range of complexity types i and their probability of occurrence pi, with the understanding that larger values of i mean a higher degree of complexity. To address this problem, we introduce a new complexity measure called case-based entropyCc - a modification of the Shannon-Wiener entropy measure H. The utility of this measure is that, unlike current complexity measures-which focus on the macroscopic complexity of a single system-Cc can be used to empirically identify and measure the distribution of the diversity of complexity within and across multiple natural and human-made systems, as well as the diversity contribution of complexity of any part of a system, relative to the total range of ordered complexity types.

  9. Unraveling chaotic attractors by complex networks and measurements of stock market complexity

    We present a novel method for measuring the complexity of a time series by unraveling a chaotic attractor modeled on complex networks. The complexity index R, which can potentially be exploited for prediction, has a similar meaning to the Kolmogorov complexity (calculated from the Lempel–Ziv complexity), and is an appropriate measure of a series' complexity. The proposed method is used to research the complexity of the world's major capital markets. None of these markets are completely random, and they have different degrees of complexity, both over the entire length of their time series and at a level of detail. However, developing markets differ significantly from mature markets. Specifically, the complexity of mature stock markets is stronger and more stable over time, whereas developing markets exhibit relatively low and unstable complexity over certain time periods, implying a stronger long-term price memory process

  10. A computer program for geochemical analysis of acid-rain and other low-ionic-strength, acidic waters

    Johnsson, P.A.; Lord, D.G.

    1987-01-01

    ARCHEM, a computer program written in FORTRAN 77, is designed primarily for use in the routine geochemical interpretation of low-ionic-strength, acidic waters. On the basis of chemical analyses of the water, and either laboratory or field determinations of pH, temperature, and dissolved oxygen, the program calculates the equilibrium distribution of major inorganic aqueous species and of inorganic aluminum complexes. The concentration of the organic anion is estimated from the dissolved organic concentration. Ionic ferrous iron is calculated from the dissolved oxygen concentration. Ionic balances and comparisons of computed with measured specific conductances are performed as checks on the analytical accuracy of chemical analyses. ARCHEM may be tailored easily to fit different sampling protocols, and may be run on multiple sample analyses. (Author 's abstract)

  11. Measuring control structure complexity through execution sequence grammars

    MacLennan, Bruce J.

    1981-01-01

    A method for measuring the complexity of control structures is presented. It is based on the size of a grammar describing the possible execution sequences of the control structure. This method is applied to a number of control structures, including Pascal's control structures, Dijkstra's operators, and a structure recently proposed by Parnas. The verification of complexity measures is briefly discussed. (Author)

  12. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  13. Communication line for an image scanning and measurement Complex

    A complex for an on-line processing of the film information obtained in the process of photography of events in a bubble chambers is described. The complex involves an image scanning and measurement apparatus (5 SAMET image scanning and measurement tables and 2 VT-340 alphanumeric displays) the Electronics computer and data transmission line consisting of the communication line itself and two buffer shaping amplifiers and interfaces. The flowsheet of the above complex communication line is presented

  14. SAT is a problem with exponential complexity measured by negentropy

    Pan, Feng(Department of Physics, Liaoning Normal University, Dalian 116029, China)

    2014-01-01

    In this paper the reason why entropy reduction (negentropy) can be used to measure the complexity of any computation was first elaborated both in the aspect of mathematics and informational physics. In the same time the equivalence of computation and information was clearly stated. Then the complexities of three specific problems: logical compare, sorting and SAT, were analyzed and measured. The result showed SAT was a problem with exponential complexity which naturally leads to the conclusio...

  15. Digital System for Complex Bioimpedance Measurement

    Verner, Petr

    Brno : Vysoké učení technické v Brně, 2004 - (Boušek, J.; Háze, J.), s. 149-152 ISBN 80-214-2701-9. [EDS '04 /11./ Electronic Devices and Systems Conference. Brno (CZ), 09.09.2004-10.09.2004] R&D Projects: GA ČR GA102/00/1262 Keywords : bioimpedance measurement * digital receiver * cardiac output Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering

  16. Complex permittivity measurements of ferroelectrics employing composite dielectric resonator technique.

    Krupka, Jerzy; Zychowicz, Tomasz; Bovtun, Viktor; Veljko, Sergiy

    2006-10-01

    Composite cylindrical TE(0n1) mode dielectric resonator has been used for the complex permittivity measurements of ferroelectrics at frequency about 8.8 GHz. Rigorous equations have been derived that allowed us to find a relationship between measured resonance frequency and Q-factor and the complex permittivity. It has been shown that the choice of appropriate diameter of a sample together with rigorous complex angular frequency analysis allows precise measurements of various ferroelectric. Proposed technique can be used for materials having both real and imaginary part of permittivity as large as a few thousand. Variable temperature measurements were performed on a PbMg(1/3)Nb(2/3)O3 (PMN) ceramic sample, and the measured complex permittivity have shown good agreement with the results of measurements obtained on the same sample at lower frequencies (0.1-1.8 GHz). PMID:17036796

  17. A Simple Complexity Measurement for Software Verification and Software Testing

    Cheng, Zheng; Monahan, Rosemary; Power, James F.

    2012-01-01

    In this paper, we used a simple metric (i.e. Lines of Code) to measure the complexity involved in software verification and software testing. The goal is then, to argue for software verification over software testing, and motivate a discussion of how to reduce the complexity involved in software verification. We propose to reduce this complexity by translating the software to a simple intermediate representation which can be verified using an efficient verifier, such as Boog...

  18. Complexity measure for the Prototype System Description Language (PSDL)

    Dupont, Joseph P.

    2002-01-01

    "We often misunderstand, ill define or improperly measure the complexity of software. Software complexity is represented by the degree of complication of a system determined by such factors as control flow, information flow, the degree of nesting, the types of data structures, and other system characteristics, such as unconventional architectures. However, a common notion of software complexity fulfills a non-functional requirement, that of understandability. How well do we understand the...

  19. SYNAPTONEMAL COMPLEX DAMAGE AS A MEASURE OF GENOTOXICITY AT MEIOSIS

    Synaptonemal complex aberrations can provide a sensitive measure of chemical-specific alterations to meiotic chromosomes. Mitomycin C, cyclophosphamide, amsacrine, ellipticine, colchicine, vinblastine sulfate, and cis-platin exposures in mice have been shown to cause various patt...

  20. High Dynamic Range Complex Impedance Measurement System for Petrophysical Usage

    Chen, R.; He, X.; Yao, H.; Tan, S.; Shi, H.; Shen, R.; Yan, C.; Zeng, P.; He, L.; Qiao, N.; Xi, F.; Zhang, H.; Xie, J.

    2015-12-01

    Spectral induced polarization method (SIP) or complex resistivity method is increasing its application in metalliferous ore exploration, hydrocarbon exploration, underground water exploration, monitoring of environment pollution, and the evaluation of environment remediation. And the measurement of complex resistivity or complex impedance of rock/ore sample and polluted water plays a fundamental role in improving the application effect of SIP and the application scope of SIP. However, current instruments can't guaranty the accuracy of measurement when the resistance of sample is less than 10Ω or great than 100kΩ. A lot of samples, such as liquid, polluted sea water, igneous rock, limestone, and sandstone, can't be measured with reliable complex resistivity result. Therefore, this problem projects a shadow in the basic research and application research of SIP. We design a high precision measurement system from the study of measurement principle, sample holder, and measurement instrument. We design input buffers in a single board. We adopt operation amplifier AD549 in this system because of its ultra-high input impedance and ultra-low current noise. This buffer is good in acquiring potential signal across high impedance sample. By analyzing the sources of measurement error and errors generated by the measurement system, we propose a correction method to remove the error in order to achieve high quality complex impedance measurement for rock and ore samples. This measurement system can improve the measurement range of the complex impedance to 0.1 Ω ~ 10 GΩ with amplitude error less than 0.1% and phase error less than 0.1mrad when frequency ranges as 0.01 Hz ~ 1 kHz. We tested our system on resistors with resistance as 0.1Ω ~ 10 GΩ in frequency range as 1 Hz ~ 1000 Hz, and the measurement error is less than 0.1 mrad. We also compared the result with LCR bridge and SCIP, we can find that the bridge's measuring range only reaches 100 MΩ, SCIP's measuring range

  1. Confidence bounds of recurrence-based complexity measures

    Schinkel, Stefan [Interdisciplinary Centre for Dynamics of Complex Systems, University of Potsdam (Germany)], E-mail: schinkel@agnld.uni-potsdam.de; Marwan, N. [Interdisciplinary Centre for Dynamics of Complex Systems, University of Potsdam (Germany); Potsdam Institute for Climate Impact Research (PIK) (Germany); Dimigen, O. [Department of Psychology, University of Potsdam (Germany); Kurths, J. [Potsdam Institute for Climate Impact Research (PIK) (Germany); Department of Physics, Humboldt University at Berlin (Germany)

    2009-06-15

    In the recent past, recurrence quantification analysis (RQA) has gained an increasing interest in various research areas. The complexity measures the RQA provides have been useful in describing and analysing a broad range of data. It is known to be rather robust to noise and nonstationarities. Yet, one key question in empirical research concerns the confidence bounds of measured data. In the present Letter we suggest a method for estimating the confidence bounds of recurrence-based complexity measures. We study the applicability of the suggested method with model and real-life data.

  2. One Single Static Measurement Predicts Wave Localization in Complex Structures

    Lefebvre, Gautier; Gondel, Alexane; Dubois, Marc; Atlan, Michael; Feppon, Florian; Labbé, Aimé; Gillot, Camille; Garelli, Alix; Ernoult, Maxence; Mayboroda, Svitlana; Filoche, Marcel; Sebbah, Patrick

    2016-08-01

    A recent theoretical breakthrough has brought a new tool, called the localization landscape, for predicting the localization regions of vibration modes in complex or disordered systems. Here, we report on the first experiment which measures the localization landscape and demonstrates its predictive power. Holographic measurement of the static deformation under uniform load of a thin plate with complex geometry provides direct access to the landscape function. When put in vibration, this system shows modes precisely confined within the subregions delineated by the landscape function. Also the maxima of this function match the measured eigenfrequencies, while the minima of the valley network gives the frequencies at which modes become extended. This approach fully characterizes the low frequency spectrum of a complex structure from a single static measurement. It paves the way for controlling and engineering eigenmodes in any vibratory system, especially where a structural or microscopic description is not accessible.

  3. Network Decomposition and Complexity Measures: An Information Geometrical Approach

    Masatoshi Funabashi

    2014-07-01

    Full Text Available We consider the graph representation of the stochastic model with n binary variables, and develop an information theoretical framework to measure the degree of statistical association existing between subsystems as well as the ones represented by each edge of the graph representation. Besides, we consider the novel measures of complexity with respect to the system decompositionability, by introducing the geometric product of Kullback–Leibler (KL- divergence. The novel complexity measures satisfy the boundary condition of vanishing at the limit of completely random and ordered state, and also with the existence of independent subsystem of any size. Such complexity measures based on the geometric means are relevant to the heterogeneity of dependencies between subsystems, and the amount of information propagation shared entirely in the system.

  4. A Collection of Complex Permittivity and Permeability Measurements

    Barry, W.; Byrd, J.; Johnson, J.; Smithwick, J.

    1993-02-01

    We present the results of measurements of the complex permittivity and permeability over a frequency range of 0.1-5.1 GHz for a range of microwave absorbing materials used in a variety of accelerator applications. We also describe the automated measurement technique which uses swept-frequency S-parameter measurements made on a strip transmission line device loaded with the material under test.

  5. Measuring logic complexity can guide pattern discovery in empirical systems

    Gherardi, Marco

    2016-01-01

    We explore a definition of complexity based on logic functions, which are widely used as compact descriptions of rules in diverse fields of contemporary science. Detailed numerical analysis shows that (i) logic complexity is effective in discriminating between classes of functions commonly employed in modelling contexts; (ii) it extends the notion of canalisation, used in the study of genetic regulation, to a more general and detailed measure; (iii) it is tightly linked to the resilience of a function's output to noise affecting its inputs. We demonstrate its utility by measuring it in empirical data on gene regulation, digital circuitry, and propositional calculus. Logic complexity is exceptionally low in these systems. The asymmetry between "on" and "off" states in the data correlates with the complexity in a non-null way; a model of random Boolean networks clarifies this trend and indicates a common hierarchical architecture in the three systems.

  6. Minimal classical communication and measurement complexity for quantum information splitting

    We present two quantum information splitting schemes using respectively tripartite GHZ and asymmetric W states as quantum channels. We show that if the secret state is chosen from a special ensemble and known to the sender (Alice), then she can split and distribute it to the receivers Bob and Charlie by performing only a single-qubit measurement and broadcasting a one-cbit message. It is clear that no other schemes could possibly achieve the same goal with simpler measurement and less classical communication. In comparison, existing schemes work for arbitrary quantum states which need not be known to Alice; however she is required to perform a two-qubit Bell measurement and communicate a two-cbit message. Hence there is a trade-off between flexibility and measurement complexity plus classical resource. In situations where our schemes are applicable, they will greatly reduce the measurement complexity and at the same time cut the communication overhead by one half

  7. A Measure of Learning Model Complexity by VC Dimension

    WANG Wen-jian; ZHANG Li-xia; XU Zong-ben

    2002-01-01

    When developing models there is always a trade-off between model complexity and model fit. In this paper, a measure of learning model complexity based on VC dimension is presented, and some relevant mathematical theory surrounding the derivation and use of this metric is summarized. The measure allows modelers to control the amount of error that is returned from a modeling system and to state upper bounds on the amount of error that the modeling system will return on all future, as yet unseen and uncollected data sets. It is possible for modelers to use the VC theory to determine which type of model more accurately represents a system.

  8. A new complexity measure for time series analysis and classification

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  9. Defining statistical relative complexity measure: Application to diversity in atoms

    A statistical relative complexity measure, based on the Kullback-Leibler distance measure defining the relative information and the Carbo quantum similarity index defining the relative disequilibrium is proposed. It is shown that with the specific choice of prior density corresponding to the atom at the beginning of the subshell, this measure reveals the diversity of atoms as the subshells are filled across the periodic table. Numerical tests are reported using the non-relativistic Hartree-Fock as well as the relativistic Dirac-Fock density for all atoms in the periodic table. -- Highlights: → A statistical relative complexity measure is introduced. → Numerator as Kullback-Leibler relative information. → Denominator as Carbo quantum similarity as relative disequilibrium is proposed. → Prior density set as atom at the beginning of the subshell in the periodic table. → The diversity of atoms as the subshells are filled is revealed.

  10. On bias of kinetic temperature measurements in complex plasmas

    Kantor, M.; Moseev, D.; Salewski, Mirko

    2014-01-01

    The kinetic temperature in complex plasmas is often measured using particle tracking velocimetry. Here, we introduce a criterion which minimizes the probability of faulty tracking of particles with normally distributed random displacements in consecutive frames. Faulty particle tracking results in...... a measurement bias of the deduced velocity distribution function and hence the deduced kinetic temperature. For particles with a normal velocity distribution function, mistracking biases the obtained velocity distribution function towards small velocities at the expense of large velocities, i...

  11. The Generalization Complexity Measure for Continuous Input Data

    Iván Gómez

    2014-01-01

    defined in Boolean space, quantifies the complexity of data in relationship to the prediction accuracy that can be expected when using a supervised classifier like a neural network, SVM, and so forth. We first extend the original measure for its use with continuous functions to later on, using an approach based on the use of the set of Walsh functions, consider the case of having a finite number of data points (inputs/outputs pairs, that is, usually the practical case. Using a set of trigonometric functions a model that gives a relationship between the size of the hidden layer of a neural network and the complexity is constructed. Finally, we demonstrate the application of the introduced complexity measure, by using the generated model, to the problem of estimating an adequate neural network architecture for real-world data sets.

  12. Complex dielectric constant measurements by the microwave resonant cavities method

    A complex dielectric constant measurement method for solids, using cylindrical and parallelipipedic microwave resonant cavities is presented. This method provides high accuracy when calculating the value of epsilonsup(*) for dielectric, semiconductor, ferroelectric and ferromagnetic materials. The paper contains a short theoretical approach, the description of the experimental method, as well as some experimental results obtained in the frequency band (19500 MHz). (author)

  13. A SHARC based ROB Complex : design and measurement results

    Boterenbrood, H; Kieft, G; Scholte, R; Slopsema, R; Vermeulen, J C

    2000-01-01

    ROB hardware, based on and exploiting the properties of the SHARC DSP and of FPGAs, and the associated software are described. Results from performance measurements and an analysis of the results for a single ROBIn as well as for a ROB Complex with up to 4 ROBIns are presented.

  14. Assessment of Complex Performances: Limitations of Key Measurement Assumptions.

    Delandshere, Ginette; Petrosky, Anthony R.

    1998-01-01

    Examines measurement concepts and assumptions traditionally used in educational assessment, using the Early Adolescence/English Language Arts assessment developed for the National Board for Professional Teaching Standards as a context. The use of numerical ratings in complex performance assessment is questioned. (SLD)

  15. Effect of ions on sulfuric acid-water binary particle formation: 1. Theory for kinetic- and nucleation-type particle formation and atmospheric implications

    Merikanto, Joonas; Duplissy, Jonathan; Määttänen, Anni; Henschel, Henning; Donahue, Neil M.; Brus, David; Schobesberger, Siegfried; Kulmala, Markku; Vehkamäki, Hanna

    2016-02-01

    We derive a version of Classical Nucleation Theory normalized by quantum chemical results on sulfuric acid-water hydration to describe neutral and ion-induced particle formation in the binary sulfuric acid-water system. The theory is extended to treat the kinetic regime where the nucleation free energy barrier vanishes at high sulfuric acid concentrations or low temperatures. In the kinetic regime particle formation rates become proportional to sulfuric acid concentration to second power in the neutral system or first power in the ion-induced system. We derive simple general expressions for the prefactors in kinetic-type and activation-type particle formation calculations applicable also to more complex systems stabilized by other species. The theory predicts that the binary water-sulfuric acid system can produce strong new particle formation in the free troposphere both through barrier crossing and through kinetic pathways. At cold stratospheric and upper free tropospheric temperatures neutral formation dominates the binary particle formation rates. At midtropospheric temperatures the ion-induced pathway becomes the dominant mechanism. However, even the ion-induced binary mechanism does not produce significant particle formation in warm boundary layer conditions, as it requires temperatures below 0°C to take place at atmospheric concentrations. The theory successfully reproduces the characteristics of measured charged and neutral binary particle formation in CERN CLOUD3 and CLOUD5 experiments, as discussed in a companion paper.

  16. Measuring the Complexity of Self-organizing Traffic Lights

    Zubillaga, Dario; Aguilar, Luis Daniel; Zapotecatl, Jorge; Fernandez, Nelson; Aguilar, Jose; Rosenblueth, David A; Gershenson, Carlos

    2014-01-01

    We apply measures of complexity, emergence and self-organization to an abstract city traffic model for comparing a traditional traffic coordination method with a self-organizing method in two scenarios: cyclic boundaries and non-orientable boundaries. We show that the measures are useful to identify and characterize different dynamical phases. It becomes clear that different operation regimes are required for different traffic demands. Thus, not only traffic is a non-stationary problem, which requires controllers to adapt constantly. Controllers must also change drastically the complexity of their behavior depending on the demand. Based on our measures, we can say that the self-organizing method achieves an adaptability level comparable to a living system.

  17. Measuring the Complexity of Self-Organizing Traffic Lights

    Darío Zubillaga

    2014-04-01

    Full Text Available We apply measures of complexity, emergence, and self-organization to an urban traffic model for comparing a traditional traffic-light coordination method with a self-organizing method in two scenarios: cyclic boundaries and non-orientable boundaries. We show that the measures are useful to identify and characterize different dynamical phases. It becomes clear that different operation regimes are required for different traffic demands. Thus, not only is traffic a non-stationary problem, requiring controllers to adapt constantly; controllers must also change drastically the complexity of their behavior depending on the demand. Based on our measures and extending Ashby’s law of requisite variety, we can say that the self-organizing method achieves an adaptability level comparable to that of a living system.

  18. Complexity-Entropy Causality Plane as a Complexity Measure for Two-dimensional Patterns

    Ribeiro, H V; Lenzi, E K; Santoro, P A; Mendes, R S; 10.1371/journal.pone.0040689

    2012-01-01

    Complexity measures are essential to understand complex systems and there are numerous definitions to analyze one-dimensional data. However, extensions of these approaches to two or higher-dimensional data, such as images, are much less common. Here, we reduce this gap by applying the ideas of the permutation entropy combined with a relative entropic index. We build up a numerical procedure that can be easily implemented to evaluate the complexity of two or higher-dimensional patterns. We work out this method in different scenarios where numerical experiments and empirical data were taken into account. Specifically, we have applied the method to i) fractal landscapes generated numerically where we compare our measures with the Hurst exponent; ii) liquid crystal textures where nematic-isotropic-nematic phase transitions were properly identified; iii) 12 characteristic textures of liquid crystals where the different values show that the method can distinguish different phases; iv) and Ising surfaces where our m...

  19. A Method for Measuring the Structure Complexity of Web Application

    2006-01-01

    The precise and effective measure results of Web applications not only facilitate good comprehension of them, but also benefit to the macro-management of software activities, such as testing, reverse engineering, reuse, etc. The paper exploits some researches on measuring the structure complexity of Web application. Through a deep analysis of the configuration and objects' interactions of Web system, two conclusions have been drawn:①A generic Web application consists of static web page, dynamic page, component and database object;②The main interactions have only three styles, that is static link, dynamic link and call/return relation. Based on analysis and modeling of the content of a Web page (static or dynamic), complexity measure methods of both control logic of script and nesting of HTML code are further discussed. In addition, two methods for measuring the complexity of inter-page navigation are also addressed by modeling the inter-page navigation behaviors of Web application via WNG graph.

  20. The step complexity measure its meaning and applications

    According to related studies, it was revealed that procedural error plays a significant role for initiating accidents or incidents. This means that, to maximize safety, it is indispensable to be able to answer the question of 'why the operators perpetrate procedural error?' In this study, the SC (Step Complexity) measure is introduced to investigate its applicability for studying procedural error, since it was shown that the change of the operators' performance is strongly correlated with the change of SC scores. This means that the SC measure could play an important role for researches related to procedural error, since it is strongly believed that complicated procedures would affect both the operators' performance and the possibility of procedural error. Thus, to ensure this expectation, the meaning of the SC measure is investigated through brief explanations including the necessity, theoretical basis and verification activities of the SC measure. As the result, it is quite positive that the SC measure can be used to explain the change of the operators' performance due to the task complexity implied by procedures. In addition, it seems that the SC measure may be useful for various purposes, particularly for scrutinizing the relationship between procedural error and complicated procedures

  1. Measuring system complexity to support development cost estimates

    Malone, P.; Wolfarth, L.

    Systems and System-of-Systems (SoS) are being used more frequently either as a design element of stand alone systems or architectural frameworks. Consequently, a programmatic need has arisen to understand and measure systems complexity in order to estimate more accurately development plans and life-cycle costs. In a prior paper, we introduced the System Readiness Level (SRL) concept as a composite function of both Technology Readiness Levels (TRLs) and Integration Readiness Levels (IRLs) and touched on system complexity. While the SRL approach provides a repeatable, process-driven method to assess the maturity of a system or SoS, it does not capture all aspects of system complexity. In this paper we assess the concept of cyclomatic complexity as a system complexity metric and consider its utility as an approach for estimating the life-cycle costs and cost growth of complex systems. We hypothesize that the greater the number of technologies and integration tasks, the more complex the system and the higher its cost to develop and maintain. We base our analysis on historical data from DoD programs that have experienced significant cost growth, including some that have been cancelled due to unsustainable cost (and schedule) growth. We begin by describing the original implementation of the cyclomatic method, which was developed to estimate the effort to maintain system software. We then describe how the method can be generalized and applied to systems. Next, we show how to estimate the cyclomatic number (CN) and show the statistical significance between a system's CN metric and its cost. We illustrate the method with an example. Last, we discuss opportunities for future research.

  2. Applications of fidelity measures to complex quantum systems.

    Wimberger, Sandro

    2016-06-13

    We revisit fidelity as a measure for the stability and the complexity of the quantum motion of single-and many-body systems. Within the context of cold atoms, we present an overview of applications of two fidelities, which we call static and dynamical fidelity, respectively. The static fidelity applies to quantum problems which can be diagonalized since it is defined via the eigenfunctions. In particular, we show that the static fidelity is a highly effective practical detector of avoided crossings characterizing the complexity of the systems and their evolutions. The dynamical fidelity is defined via the time-dependent wave functions. Focusing on the quantum kicked rotor system, we highlight a few practical applications of fidelity measurements in order to better understand the large variety of dynamical regimes of this paradigm of a low-dimensional system with mixed regular-chaotic phase space. PMID:27140967

  3. Statistical analysis of complex systems with nonclassical invariant measures

    Fratalocchi, Andrea

    2011-02-28

    I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a general formalism based on the Ablowitz-Kaup-Newell-Segur scheme, I demonstrate how to build an invariant measure and, within a one-dimensional phase space, how to develop a suitable thermodynamics. A detailed example is provided with a universal model of wave propagation, with reference to a transparent potential sustaining gray solitons. The system shows a rich thermodynamic scenario, with a free-energy landscape supporting phase transitions and controllable emergent properties. I finally discuss the origin of such behavior, trying to identify common denominators in the area of complex dynamics.

  4. Measuring complex problem solving: the MicroDYN approach

    Greiff, Samuel; Funke, Joachim

    2009-01-01

    In educational large-scale assessments such as PISA only recently an increasing interest in measuring cross-curricular competencies can be observed. These are now discovered as valuable aspects of school achievement. Complex problem solving (CPS) describes an interesting construct for the diagnostics of domain-general competencies. Here, we present MicroDYN, a new approach for computer-based assessment of CPS. We introduce the new concept, describe proper software and present first results...

  5. Novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis

    Mihailovic, Dragutin T; Nikolic-Djoric, Emilija; Arsenic, Ilija

    2013-01-01

    We have proposed novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis. We have considered background of the Kolmogorov complexity and also we have discussed meaning of the physical as well as other complexities. To get better insights into the complexity of complex systems and time series analysis we have introduced the three novel measures based on the Kolmogorov complexity: (i) the Kolmogorov complexity spectrum, (ii) the Kolmogorov complexity spectrum highest value and (iii) the overall Kolmogorov complexity. The characteristics of these measures have been tested using a generalized logistic equation. Finally, the proposed measures have been applied on different time series originating from: the model output (the biochemical substance exchange in a multi-cell system), four different geophysical phenomena (dynamics of: river flow, long term precipitation, indoor 222Rn concentration and UV radiation dose) and economy (stock prices dynamics). Re...

  6. Novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis

    Mihailović Dragutin T.

    2015-01-01

    Full Text Available We propose novel metrics based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis. We consider the origins of the Kolmogorov complexity and discuss its physical meaning. To get better insights into the nature of complex systems and time series analysis we introduce three novel measures based on the Kolmogorov complexity: (i the Kolmogorov complexity spectrum, (ii the Kolmogorov complexity spectrum highest value and (iii the overall Kolmogorov complexity. The characteristics of these measures have been tested using a generalized logistic equation. Finally, the proposed measures have been applied to different time series originating from: a model output (the biochemical substance exchange in a multi-cell system, four different geophysical phenomena (dynamics of: river flow, long term precipitation, indoor 222Rn concentration and UV radiation dose and the economy (stock price dynamics. The results obtained offer deeper insights into the complexity of system dynamics and time series analysis with the proposed complexity measures.

  7. On the extension of Importance Measures to complex components

    Importance Measures are indicators of the risk significance of the components of a system. They are widely used in various applications of Probabilistic Safety Analyses, off-line and on-line, in decision making for preventive and corrective purposes, as well as to rank components according to their contribution to the global risk. They are primarily defined for the case the support model is a coherent fault tree and failures of components are described by basic events of this fault tree. In this article, we study their extension to complex components, i.e. components whose failures are modeled by a gate rather than just a basic event. Although quite natural, such an extension has not received much attention in the literature. We show that it raises a number of problems. The Birnbaum Importance Measure and the notion of Critical States concentrate these difficulties. We present alternative solutions for the extension of these notions. We discuss their respective advantages and drawbacks. This article gives a new point of view on the mathematical foundations of Importance Measures and helps us to clarify their physical meaning. - Highlights: • We propose an extension of Importance Measures to complex components. • We define our extension in term minterms, i.e. states of the system. • We discuss the physical interpretation of Importance Measures in light of this interpretation

  8. OPEN PUBLIC SPACE ATTRIBUTES AND CATEGORIES – COMPLEXITY AND MEASURABILITY

    Ljiljana Čavić

    2014-12-01

    Full Text Available Within the field of architectural and urban research, this work addresses the complexity of contemporary public space, both in a conceptual and concrete sense. It aims at systematizing spatial attributes and their categories and discussing spatial complexity and measurability, all this in order to reach a more comprehensive understanding, description and analysis of public space. Our aim is to improve everyday usage of open public space and we acknowledged users as its crucial factor. There are numerous investigations on the complex urban and architectural reality of public space that recognise importance of users. However, we did not find any that would holistically account for what users find essential in public space. Based on the incompleteness of existing approaches on open public space and the importance of users for their success, this paper proposes a user-orientated approach. Through an initial survey directed to users, we collected the most important aspects of public spaces in the way that contemporary humans see them. The gathered data is analysed and coded into spatial attributes from which their role in the complexity of open public space and measurability are discussed. The work results in an inventory of attributes that users find salient in public spaces. It does not discuss their qualitative values or contribution in generating spatial realities. It aims to define them clearly so that any further logical argumentation on open space concerning users may be solidly constructed. Finally, through categorisation of attributes it proposes the disciplinary levels necessary for the analysis of complex urban-architectural reality

  9. Compositional segmentation and complexity measurement in stock indices

    Wang, Haifeng; Shang, Pengjian; Xia, Jianan

    2016-01-01

    In this paper, we introduce a complexity measure based on the entropic segmentation called sequence compositional complexity (SCC) into the analysis of financial time series. SCC was first used to deal directly with the complex heterogeneity in nonstationary DNA sequences. We already know that SCC was found to be higher in sequences with long-range correlation than those with low long-range correlation, especially in the DNA sequences. Now, we introduce this method into financial index data, subsequently, we find that the values of SCC of some mature stock indices, such as S & P 500 (simplified with S & P in the following) and HSI, are likely to be lower than the SCC value of Chinese index data (such as SSE). What is more, we find that, if we classify the indices with the method of SCC, the financial market of Hong Kong has more similarities with mature foreign markets than Chinese ones. So we believe that a good correspondence is found between the SCC of the index sequence and the complexity of the market involved.

  10. Increment Entropy as a Measure of Complexity for Time Series

    Xiaofeng Liu

    2016-01-01

    Full Text Available Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce an increment entropy to measure the complexity of time series in which each increment is mapped onto a word of two letters, one corresponding to the sign and the other corresponding to the magnitude. Increment entropy (IncrEn is defined as the Shannon entropy of the words. Simulations on synthetic data and tests on epileptic electroencephalogram (EEG signals demonstrate its ability of detecting abrupt changes, regardless of the energetic (e.g., spikes or bursts or structural changes. The computation of IncrEn does not make any assumption on time series, and it can be applicable to arbitrary real-world data.

  11. Increment entropy as a measure of complexity for time series

    Liu, Xiaofeng; Xu, Ning; Xue, Jianru

    2015-01-01

    Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce increment entropy to measure the complexity of time series in which each increment is mapped into a word of two letters, one letter corresponding to direction and the other corresponding to magnitude. The Shannon entropy of the words is termed as increment entropy (IncrEn). Simulations on synthetic data and tests on epileptic EEG signals have demonstrated its ability of detecting the abrupt change, regardless of energetic (e.g. spikes or bursts) or structural changes. The computation of IncrEn does not make any assumption on time series and it can be applicable to arbitrary real-world data.

  12. Overcoming Problems in the Measurement of Biological Complexity

    Cebrian, Manuel; Ortega, Alfonso

    2010-01-01

    In a genetic algorithm, fluctuations of the entropy of a genome over time are interpreted as fluctuations of the information that the genome's organism is storing about its environment, being this reflected in more complex organisms. The computation of this entropy presents technical problems due to the small population sizes used in practice. In this work we propose and test an alternative way of measuring the entropy variation in a population by means of algorithmic information theory, where the entropy variation between two generational steps is the Kolmogorov complexity of the first step conditioned to the second one. As an example application of this technique, we report experimental differences in entropy evolution between systems in which sexual reproduction is present or absent.

  13. Atmospheric stability and complex terrain: comparing measurements and CFD

    Koblitz, Tilman; Bechmann, Andreas; Berg, Jacob;

    2014-01-01

    , buoyancy forces and heat transport, are mostly ignored in state-of-the-art flow solvers. In order to decrease the uncertainty of wind resource assessment, the effect of thermal stratification on the atmospheric boundary layer should be included in such models. The present work focuses on non......-neutral atmospheric flow over complex terrain including physical processes like stability and Coriolis force. We examine the influence of these effects on the whole atmospheric boundary layer using the DTU Wind Energy flow solver EllipSys3D. To validate the flow solver, measurements from Benakanahalli hill, a field...... experiment that took place in India in early 2010, are used. The experiment was specifically designed to address the combined effects of stability and Coriolis force over complex terrain, and provides a dataset to validate flow solvers. Including those effects into EllipSys3D significantly improves the...

  14. Determination of complex microcalorimeter parameters with impedance measurements

    The proper understanding and modeling of a microcalorimeter's response requires accurate knowledge of a handful of parameters, such as C, G, α. While a few of these parameters are directly determined from the IV characteristics, some others, notoriously the heat capacity (C) and α, appear in degenerate combinations in most measurable quantities. The consideration of a complex microcalorimeter leads to an added ambiguity in the determination of the parameters. In general, the dependence of the microcalorimeter's complex impedance on these various parameters varies with frequency. This dependence allows us to determine individual parameters by fitting the prediction of the microcalorimeter model to impedance data. In this paper we describe efforts at characterizing the Goddard X-ray microcalorimeters. With the parameters determined by this method, we compare the pulse shape and noise spectra predictions to data taken with the same devices

  15. A new measure of heterogeneity for complex networks

    Jacob, Rinku; Misra, R; Ambika, G

    2016-01-01

    We propose a novel measure of heterogeneity for unweighted and undirected complex networks that can be derived from the degree distribution of the network instead of the degree sequences, as is done at present. We show that the proposed measure can be applied to all types of topology with ease and shows direct correlation with the diversity of node degrees in the network. The measure is mathematically well behaved and is normalised in the interval [0, 1]. The measure is applied to compute the heterogeneity of synthetic (both random and scale free) and real world networks. We specifically show that the heterogeneity of an evolving scale free network decreases as a power law with the size of the network N, implying a scale free character for the proposed measure. Finally, as a specific application, we show that the proposed measure can be used to compare the heterogeneity of recurrence networks constructed from the time series of several low dimensional chaotic attractors, thereby providing a single index to co...

  16. Entropies from Markov Models as Complexity Measures of Embedded Attractors

    Julián D. Arias-Londoño

    2015-06-01

    Full Text Available This paper addresses the problem of measuring complexity from embedded attractors as a way to characterize changes in the dynamical behavior of different types of systems with a quasi-periodic behavior by observing their outputs. With the aim of measuring the stability of the trajectories of the attractor along time, this paper proposes three new estimations of entropy that are derived from a Markov model of the embedded attractor. The proposed estimators are compared with traditional nonparametric entropy measures, such as approximate entropy, sample entropy and fuzzy entropy, which only take into account the spatial dimension of the trajectory. The method proposes the use of an unsupervised algorithm to find the principal curve, which is considered as the “profile trajectory”, that will serve to adjust the Markov model. The new entropy measures are evaluated using three synthetic experiments and three datasets of physiological signals. In terms of consistency and discrimination capabilities, the results show that the proposed measures perform better than the other entropy measures used for comparison purposes.

  17. Power Quality Measurement in a Modern Hotel Complex

    Velimir Strugar

    2010-06-01

    Full Text Available The paper presents the analysis of power quality characteristics at the 10 kV grids supplying a modern hotel complex in Montenegrin Adriatic coast. The consumer is characterized with different type of loads, of which some are with highly nonlinear characteristic. For example, smart rooms, lift drives, modern equipment for hotel kitchen, public electric lighting, audio, video and TV devices, etc. Such loads in the hotel complex may be source of negative effects regarding power quality at MV public distribution network (10 kV and 35 kV. In the first part of the paper, results of harmonic measurement at a 35/10 kV substation are presented. The measurements lasted one week in real operating conditions (in accordance with EN 50160. The results were the basis for developing a simulation model. The measurement results were analyzed and compared with simulation ones. Application of harmonic filter is simulated. Filter effects on harmonic level is calculated and discussed using simulation results.

  18. Automated imitating-measuring complex for designing and measuring characteristics of phased antenna arrays

    Usin, V.; Markov, V.; Pomazanov, S.; Usina, A.; Filonenko, A.

    2011-01-01

    This article considers design principles, structure and technical characteristics of automated imitating-measuring complex, contains variants of its hardware and software implementation for selecting APD, tolerance justification, estimation of manufacturing errors’ influence, discrete nature of control and mutual influence of radiating elements on PAA parameters.

  19. Measuring complexity with multifractals in texts. Translation effects

    Highlights: ► Two texts in English and one in Esperanto are transformed into 6 time series. ► D(q) and f(alpha) of such (and shuffled) time series are obtained. ► A model for text construction is presented based on a parametrized Cantor set. ► The model parameters can also be used when examining machine translated texts. ► Suggested extensions to higher dimensions: in 2D image analysis and on hypertexts. - Abstract: Should quality be almost a synonymous of complexity? To measure quality appears to be audacious, even very subjective. It is hereby proposed to use a multifractal approach in order to quantify quality, thus through complexity measures. A one-dimensional system is examined. It is known that (all) written texts can be one-dimensional nonlinear maps. Thus, several written texts by the same author are considered, together with their translation, into an unusual language, Esperanto, and asa baseline their corresponding shuffled versions. Different one-dimensional time series can be used: e.g. (i) one based on word lengths, (ii) the other based on word frequencies; both are used for studying, comparing and discussing the map structure. It is shown that a variety in style can be measured through the D(q) and f(α) curves characterizing multifractal objects. This allows to observe on the one hand whether natural and artificial languages significantly influence the writing and the translation, and whether one author’s texts differ technically from each other. In fact, the f(α) curves of the original texts are similar to each other, but the translated text shows marked differences. However in each case, the f(α) curves are far from being parabolic, – in contrast to the shuffled texts. Moreover, the Esperanto text has more extreme values. Criteria are thereby suggested for estimating a text quality, as if it is a time series only. A model is introduced in order to substantiate the findings: it consists in considering a text as a random Cantor set

  20. Range-limited Centrality Measures in Complex Networks

    Ercsey-Ravasz, Maria; Chawla, Nitesh V; Toroczkai, Zoltan

    2011-01-01

    Here we present a range-limited approach to centrality measures in both non-weighted and weighted directed complex networks. We introduce an efficient method that generates for every node and every edge its betweenness centrality based on shortest paths of lengths not longer than $\\ell = 1,...,L$ in case of non-weighted networks, and for weighted networks the corresponding quantities based on minimum weight paths with path weights not larger than $w_{\\ell}=\\ell \\Delta$, $\\ell=1,2...,L=R/\\Delta$. These measures provide a systematic description on the positioning importance of a node (edge) with respect to its network neighborhoods 1-step out, 2-steps out, etc. up to including the whole network. We show that range-limited centralities obey universal scaling laws for large non-weighted networks. As the computation of traditional centrality measures is costly, this scaling behavior can be exploited to efficiently estimate centralities of nodes and edges for all ranges, including the traditional ones. The scaling ...

  1. Digraph Complexity Measures and Applications in Formal Language Theory

    Gruber, Hermann

    2011-01-01

    We investigate structural complexity measures on digraphs, in particular the cycle rank. This concept is intimately related to a classical topic in formal language theory, namely the star height of regular languages. We explore this connection, and obtain several new algorithmic insights regarding both cycle rank and star height. Among other results, we show that computing the cycle rank is NP-complete, even for sparse digraphs of maximum outdegree 2. Notwithstanding, we provide both a polynomial-time approximation algorithm and an exponential-time exact algorithm for this problem. The former algorithm yields an O((log n)^(3/2))- approximation in polynomial time, whereas the latter yields the optimum solution, and runs in time and space O*(1.9129^n) on digraphs of maximum outdegree at most two. Regarding the star height problem, we identify a subclass of the regular languages for which we can precisely determine the computational complexity of the star height problem. Namely, the star height problem for bidet...

  2. Determination of Complex Microcalorimeter Parameters with Impedance Measurements

    Saab, T.; Bandler, S. R.; Chervenak, J.; Figueroa-Feliciano, E.; Finkbeiner, F.; Iyomoto, N.; Kelley, R.; Kilbourne, C. A.; Lindeman, M. A.; Porter, F. S.; Sadleir, J.

    2005-01-01

    The proper understanding and modeling of a microcalorimeter s response requires the accurate knowledge of a handful of parameters, such as C, G, alpha, . . . . While a few of these, such 8s the normal state resistance and the total thermal conductance to the heat bath (G) are directly determined from the DC IV characteristics, some others, notoriously the heat capacity (C) and alpha, appear in degenerate combinations in most measurable quantities. The case of a complex microcalorimeter, i.e. one in which the absorber s heat capacity is connected by a finite thermal impedance to the sensor, and subsequently by another thermal impedance to the heat bath, results in an added ambiguity in the determination of the individual C's and G's. In general, the dependence of the microcalorimeter s complex impedance on these parameters varies with frequency. This variation allows us to determine the individual parameters by fitting the prediction of the microcalorimeter model to the impedance data. We describe in this paper our efforts at characterizing the Goddard X-ray microcalorimeters. Using the parameters determined with this method we them compare the pulse shape and noise spectra predicted by the microcalorimeter model to data taken with the same devices.

  3. Measuring robustness of community structure in complex networks

    Li, Hui-Jia; Chen, Luonan

    2015-01-01

    The theory of community structure is a powerful tool for real networks, which can simplify their topological and functional analysis considerably. However, since community detection methods have random factors and real social networks obtained from complex systems always contain error edges, evaluating the robustness of community structure is an urgent and important task. In this letter, we employ the critical threshold of resolution parameter in Hamiltonian function, $\\gamma_C$, to measure the robustness of a network. According to spectral theory, a rigorous proof shows that the index we proposed is inversely proportional to robustness of community structure. Furthermore, by utilizing the co-evolution model, we provides a new efficient method for computing the value of $\\gamma_C$. The research can be applied to broad clustering problems in network analysis and data mining due to its solid mathematical basis and experimental effects.

  4. Measuring the complex behavior of the SO2 oxidation reaction

    Muhammad Shahzad

    2015-09-01

    Full Text Available The two step reversible chemical reaction involving five chemical species is investigated. The quasi equilibrium manifold (QEM and spectral quasi equilibrium manifold (SQEM are used for initial approximation to simplify the mechanisms, which we want to utilize in order to investigate the behavior of the desired species. They show a meaningful picture, but for maximum clarity, the investigation method of invariant grid (MIG is employed. These methods simplify the complex chemical kinetics and deduce low dimensional manifold (LDM from the high dimensional mechanism. The coverage of the species near equilibrium point is investigated and then we shall discuss moving along the equilibrium of ODEs. The steady state behavior is observed and the Lyapunov function is utilized to study the stability of ODEs. Graphical results are used to describe the physical aspects of measurements.

  5. Measurement of complex supercontinuum light pulses using time domain ptychography

    Heidt, Alexander M; Brügmann, Michael; Rohwer, Erich G; Feurer, Thomas

    2016-01-01

    We demonstrate that time-domain ptychography, a recently introduced ultrafast pulse reconstruction modality, has properties ideally suited for the temporal characterization of complex light pulses with large time-bandwidth products as it achieves temporal resolution on the scale of a single optical cycle using long probe pulses, low sampling rates, and an extremely fast and robust algorithm. In comparison to existing techniques, ptychography minimizes the data to be recorded and processed, and drastically reduces the computational time of the reconstruction. Experimentally we measure the temporal waveform of an octave-spanning, 3.5~ps long supercontinuum pulse generated in photonic crystal fiber, resolving features as short as 5.7~fs with sub-fs resolution and 30~dB dynamic range using 100~fs probe pulses and similarly large delay steps.

  6. Permutation Complexity and Coupling Measures in Hidden Markov Models

    Taichi Haruna

    2013-09-01

    Full Text Available Recently, the duality between values (words and orderings (permutations has been proposed by the authors as a basis to discuss the relationship between information theoretic measures for finite-alphabet stationary stochastic processes and their permutatio nanalogues. It has been used to give a simple proof of the equality between the entropy rate and the permutation entropy rate for any finite-alphabet stationary stochastic process and to show some results on the excess entropy and the transfer entropy for finite-alphabet stationary ergodic Markov processes. In this paper, we extend our previous results to hidden Markov models and show the equalities between various information theoretic complexity and coupling measures and their permutation analogues. In particular, we show the following two results within the realm of hidden Markov models with ergodic internal processes: the two permutation analogues of the transfer entropy, the symbolic transfer entropy and the transfer entropy on rank vectors, are both equivalent to the transfer entropy if they are considered as the rates, and the directed information theory can be captured by the permutation entropy approach.

  7. A high accuracy broadband measurement system for time resolved complex bioimpedance measurements

    Bioimpedance measurements are useful tools in biomedical engineering and life science. Bioimpedance is the electrical impedance of living tissue and can be used in the analysis of various physiological parameters. Bioimpedance is commonly measured by injecting a small well known alternating current via surface electrodes into an object under test and measuring the resultant surface voltages. It is non-invasive, painless and has no known hazards. This work presents a field programmable gate array based high accuracy broadband bioimpedance measurement system for time resolved bioimpedance measurements. The system is able to measure magnitude and phase of complex impedances under test in a frequency range of about 10–500 kHz with excitation currents from 10 µA to 5 mA. The overall measurement uncertainties stay below 1% for the impedance magnitude and below 0.5° for the phase in most measurement ranges. Furthermore, the described system has a sample rate of up to 3840 impedance spectra per second. The performance of the bioimpedance measurement system is demonstrated with a resistor based system calibration and with measurements on biological samples. (paper)

  8. Introducing a Space Complexity Measure for P Systems

    Porreca, Antonio E.; Leporati, Alberto; Mauri, Giancarlo; Zandron, Claudio; Research Group on Natural Computing (Universidad de Sevilla) (Coordinador)

    2009-01-01

    We define space complexity classes in the framework of membrane computing, giving some initial results about their mutual relations and their connection with time complexity classes, and identifying some potentially interesting problems which require further research.

  9. Disassembling "evapotranspiration" in-situ with a complex measurement tool

    Chormanski, Jaroslaw; Kleniewska, Malgorzata; Berezowski, Tomasz; Sporak-Wasilewska, Sylwia; Okruszko, Tomasz; Szatylowicz, Jan; Batelaan, Okke

    2014-05-01

    In this work we present a complex tool for measuring water fluxes in wetland ecosystems. The tool was designed to quantify processes related to interception storage on plants leafs. The measurements are conducted by combining readings from various instruments, including: eddy covariance tower (EC), field spectrometer, SapFlow system, rain gauges above and under canopy, soil moisture probes and other. The idea of this set-up is to provide continuous measurement of overall water flux from the ecosystem (EC tower), intercepted water volume and timing (field spectrometers), through-fall (rain gauges above and under canopy), transpiration (SapFlow), evaporation and soil moisture (soil moisture probes). Disassembling the water flux to the above components allows giving more insight to the interception related processes and differentiates them from the total evapotranspiration. The measurements are conducted in the Upper Biebrza Basin (NE Poland). The study area is part of the valley and is covered by peat soils (mainly peat moss with the exception of areas near the river) and receives no inundations waters of the Biebrza. The plant community of Agrostietum-Carici caninae has a dominant share here creating an up to 0.6 km wide belt along the river. The area is covered also by Caricion lasiocarpae as well as meadows and pastures Molinio-Arrhenatheretea, Phragmitetum communis. Sedges form a hummock pattern characteristic for the sedge communities in natural river valleys with wetland vegetation. The main result of the measurement set-up will be the analyzed characteristics and dynamics of interception storage for sedge ecosystems and a developed methodology for interception monitoring by use spectral reflectance technique. This will give a new insight to processes of evapotranspiration in wetlands and its components transpiration, evaporation from interception and evaporation from soil. Moreover, other important results of this project will be the estimation of energy and

  10. Methodology for Measuring the Complexity of Enterprise Information Systems

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  11. Complex Squeezing and Force Measurement Beyond the Standard Quantum Limit

    Buchmann, L F; Kohler, J; Spethmann, N; Stamper-Kurn, D M

    2016-01-01

    A continuous quantum field, such as a propagating beam of light, may be characterized by a squeezing spectrum that is inhomogeneous in frequency. We point out that homodyne detectors, which are commonly employed to detect quantum squeezing, are blind to squeezing spectra in which the correlation between amplitude and phase fluctuations is complex. We find theoretically that such complex squeezing is a component of ponderomotive squeezing of light through cavity optomechanics. We propose a detection scheme, called synodyne detection, which reveals complex squeezing and allows its use to improve force detection beyond the standard quantum limit.

  12. Measuring the Level of Complexity of Scientific Inquiries: The LCSI Index

    Eilam, Efrat

    2015-01-01

    The study developed and applied an index for measuring the level of complexity of full authentic scientific inquiry. Complexity is a fundamental attribute of real life scientific research. The level of complexity is an overall reflection of complex cognitive and metacognitive processes which are required for navigating the authentic inquiry…

  13. A Measure for Complex Dynamics in Power Systems

    Ralph Wilson

    2011-06-01

    Full Text Available In an attempt to quantify the dynamical complexity of power systems, we introduce the use of a non-linear time series technique to detect complex dynamics in a signal. The technique is a significant reinterpretation of the Approximate Entropy (ApEn introduced by Pincus, as an approximation to the Eckmann- Ruelle entropy. It is examined in the context of power systems, and several examples are explored.

  14. Power Quality Measurement in a Modern Hotel Complex

    Velimir Strugar; Vladimir Katić

    2010-01-01

    The paper presents the analysis of power quality characteristics at the 10 kV grids supplying a modern hotel complex in Montenegrin Adriatic coast. The consumer is characterized with different type of loads, of which some are with highly nonlinear characteristic. For example, smart rooms, lift drives, modern equipment for hotel kitchen, public electric lighting, audio, video and TV devices, etc. Such loads in the hotel complex may be source of negative effects regarding power quality at MV pu...

  15. Approximate entropy as a measure of system complexity.

    Pincus, S M

    1991-01-01

    Techniques to determine changing system complexity from data are evaluated. Convergence of a frequently used correlation dimension algorithm to a finite value does not necessarily imply an underlying deterministic model or chaos. Analysis of a recently developed family of formulas and statistics, approximate entropy (ApEn), suggests that ApEn can classify complex systems, given at least 1000 data values in diverse settings that include both deterministic chaotic and stochastic processes. The ...

  16. Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR and Information Entropy

    Rong Jiang

    2015-04-01

    Full Text Available Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL, Shili (SL and Renli (RL, so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.

  17. Information and complexity measures for hydrologic model evaluation

    Hydrological models are commonly evaluated through the residual-based performance measures such as the root-mean square error or efficiency criteria. Such measures, however, do not evaluate the degree of similarity of patterns in simulated and measured time series. The objective of this study was to...

  18. Measuring and modeling of the wind profile in complex terrain

    Hošek, Jiří

    Wilhelmshaven : Deutsche Wind Energie Institut, 2004, ---. [DEWEK 2004. Wilhelmshaven (DE), 20.10.2005-21.10.2005] Institutional research plan: CEZ:AV0Z3042911 Keywords : wind profile * complex terrain * numerical model Subject RIV: JE - Non-nuclear Energetics, Energy Consumption ; Use

  19. Existence of biological uncertainty principle implies that we can never find 'THE' measure for biological complexity

    Banerji, Anirban

    2009-01-01

    There are innumerable 'biological complexity measure's. While some patterns emerge from these attempts to represent biological complexity, a single measure to encompass the seemingly countless features of biological systems, still eludes the students of Biology. It is the pursuit of this paper to discuss the feasibility of finding one complete and objective measure for biological complexity. A theoretical construct (the 'Thread-Mesh model') is proposed here to describe biological reality. It ...

  20. Quantum mechanics with chaos correspondence principle, measurement and complexity

    Kirilyuk, A P

    1995-01-01

    The true dynamical randomness is obtained as a natural fundamental property of deterministic quantum systems. It provides quantum chaos passing to the classical dynamical chaos under the ordinary semiclassical transition, which extends the correspondence principle to chaotic systems. In return one should accept the modified form of quantum formalism (exemplified by the Schrodinger equation) which, however, does not contradict the ordinary form, and the main postulates, of quantum mechanics. It introduces the principle of the fundamental dynamic multivaluedness extending the quantum paradigm to complex dynamical behaviour. Moreover, a causal solution to the well-known problems of the foundations of quantum mechanics, those of quantum indeterminacy and wave reduction, is also found using the same method. The concept of the fundamental dynamic uncertainty thus established is universal in character and provides a unified scheme of the complete description of arbitrary complex system of any origin. This scheme inc...

  1. The Complex Trauma Questionnaire (ComplexTQ): development and preliminary psychometric properties of an instrument for measuring early relational trauma

    Maggiora Vergano, Carola; Lauriola, Marco; Speranza, Anna M.

    2015-01-01

    Research on the etiology of adult psychopathology and its relationship with childhood trauma has focused primarily on specific forms of maltreatment. This study developed an instrument for the assessment of childhood and adolescence trauma that would aid in identifying the role of co-occurring childhood stressors and chronic adverse conditions. The Complex Trauma Questionnaire (ComplexTQ), in both clinician and self-report versions, is a measure for the assessment of multi-type maltreatment: ...

  2. Liquid structure of acetic acid-water and trifluoroacetic acid-water mixtures studied by large-angle X-ray scattering and NMR.

    Takamuku, Toshiyuki; Kyoshoin, Yasuhiro; Noguchi, Hiroshi; Kusano, Shoji; Yamaguchi, Toshio

    2007-08-01

    The structures of acetic acid (AA), trifluoroacetic acid (TFA), and their aqueous mixtures over the entire range of acid mole fraction xA have been investigated by using large-angle X-ray scattering (LAXS) and NMR techniques. The results from the LAXS experiments have shown that acetic acid molecules mainly form a chain structure via hydrogen bonding in the pure liquid. In acetic acid-water mixtures hydrogen bonds of acetic acid-water and water-water gradually increase with decreasing xA, while the chain structure of acetic acid molecules is moderately ruptured. Hydrogen bonds among water molecules are remarkably formed in acetic acid-water mixtures at xATFA molecules form not a chain structure but cyclic dimers through hydrogen bonding in the pure liquid. In TFA-water mixtures O...O hydrogen bonds among water molecules gradually increase when xA decreases, and hydrogen bonds among water molecules are significantly formed in the mixtures at xATFA molecules are considerably dissociated to hydrogen ions and trifluoroacetate in the mixtures. 1H, 13C, and 19F NMR chemical shifts of acetic acid and TFA molecules for acetic acid-water and TFA-water mixtures have indicated strong relationships between a structural change of the mixtures and the acid mole fraction. On the basis of both LAXS and NMR results, the structural changes of acetic acid-water and TFA-water mixtures with decreasing acid mole fraction and the effects of fluorination of the methyl group on the structure are discussed at the molecular level. PMID:17628099

  3. 3-D profile measurement for complex micro-structures

    HU Chun-guang; HU Xiao-dong; XU Lin-yan; GUO Tong; HU Xiao-tang

    2005-01-01

    Micro-structures 3-D profile measurement is an important measurement content for research on micro-machining and characterization of micro-dimension. In this paper,a new method involved 2-D structure template, which guides phase unwrapping,is proposed based on phase-shifting microscopic interferometry.It is fit not only for static measurement, but also for dynamic measurement,especially for motion of MEMS devices.3-D profile of active comb of micro-resonator is obtained by using the method.The theoretic precision in out-of-plane direction is better than 0.5 nm.The in-plane theoretic precision in micro-structures is better than 0.5 μm.But at the edge of micro-structures,it is on the level of micrometer mainly caused by imprecise edge analysis.Finally,its disadvantages and the following development are discussed.

  4. Block-based test data adequacy measurement criteria and test complexity metrics

    2002-01-01

    On the basis of software testing tools we developed for programming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program-based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J-complexity 0; J-complexity 1; J-complexity 1 +; J-complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.

  5. Block-based test data adequacy measurement criteria and test complexity metrics

    陈卫东; 杨建军; 叶澄清; 潘云鹤

    2002-01-01

    On the basis of software testing tools we developed for progrmnming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program-based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J-complexity 0; J-complexity 1 ; J-complexity 1 + ; J-complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.

  6. On the Measurement of Turbulence Over Complex Mountainous Terrain

    Stiperski, Ivana; Rotach, Mathias W.

    2016-04-01

    The theoretical treatment of turbulence is largely based on the assumption of horizontally homogeneous and flat underlying surfaces. Correspondingly, approaches developed over the years to measure turbulence statistics in order to test this theoretical understanding or to provide model input, are also largely based on the same assumption of horizontally homogeneous and flat terrain. Here we discuss aspects of turbulence measurements that require special attention in mountainous terrain. We especially emphasize the importance of data quality (flux corrections, data quality assessment, uncertainty estimates) and address the issues of coordinate systems and different post-processing options in mountainous terrain. The appropriate choice of post-processing methods is then tested based on local scaling arguments. We demonstrate that conclusions drawn from turbulence measurements obtained in mountainous terrain are rather sensitive to these post-processing choices and give suggestions as to those that are most appropriate.

  7. Complex measurement of risk factors at uranium mine workplaces

    The measurement reported was oriented to monitoring the concentrations of dust aerosol and nitrogen oxides during individual operations and the impact of diesel machinery. Other data on the measurement point included the air flow volume, temperature, relative humidity, activity of radon and its daughters. Thanks to the fact that in uranium mines a high value of fresh air flow is prescribed in view of radon and radon daughters contamination, the concentrations of dust aerosol and nitrogen oxides was found not to even reach permissible values. (Ha)

  8. Measuring complexity, nonextensivity and chaos in the DNA sequence of the Major Histocompatibility Complex

    Pavlos, G. P.; Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Xenakis, M. N.; Clark, Peter; Duke, Jamie; Monos, D. S.

    2015-11-01

    We analyze 4 Mb sequences of the Major Histocompatibility Complex (MHC), which is a DNA segment on chromosome 6 with high gene density, controlling many immunological functions and associated with many diseases. The analysis is based on modern theoretical and mathematical tools of complexity theory, such as nonlinear time series analysis and Tsallis non-extensive statistics. The results revealed that the DNA complexity and self-organization can be related to fractional dynamical nonlinear processes with low-dimensional deterministic chaotic and non-extensive statistical character, which generate the DNA sequences under the extremization of Tsallis q-entropy principle. While it still remains an open question as to whether the DNA walk is a fractional Brownian motion (FBM), a static anomalous diffusion process or a non-Gaussian dynamical fractional anomalous diffusion process, the results of this study testify for the latter, providing also a possible explanation for the previously observed long-range power law correlations of nucleotides, as well as the long-range correlation properties of coding and non-coding sequences present in DNA sequences.

  9. Comparison of task complexity measures for emergency operating procedures: Convergent validity and predictive validity

    Human performance while executing operating procedures is critically important for the safety of complex industrial systems. To predict and model human performance, several complexity measures have been developed. This study aims to compare the convergent validity and predictive validity of three existing complexity measures, step complexity (SC), task size, and task complexity (TC), using operator performance data collected from an emergency operating procedure (EOP) experiment. This comparative study shows that these measures have a high convergent validity with each other, most likely because all of them involve the size dimension of complexity. These measures and their sub-measures also have a high predictive validity for operation time and a moderate-to-high predictive validity for error rate, except the step logic complexity (SLC) measure, a component of the SC measure. SLC appears not to contribute to the predictive validity in the experimental EOPs. The use of visual, auditory, cognitive, and psychomotor (VACP) rating scales in the TC measure seems to be significantly beneficial for explaining the human error rate; however, these rating scales appear not to adequately reflect the complexity differences among the meta-operations in EOPs

  10. Measuring Viscosity with a Levitating Magnet: Application to Complex Fluids

    Even, C.; Bouquet, F.; Remond, J.; Deloche, B.

    2009-01-01

    As an experimental project proposed to students in fourth year of university, a viscometer was developed, consisting of a small magnet levitating in a viscous fluid. The viscous force acting on the magnet is directly measured: viscosities in the range 10-10[superscript 6] mPa s are obtained. This experiment is used as an introduction to complex…

  11. Resolving and measuring diffusion in complex interfaces: Exploring new capabilities

    Alam, Todd M. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This exploratory LDRD targeted the use of a new high resolution spectroscopic diffusion capabilities developed at Sandia to resolve transport processes at interfaces in heterogeneous polymer materials. In particular, the combination of high resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) spectroscopy with pulsed field gradient (PFG) diffusion experiments were used to directly explore interface diffusion within heterogeneous polymer composites, including measuring diffusion for individual chemical species in multi-component mixtures. Several different types of heterogeneous polymer systems were studied using these HRMAS NMR diffusion capabilities to probe the resolution limitations, determine the spatial length scales involved, and explore the general applicability to specific heterogeneous systems. The investigations pursued included a) the direct measurement of the diffusion for poly(dimethyl siloxane) polymer (PDMS) on nano-porous materials, b) measurement of penetrant diffusion in additive manufactures (3D printed) processed PDMS composites, and c) the measurement of diffusion in swollen polymers/penetrant mixtures within nano-confined aluminum oxide membranes. The NMR diffusion results obtained were encouraging and allowed for an improved understanding of diffusion and transport processes at the molecular level, while at the same time demonstrating that the spatial heterogeneity that can be resolved using HRMAS NMR PFG diffusion experiment must be larger than ~μm length scales, expect for polymer transport within nanoporous carbons where additional chemical resolution improves the resolvable heterogeneous length scale to hundreds of nm.

  12. Simulation and Efficient Measurements of Intensities for Complex Imaging Sequences

    Jensen, Jørgen Arendt; Rasmussen, Morten Fischer; Stuart, Matthias Bo;

    2014-01-01

    on the sequence to simulate both intensity and mechanical index (MI) according to FDA rules. A 3 MHz BK Medical 8820e convex array transducer is used with the SARUS scanner. An Onda HFL-0400 hydrophone and the Onda AIMS III system measures the pressure field for three imaging schemes: a fixed focus, single...

  13. An approach to measuring adolescents' perception of complexity for pictures of fruit and vegetable mixes

    Mielby, Line Holler; Bennedbæk-Jensen, Sidsel; Edelenbos, Merete;

    2013-01-01

    Complexity is an important parameter in the food industry because of its relationship with hedonic appreciation. However, difficulties are encountered when measuring complexity. The hypothesis of this study was that sensory descriptive analysis is an effective tool for deriving terms to measure a...... simplicity can be used to measure perceived complexity. In relation to attractiveness, different optimal levels of simplicity of pictures of fruit mixes were found for segments of the adolescent consumer group.......Complexity is an important parameter in the food industry because of its relationship with hedonic appreciation. However, difficulties are encountered when measuring complexity. The hypothesis of this study was that sensory descriptive analysis is an effective tool for deriving terms to measure....... An adolescent consumer group (n = 242) and an adult consumer group (n = 86) subsequently rated the pictures on simplicity and attractiveness. Pearson's correlation coefficients revealed strong correlations between the sensory panel and both consumer groups' usage of simplicity. This suggests that...

  14. Complex permittivity measurements of ferroelectric employing composite dielectric resonator technique

    Krupka, J.; Zychowicz, T.; Bovtun, Viktor; Veljko, Sergiy

    2006-01-01

    Roč. 53, č. 10 (2006), s. 1883-1888. ISSN 0885-3010 R&D Projects: GA AV ČR(CZ) IAA1010213; GA ČR(CZ) GA202/04/0993; GA ČR(CZ) GA202/06/0403 Institutional research plan: CEZ:AV0Z10100520 Keywords : dielectric resonator * ferroelectrics * microwave measurements Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.729, year: 2006

  15. Urban sustainability : complex interactions and the measurement of risk

    Lidia Diappi

    1999-05-01

    Full Text Available This paper focuses on the concept of asustainable city and its theoretical implications for the urban system. Urban sustainability is based on positive interactions among three different urban sub-systems : social, economic and physical, where social well-being coexists with economic development and environmental quality. This utopian scenario doesn’t appear. Affluent economy is often associated with poverty and criminality, labour variety and urban efficiency coexist with pollution and congestion. The research subject is the analysis of local risk and opportunity conditions, based on the application of a special definition of risk elaborated and made operative with the production of a set of maps representing the multidimensional facets of spatial organisation in urban sustainability. The interactions among the economic/social and environmental systems are complex and unpredictable and present the opportunity for a new methodology of scientific investigation : the connectionistic approach, processed by Self-Reflexive Neural Networks (SRNN. These Networks are a useful instrument of investigation and analogic questioning of the Data Base. Once the SRNN has learned the structure of the weights from the DB, by querying the network with the maximization or minimization of specific groups of attributes, it is possible to read the related properties and to rank the areas. The survey scale assumed by the research is purposefully aimed at the micro-scale and concerns the Municipality of Milan which is spatially divided into 144 zones.

  16. Measuring patient satisfaction in complex continuing care/rehabilitation care.

    Malik, Navin; Alvaro, Celeste; Kuluski, Kerry; Wilkinson, Andrea J

    2016-04-18

    Purpose - The purpose of this paper is to develop a psychometrically validated survey to assess satisfaction in complex continuing care (CCC)/rehabilitation patients. Design/methodology/approach - A paper or computer-based survey was administered to 252 CCC/rehabilitation patients (i.e. post-acute hospital care setting for people who require ongoing care before returning home) across two hospitals in Toronto, Ontario, Canada. Findings - Using factor analysis, five domains were identified with loadings above 0.4 for all but one item. Behavioral intention and information/communication showed the lowest patient satisfaction, while patient centredness the highest. Each domain correlated positively and significantly predicted overall satisfaction, with quality and safety showing the strongest predictive power and the healing environment the weakest. Gender made a significant contribution to predicting overall satisfaction, but age did not. Research limitations/implications - Results provide evidence of the survey's psychometric properties. Owing to a small sample, supplemental testing with a larger patient group is required to confirm the five-factor structure and to assess test-retest reliability. Originality/value - Improving the health system requires integrating patient perspectives. The patient experience, however, will vary depending on the population being served. This is the first psychometrically validated survey specific to a smaller specialty patient group receiving care at a CCC/rehabilitation facility in Canada. PMID:27120509

  17. Complexity and Information: Measuring Emergence, Self-organization, and Homeostasis at Multiple Scales

    Gershenson, Carlos

    2012-01-01

    Concepts used in the scientific study of complex systems have become so widespread that their use and abuse has led to ambiguity and confusion in their meaning. In this paper we use information theory to provide abstract and concise measures of complexity, emergence, self-organization, and homeostasis. The purpose is to clarify the meaning of these concepts with the aid of the proposed formal measures. In a simplified version of the measures (focussing on the information produced by a system), emergence becomes the opposite of self-organization, while complexity represents their balance. We use computational experiments on random Boolean networks and elementary cellular automata to illustrate our measures at multiple scales.

  18. Measuring the complex field scattered by single submicron particles

    Potenza, Marco A. C., E-mail: marco.potenza@unimi.it; Sanvito, Tiziano [Department of Physics, University of Milan, via Celoria, 16 – I-20133 Milan (Italy); CIMAINA, University of Milan, via Celoria, 16 – I-20133 Milan (Italy); EOS s.r.l., viale Ortles 22/4, I-20139 Milan (Italy); Pullia, Alberto [Department of Physics, University of Milan, via Celoria, 16 – I-20133 Milan (Italy)

    2015-11-15

    We describe a method for simultaneous measurements of the real and imaginary parts of the field scattered by single nanoparticles illuminated by a laser beam, exploiting a self-reference interferometric scheme relying on the fundamentals of the Optical Theorem. Results obtained with calibrated spheres of different materials are compared to the expected values obtained through a simplified analytical model without any free parameters, and the method is applied to a highly polydisperse water suspension of Poly(D,L-lactide-co-glycolide) nanoparticles. Advantages with respect to existing methods and possible applications are discussed.

  19. Reconstruction of Complex Materials by Integral Geometric Measures

    2002-01-01

    The goal of much research in computational materials science is to quantify necessary morphological information and then to develop stochastic models which both accurately reflect the material morphology and allow one to estimate macroscopic physical properties. A novel method of characterizing the morphology of disordered systems is presented based on the evolution of a family of integral geometric measures during erosion and dilation operations.The method is used to determine the accuracy of model reconstructions of random systems. It is shown that the use of erosion/dilation operations on the original image leads to a more accurate discrimination of morphology than previous methods.

  20. Prediction of Software Requirements Stability Based on Complexity Point Measurement Using Multi-Criteria Fuzzy Approach

    D. Francis Xavier Christopher

    2012-12-01

    Full Text Available Many software projects fail due to instable requirements and lack of managing the requirements changesefficiently. Software Requirements Stability Index Metric (RSI helps to evaluate the overall stability ofrequirements and also keep track of the project status. Higher the stability, less changes tends topropagate. The existing system use Function Point modeling for measuring the Requirements Stability.However, the main drawback of the existing modeling is that the complexity of non-functional requirementshas not been measured for Requirements Stability. The Non-Functional Factors plays a vital role inassessing the Requirements Stability. Numerous Measurement methods have been proposed for measuringthe software complexity. This paper proposes Multi-criteria Fuzzy Based approach for finding out thecomplexity weight based on Requirement Complexity Attributes such as Functional RequirementComplexity, Non-Functional Requirement Complexity, Input Output Complexity, Interface and FileComplexity. Based on the complexity weight, this paper computes the software complexity point. And thenpredict the Software Requirements Stability based on Software Complexity Point changes. The advantageof this model is that it is able to estimate the software complexity early which in turn predicts the SoftwareRequirement Stability during the software development life cycle.

  1. A comparison of LMC and SDL complexity measures on binomial distributions

    Piqueira, José Roberto C.

    2016-02-01

    The concept of complexity has been widely discussed in the last forty years, with a lot of thinking contributions coming from all areas of the human knowledge, including Philosophy, Linguistics, History, Biology, Physics, Chemistry and many others, with mathematicians trying to give a rigorous view of it. In this sense, thermodynamics meets information theory and, by using the entropy definition, López-Ruiz, Mancini and Calbet proposed a definition for complexity that is referred as LMC measure. Shiner, Davison and Landsberg, by slightly changing the LMC definition, proposed the SDL measure and the both, LMC and SDL, are satisfactory to measure complexity for a lot of problems. Here, SDL and LMC measures are applied to the case of a binomial probability distribution, trying to clarify how the length of the data set implies complexity and how the success probability of the repeated trials determines how complex the whole set is.

  2. Design of New Complex Detector Used for Gross Beta Measuring

    The level of gross β for radioactive aerosol in the containment of nuclear plants can indicate how serious the radioactive pollution is in the shell, and it can provide evidence which shows whether there is the phenomenon of leak in the boundaries of confined aquifer of the primary coolant circuit equipment.In the process of measuring, the counting of gross β is influenced by γ. In order to avoid the influence of γ, a new method was introduced and a new detector was designed using plastic scintillator as the major detecting component and BGO as the sub-component. Based on distinctive difference of light attenuation time, signal induced in them can be discriminated. γ background in plastic scintillator was subtracted according to the counting of γ in BGO. The functions of absolute detection efficiency were obtained. The simulation for Monte-Carlo method shows that the influence of γ background is decreased about one order of magnitude. (authors)

  3. What the complex joint probabilities observed in weak measurements can tell us about quantum physics

    Hofmann, Holger F. [Graduate School of Advanced Sciences of Matter, Hiroshima University, Kagamiyama 1-3-1, Higashi Hiroshima 739-8530, Japan and JST, CREST, Sanbancho 5, Chiyoda-ku, Tokyo 102-0075 (Japan)

    2014-12-04

    Quantummechanics does not permit joint measurements of non-commuting observables. However, it is possible to measure the weak value of a projection operator, followed by the precise measurement of a different property. The results can be interpreted as complex joint probabilities of the two non-commuting measurement outcomes. Significantly, it is possible to predict the outcome of completely different measurements by combining the joint probabilities of the initial state with complex conditional probabilities relating the new measurement to the possible combinations of measurement outcomes used in the characterization of the quantum state. We can therefore conclude that the complex conditional probabilities observed in weak measurements describe fundamental state-independent relations between non-commuting properties that represent the most fundamental form of universal laws in quantum physics.

  4. What the complex joint probabilities observed in weak measurements can tell us about quantum physics

    Quantummechanics does not permit joint measurements of non-commuting observables. However, it is possible to measure the weak value of a projection operator, followed by the precise measurement of a different property. The results can be interpreted as complex joint probabilities of the two non-commuting measurement outcomes. Significantly, it is possible to predict the outcome of completely different measurements by combining the joint probabilities of the initial state with complex conditional probabilities relating the new measurement to the possible combinations of measurement outcomes used in the characterization of the quantum state. We can therefore conclude that the complex conditional probabilities observed in weak measurements describe fundamental state-independent relations between non-commuting properties that represent the most fundamental form of universal laws in quantum physics

  5. Matrix Energy as a Measure of Topological Complexity of a Graph

    Sinha, Kaushik

    2016-01-01

    The complexity of highly interconnected systems is rooted in the interwoven architecture defined by its connectivity structure. In this paper, we develop matrix energy of the underlying connectivity structure as a measure of topological complexity and highlight interpretations about certain global features of underlying system connectivity patterns. The proposed complexity metric is shown to satisfy the Weyuker criteria as a measure of its validity as a formal complexity metric. We also introduce the notion of P point in the graph density space. The P point acts as a boundary between multiple connectivity regimes for finite-size graphs.

  6. Computed phase diagrams for the system: Sodium hydroxide-uric acid-hydrochloric acid-water

    Brown, W. E.; Gregory, T. M.; Füredi-Milhofer, H.

    1987-07-01

    Renal stone formation is made complex by the variety of solid phases that are formed, by the number of components in the aqueous phase, and by the multiplicity of ionic dissociation and association processes that are involved. In the present work we apply phase diagrams calculated by the use of equilibrium constants from the ternary system sodium hydroxide-uric acid-water to simplify and make more rigorous the understanding of the factors governing dissolution and precipitation of uric acid (anhydrous and dihydrate) and sodium urate monohydrate. The system is then examined in terms of four components. Finally, procedures are described for fluids containing more than four components. The isotherms, singular points, and fields of supersaturation and undersaturation are shown in various forms of phase diagrams. This system has two notable features: (1) in the coordinates -log[H 2U] versus -log[NaOH], the solubility isotherms for anhydrous uric acid and uric acid dihydrate approximate straight lines with slopes equal to +1 over a wide range of concentrations. As a result, substantial quantities of sodium acid urate monohydrate can precipitate from solution or dissolve without changing the degree of saturation of uric acid significantly. (2) The solubility isotherm for NaHU·H 2O has a deltoid shape with the low-pH branch having a slope of infinity. As a result of the vertical slope of this isotherm, substantial quantities of uric acid can dissolve or precipitate without changing the degree of saturation of sodium acid urate monohydrate significantly. The H 2U-NaOH singular point has a pH of 6.87 at 310 K in the ternary system.

  7. Measurements of complex impedance in microwave high power systems with a new bluetooth integrated circuit.

    Roussy, Georges; Dichtel, Bernard; Chaabane, Haykel

    2003-01-01

    By using a new integrated circuit, which is marketed for bluetooth applications, it is possible to simplify the method of measuring the complex impedance, complex reflection coefficient and complex transmission coefficient in an industrial microwave setup. The Analog Devices circuit AD 8302, which measures gain and phase up to 2.7 GHz, operates with variable level input signals and is less sensitive to both amplitude and frequency fluctuations of the industrial magnetrons than are mixers and AM crystal detectors. Therefore, accurate gain and phase measurements can be performed with low stability generators. A mechanical setup with an AD 8302 is described; the calibration procedure and its performance are presented. PMID:15078067

  8. Measurement of Characteristic Self-Similarity and Self-Diversity for Complex Mechanical Systems

    ZHOU Meili; LAI Jiangfeng

    2006-01-01

    Based on similarity science and complex system theory, a new concept of characteristic self-diversity and corresponding relations between self-similarity and self-diversity for complex mechanical systems are presented in this paper. Methods of system self-similarity and self-diversity measure between main system and sub-system are studied. Numerical calculations show that the characteristic self-similarity and self-diversity measure method is validity. A new theory and method of self-similarity and self-diversity measure for complexity mechanical system is presented.

  9. The Complex Trauma Questionnaire (ComplexTQ): development and preliminary psychometric properties of an instrument for measuring early relational trauma.

    Maggiora Vergano, Carola; Lauriola, Marco; Speranza, Anna M

    2015-01-01

    Research on the etiology of adult psychopathology and its relationship with childhood trauma has focused primarily on specific forms of maltreatment. This study developed an instrument for the assessment of childhood and adolescence trauma that would aid in identifying the role of co-occurring childhood stressors and chronic adverse conditions. The Complex Trauma Questionnaire (ComplexTQ), in both clinician and self-report versions, is a measure for the assessment of multi-type maltreatment: physical, psychological, and sexual abuse; physical and emotional neglect as well as other traumatic experiences, such rejection, role reversal, witnessing domestic violence, separations, and losses. The four-point Likert scale allows to specifically indicate with which caregiver the traumatic experience has occurred. A total of 229 participants, a sample of 79 nonclinical and that of 150 high-risk and clinical participants, were assessed with the ComplexTQ clinician version applied to Adult Attachment Interview (AAI) transcripts. Initial analyses indicate acceptable inter-rater reliability. A good fit to a 6-factor model regarding the experience with the mother and to a 5-factor model with the experience with the father was obtained; the internal consistency of factors derived was good. Convergent validity was provided with the AAI scales. ComplexTQ factors discriminated normative from high-risk and clinical samples. The findings suggest a promising, reliable, and valid measurement of early relational trauma that is reported; furthermore, it is easy to complete and is useful for both research and clinical practice. PMID:26388820

  10. MEASURING OF COMPLEX STRUCTURE TRANSFER FUNCTION AND CALCULATING OF INNER SOUND FIELD

    Chen Yuan; Huang Qibai; Shi Hanmin

    2005-01-01

    In order to measure complex structure transfer function and calculate inner sound field, transfer function of integration is mentioned. By establishing virtual system, transfer function of integration can be measured and the inner sound field can also be calculated. In the experiment, automobile body transfer function of integration is measured and experimental method of establishing virtual system is very valid.

  11. Measurement of the total solar energy transmittance (g-value) for complex glazings

    Duer, Karsten

    1999-01-01

    Four different complex glazings have been investigated in the Danish experimental setup METSET.The purpose of the measurements is to increase the confidence in the calorimetric measurements and to perform measurements and corrections according to a method developed in the ALTSET project...

  12. Counterions release from electrostatic complexes of polyelectrolytes and proteins of opposite charge : a direct measurement

    Gummel, Jérémie; Cousin, Fabrice; Boué, François

    2009-01-01

    Though often considered as one of the main driving process of the complexation of species of opposite charges, the release of counterions has never been experimentally directly measured on polyelectrolyte/proteins complexes. We present here the first structural determination of such a release by Small Angle Neutron Scattering in complexes made of lysozyme, a positively charged protein and of PSS, a negatively charged polyelectrolyte. Both components have the same neutron density length, so th...

  13. Measurement of solubilities for rhodium complexes and phosphine ligands in supercritical carbon dioxide

    Shimoyama, Yusuke; Sonoda, Masanori; Miyazaki, Kaoru; Higashi, Hidenori; Iwai, Yoshio; ARAI, Yasuhiko

    2008-01-01

    The solubilities of phosphine ligands and rhodium (Rh) complexes in supercritical carbon dioxide were measured with Fourier transform infrared (FT-IR) spectroscopy at 320 and 333 K and several pressures. Triphenylphosphine (TPP) and tris(p-trifluoromethylphenyl)-phosphine (TTFMPP) were selected as ligands for the Rh complex. The solubilities of the fluorinated ligands and complexes were compared with those of the non-fluorinated compounds. The solubilities of ligand increased up to 10 times b...

  14. Measuring Search Efficiency in Complex Visual Search Tasks: Global and Local Clutter

    Beck, Melissa R.; Lohrenz, Maura C.; Trafton, J. Gregory

    2010-01-01

    Set size and crowding affect search efficiency by limiting attention for recognition and attention against competition; however, these factors can be difficult to quantify in complex search tasks. The current experiments use a quantitative measure of the amount and variability of visual information (i.e., clutter) in highly complex stimuli (i.e.,…

  15. Information Measures of Complexity, Emergence, Self-organization, Homeostasis, and Autopoiesis

    Fernandez, Nelson; Maldonado, Carlos; Gershenson, Carlos

    2013-01-01

    This chapter reviews measures of emergence, self-organization, complexity, homeostasis, and autopoiesis based on information theory. These measures are derived from proposed axioms and tested in two case studies: random Boolean networks and an Arctic lake ecosystem. Emergence is defined as the information a system or process produces. Self-organization is defined as the opposite of emergence, while complexity is defined as the balance between emergence and self-organization. Homeostasis refle...

  16. Silicon Isotope Fractionation During Acid Water-Igneous Rock Interaction

    van den Boorn, S. H.; van Bergen, M. J.; Vroon, P. Z.

    2007-12-01

    Silica enrichment by metasomatic/hydrothermal alteration is a widespread phenomenon in crustal environments where acid fluids interact with silicate rocks. High-sulfidation epithermal ore deposits and acid-leached residues at hot-spring settings are among the best known examples. Acid alteration acting on basalts has also been invoked to explain the relatively high silica contents of the surface of Mars. We have analyzed basaltic-andesitic lavas from the Kawah Ijen volcanic complex (East Java, Indonesia) that were altered by interaction with highly acid (pH~1) sulfate-chloride water of its crater lake and seepage stream. Quantitative removal of major elements during this interaction has led to relative increase in SiO2 contents. Our silicon isotope data, obtained by HR-MC-ICPMS and reported relative to the NIST RM8546 (=NBS28) standard, show a systematic increase in &δ&&30Si from -0.2‰ (±0.3, 2sd) for unaltered andesites and basalts to +1.5‰ (±0.3, 2sd) for the most altered/silicified rocks. These results demonstrate that silicification induced by pervasive acid alteration is accompanied by significant Si isotope fractionation, so that alterered products become isotopically heavier than the precursor rocks. Despite the observed enrichment in SiO2, the rocks have experienced an overall net loss of silicon upon alteration, if Nb is considered as perfectly immobile. The observed &δ&&30Si values of the alteration products appeared to correlate well with the inferred amounts of silicon loss. These findings would suggest that &28Si is preferentially leached during water-rock interaction, implying that dissolved silica in the ambient lake and stream water is isotopically light. However, layered opaline lake sediments, that are believed to represent precipitates from the silica-saturated water show a conspicuous &30Si-enrichment (+1.2 ± 0.2‰). Because anorganic precipitation is known to discriminate against the heavy isotope (e.g. Basile- Doelsch et al., 2006

  17. Solubilities of Isophthalic Acid in Acetic Acid + Water Solvent Mixtures

    CHENG Youwei; HUO Lei; LI Xi

    2013-01-01

    The solubilities of isophthalic acid (1) in binary acetic acid (2) + water (3) solvent mixtures were determined in a pressurized vessel.The temperature range was from 373.2 to 473.2K and the range of the mole fraction of acetic acid in the solvent mixtures was from x2 =0 to 1.A new method to measure the solubility was developed,which solved the problem of sampling at high temperature.The experimental results indicated that within the temperature range studied,the solubilities of isophthalic acid in all mixtures showed an increasing trend with increasing temperature.The experimental solubilities were correlated by the Buchowski equation,and the calculate results showed good agreement with the experimental solubilities.Furthermore,the mixed solvent systems were found to exhibit a maximum solubility effect on the solubility,which may be attributed to the intermolecular association between the solute and the solvent mixture.The maximum solubility effect was well modeled by the modified Wilson equation.

  18. Size Distribution Studies on Sulfuric Acid-Water Particles in a Photolytic Reactor

    Abdullahi, H. U.; Kunz, J. C.; Hanson, D. R.; Thao, S.; Vences, J.

    2015-12-01

    The size distribution of particles composed of sulfuric acid and water were measured in a Photolytic cylindrical Flow Reactor (PhoFR, inner diameter 5 cm, length ~ 100 cm). In the reactor, nitrous acid, water and sulfur dioxide gases along with ultraviolet light produced sulfuric acid. The particles formed from these vapors were detected with a scanning mobility particle spectrometer equipped with a diethylene glycol condensation particle counter (Jiang et al. 2011). For a set of standard conditions, particles attained a log-normal distribution with a peak diameter of 6 nm, and a total number of about 3x105 cm-3. The distributions show that ~70 % of the particles are between 4 and 8 nm diameter (lnσ ~ 0.37). These standard conditions are: 296 K, 25% relative humidity, total flow = 3 sLpm, ~10 ppbv HONO, SO2 in excess. With variations of relative humidity, the total particle number varied strongly, with a power relationship of ~3.5, and the size distributions showed a slight increase in peak diameter with relative humidity, increasing about 1 nm from 8 to 33 % relative humidity. Variations of HONO at a constant light intensity (wavelength of ~ 360 nm) were performed and particle size and total number changed dramatically. Size distributions also changed drastically with variations of light intensity, accomplished by turning on/off some of the black light flourescent bulbs that illuminated the flow reactor. Comparisons of these size distributions to recently published nucleation experiments (e.g. Zollner et al., Glasoe et al.) as well as to simulations of PhoFR reveal important details about the levels of sulfuric acid present in PhoFR as well as possible base contaminants.

  19. Binary Homogeneous Nucleation of Sulfuric Acid-Water: Particle Size Distribution and Effect of

    Neitola, K.; Brus, David; Sipilä, M.; Kulmala, M.

    Thessaloniki : Hellenic Association for Aerosol Research, 2008, T03A041P. [European Aerosol Conference 2008. Thessaloniki (GR), 24.08.2008-29.08.2008] Institutional research plan: CEZ:AV0Z40720504 Keywords : sulphuric acid -water * homogeneous nucleation Subject RIV: CF - Physical ; Theoretical Chemistry

  20. Variances as order parameter and complexity measure for random Boolean networks

    Luque, Bartolo [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain); Ballesteros, Fernando J [Observatori Astronomic, Universitat de Valencia, Ed. Instituts d' Investigacio, Pol. La Coma s/n, E-46980 Paterna, Valencia (Spain); Fernandez, Manuel [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain)

    2005-02-04

    Several order parameters have been considered to predict and characterize the transition between ordered and disordered phases in random Boolean networks, such as the Hamming distance between replicas or the stable core, which have been successfully used. In this work, we propose a natural and clear new order parameter: the temporal variance. We compute its value analytically and compare it with the results of numerical experiments. Finally, we propose a complexity measure based on the compromise between temporal and spatial variances. This new order parameter and its related complexity measure can be easily applied to other complex systems.

  1. In Situ Fluorescence Microscopic Measurements of Complexation Reactions at Liquid/Liquid Interface

    TSUKAHARA, Satoshi

    2005-01-01

    In situ microscopic measurement is a novel approach to clarify the intrinsic mechanism of complexation reactions occurring at liquid/liquid interfaces. The present review was mainly focused on recent three topics of methodology of in situ fluorescence microscopic observation and measurement of interfacial complexation reactions: (1) two kinds of self-assemblies of Pd2+ and 5,10,15,20-tetra(4-pyridyl)-21H, 23H-porphine complexes formed at the toluene/water interface, (2) microextraction of Eu3...

  2. Stochastic processes with values in Riemannian admissible complex: Isotropic process, Wiener measure and Brownian motion

    The purpose of this work was to construct a Brownian motion with values in simplicial complexes with piecewise differential structure. After a martingale theory attempt, we constructed a family of continuous Markov processes with values in an admissible complex; we named every process of this family, isotropic transport process. We showed that the family of the isotropic processes contains a subsequence, which converged weakly to a measure; we named it the Wiener measure. Then, we constructed, thanks to the finite dimensional distributions of the Wiener measure a new continuous Markov process with values in an admissible complex: the Brownian motion. We finished with a geometric analysis of this Brownian motion, to determinate, under hypothesis on the complex, the recurrent or transient behavior of such process. (author)

  3. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    Park, J. K.; Jung, W. D.; Kim, J. W.; Ha, J. J

    2001-04-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed.

  4. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed

  5. Modeling complexity in pathologist workload measurement: the Automatable Activity-Based Approach to Complexity Unit Scoring (AABACUS).

    Cheung, Carol C; Torlakovic, Emina E; Chow, Hung; Snover, Dale C; Asa, Sylvia L

    2015-03-01

    Pathologists provide diagnoses relevant to the disease state of the patient and identify specific tissue characteristics relevant to response to therapy and prognosis. As personalized medicine evolves, there is a trend for increased demand of tissue-derived parameters. Pathologists perform increasingly complex analyses on the same 'cases'. Traditional methods of workload assessment and reimbursement, based on number of cases sometimes with a modifier (eg, the relative value unit (RVU) system used in the United States), often grossly underestimate the amount of work needed for complex cases and may overvalue simple, small biopsy cases. We describe a new approach to pathologist workload measurement that aligns with this new practice paradigm. Our multisite institution with geographically diverse partner institutions has developed the Automatable Activity-Based Approach to Complexity Unit Scoring (AABACUS) model that captures pathologists' clinical activities from parameters documented in departmental laboratory information systems (LISs). The model's algorithm includes: 'capture', 'export', 'identify', 'count', 'score', 'attribute', 'filter', and 'assess filtered results'. Captured data include specimen acquisition, handling, analysis, and reporting activities. Activities were counted and complexity units (CUs) generated using a complexity factor for each activity. CUs were compared between institutions, practice groups, and practice types and evaluated over a 5-year period (2008-2012). The annual load of a clinical service pathologist, irrespective of subspecialty, was ∼40,000 CUs using relative benchmarking. The model detected changing practice patterns and was appropriate for monitoring clinical workload for anatomical pathology, neuropathology, and hematopathology in academic and community settings, and encompassing subspecialty and generalist practices. AABACUS is objective, can be integrated with an LIS and automated, is reproducible, backwards compatible

  6. The Microcantilever: A Versatile Tool for Measuring the Rheological Properties of Complex Fluids

    I. Dufour

    2012-01-01

    Full Text Available Silicon microcantilevers can be used to measure the rheological properties of complex fluids. In this paper, two different methods will be presented. In the first method, the microcantilever is used to measure the hydrodynamic force exerted by a confined fluid on a sphere that is attached to the microcantilever. In the second method, the measurement of the microcantilever's dynamic spectrum is used to extract the hydrodynamic force exerted by the surrounding fluid on the microcantilever. The originality of the proposed methods lies in the fact that not only may the viscosity of the fluid be measured, but also the fluid's viscoelasticity, that is, both viscous and elastic properties, which are key parameters in the case of complex fluids. In both methods, the use of analytical equations permits the fluid's complex shear modulus to be extracted and expressed as a function of shear stress and/or frequency.

  7. Complexity

    Gershenson, Carlos

    2011-01-01

    The term complexity derives etymologically from the Latin plexus, which means interwoven. Intuitively, this implies that something complex is composed by elements that are difficult to separate. This difficulty arises from the relevant interactions that take place between components. This lack of separability is at odds with the classical scientific method - which has been used since the times of Galileo, Newton, Descartes, and Laplace - and has also influenced philosophy and engineering. In recent decades, the scientific study of complexity and complex systems has proposed a paradigm shift in science and philosophy, proposing novel methods that take into account relevant interactions.

  8. Exploring The Globalization Of German Mncs With The Complex Spread And Diversity Measure

    Jan Hendrik Fisch; Michael-Jörg Oesterle

    2003-01-01

    In this paper, we present a new quantitative measurement concept that integrates multiple dimensions of internationalization in a complex number and tries to measure globalization instead of simple internationalization. We apply this measure to assess the globalization states and processes of the most internationalized German MNCs. Our results suggest that these MNCs are neither globalized nor do they show a straightforward path towards globalization in the last decade. This outcome contradic...

  9. Fast laser systems for measuring the geometry of complex-shaped objects

    Galiulin, Ravil M.; Galiulin, Rishat M.; Bakirov, J. M.; Vorontsov, A. V.; Ponomarenko, I. V.

    1999-01-01

    The technical characteristics, advantages and applications of an automated optoelectronic measuring system designed by 'Optel' company, State Aviation University of Ufa, are presented in this paper. The measuring apparatus can be applied for industrial development and research, for example, in rapid prototyping, and for obtaining geometrical parameters in medicine and criminalistics. It essentially is a non-contact and rapid scanning system, allowing measurements of complex shaped objects like metal and plastic workpieces or parts of human body.

  10. Design and Functional Validation of a Complex Impedance Measurement Device for Characterization of Ultrasonic Transducers

    This paper presents the design and practical implementation of a complex impedance measurement device capable of characterization of ultrasonic transducers. The device works in the frequency range used by industrial ultrasonic transducers which is below the measurement range of modern high end network analyzers. The device uses the Goertzel algorithm instead of the more common FFT algorithm to calculate the magnitude and phase component of the impedance under test. A theoretical overview is given followed by a practical approach and measurement results. (authors)

  11. The effect of electrode contact resistance and capacitive coupling on Complex Resistivity measurements

    Ingeman-Nielsen, Thomas

    2006-01-01

    The effect of electrode contact resistance and capacitive coupling on complex resistivity (CR) measurements is studied in this paper. An equivalent circuit model for the receiver is developed to describe the effects. The model shows that CR measurements are severely affected even at relatively lo...... the contact resistance artificially increased by resistors. The results emphasize the importance of keeping contact resistance low in CR measurements....

  12. Complexity measures for object-oriented conceptual models of an application domain

    Poels, Geert; Dedene, Guido

    1997-01-01

    According to Norman Fenton few work has been done on measuring the complexity of the problems underlying software development. Nonetheless, it is believed that this attribute has a significant impact on software quality and development effort. A substantial portion of the underlying problems are captured in the conceptual model of the application domain. Based on previous work on conceptual modelling of aplication domains, the attribute 'complexity of a conceptual model' is formally define...

  13. Implementing digital holograms to create and measure complex-plane optical fields

    Dudley, Angela; Majola, Nombuso; Chetty, Naven; Forbes, Andrew

    2016-02-01

    The coherent superposition of a Gaussian beam with an optical vortex can be mathematically described to occupy the complex plane. We provide a simple analogy between the mathematics, in the form of the complex plane, and the visual representation of these two superimposed optical fields. We provide detailed instructions as to how one can experimentally produce, measure, and control these fields with the use of digital holograms encoded on a spatial light modulator.

  14. Measuring streetscape complexity based on the statistics of local contrast and spatial frequency.

    Cavalcante, André; Mansouri, Ahmed; Kacha, Lemya; Barros, Allan Kardec; Takeuchi, Yoshinori; Matsumoto, Naoji; Ohnishi, Noboru

    2014-01-01

    Streetscapes are basic urban elements which play a major role in the livability of a city. The visual complexity of streetscapes is known to influence how people behave in such built spaces. However, how and which characteristics of a visual scene influence our perception of complexity have yet to be fully understood. This study proposes a method to evaluate the complexity perceived in streetscapes based on the statistics of local contrast and spatial frequency. Here, 74 streetscape images from four cities, including daytime and nighttime scenes, were ranked for complexity by 40 participants. Image processing was then used to locally segment contrast and spatial frequency in the streetscapes. The statistics of these characteristics were extracted and later combined to form a single objective measure. The direct use of statistics revealed structural or morphological patterns in streetscapes related to the perception of complexity. Furthermore, in comparison to conventional measures of visual complexity, the proposed objective measure exhibits a higher correlation with the opinion of the participants. Also, the performance of this method is more robust regarding different time scenarios. PMID:24498292

  15. Measuring streetscape complexity based on the statistics of local contrast and spatial frequency.

    André Cavalcante

    Full Text Available Streetscapes are basic urban elements which play a major role in the livability of a city. The visual complexity of streetscapes is known to influence how people behave in such built spaces. However, how and which characteristics of a visual scene influence our perception of complexity have yet to be fully understood. This study proposes a method to evaluate the complexity perceived in streetscapes based on the statistics of local contrast and spatial frequency. Here, 74 streetscape images from four cities, including daytime and nighttime scenes, were ranked for complexity by 40 participants. Image processing was then used to locally segment contrast and spatial frequency in the streetscapes. The statistics of these characteristics were extracted and later combined to form a single objective measure. The direct use of statistics revealed structural or morphological patterns in streetscapes related to the perception of complexity. Furthermore, in comparison to conventional measures of visual complexity, the proposed objective measure exhibits a higher correlation with the opinion of the participants. Also, the performance of this method is more robust regarding different time scenarios.

  16. Three-dimensional quantification of structures in trabecular bone using measures of complexity

    Marwan, Norbert; Kurths, Jürgen; Thomsen, Jesper Skovhus; Felsenberg, Dieter; Saparin, Peter

    2009-01-01

    The study of pathological changes of bone is an important task in diagnostic procedures of patients with metabolic bone diseases such as osteoporosis as well as in monitoring the health state of astronauts during long-term space flights. The recent availability of high-resolution three......-dimensional (3D) imaging of bone challenges the development of data analysis techniques able to assess changes of the 3D microarchitecture of trabecular bone. We introduce an approach based on spatial geometrical properties and define structural measures of complexity for 3D image analysis. These measures...... evaluate different aspects of organization and complexity of 3D structures, such as complexity of its surface or shape variability. We apply these measures to 3D data acquired by high-resolution microcomputed tomography (µCT) from human proximal tibiae and lumbar vertebrae at different stages of...

  17. Classification of periodic, chaotic and random sequences using approximate entropy and Lempel–Ziv complexity measures

    Karthi Balasubramanian; Silpa S Nair; Nithin Nagaraj

    2015-03-01

    ‘Complexity’ has several definitions in diverse fields. These measures are indicators of some aspects of the nature of the signal. Such measures are used to analyse and classify signals and as a signal diagnostics tool to distinguish between periodic, quasiperiodic, chaotic and random signals. Lempel–Ziv (LZ) complexity and approximate entropy (ApEn) are such popular complexity measures that are widely used for characterizing biological signals also. In this paper, we compare the utility of ApEn, LZ complexities and Shannon’s entropy in characterizing data from a nonlinear chaotic map (logistic map). In this work, we show that LZ and ApEn complexity measures can characterize the data complexities correctly for data sequences as short as 20 in length while Shannon’s entropy fails for length less than 50. In the case of noisy sequences with 10% uniform noise, Shannon’s entropy works only for lengths greater than 200 while LZ and ApEn are successful with sequences of lengths greater than 30 and 20, respectively.

  18. Quantification of spatial structure of human proximal tibial bone biopsies using 3D measures of complexity

    Saparin, Peter I.; Thomsen, Jesper Skovhus; Prohaska, Steffen; Zaikin, Alexei; Kurths, Jürgen; Hege, H.-C.; Gowin, Wolfgang

    3D data sets of human tibia bone biopsies acquired by a micro-CT scanner. In order to justify the newly proposed approach, the measures of complexity of the bone architecture were compared with the results of traditional 2D bone histomorphometry. The proposed technique is able to quantify the......Changes in trabecular bone composition during development of osteoporosis are used as a model for bone loss in microgravity conditions during a space flight. Symbolic dynamics and measures of complexity are proposed and applied to assess quantitatively the structural composition of bone tissue from...

  19. An Activation Force-based Affinity Measure for Analyzing Complex Networks

    Jun Guo; Hanliang Guo; Zhanyi Wang

    2011-01-01

    Affinity measure is a key factor that determines the quality of the analysis of a complex network. Here, we introduce a type of statistics, activation forces, to weight the links of a complex network and thereby develop a desired affinity measure. We show that the approach is superior in facilitating the analysis through experiments on a large-scale word network and a protein-protein interaction (PPI) network consisting of ∼5,000 human proteins. The experiment on the word network verifies tha...

  20. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces.

    Nakano, Yuichiro; Akamatsu, Norihiko; Mori, Tsuyoshi; Sano, Kazunori; Satoh, Katsuya; Nagayasu, Takeshi; Miyoshi, Yoshiaki; Sugio, Tomomi; Sakai, Hideyuki; Sakae, Eiji; Ichimiya, Kazuko; Hamada, Masahisa; Nakayama, Takehisa; Fujita, Yuhzo; Yanagihara, Katsunori; Nishida, Noriyuki

    2016-01-01

    Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9) with sonication, and then with acidic water (pH 2.7) without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa) and a fungus (Candida albicans) were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound. PMID:27223116

  1. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces.

    Yuichiro Nakano

    Full Text Available Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9 with sonication, and then with acidic water (pH 2.7 without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa and a fungus (Candida albicans were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound.

  2. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces

    Nakano, Yuichiro; Akamatsu, Norihiko; Mori, Tsuyoshi; Sano, Kazunori; Satoh, Katsuya; Nagayasu, Takeshi; Miyoshi, Yoshiaki; Sugio, Tomomi; Sakai, Hideyuki; Sakae, Eiji; Ichimiya, Kazuko; Hamada, Masahisa; Nakayama, Takehisa; Fujita, Yuhzo; Yanagihara, Katsunori; Nishida, Noriyuki

    2016-01-01

    Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9) with sonication, and then with acidic water (pH 2.7) without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa) and a fungus (Candida albicans) were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound. PMID:27223116

  3. PIV measurements and data accuracy analysis of flow in complex terrain

    Yao, Rentai; Hao, Hongwei; Qiao, Qingdang

    2000-10-01

    In this paper velocity fields and flow visualization in complex terrain in an environmental wind tunnel have been measured using PIV. In addition, it would be useful to appraise the PIV data by comparing the PIV results with those obtained from the well- established point measurement methods, such as constant temperature anemometry (CTA) and Dantec FlowMaster, in order to verify the accuracy of PIV measurements. The results indicate that PIV is a powerful tool for velocity measurements in the environmental wind tunnel.

  4. Comparing entropy with tests for randomness as a measure of complexity in time series

    Gan, Chee Chun

    2015-01-01

    Entropy measures have become increasingly popular as an evaluation metric for complexity in the analysis of time series data, especially in physiology and medicine. Entropy measures the rate of information gain, or degree of regularity in a time series e.g. heartbeat. Ideally, entropy should be able to quantify the complexity of any underlying structure in the series, as well as determine if the variation arises from a random process. Unfortunately current entropy measures mostly are unable to perform the latter differentiation. Thus, a high entropy score indicates a random or chaotic series, whereas a low score indicates a high degree of regularity. This leads to the observation that current entropy measures are equivalent to evaluating how random a series is, or conversely the degree of regularity in a time series. This raises the possibility that existing tests for randomness, such as the runs test or permutation test, may have similar utility in diagnosing certain conditions. This paper compares various t...

  5. Measurement and documentation of complex PTSD in treatment seeking traumatized refugees

    Palic, Sabina

    syndromes on self-report measures. A total of 34% of the refugee clinical convenience sample (n = 116) met the criteria for DESNOS, and 32% were estimated to have one of the two PD. Furthermore, Axis-II pathology and DESNOS was observed in traumatized refugees even when there was no presence of childhood...... limited to measuring symptoms of PTSD, anxiety, and depression. This renders documentation, measurement, and treatment of possible complex traumatic adaptations in traumatized refugees very difficult. The thesis comprises two studies using different measures and different samples. The first study...... investigated complex traumatization as Disorders of Extreme Stress Not Otherwise Specified (DESNOS). The first article from this study demonstrated that DESNOS in a clinical sample of refugees, primarily resembled the Schizotypal, and Paranoid personality disorders (PD), when compared to Axis I and Axis II...

  6. Measuring complex for studying cascade solar photovoltaic cells and concentrator modules on their basis

    Larionov, V. R.; Malevskii, D. A.; Pokrovskii, P. V.; Rumyantsev, V. D.

    2015-06-01

    The design and implementation of several measuring complexes intended for studying cascade solar photovoltaic converters are considered. The complexes consist of a solar simulator and an electronic unit with an active load. The high-aperture light source of the complex reproduces solar intensity over wide spectral range λ = 350-1700 nm with an angle of divergence of ±0.26°, which are characteristic of solar radiation. The active load of the electronic unit allows taking both dark and illuminated I- V characteristics of test objects within about 1 ms during the quasi-stationary part of the irradiation pulse. The small size and low power consumption of the complexes hold out the hope that they will be widely used in designing, refining, and testing cascade efficient photovoltaic converters made of III-V materials and solar modules integrating these converters with concentrator modules.

  7. Counterions release from electrostatic complexes of polyelectrolytes and proteins of opposite charge : a direct measurement

    Gummel, Jérémie; Boué, François

    2009-01-01

    Though often considered as one of the main driving process of the complexation of species of opposite charges, the release of counterions has never been experimentally directly measured on polyelectrolyte/proteins complexes. We present here the first structural determination of such a release by Small Angle Neutron Scattering in complexes made of lysozyme, a positively charged protein and of PSS, a negatively charged polyelectrolyte. Both components have the same neutron density length, so their scattering can be switched off simultaneously in an appropriate "matching" solvent; this enables determination of the spatial distribution of the single counterions within the complexes. The counterions (including the one subjected to Manning condensation) are expelled from the cores where the species are at electrostatic stoichiometry.

  8. The precision of visual memory for a complex contour shape measured by a freehand drawing task.

    Osugi, Takayuki; Takeda, Yuji

    2013-03-01

    Contour information is an important source for object perception and memory. Three experiments examined the precision of visual short-term memory for complex contour shapes. All used a new procedure that assessed recall memory for holistic information in complex contour shapes: Participants studied, then reproduced (without cues), a contoured shape by freehand drawing. In Experiment 1 memory precision was measured by comparing Fourier descriptors for studied and reproduced contours. Results indicated survival of lower (holistic) frequency information (i.e., ⩽5cycles/perimeter) and loss of higher (detail) frequency information. Secondary tasks placed demands on either verbal memory (Experiment 2) or visual spatial memory (Experiment 3). Neither secondary task interfered with recall of complex contour shapes, suggesting that the memory system maintaining holistic shape information was independent of both the verbal memory system and the visual spatial memory subsystem of visual short-term memory. The nature of memory for complex contour shape is discussed. PMID:23296198

  9. Study of proton-transfer processes by the NMR method applied to various nuclei. VIII. The trifluoroacetic acid-water system

    It was shown earlier that the determination of the composition and type of the complexes is possible by the use of the NMR method applied to various nuclei. This method is based on the simultaneous solution of equations describing the concentration dependence of the NMR chemical shifts for the various nuclei in the system and material-balance equations. It has been applied to the investigation of complex-formation and proton-transfer processes in the nitric acid-water system. In the present work the authors studied aqueous solutions of an acid that is weaker than nitric acid, namely trifluoroacetic acid, both of the usual isotopic composition, and also a sample deuterated to the extent of 97.65%, in the concentration range of 0-100 mole %. The considerable changes in the chemical shifts of the 1H, 13C, and 19F nuclei, depending on the concentration, indicate the formation of complexes of various types and compositions

  10. Node-weighted interacting network measures improve the representation of real-world complex systems

    Wiedermann, Marc; Heitzig, Jobst; Kurths, Jürgen

    2013-01-01

    Network theory provides a rich toolbox consisting of methods, measures, and models for studying the structure and dynamics of complex systems found in nature, society, or technology. Recently, it has been pointed out that many real-world complex systems are more adequately mapped by networks of interacting or interdependent networks, e.g., a power grid showing interdependency with a communication network. Additionally, in many real-world situations it is reasonable to include node weights into complex network statistics to reflect the varying size or importance of subsystems that are represented by nodes in the network of interest. E.g., nodes can represent vastly different surface area in climate networks, volume in brain networks or economic capacity in trade networks. In this letter, combining both ideas, we derive a novel class of statistical measures for analysing the structure of networks of interacting networks with heterogeneous node weights. Using a prototypical spatial network model, we show that th...

  11. Effects of Lability of Metal Complex on Free Ion Measurement Using DMT

    Weng, L.P.; Riemsdijk, van W.H.; Temminghoff, E.J.M.

    2010-01-01

    Very low concentrations of free metal ion in natural samples can be measured using the Donnan membrane technique (DMT) based on ion transport kinetics. In this paper, the possible effects of slow dissociation of metal complexes on the interpretation of kinetic DMT are investigated both theoretically

  12. The Word Complexity Measure: Description and Application to Developmental Phonology and Disorders

    Stoel-Gammon, Carol

    2010-01-01

    Miccio's work included a number of articles on the assessment of phonology in children with phonological disorders, typically using measures of correct articulation, using the PCC, or analyses of errors, using the framework of phonological processes. This paper introduces an approach to assessing phonology by examining the phonetic complexity of…

  13. A measure of statistical complexity based on predictive information with application to finite spin systems

    Abdallah, Samer A., E-mail: samer.abdallah@eecs.qmul.ac.uk [School of Electronic Engineering and Computer Science, Queen Mary University of London, London E1 4NS (United Kingdom); Plumbley, Mark D., E-mail: mark.plumbley@eecs.qmul.ac.uk [School of Electronic Engineering and Computer Science, Queen Mary University of London, London E1 4NS (United Kingdom)

    2012-01-09

    We propose the binding information as an information theoretic measure of complexity between multiple random variables, such as those found in the Ising or Potts models of interacting spins, and compare it with several previously proposed measures of statistical complexity, including excess entropy, Bialek et al.'s predictive information, and the multi-information. We discuss and prove some of the properties of binding information, particularly in relation to multi-information and entropy, and show that, in the case of binary random variables, the processes which maximise binding information are the ‘parity’ processes. The computation of binding information is demonstrated on Ising models of finite spin systems, showing that various upper and lower bounds are respected and also that there is a strong relationship between the introduction of high-order interactions and an increase of binding-information. Finally we discuss some of the implications this has for the use of the binding information as a measure of complexity. -- Highlights: ► We introduce ‘binding information’ as a entropic/statistical measure of complexity. ► Binding information (BI) is related to earlier notions of predictive information. ► We derive upper and lower bounds of BI relation to entropy and multi-information. ► Parity processes found to maximise BI in finite sets of binary random variables. ► Application to spin glasses shows highest BI obtained with high-order interactions.

  14. 2D and 3D endoanal and translabial ultrasound measurement variation in normal postpartum measurements of the anal sphincter complex

    MERIWETHER, Kate V.; HALL, Rebecca J.; LEEMAN, Lawrence M.; MIGLIACCIO, Laura; QUALLS, Clifford; ROGERS, Rebecca G.

    2015-01-01

    Introduction Women may experience anal sphincter anatomy changes after vaginal or Cesarean delivery. Therefore, accurate and acceptable imaging options to evaluate the anal sphincter complex (ASC) are needed. ASC measurements may differ between translabial (TL-US) and endoanal ultrasound (EA-US) imaging and between 2D and 3D ultrasound. The objective of this analysis was to describe measurement variation between these modalities. Methods Primiparous women underwent 2D and 3D TL-US imaging of the ASC six months after a vaginal birth (VB) or Cesarean delivery (CD). A subset of women also underwent EA-US measurements. Measurements included the internal anal sphincter (IAS) thickness at proximal, mid, and distal levels and the external anal sphincter (EAS) at 3, 6, 9, and 12 o’clock positions as well as bilateral thickness of the pubovisceralis muscle (PVM). Results 433 women presented for US: 423 had TL-US and 64 had both TL-US and EA-US of the ASC. All IAS measurements were significantly thicker on TL-US than EA-US (all p0.20). On both TL-US and EA-US, there were multiple sites where significant asymmetry existed in left versus right measurements. Conclusion The ultrasound modality used to image the ASC introduces small but significant changes in measurements, and the direction of the bias depends on the muscle and location being imaged. PMID:25344221

  15. Measuring economic complexity of countries and products: which metric to use?

    Mariani, Manuel Sebastian; Vidmer, Alexandre; Medo, Matsúš; Zhang, Yi-Cheng

    2015-11-01

    Evaluating the economies of countries and their relations with products in the global market is a central problem in economics, with far-reaching implications to our theoretical understanding of the international trade as well as to practical applications, such as policy making and financial investment planning. The recent Economic Complexity approach aims to quantify the competitiveness of countries and the quality of the exported products based on the empirical observation that the most competitive countries have diversified exports, whereas developing countries only export few low quality products - typically those exported by many other countries. Two different metrics, Fitness-Complexity and the Method of Reflections, have been proposed to measure country and product score in the Economic Complexity framework. We use international trade data and a recent ranking evaluation measure to quantitatively compare the ability of the two metrics to rank countries and products according to their importance in the network. The results show that the Fitness-Complexity metric outperforms the Method of Reflections in both the ranking of products and the ranking of countries. We also investigate a generalization of the Fitness-Complexity metric and show that it can produce improved rankings provided that the input data are reliable.

  16. Measuring Complexity, Development Time and Understandability of a Program: A Cognitive Approach

    Amit Kumar Jakhar

    2014-11-01

    Full Text Available One of the central problems in software engineering is the inherent complexity. Since software is the result of human creative activity and cognitive informatics plays an important role in understanding its fundamental characteristics. This paper models one of the fundamental characteristics of software complexity by examining the cognitive weights of basic software control structures. Cognitive weights are the degree of the difficulty or relative time and effort required for comprehending a given piece of software, which satisfy the definition of complexity. Based on this approach a new concept of New Weighted Method Complexity (NWMC of software is developed. Twenty programs are distributed among 5 PG students and development time is noted of all of them and mean is considered as the actual time needed time to develop the programs and Understandability (UA is also measured of all the programs means how much time needed to understand the code. This paper considers Jingqiu Shao et al Cognitive Functional Size (CFS of software for study. In order to validate the new complexity metrics we have calculated the correlation between proposed metric and CFS with respect to actual development time and performed analysis of NWMC with CFS with Mean Relative Error (MRE and Standard Deviation (Std.. Finally, the authors found that the accuracy to estimate the development time with proposed measure is far better than CFS.

  17. Microbial growth and biofilm formation in geologic media is detected with complex conductivity measurements

    Davis, Caroline A.; Atekwana, Estella; Atekwana, Eliot; Slater, Lee D.; Rossbach, Silvia; Mormile, Melanie R.

    2006-09-01

    Complex conductivity measurements (0.1-1000 Hz) were obtained from biostimulated sand-packed columns to investigate the effect of microbial growth and biofilm formation on the electrical properties of porous media. Microbial growth was verified by direct microbial counts, pH measurements, and environmental scanning electron microscope imaging. Peaks in imaginary (interfacial) conductivity in the biostimulated columns were coincident with peaks in the microbial cell concentrations extracted from sands. However, the real conductivity component showed no discernible relationship to microbial cell concentration. We suggest that the observed dynamic changes in the imaginary conductivity (σ″) arise from the growth and attachment of microbial cells and biofilms to sand surfaces. We conclude that complex conductivity techniques, specifically imaginary conductivity measurements are a proxy indicator for microbial growth and biofilm formation in porous media. Our results have implications for microbial enhanced oil recovery, CO2 sequestration, bioremediation, and astrobiology studies.

  18. Recurrence Plot Based Measures of Complexity and its Application to Heart Rate Variability Data

    Marwan, N; Meyerfeldt, U; Schirdewan, A; Kurths, J

    2002-01-01

    In complex systems the knowledge of transitions between regular, laminar or chaotic behavior is essential to understand the processes going on there. Linear approaches are often not sufficient to describe these processes and several nonlinear methods require rather long time observations. To overcome these difficulties, we propose measures of complexity based on vertical structures in recurrence plots and apply them to the logistic map as well as to heart rate variability data. For the logistic map these measures enable us to detect transitions between chaotic and periodic states, as well as to identify additional laminar states, i.e. chaos-chaos transitions. Traditional recurrence quantification analysis fails to detect these latter transitions. Applying our new measures to the heart rate variability data, we are able to detect and quantify laminar phases before a life-threatening cardiac arrhythmia and, thus, to enable a prediction of such an event. Our findings could be of importance for the therapy of mal...

  19. Measuring the pollutant transport capacity of dissolved organic matter in complex matrixes

    Persson, L.; Alsberg, T.; Odham, G.;

    2003-01-01

    Dissolved organic matter (DOM) facilitated transport in contaminated groundwater was investigated through the measurement of the binding capacity of landfill leachate DOM (Vejen, Denmark) towards two model pollutants (pyrene and phenanthrene). Three different methods for measuring binding capacity...... were used and evaluated, head-space solid-phase micro-extraction (HS-SPME), enhanced solubility (ES) and fluorescence quenching (FQ). It was concluded that for samples with complex matrixes it was possible to measure the net effect of the DOM binding capacity and the salting out effect of the matrix...... binding capacity....

  20. Complex hand dexterity: a review of biomechanical methods for measuring musical performance.

    Metcalf, Cheryl D; Irvine, Thomas A; Sims, Jennifer L; Wang, Yu L; Su, Alvin W Y; Norris, David O

    2014-01-01

    Complex hand dexterity is fundamental to our interactions with the physical, social, and cultural environment. Dexterity can be an expression of creativity and precision in a range of activities, including musical performance. Little is understood about complex hand dexterity or how virtuoso expertise is acquired, due to the versatility of movement combinations available to complete any given task. This has historically limited progress of the field because of difficulties in measuring movements of the hand. Recent developments in methods of motion capture and analysis mean it is now possible to explore the intricate movements of the hand and fingers. These methods allow us insights into the neurophysiological mechanisms underpinning complex hand dexterity and motor learning. They also allow investigation into the key factors that contribute to injury, recovery and functional compensation. The application of such analytical techniques within musical performance provides a multidisciplinary framework for purposeful investigation into the process of learning and skill acquisition in instrumental performance. These highly skilled manual and cognitive tasks present the ultimate achievement in complex hand dexterity. This paper will review methods of assessing instrumental performance in music, focusing specifically on biomechanical measurement and the associated technical challenges faced when measuring highly dexterous activities. PMID:24860531

  1. Complex Hand Dexterity: A Review of Biomechanical Methods for Measuring Musical Performance

    Cheryl Diane Metcalf

    2014-05-01

    Full Text Available Complex hand dexterity is fundamental to our interactions with the physical, social and cultural environment. Dexterity can be an expression of creativity and precision in a range of activities, including musical performance. Little is understood about complex hand dexterity or how virtuoso expertise is acquired, due to the versatility of movement combinations available to complete any given task. This has historically limited progress of the field because of difficulties in measuring movements of the hand. Recent developments in methods of motion capture and analysis mean it is now possible to explore the intricate movements of the hand and fingers. These methods allow us insights into the neurophysiological mechanisms underpinning complex hand dexterity and motor learning. They also allow investigation into the key factors that contribute to injury, recovery and functional compensation.The application of such analytical techniques within musical performance provides a multidisciplinary framework for purposeful investigation into the process of learning and skill acquisition in instrumental performance. These highly skilled manual and cognitive tasks present the ultimate achievement in complex hand dexterity. This paper will review methods of assessing instrumental performance in music, focusing specifically on biomechanical measurement and the associated technical challenges faced when measuring highly dexterous activities.

  2. Investigating the TACOM measure as a general tool for quantifying the complexity of procedure guided tasks

    According to operation experience, it is evident that the role of human operators is critical for securing the safety of complex socio-technical systems. For this reason, various kinds of HRA (Human Reliability Analysis) techniques have been used for several decades in order to systematically manage the likelihood of human error. One of the prerequisites to accomplish this goal is the provision of sufficient data that are helpful for HRA practitioners. In this regard, Podofillini, Park, and Dang (2013) investigated the feasibility of the TACOM (Task Complexity) measure as a tool to represent the effect of a task complexity on the performance of human operators in an objective manner. As a result, it was observed that TACOM scores systematically explain the variation of difficulty rankings and the likelihood of human error being empirically measured. Accordingly, it is possible to expect that the TACOM measure can support HRA practitioners because they can estimate the relative difficulties (or the likelihoods of human error) among tasks based on the associated TACOM scores to some extent. In order to confirm this expectation, however, it is indispensable to ensure the generality of the TACOM measure. From this necessity, task performance time data obtained from different task environments are compared. Consequently, it is believed that the TACOM measure can be regarded as a general tool for representing the complexity of procedure guided tasks because human operators who are faced with similar TACOM scores showed comparable task performance times even under different task environments. - Highlights: • TACOM scores seemed to be correlated with difficulty rankings and empirical HEPs. • Two sets of task performance times are gathered from different task environments. • Task performance times are compared with the associated TACOM scores. • TACOM measure seems to be applied in common to different task environments

  3. Raman spectroscopy of the system iron(III)-sulfuric acid-water: an approach to Tinto River's (Spain) hydrogeochemistry.

    Sobron, P; Rull, F; Sobron, F; Sanz, A; Medina, J; Nielsen, C J

    2007-12-15

    Acid mine drainage is formed when pyrite (FeS(2)) is exposed and reacts with air and water to form sulfuric acid and dissolved iron. Tinto River (Huelva, Spain) is an example of this phenomenon. In this study, Raman spectroscopy has been used to investigate the speciation of the system iron(III)-sulfuric acid-water as an approach to Tinto River's aqueous solutions. The molalities of sulfuric acid (0.09 mol/kg) and iron(III) (0.01-1.5 mol/kg) were chosen to mimic the concentration of the species in Tinto River waters. Raman spectra of the solutions reveal a strong iron(III)-sulfate inner-sphere interaction through the nu(1) sulfate band at 981 cm(-1) and its shoulder at 1005 cm(-1). Iron(III)-sulfate interaction may also be facilitated by hydrogen bonds and monitored in the Raman spectra through the symmetric stretching band of bisulfate at 1052 cm(-1) and a shoulder at 1040 cm(-1). Other bands in the low-frequency region of the Raman spectra are attributed to the hydrogen-bonded complexes formation as well. PMID:17869164

  4. Full-field velocity and temperature measurements using magnetic resonance imaging in turbulent complex internal flows

    Flow and heat transfer in complex internal passages are difficult to predict due to the presence of strong secondary flows and multiple regions of separation. Two methods based on magnetic resonance imaging called 4D magnetic resonance velocimetry (4D-MRV) and thermometry (4D-MRT) are described for measuring the full-field mean velocities and temperatures, respectively, in complex internal passage flows. 4D-MRV measurements are presented for flow through a model of a gas turbine blade internal cooling passage geometry with Reh = 10,000 and compared to PIV measurements in a highly complex 180 deg bend. Measured three-component velocities provide excellent qualitative and quantitative insight into flow structures throughout the entire flow domain. The velocities agree within ±10% in magnitude and ±10 deg in direction in a large portion of the bend which is characterized by turbulent fluctuations as high as 10-20% of the passage inlet bulk velocity. Integrated average flow rates are accurate to 4% throughout the flow domain. Preliminary 4D-MRV/MRT results are presented for heated fully developed turbulent pipe flow at ReD = 13,000

  5. BETWEEN PARCIMONY AND COMPLEXITY: COMPARING PERFORMANCE MEASURES FOR ROMANIAN BANKING INSTITUTIONS

    ANCA MUNTEANU

    2012-01-01

    Full Text Available The main objective of this study is to establish the relationship between traditional measures of performance (ROE, ROA and NIM and EVA in order to gain some insight about the relevance of using more sophisticated performance measurements tools. Towards this end the study uses two acknowledged statistical measures: Kendall’s Tau and Spearman rank correlation Index. Using data from 12 Romanian banking institutions that report under IFRS for the period 2006-2010 the results suggest that generally EVA is highly correlated with Residual Income in the years that present positive operational profits whereas for the years with negative outcome the correlation is low. ROA and ROE are the measure that best correlates with EVA for the entire period and thus -applying Occam’s razor- could be used as a substitute for more complex shareholder earnings measures.

  6. A quantitative measure, mechanism and attractor for self-organization in networked complex systems

    Georgiev, Georgi Yordanov

    2012-01-01

    Quantity of organization in complex networks here is measured as the inverse of the average sum of physical actions of all elements per unit motion multiplied by the Planck's constant. The meaning of quantity of organization is the inverse of the number of quanta of action per one unit motion of an element. This definition can be applied to the organization of any complex system. Systems self-organize to decrease the average action per element per unit motion. This lowest action state is the attractor for the continuous self-organization and evolution of a dynamical complex system. Constraints increase this average action and constraint minimization by the elements is a basic mechanism for action minimization. Increase of quantity of elements in a network, leads to faster constraint minimization through grouping, decrease of average action per element and motion and therefore accelerated rate of self-organization. Progressive development, as self-organization, is a process of minimization of action.

  7. Quantifying the complexity of human colonic pressure signals using an entropy measure.

    Xu, Fei; Yan, Guozheng; Zhao, Kai; Lu, Li; Wang, Zhiwu; Gao, Jinyang

    2016-02-01

    Studying the complexity of human colonic pressure signals is important in understanding this intricate, evolved, dynamic system. This article presents a method for quantifying the complexity of colonic pressure signals using an entropy measure. As a self-adaptive non-stationary signal analysis algorithm, empirical mode decomposition can decompose a complex pressure signal into a set of intrinsic mode functions (IMFs). Considering that IMF2, IMF3, and IMF4 represent crucial characteristics of colonic motility, a new signal was reconstructed with these three signals. Then, the time entropy (TE), power spectral entropy (PSE), and approximate entropy (AE) of the reconstructed signal were calculated. For subjects with constipation and healthy individuals, experimental results showed that the entropies of reconstructed signals between these two classes were distinguishable. Moreover, the TE, PSE, and AE can be extracted as features for further subject classification. PMID:26043437

  8. Multi-attribute integrated measurement of node importance in complex networks

    Wang, Shibo; Zhao, Jinlou

    2015-11-01

    The measure of node importance in complex networks is very important to the research of networks stability and robustness; it also can ensure the security of the whole network. Most researchers have used a single indicator to measure the networks node importance, so that the obtained measurement results only reflect certain aspects of the networks with a loss of information. Meanwhile, because of the difference of networks topology, the nodes' importance should be described by combining the character of the networks topology. Most of the existing evaluation algorithms cannot completely reflect the circumstances of complex networks, so this paper takes into account the degree of centrality, the relative closeness centrality, clustering coefficient, and topology potential and raises an integrated measuring method to measure the nodes' importance. This method can reflect nodes' internal and outside attributes and eliminate the influence of network structure on the node importance. The experiments of karate network and dolphin network show that networks topology structure integrated measure has smaller range of metrical result than a single indicator and more universal. Experiments show that attacking the North American power grid and the Internet network with the method has a faster convergence speed than other methods.

  9. Long-lifetime Ru(II) complexes for the measurement of high molecular weight protein hydrodynamics.

    Szmacinski, H; Castellano, F N; Terpetschnig, E; Dattelbaum, J D; Lakowicz, J R; Meyer, G J

    1998-03-01

    We describe the synthesis and characterization of two asymmetrical ruthenium(II) complexes, [Ru(dpp)2(dcbpy)]2+ and [Ru(dpp)2(mcbpy)]2+, as well as the water soluble sulfonated derivatives [Ru(dpp(SO3Na)2)2(dcbpy)]2+ and [Ru(dpp(SO3Na)2)2(mcbpy)]2+ (dpp is 4,7-diphenyl-1,10-phenanthroline, dcbpy is 4,4'-dicarboxylic acid-2,2'-bipyridine, mcbpy is 4-methyl,4'-carboxylic acid-2,2'-bipyridine, and dpp(SO3Na)2 is the disulfonated derivative of dpp) as probes for the measurement of the rotational motions of proteins. The spectral (absorption, emission, and anisotropy) and photophysical (time-resolved intensity and anisotropy decays) properties of these metal-ligand complexes were determined in solution, in both the presence and absence of human serum albumin (HSA). These complexes display lifetimes ranging from 345 ns to 3.8 microseconds in deoxygenated aqueous solutions under a variety of conditions. The carboxylic acid groups on these complexes were activated to form N-hydroxysuccinimide (NHS) esters which were used to covalently lable HSA, and were characterized spectroscopically in the same manner as above. Time-resolved anisotropy measurements were performed to demonstrate the utility of these complexes in measuring long rotational correlation times of bioconjugates between HSA and antibody to HSA. The potential usefulness of these probes in fluorescence polarization immunoassays was demonstrated by an association assay of the Ru(II)-labeled HSA with polyclonal antibody. PMID:9546056

  10. Crater size-frequency distribution measurements and age of the Compton-Belkovich Volcanic Complex

    Shirley, K. A.; Zanetti, M.; Jolliff, B.; van der Bogert, C. H.; Hiesinger, H.

    2016-07-01

    The Compton-Belkovich Volcanic Complex (CBVC) is a 25 × 35 km feature on the lunar farside marked by elevated topography, high albedo, high thorium concentration, and high silica content. Morphologies indicate that the complex is volcanic in origin and compositions indicate that it represents rare silicic volcanism on the Moon. Constraining the timing of silicic volcanism at the complex is necessary to better understand the development of evolved magmas and when they were active on the lunar surface. We employ image analysis and crater size-frequency distribution (CSFD) measurements on several locations within the complex and at surrounding impact craters, Hayn (87 km diameter), and Compton (160 km diameter), to determine relative and absolute model ages of regional events. Using CSFD measurements, we establish a chronology dating regional resurfacing events and the earliest possible onset of CBVC volcanism at ∼3.8 Ga, the formation of Compton Crater at 3.6 Ga, likely resurfacing by volcanism at the CBVC at ∼3.5 Ga, and the formation of Hayn Crater at ∼1 Ga. For the CBVC, we find the most consistent results are obtained using craters larger than 300 m in diameter; the small crater population is affected by their approach to an equilibrium condition and by the physical properties of regolith at the CBVC.

  11. Direct measurement and modulation of single-molecule coordinative bonding forces in a transition metal complex

    Hao, Xian; Zhu, Nan; Gschneidtner, Tina;

    2013-01-01

    Coordination chemistry has been a consistently active branch of chemistry since Werner's seminal theory of coordination compounds inaugurated in 1893, with the central focus on transition metal complexes. However, control and measurement of metal-ligand interactions at the single-molecule level...... remain a daunting challenge. Here we demonstrate an interdisciplinary and systematic approach that enables measurement and modulation of the coordinative bonding forces in a transition metal complex. Terpyridine is derived with a thiol linker, facilitating covalent attachment of this ligand on both gold...... significant impact on the metal-ligand interactions. The present approach represents a major advancement in unravelling the nature of metal-ligand interactions and could have broad implications in coordination chemistry....

  12. Systematic Study of Information Measures, Statistical Complexity and Atomic Structure Properties

    Chatzisavvas, K. Ch.; Tserkis, S. T.; Panos, C. P.; Moustakidis, Ch. C.

    2015-05-01

    We present a comparative study of several information and statistical complexity measures in order to examine a possible correlation with certain experimental properties of atomic structure. Comparisons are also carried out quantitatively using Pearson correlation coefficient. In particular, it is shown that Fisher information in momentum space is very sensitive to shell effects. It is also seen that three measures expressed in momentum space that is Fisher information, Fisher-Shannon plane and LMC complexity are associated with atomic radius, ionization energy, electronegativity, and atomic dipole polarizability. Our results indicate that a momentum space treatment of atomic periodicity is superior to a position space one. Finally we present a relation that emerges between Fisher information and the second moment of the probability distribution in momentum space i.e. an energy functional of interest in (e,2e) experiments.

  13. Wettability of reservoir rock and fluid systems from complex resistivity measurements

    Moss, A.K.; Jing, X.D.; Archer, J.S. [Department of Earth Science and Engineering, Imperial College of Science, Technology and Medicine, London (United Kingdom)

    2002-04-01

    Electrical resistivity measurements at a single low AC frequency have long been recognized as providing an indication of the wettability of reservoir rock and fluid systems. However, the resistivity response over a range of frequencies for samples of varying wettability is not so well characterized. Data is presented from reservoir core plugs of differing lithologies, permeabilities, and wettabilities. The complex resistivity response at differing saturations and wettability was measured. This research group has been investigating relationships between complex resistivity, permeability, and clay content, described in previous research papers. This study extends this work to include wettability. Electrical resistivity measurements in the low-frequency range (10 Hz-10 kHz) include an electrode polarization effect. At frequencies between 10 and 200 kHz, the electrode polarization effect is reduced and the bulk sample response measured. An Argand diagram analysis is employed to find the critical frequency (f{sub c}) separating the electrode polarization from the bulk sample response. Samples are tested in a multi-sample rig at hydrostatic reservoir overburden stresses. The test equipment allows the measurement of resistivity in the two or four electrode configurations over a frequency range from 10 Hz to 1 MHz during drainage and imbibition cycles. Multi-electrodes down the sample length allow saturation monitoring and thus the detection of any saturation inhomogeneity throughout the samples. Sample wettability is evaluated using the Amott-Harvey wettability index (AHWI) on adjacent samples and change in Archie Saturation exponent before and after aging in crude oil. The effect of frequency dispersion was analysed in relation to pore-scale fluid distribution and, hence, wettability. The results suggest complex resistivity measurement have the potential as a non-invasive technique to evaluate reservoir wettability.

  14. Shock tunnel free flight force measurements using a complex model configuration

    Hannemann, Klaus; Martinez Schramm, Jan; Laurence, Stuart; Karl, Sebastian

    2015-01-01

    The free flight force measurement technique is a very attractive tool to determine forces and moments in particular in short duration ground based test facilities. With test times in the order of a few milliseconds, conventional force balances cannot be applied here. The technique has been applied in a number of shock tunnels utilizing models up to approximately 300 mm in length and looking at external aerodynamics. In the present study the technique is applied using a complex 1.5 m l...

  15. An Assessment of Wind Plant Complex Flows Using Advanced Doppler Radar Measurements

    Gunter, W. S.; Schroeder, J.; Hirth, B.; Duncan, J.; Guynes, J.

    2015-12-01

    As installed wind energy capacity continues to steadily increase, the need for comprehensive measurements of wind plant complex flows to further reduce the cost of wind energy has been well advertised by the industry as a whole. Such measurements serve diverse perspectives including resource assessment, turbine inflow and power curve validation, wake and wind plant layout model verification, operations and maintenance, and the development of future advanced wind plant control schemes. While various measurement devices have been matured for wind energy applications (e.g. meteorological towers, LIDAR, SODAR), this presentation will focus on the use of advanced Doppler radar systems to observe the complex wind flows within and surrounding wind plants. Advanced Doppler radars can provide the combined advantage of a large analysis footprint (tens of square kilometers) with rapid data analysis updates (a few seconds to one minute) using both single- and dual-Doppler data collection methods. This presentation demonstrates the utility of measurements collected by the Texas Tech University Ka-band (TTUKa) radars to identify complex wind flows occurring within and nearby operational wind plants, and provide reliable forecasts of wind speeds and directions at given locations (i.e. turbine or instrumented tower sites) 45+ seconds in advance. Radar-derived wind maps reveal commonly observed features such as turbine wakes and turbine-to-turbine interaction, high momentum wind speed channels between turbine wakes, turbine array edge effects, transient boundary layer flow structures (such as wind streaks, frontal boundaries, etc.), and the impact of local terrain. Operational turbine or instrumented tower data are merged with the radar analysis to link the observed complex flow features to turbine and wind plant performance.

  16. An entropy-based measure of hydrologic complexity and its applications

    Castillo, Aldrich; Castelli, Fabio; Entekhabi, Dara

    2015-01-01

    Abstract Basin response and hydrologic fluxes are functions of hydrologic states, most notably of soil moisture. However, characterization of hillslope‐scale soil moisture is challenging since it is both spatially heterogeneous and dynamic. This paper introduces an entropy‐based and discretization‐invariant dimensionless index of hydrologic complexity H that measures the distance of a given distribution of soil moisture from a Dirac delta (most organization) and a uniform distribution (widest...

  17. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Szi-Wen Chen

    2007-01-01

    A novel approach that employs a complexity-based sequential hypothesis testing (SHT) technique for real-time detection of ventricular fibrillation (VF) and ventricular tachycardia (VT) is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG) recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For ...

  18. Instrumentation measurement and testing complex for detection and identification of radioactive materials using the emitted radiation

    Simultaneous measurement of neutron and gamma radiation is a very usefull method for effective nuclear materials identification and control. The gamma-ray-neutron complex described in the paper is based on two multi-layer 3He neutrons detectors and two High Pressure Xenon gamma-ray spectrometers assembled in one unit. All these detectors were callibrated on neutron and gamma-ray sources. The main characteristics of the instrumentation , its testing results and gamma-ray and neutron radiation parameters, which have been measured are represented in the paper. The gamma-neutron sources and fissile materials reliable detection and identification capability was demonstrated

  19. RNACompress: Grammar-based compression and informational complexity measurement of RNA secondary structure

    Chen Chun

    2008-03-01

    Full Text Available Abstract Background With the rapid emergence of RNA databases and newly identified non-coding RNAs, an efficient compression algorithm for RNA sequence and structural information is needed for the storage and analysis of such data. Although several algorithms for compressing DNA sequences have been proposed, none of them are suitable for the compression of RNA sequences with their secondary structures simultaneously. This kind of compression not only facilitates the maintenance of RNA data, but also supplies a novel way to measure the informational complexity of RNA structural data, raising the possibility of studying the relationship between the functional activities of RNA structures and their complexities, as well as various structural properties of RNA based on compression. Results RNACompress employs an efficient grammar-based model to compress RNA sequences and their secondary structures. The main goals of this algorithm are two fold: (1 present a robust and effective way for RNA structural data compression; (2 design a suitable model to represent RNA secondary structure as well as derive the informational complexity of the structural data based on compression. Our extensive tests have shown that RNACompress achieves a universally better compression ratio compared with other sequence-specific or common text-specific compression algorithms, such as Gencompress, winrar and gzip. Moreover, a test of the activities of distinct GTP-binding RNAs (aptamers compared with their structural complexity shows that our defined informational complexity can be used to describe how complexity varies with activity. These results lead to an objective means of comparing the functional properties of heteropolymers from the information perspective. Conclusion A universal algorithm for the compression of RNA secondary structure as well as the evaluation of its informational complexity is discussed in this paper. We have developed RNACompress, as a useful tool

  20. Determining complex permittivity from propagation constant measurements with planar transmission lines

    A new two-standard calibration procedure is outlined for determining the complex permittivity of materials from the propagation constant measured with planar transmission lines. Once calibrated, a closed-form expression for the material permittivity is obtained. The effects of radiation and conductor losses are accounted for in the calibration. The multiline technique, combined with a recently proposed planar transmission-line configuration, is used to determine the line propagation constant. An uncertainty analysis is presented for the proposed calibration procedure that includes the uncertainties associated with the multiline technique. This allows line dimensions and calibration standards to be selected that minimize the total measurement uncertainty. The use of air and distilled water as calibration standards gives relatively small measurement uncertainty. Permittivity measurement results for five liquids, covering a wide permittivity range, agree very closely with expected values from 0.5–5 GHz. (paper)

  1. MEASURING OBJECT-ORIENTED SYSTEMS BASED ON THE EXPERIMENTAL ANALYSIS OF THE COMPLEXITY METRICS

    J.S.V.R.S.SASTRY,

    2011-05-01

    Full Text Available Metrics are used to help a software engineer in quantitative analysis to assess the quality of the design before a system is built. The focus of Object-Oriented metrics is on the class which is the fundamental building block of the Object-Oriented architecture. These metrics are focused on internal object structure and external object structure. Internal object structure reflects the complexity of each individual entity such as methods and classes. External complexity measures the interaction among entities such as Coupling and Inheritance. This paper mainly focuses on a set of object oriented metrics that can be used to measure the quality of an object oriented design. Two types of complexity metrics in Object-Oriented paradigm namely Mood metrics and Lorenz & Kidd metrics. Mood metrics consist of Method inheritance factor(MIF, Coupling factor(CF, Attribute inheritance factor(AIF, Method hiding factor(MHF, Attribute hiding factor(AHF, and polymorphism factor(PF. Lorenz & Kidd metrics consist of Number of operations overridden (NOO, Number operations added (NOA, Specialization index(SI. Mood metrics and Lorenz & Kidd metrics measurements are used mainly by designers and testers. Designers uses these metrics to access the software early in process,making changes that will reduce complexity and improve the continuing capability of the design. Testers use to test the software for finding the complexity, performance of the system, quality of the software. This paper reviews Mood metrics and Lorenz & Kidd metrics are validates theoretically and empirically methods. In thispaper, work has been done to explore the quality of design of software components using object oriented paradigm. A number of object oriented metrics have been proposed in the literature for measuring the design attributes such as inheritance, coupling, polymorphism etc. This paper, metrics have been used to analyzevarious features of software component. Complexity of methods

  2. Acid water problem in Bukit Assam Coal Mine, South Sumatra, Indonesia

    Gautama, R.S. [Institut Teknologi Bandung, Bandung (Indonesia). Dept. of Mining Engineering

    1994-09-01

    With an average annual rainfall of more than 2800 mm runoff water is considered to be the main water problem faced by Bukit Asam Coal Mine. There is only a minor problem due to groundwater as the relevant aquifer consists of sandstone with low hydraulic conductivity, i.e. less than 10{sup -7}m/s. Water quality monitoring done periodically as a part of an environmental monitoring program has detected water with low pH. The problem is significant as it relates to a large amount of acid water in an abandoned pit in East Klawas which could be discharged to the nearly Enim River. The acid water problem in Bukit Asam mine will be discussed. Since this problem has never been encountered before, the analyses to understand the acid generating process at present is based on very limited data. Further research is necessary, is being conducted and, at the same time, the appropriate method to handle the problem has to be developed to support an environmentally sound mining operation. 6 refs., 4 figs., 2 tabs.

  3. A Statistical Framework to Infer Delay and Direction of Information Flow from Measurements of Complex Systems.

    Schumacher, Johannes; Wunderle, Thomas; Fries, Pascal; Jäkel, Frank; Pipa, Gordon

    2015-08-01

    In neuroscience, data are typically generated from neural network activity. The resulting time series represent measurements from spatially distributed subsystems with complex interactions, weakly coupled to a high-dimensional global system. We present a statistical framework to estimate the direction of information flow and its delay in measurements from systems of this type. Informed by differential topology, gaussian process regression is employed to reconstruct measurements of putative driving systems from measurements of the driven systems. These reconstructions serve to estimate the delay of the interaction by means of an analytical criterion developed for this purpose. The model accounts for a range of possible sources of uncertainty, including temporally evolving intrinsic noise, while assuming complex nonlinear dependencies. Furthermore, we show that if information flow is delayed, this approach also allows for inference in strong coupling scenarios of systems exhibiting synchronization phenomena. The validity of the method is demonstrated with a variety of delay-coupled chaotic oscillators. In addition, we show that these results seamlessly transfer to local field potentials in cat visual cortex. PMID:26079751

  4. A Measure for Brain Complexity: Relating Functional Segregation and Integration in the Nervous System

    Tononi, Giulio; Sporns, Olaf; Edelman, Gerald M.

    1994-05-01

    In brains of higher vertebrates, the functional segregation of local areas that differ in their anatomy and physiology contrasts sharply with their global integration during perception and behavior. In this paper, we introduce a measure, called neural complexity (C_N), that captures the interplay between these two fundamental aspects of brain organization. We express functional segregation within a neural system in terms of the relative statistical independence of small subsets of the system and functional integration in terms of significant deviations from independence of large subsets. C_N is then obtained from estimates of the average deviation from statistical independence for subsets of increasing size. C_N is shown to be high when functional segregation coexists with integration and to be low when the components of a system are either completely independent (segregated) or completely dependent (integrated). We apply this complexity measure in computer simulations of cortical areas to examine how some basic principles of neuroanatomical organization constrain brain dynamics. We show that the connectivity patterns of the cerebral cortex, such as a high density of connections, strong local connectivity organizing cells into neuronal groups, patchiness in the connectivity among neuronal groups, and prevalent reciprocal connections, are associated with high values of C_N. The approach outlined here may prove useful in analyzing complexity in other biological domains such as gene regulation and embryogenesis.

  5. Growing complex network of citations of scientific papers -- measurements and modeling

    Golosovsky, M

    2016-01-01

    To quantify the mechanism of a complex network growth we focus on the network of citations of scientific papers and use a combination of the theoretical and experimental tools to uncover microscopic details of this network growth. Namely, we develop a stochastic model of citation dynamics based on copying/redirection/triadic closure mechanism. In a complementary and coherent way, the model accounts both for statistics of references of scientific papers and for their citation dynamics. Originating in empirical measurements, the model is cast in such a way that it can be verified quantitatively in every aspect. Such verification is performed by measuring citation dynamics of Physics papers. The measurements revealed nonlinear citation dynamics, the nonlinearity being intricately related to network topology. The nonlinearity has far-reaching consequences including non-stationary citation distributions, diverging citation trajectory of similar papers, runaways or "immortal papers" with infinite citation lifetime ...

  6. Simulation of complex glazing products; from optical data measurements to model based predictive controls

    Kohler, Christian

    2012-08-01

    Complex glazing systems such as venetian blinds, fritted glass and woven shades require more detailed optical and thermal input data for their components than specular non light-redirecting glazing systems. Various methods for measuring these data sets are described in this paper. These data sets are used in multiple simulation tools to model the thermal and optical properties of complex glazing systems. The output from these tools can be used to generate simplified rating values or as an input to other simulation tools such as whole building annual energy programs, or lighting analysis tools. I also describe some of the challenges of creating a rating system for these products and which factors affect this rating. A potential future direction of simulation and building operations is model based predictive controls, where detailed computer models are run in real-time, receiving data for an actual building and providing control input to building elements such as shades.

  7. Measuring working memory in aphasia: Comparing performance on complex span and N-back tasks

    Maria Ivanova

    2014-04-01

    No significant correlations were observed between performance on complex span task and N-back tasks.Furthermore, performance on the modified listening span was related to performance on the comprehension subtest of the QASA, while no relationship was found for 2-back and 0-back tasks.Our results mirror studies in healthy controls that demonstrated no relationship between performance on the two tasks(Jaeggi et al., 2010; Kane et al., 2007. Thus although N-back tasks seem similar to traditional complex span measures and may also index abilities related to cognitive processing, the evidence to date does not warrant their direct association with the construct of WM. Implications for future investigation of cognitive deficits in aphasia will be discussed.

  8. Estimation of Defect proneness Using Design complexity Measurements in Object- Oriented Software

    Selvarani, R; Prasad, V Kamakshi

    2010-01-01

    Software engineering is continuously facing the challenges of growing complexity of software packages and increased level of data on defects and drawbacks from software production process. This makes a clarion call for inventions and methods which can enable a more reusable, reliable, easily maintainable and high quality software systems with deeper control on software generation process. Quality and productivity are indeed the two most important parameters for controlling any industrial process. Implementation of a successful control system requires some means of measurement. Software metrics play an important role in the management aspects of the software development process such as better planning, assessment of improvements, resource allocation and reduction of unpredictability. The process involving early detection of potential problems, productivity evaluation and evaluating external quality factors such as reusability, maintainability, defect proneness and complexity are of utmost importance. Here we d...

  9. Detecting Microbial Growth and Metabolism in Geologic Media with Complex Conductivity Measurements

    Davis, C. A.; Atekwana, E. A.; Slater, L. D.; Bottrell, P. M.; Chasten, L. E.; Heidenreich, J. D.

    2006-05-01

    Complex conductivity measurements between 0.1-1000 Hz were obtained from biostimulated sand-packed (coarse and mixed fine and medium grain) columns to investigate microbial growth, biofilm formation, and microbial metabolism on the electrical properties of porous media. Microbial growth and metabolism was verified by direct microbial counts, pH changes, and environmental scanning electron microscope imaging. Peaks in imaginary (interfacial) conductivity in the coarse grain columns occurred concurrently with peaks in the microbial cell concentrations. The magnitude of the imaginary conductivity response in the mixed fine and medium grain columns, however, was low compared to the coarse grain sand columns, consistent with lower microbial cell concentrations. It is possible that the pore size in the mixed fine and medium grain sand restricted bacteria cell division, inhibiting microbial growth, and thus the smaller magnitude imaginary conductivity response. The biostimulated columns for both grain sizes displayed similar trends and showed an increase in the real (electrolytic) conductivity and decrease in pH over time. Dynamic changes in the imaginary conductivity arises from the growth and attachment of microbial cells and biofilms to surfaces, whereas, changes in the real conductivity arises from the release of byproducts (ionic species) of microbial metabolism. We conclude that complex conductivity techniques are feasible sensors for detecting microbial growth (imaginary conductivity measurements) and metabolism (real conductivity measurements) with implications for bioremediation and astrobiology studies.

  10. In vivo and in situ measurement and modelling of intra-body effective complex permittivity.

    Nadimi, Esmaeil S; Blanes-Vidal, Victoria; Harslund, Jakob L F; Ramezani, Mohammad H; Kjeldsen, Jens; Johansen, Per Michael; Thiel, David; Tarokh, Vahid

    2015-12-01

    Radio frequency tracking of medical micro-robots in minimally invasive medicine is usually investigated upon the assumption that the human body is a homogeneous propagation medium. In this Letter, the authors conducted various trial programs to measure and model the effective complex permittivity ε in terms of refraction ε', absorption ε″ and their variations in gastrointestinal (GI) tract organs (i.e. oesophagus, stomach, small intestine and large intestine) and the porcine abdominal wall under in vivo and in situ conditions. They further investigated the effects of irregular and unsynchronised contractions and simulated peristaltic movements of the GI tract organs inside the abdominal cavity and in the presence of the abdominal wall on the measurements and variations of ε' and ε''. They advanced the previous models of effective complex permittivity of a multilayer inhomogeneous medium, by estimating an analytical model that accounts for reflections between the layers and calculates the attenuation that the wave encounters as it traverses the GI tract and the abdominal wall. They observed that deviation from the specified nominal layer thicknesses due to non-geometric boundaries of GI tract morphometric variables has an impact on the performance of the authors' model. Therefore, they derived statistical-based models for ε' and ε'' using their experimental measurements. PMID:26713157

  11. Complex permittivity measurement at millimetre-wave frequencies during the fermentation process of Japanese sake

    Kouzai, Masaki [Tokyo Institute of Technology, Meguro, Tokyo 152-8552 (Japan); Nishikata, Atsuhiro [Tokyo Institute of Technology, Meguro, Tokyo 152-8552 (Japan); Fukunaga, Kaori [NICT, Koganei, Tokyo 184-8795 (Japan); Miyaoka, Shunsuke [Industrial Research Centre of Ehime, Matsuyama, Ehime 791-1101 (Japan)

    2007-01-07

    Various chemical reactions occur simultaneously in barrels during the fermentation processes of alcoholic beverages. Chemical analyses are employed to monitor the change in chemical components, such as glucose and ethyl alcohol. The tests are carried out with extracted specimens, are costly and require time. We have developed a permittivity measurement system for liquid specimens in the frequency range from 2.6 to 50 GHz, and applied the system to fermentation monitoring. Experimental results proved that the observed change in complex permittivity suggests a decrease in the amount of glucose and an increase in alcohol content, which are the key chemical components during the fermentation process.

  12. Relating Hyperspectral Airborne Data to Ground Measurements in a Complex and Discontinuous Canopy

    Calleja Javier F.

    2015-12-01

    Full Text Available The work described in this paper is aimed at validating hyperspectral airborne reflectance data collected during the Regional Experiments For Land-atmosphere EXchanges (REFLEX campaign. Ground reflectance data measured in a vineyard were compared with airborne reflectance data. A sampling strategy and subsequent ground data processing had to be devised so as to capture a representative spectral sample of this complex crop. A linear model between airborne and ground data was tried and statistically tested. Results reveal a sound correspondence between ground and airborne reflectance data (R2 > 0.97, validating the atmospheric correction of the latter.

  13. Complex permittivity measurement at millimetre-wave frequencies during the fermentation process of Japanese sake

    Various chemical reactions occur simultaneously in barrels during the fermentation processes of alcoholic beverages. Chemical analyses are employed to monitor the change in chemical components, such as glucose and ethyl alcohol. The tests are carried out with extracted specimens, are costly and require time. We have developed a permittivity measurement system for liquid specimens in the frequency range from 2.6 to 50 GHz, and applied the system to fermentation monitoring. Experimental results proved that the observed change in complex permittivity suggests a decrease in the amount of glucose and an increase in alcohol content, which are the key chemical components during the fermentation process

  14. A Thorax Simulator for Complex Dynamic Bioimpedance Measurements With Textile Electrodes.

    Ulbrich, Mark; Muhlsteff, Jens; Teichmann, Daniel; Leonhardt, Steffen; Walter, Marian

    2015-06-01

    Bioimpedance measurements on the human thorax are suitable for assessment of body composition or hemodynamic parameters, such as stroke volume; they are non-invasive, easy in application and inexpensive. When targeting personal healthcare scenarios, the technology can be integrated into textiles to increase ease, comfort and coverage of measurements. Bioimpedance is generally measured using two electrodes injecting low alternating currents (0.5-10 mA) and two additional electrodes to measure the corresponding voltage drop. The impedance is measured either spectroscopically (bioimpedance spectroscopy, BIS) between 5 kHz and 1 MHz or continuously at a fixed frequency around 100 kHz (impedance cardiography, ICG). A thorax simulator is being developed for testing and calibration of bioimpedance devices and other new developments. For the first time, it is possible to mimic the complete time-variant properties of the thorax during an impedance measurement. This includes the dynamic real part and dynamic imaginary part of the impedance with a peak-to-peak value of 0.2 Ω and an adjustable base impedance (24.6 Ω ≥ Z0 ≥ 51.6 Ω). Another novelty is adjustable complex electrode-skin contact impedances for up to 8 electrodes to evaluate bioimpedance devices in combination with textile electrodes. In addition, an electrocardiographic signal is provided for cardiographic measurements which is used in ICG devices. This provides the possibility to generate physiologic impedance changes, and in combination with an ECG, all parameters of interest such as stroke volume (SV), pre-ejection period (PEP) or extracellular resistance (Re) can be simulated. The speed of all dynamic signals can be altered. The simulator was successfully tested with commercially available BIS and ICG devices and the preset signals are measured with high correlation (r = 0.996). PMID:25148671

  15. Selective extraction of metals from products of mine acidic water treatment

    A study was made on possibility of processing of foam products prepared during flotation purification of mine acidic waters for the purpose of selective extraction of non-ferrous (Co, Ni) and rare earth elements (REE) and their separation from the basic macrocomponent of waters-iron. Optimal conditions of selective metal extraction from foam flotation products are the following: T=333 K, pH=3.0-3.5, ratio of solid and liquid phase - 1:4-1:7, duration of sulfuric acid leaching - 30 min. Rare earth extraction under such conditions equals 87.6-93.0%. The degree of valuable component concentration equals ∼ 10. Rare earths are separated from iron by extraction methods

  16. Fine-grained permutation entropy as a measure of natural complexity for time series

    Liu Xiao-Feng; Wang Yue

    2009-01-01

    In a recent paper [2002 Phys. Rev. Lett. 88 174102], Bandt and Pompe propose permutation entropy (PE)as a natural complexity measure for arbitrary time series which may be stationary or nonstationary, deterministic or stochastic. Their method is based on a comparison of neighbouring values. This paper further develops PE, and proposes the concept of fine-grained PE (FGPE) defined by the order pattern and magnitude of the difference between neighbouring values. This measure excludes the case where vectors with a distinct appearance are mistakenly mapped onto the same permutation type, and consequently FGPE becomes more sensitive to the dynamical change of time series than does PE, according to our simulation and experimental results.

  17. Fine-grained permutation entropy as a measure of natural complexity for time series

    In a recent paper [2002 Phys. Rev. Lett. 88 174102], Bandt and Pompe propose permutation entropy (PE) as a natural complexity measure for arbitrary time series which may be stationary or nonstationary, deterministic or stochastic. Their method is based on a comparison of neighbouring values. This paper further develops PE, and proposes the concept of fine-grained PE (FGPE) defined by the order pattern and magnitude of the difference between neighbouring values. This measure excludes the case where vectors with a distinct appearance are mistakenly mapped onto the same permutation type, and consequently FGPE becomes more sensitive to the dynamical change of time series than does PE, according to our simulation and experimental results. (general)

  18. A new closeness centrality measure via effective distance in complex networks

    Du, Yuxian; Gao, Cai; Chen, Xin; Hu, Yong; Sadiq, Rehan; Deng, Yong

    2015-03-01

    Closeness centrality (CC) measure, as a well-known global measure, is widely applied in many complex networks. However, the classical CC presents many problems for flow networks since these networks are directed and weighted. To address these issues, we propose an effective distance based closeness centrality (EDCC), which uses effective distance to replace conventional geographic distance and binary distance obtained by Dijkstra's shortest path algorithm. The proposed EDCC considers not only the global structure of the network but also the local information of nodes. And it can be well applied in directed or undirected, weighted or unweighted networks. Susceptible-Infected model is utilized to evaluate the performance by using the spreading rate and the number of infected nodes. Numerical examples simulated on four real networks are given to show the effectiveness of the proposed EDCC.

  19. Rb-Sr measurements on metamorphic rocks from the Barro Alto Complex, Goias, Brazil

    The Barro Alto Complex comprises a highly deformed and metamorphosed association of plutonic, volcanic, and sedimentary rocks exposed in a 150 x 25 Km boomerang-like strip in Central Goias, Brazil. It is the southernmost tip of an extensive yet discontinuous belt of granulite and amphibolite facies metamorphic rocks which include the Niquelandia and Cana Brava complexes to the north. Two rock associations are distinguished within the granulite belt. The first one comprises a sequence of fine-grained mafic granulite, hypersthene-quartz-feldspar granulite, garnet quartzite, sillimanite-garnet-cordierite gneiss, calc-silicate rock, and magnetite-rich iron formation. The second association comprises medium-to coarse-grained mafic rocks. The medium-grade rocks of the western/northern portion (Barro Alto Complex) comprise both layered mafic rocks and a volcanic-sedimentary sequence, deformed and metamorphosed under amphibolite facies conditions. The fine-grained amphibolite form the basal part of the Juscelandia meta volcanic-sedimentary sequence. A geochronologic investigation by the Rb-Sr method has been carried out mainly on felsic rocks from the granulite belt and gneisses of the Juscelandia sequence. The analytical results for the Juscelandia sequence are presented. Isotope results for rocks from different outcrops along the gneiss layer near Juscelandia are also presented. In conclusion, Rb-Sr isotope measurements suggest that the Barro Alto rocks have undergone at least one important metamorphic event during Middle Proterozoic times, around 1300 Ma ago. During that event volcanic and sedimentary rocks of the Juscelandia sequence, as well as the underlying gabbro-anorthosite layered complex, underwent deformation and recrystallization under amphibolite facies conditions. (author)

  20. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Chen Szi-Wen

    2007-01-01

    Full Text Available A novel approach that employs a complexity-based sequential hypothesis testing (SHT technique for real-time detection of ventricular fibrillation (VF and ventricular tachycardia (VT is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of . The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  1. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Chen, Szi-Wen

    2006-12-01

    A novel approach that employs a complexity-based sequential hypothesis testing (SHT) technique for real-time detection of ventricular fibrillation (VF) and ventricular tachycardia (VT) is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG) recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM) value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of[InlineEquation not available: see fulltext.]. The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  2. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Szi-Wen Chen

    2007-01-01

    Full Text Available A novel approach that employs a complexity-based sequential hypothesis testing (SHT technique for real-time detection of ventricular fibrillation (VF and ventricular tachycardia (VT is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of 96.67%. The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  3. [Sample pretreatment for the measurement of phthalate esters in complex matrices].

    Liang, Jing; Zhuang, Wan'e; Lin, Fang; Yao, Wensong

    2014-11-01

    Sample pretreatment methods for the measurement of phthalate esters (PAEs) by gas chromatography-mass spectrometry (GC-MS) in various complex matrices, including sediment, soil, suspended particle matter, urban surface dust, Sinonovacula Constricta, cosmet- ic, leather, plastic and coastal/estuarine seawater, were proposed. The pretreatment which was appropriate for GC-MS detection was focused on the investigation and optimization of oper- ating parameters for the extraction and purification, such as the extraction solvent, the eluant and the adsorbent of solid phase extraction. The results of the study of pretreatment for various complex matrices showed that methylene chloride was the best solvent for the ultrasonic extraction when solid-liquid extraction was used; silica gel was the economical and practical adsorbent for solid-phase extraction for purification; C18 was the most commonly adsorbent for preconcentration of PAE in coastal/estuarine seawater sample; the mixed solution of n-hexane and ethyl acetate with a certain proportion was the suitable SPE eluent. Under the optimized conditions, the spiked recoveries were above 58% and the relative standard deviations (RSDs) were less than 10.5% (n = 6). The detection limits (DL, 3σ) were in the range of 0.3 μg/kg (dibutyl phthalate)--5.2 μg/kg ( diisononyl phthalate) for sediment, and 6 ng/L (dipropyl phthalate)--67 ng/L (diisodecyl phthalate) for costal/estuarine seawater. The pretreatment meth- od for various complex matrices is prominent for the measurement of the 16 PAEs with GC-MS. PMID:25764660

  4. A study on the development of a task complexity measure for emergency operating procedures of nuclear power plants

    Park, Jinkyun [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of)]. E-mail: kshpjk@kaeri.re.kr; Jung, Wondea [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of)

    2007-08-15

    In this study, a measure called task complexity (TACOM) that can quantify the complexity of tasks stipulated in emergency operating procedures of nuclear power plants is developed. The TACOM measure consists of five sub-measures that can cover remarkable complexity factors: (1) amount of information to be managed by operators, (2) logical entanglement due to the logical sequence of the required actions, (3) amount of actions to be accomplished by operators, (4) amount of system knowledge in recognizing the problem space, and (5) amount of cognitive resources in establishing an appropriate decision criterion. The appropriateness of the TACOM measure is investigated by comparing task performance time data with the associated TACOM scores. As a result, it is observed that there is a significant correlation between TACOM scores and task performance time data. Therefore, it is reasonable to expect that the TACOM measure can be used as a meaningful tool to quantify the complexity of tasks.

  5. Measurement of unsteady convection in a complex fenestration using laser interferometry

    Poulad, M.E.; Naylor, D. [Ryerson Univ., Toronto, ON (Canada). Dept. of Mechanical and Industrial Engineering; Oosthuizen, P.H. [Queen' s Univ., Kingston, ON (Canada). Dept. of Mechanical and Materials Engineering

    2009-06-15

    Complex fenestration involving windows with between-panes louvered blinds is gaining interest as a means to control solar gains in buildings. However, the heat transfer performance of this type of shading system is not well understood, especially at high Rayleigh numbers. A Mach-Zehnder interferometer was used in this study to measure the unsteady convective heat transfer in a tall enclosure with between-panes blind that was heated to simulate absorbed solar radiation. Digital cinematography was combined with laser interferometry to make time-averaged measurements of unsteady and turbulent free convective heat transfer. This paper described the procedures used to measure the time-average local heat flux. Under strongly turbulent conditions, the average Nusselt number for the enclosure was found to compare well with empirical correlations. A total sampling time of about ten seconds was needed in this experiment to obtain a stationary time-average heat flux. The time-average heat flux was found to be relatively insensitive to the camera frame rate. The local heat flux was found to be unsteady and periodic. Heating of the blind made the flow more unstable, producing a higher amplitude heat flux variation than for the unheated blind condition. This paper reported on only a small set of preliminary measurements. This study is being extended to other blind angles and glazing spacings. The next phase will focus on flow visualization studies to characterize the nature of the flow. 8 refs., 2 tabs., 7 figs.

  6. Measuring spatial patterns in floodplains: A step towards understanding the complexity of floodplain ecosystems: Chapter 6

    Murray Scown; Martin Thoms; DeJager, Nathan R.

    2016-01-01

    Floodplains can be viewed as complex adaptive systems (Levin, 1998) because they are comprised of many different biophysical components, such as morphological features, soil groups and vegetation communities as well as being sites of key biogeochemical processing (Stanford et al., 2005). Interactions and feedbacks among the biophysical components often result in additional phenomena occuring over a range of scales, often in the absence of any controlling factors (sensu Hallet, 1990). This emergence of new biophysical features and rates of processing can lead to alternative stable states which feed back into floodplain adaptive cycles (cf. Hughes, 1997; Stanford et al., 2005). Interactions between different biophysical components, feedbacks, self emergence and scale are all key properties of complex adaptive systems (Levin, 1998; Phillips, 2003; Murray et al., 2014) and therefore will influence the manner in which we study and view spatial patterns. Measuring the spatial patterns of floodplain biophysical components is a prerequisite to examining and understanding these ecosystems as complex adaptive systems. Elucidating relationships between pattern and process, which are intrinsically linked within floodplains (Ward et al., 2002), is dependent upon an understanding of spatial pattern. This knowledge can help river scientists determine the major drivers, controllers and responses of floodplain structure and function, as well as the consequences of altering those drivers and controllers (Hughes and Cass, 1997; Whited et al., 2007). Interactions and feedbacks between physical, chemical and biological components of floodplain ecosystems create and maintain a structurally diverse and dynamic template (Stanford et al., 2005). This template influences subsequent interactions between components that consequently affect system trajectories within floodplains (sensu Bak et al., 1988). Constructing and evaluating models used to predict floodplain ecosystem responses to

  7. Vertical profiles of urban aerosol complex refractive index in the frame of ESQUIF airborne measurements

    J.-C. Raut

    2008-02-01

    Full Text Available A synergy between lidar, sunphotometer and in situ measurements has been applied to airborne observations performed during the Etude et Simulation de la QUalité de l'air en Ile-de-France (ESQUIF, enabling the retrieval of vertical profiles for the aerosol complex refractive index (ACRI and single-scattering albedo with a vertical resolution of 200 m over Paris area. The averaged value over the entire planetary boundary layer (PBL for the ACRI is close to 1.51(±0.02–i0.017(±0.003 at 532 nm. The single-scattering albedo of the corresponding aerosols is found to be ~0.9 at the same wavelength. A good agreement is found with previous studies for urban aerosols. A comparison of vertical profiles of ACRI with simulations combining in situ measurements and relative humidity (RH profiles has highlighted a modification in aerosol optical properties linked to their history and the origin of the air mass. The determination of ACRI in the atmospheric column enabled to retrieve vertical profiles of extinction coefficient in accordance with lidar profiles measurements.

  8. Vertical profiles of urban aerosol complex refractive index in the frame of ESQUIF airborne measurements

    A synergy between lidar, sun photometer and in situ measurements has been applied to airborne observations performed during the Etude et Simulation de la QUalite de l'air en Ile-de-France (ESQUIF), enabling the retrieval of vertical profiles for the aerosol complex refractive index (ACRI) and single-scattering albedo with a vertical resolution of 200 m over Paris area. The averaged value over the entire planetary boundary layer (PBL) for the ACRI is close to 1.51(± 0.02)-i0.017(± 0.003) at 532 nm. The single-scattering albedo of the corresponding aerosols is found to be similar to 0.9 at the same wavelength. A good agreement is found with previous studies for urban aerosols. A comparison of vertical profiles of ACRI with simulations combining in situ measurements and relative humidity (RH) profiles has highlighted a modification in aerosol optical properties linked to their history and the origin of the air mass. The determination of ACRI in the atmospheric column enabled to retrieve vertical profiles of extinction coefficient in accordance with lidar profiles measurements. (authors)

  9. Cortical complexity as a measure of age-related brain atrophy.

    Madan, Christopher R; Kensinger, Elizabeth A

    2016-07-01

    The structure of the human brain changes in a variety of ways as we age. While a sizeable literature has examined age-related differences in cortical thickness, and to a lesser degree, gyrification, here we examined differences in cortical complexity, as indexed by fractal dimensionality in a sample of over 400 individuals across the adult lifespan. While prior studies have shown differences in fractal dimensionality between patient populations and age-matched, healthy controls, it is unclear how well this measure would relate to age-related cortical atrophy. Initially computing a single measure for the entire cortical ribbon, i.e., unparcellated gray matter, we found fractal dimensionality to be more sensitive to age-related differences than either cortical thickness or gyrification index. We additionally observed regional differences in age-related atrophy between the three measures, suggesting that they may index distinct differences in cortical structure. We also provide a freely available MATLAB toolbox for calculating fractal dimensionality. PMID:27103141

  10. Vertical profiles of urban aerosol complex refractive index in the frame of ESQUIF airborne measurements

    J.-C. Raut

    2007-07-01

    Full Text Available A synergy between lidar, sunphotometer and in situ measurements has been applied to airborne observations performed during the Etude et Simulation de la QUalité de l'air en Ile-de-France (ESQUIF, enabling the retrieval of vertical profiles for the aerosol complex refractive index (ACRI and single-scattering albedo with a vertical resolution of 200 m over Paris area. The averaged value over the entire planetary boundary layer (PBL for the ACRI is close to 1.51(±0.02–i0.017(±0.003 at 532 nm. The single-scattering albedo of the corresponding aerosols is found to be ~0.9 at the same wavelength. A good agreement is found with previous studies for urban aerosols. A comparison of vertical profiles of ACRI with simulations combining in situ measurements and relative humidity (RH profiles has highlighted a modification in aerosol optical properties linked to their history and the origin of the air mass. The determination of ACRI in the atmospheric column enabled to retrieve vertical profiles of extinction coefficient in accordance with lidar profiles measurements.

  11. Measuring The Influence of TAsk COMplexity on Human Error Probability: An Empirical Evaluation

    A key input for the assessment of Human Error Probabilities (HEPs) with Human Reliability Analysis (HRA) methods is the evaluation of the factors influencing the human performance (often referred to as Performance Shaping Factors, PSFs). In general, the definition of these factors and the supporting guidance are such that their evaluation involves significant subjectivity. This affects the repeatability of HRA results as well as the collection of HRA data for model construction and verification. In this context, the present paper considers the TAsk COMplexity (TACOM) measure, developed by one of the authors to quantify the complexity of procedure-guided tasks (by the operating crew of nuclear power plants in emergency situations), and evaluates its use to represent (objectively and quantitatively) task complexity issues relevant to HRA methods. In particular, TACOM scores are calculated for five Human Failure Events (HFEs) for which empirical evidence on the HEPs (albeit with large uncertainty) and influencing factors are available from the International HRA Empirical Study. The empirical evaluation has shown promising results. The TACOM score increases as the empirical HEP of the selected HFEs increases. Except for one case, TACOM scores are well distinguished if related to different difficulty categories (e. g., 'easy' vs. 'somewhat difficult'), while values corresponding to tasks within the same category are very close. Despite some important limitations related to the small number of HFEs investigated and the large uncertainty in their HEPs, this paper presents one of few attempts to empirically study the effect of a performance shaping factor on the human error probability. This type of study is important to enhance the empirical basis of HRA methods, to make sure that 1) the definitions of the PSFs cover the influences important for HRA (i. e., influencing the error probability), and 2) the quantitative relationships among PSFs and error probability are

  12. Rotational study of the CH4-CO complex: Millimeter-wave measurements and ab initio calculations.

    Surin, L A; Tarabukin, I V; Panfilov, V A; Schlemmer, S; Kalugina, Y N; Faure, A; Rist, C; van der Avoird, A

    2015-10-21

    The rotational spectrum of the van der Waals complex CH4-CO has been measured with the intracavity OROTRON jet spectrometer in the frequency range of 110-145 GHz. Newly observed and assigned transitions belong to the K = 2-1 subband correlating with the rotationless jCH4 = 0 ground state and the K = 2-1 and K = 0-1 subbands correlating with the jCH4 = 2 excited state of free methane. The (approximate) quantum number K is the projection of the total angular momentum J on the intermolecular axis. The new data were analyzed together with the known millimeter-wave and microwave transitions in order to determine the molecular parameters of the CH4-CO complex. Accompanying ab initio calculations of the intermolecular potential energy surface (PES) of CH4-CO have been carried out at the explicitly correlated coupled cluster level of theory with single, double, and perturbative triple excitations [CCSD(T)-F12a] and an augmented correlation-consistent triple zeta (aVTZ) basis set. The global minimum of the five-dimensional PES corresponds to an approximately T-shaped structure with the CH4 face closest to the CO subunit and binding energy De = 177.82 cm(-1). The bound rovibrational levels of the CH4-CO complex were calculated for total angular momentum J = 0-6 on this intermolecular potential surface and compared with the experimental results. The calculated dissociation energies D0 are 91.32, 94.46, and 104.21 cm(-1) for A (jCH4 = 0), F (jCH4 = 1), and E (jCH4 = 2) nuclear spin modifications of CH4-CO, respectively. PMID:26493903

  13. Measuring The Influence of TAsk COMplexity on Human Error Probability: An Empirical Evaluation

    Podofillini, Luca; Dang, Vinh N. [Paul Scherrer Institute, Villigen (Switzerland)

    2013-04-15

    A key input for the assessment of Human Error Probabilities (HEPs) with Human Reliability Analysis (HRA) methods is the evaluation of the factors influencing the human performance (often referred to as Performance Shaping Factors, PSFs). In general, the definition of these factors and the supporting guidance are such that their evaluation involves significant subjectivity. This affects the repeatability of HRA results as well as the collection of HRA data for model construction and verification. In this context, the present paper considers the TAsk COMplexity (TACOM) measure, developed by one of the authors to quantify the complexity of procedure-guided tasks (by the operating crew of nuclear power plants in emergency situations), and evaluates its use to represent (objectively and quantitatively) task complexity issues relevant to HRA methods. In particular, TACOM scores are calculated for five Human Failure Events (HFEs) for which empirical evidence on the HEPs (albeit with large uncertainty) and influencing factors are available from the International HRA Empirical Study. The empirical evaluation has shown promising results. The TACOM score increases as the empirical HEP of the selected HFEs increases. Except for one case, TACOM scores are well distinguished if related to different difficulty categories (e. g., 'easy' vs. 'somewhat difficult'), while values corresponding to tasks within the same category are very close. Despite some important limitations related to the small number of HFEs investigated and the large uncertainty in their HEPs, this paper presents one of few attempts to empirically study the effect of a performance shaping factor on the human error probability. This type of study is important to enhance the empirical basis of HRA methods, to make sure that 1) the definitions of the PSFs cover the influences important for HRA (i. e., influencing the error probability), and 2) the quantitative relationships among PSFs and error

  14. PAFit: A Statistical Method for Measuring Preferential Attachment in Temporal Complex Networks.

    Thong Pham

    Full Text Available Preferential attachment is a stochastic process that has been proposed to explain certain topological features characteristic of complex networks from diverse domains. The systematic investigation of preferential attachment is an important area of research in network science, not only for the theoretical matter of verifying whether this hypothesized process is operative in real-world networks, but also for the practical insights that follow from knowledge of its functional form. Here we describe a maximum likelihood based estimation method for the measurement of preferential attachment in temporal complex networks. We call the method PAFit, and implement it in an R package of the same name. PAFit constitutes an advance over previous methods primarily because we based it on a nonparametric statistical framework that enables attachment kernel estimation free of any assumptions about its functional form. We show this results in PAFit outperforming the popular methods of Jeong and Newman in Monte Carlo simulations. What is more, we found that the application of PAFit to a publically available Flickr social network dataset yielded clear evidence for a deviation of the attachment kernel from the popularly assumed log-linear form. Independent of our main work, we provide a correction to a consequential error in Newman's original method which had evidently gone unnoticed since its publication over a decade ago.

  15. Complex Networks Measures for Differentiation between Normal and Shuffled Croatian Texts

    Margan, Domagoj; Martinčić-Ipšić, Sanda

    2014-01-01

    This paper studies the properties of the Croatian texts via complex networks. We present network properties of normal and shuffled Croatian texts for different shuffling principles: on the sentence level and on the text level. In both experiments we preserved the vocabulary size, word and sentence frequency distributions. Additionally, in the first shuffling approach we preserved the sentence structure of the text and the number of words per sentence. Obtained results showed that degree rank distributions exhibit no substantial deviation in shuffled networks, and strength rank distributions are preserved due to the same word frequencies. Therefore, standard approach to study the structure of linguistic co-occurrence networks showed no clear difference among the topologies of normal and shuffled texts. Finally, we showed that the in- and out- selectivity values from shuffled texts are constantly below selectivity values calculated from normal texts. Our results corroborate that the node selectivity measure can...

  16. Wide-band complex magnetic susceptibility measurements of magnetic fluids as a function of temperature

    Fannin, P. C.; Kinsella, L.; Charles, S. W.

    1999-07-01

    Measurements of the complex magnetic susceptibility over the frequency and temperature ranges of 2 MHz-6 GHz and 20 to -100°C, respectively, are reported for the first time for a magnetic fluid. The fluid used was a colloidal suspension of magnetite particles of median diameter 9 nm in a hydrocarbon oil (isopar m). Resonance was observed and found to increase from approx 1.5 GHz to 3.3 GHz in the temperature range 20 to -50°C. The increase in resonant frequency is attributed to a decrease in thermal fluctuations with decrease in temperature. At frequencies below approximately 19 MHz, a significant drop in χ'( ω) with decrease in temperature over the temperature range 20 to -100°C, is observed and is attributed to the changes in the Néel and Brownian relaxation processes. Below -60°C, the temperature at which the suspension becomes solid, Brownian relaxation ceases to exist.

  17. Measuring complexity in a business cycle model of the Kaldor type

    The purpose of this paper is to study the dynamical behavior of a family of two-dimensional nonlinear maps associated to an economic model. Our objective is to measure the complexity of the system using techniques of symbolic dynamics in order to compute the topological entropy. The analysis of the variation of this important topological invariant with the parameters of the system, allows us to distinguish different chaotic scenarios. Finally, we use a another topological invariant to distinguish isentropic dynamics and we exhibit numerical results about maps with the same topological entropy. This work provides an illustration of how our understanding of higher dimensional economic models can be enhanced by the theory of dynamical systems.

  18. A New Efficient Analytical Method for Picolinate Ion Measurements in Complex Aqueous Solutions

    Parazols, M.; Dodi, A. [CEA Cadarache, Lab Anal Radiochim and Chim, DEN, F-13108 St Paul Les Durance (France)

    2010-07-01

    This study focuses on the development of a new simple but sensitive, fast and quantitative liquid chromatography method for picolinate ion measurement in high ionic strength aqueous solutions. It involves cation separation over a chromatographic CS16 column using methane sulfonic acid as a mobile phase and detection by UV absorbance (254 nm). The CS16 column is a high-capacity stationary phase exhibiting both cation exchange and RP properties. It allows interaction with picolinate ions which are in their zwitterionic form at the pH of the mobile phase (1.3-1.7). Analysis is performed in 30 min with a detection limit of about 0.05 {mu}M and a quantification limit of about 0.15 {mu}M. Moreover, this analytical technique has been tested efficiently on complex aqueous samples from an effluent treatment facility. (authors)

  19. Measuring mixing patterns in complex networks by Spearman rank correlation coefficient

    Zhang, Wen-Yao; Wei, Zong-Wen; Wang, Bing-Hong; Han, Xiao-Pu

    2016-06-01

    In this paper, we utilize Spearman rank correlation coefficient to measure mixing patterns in complex networks. Compared with the widely used Pearson coefficient, Spearman coefficient is rank-based, nonparametric, and size-independent. Thus it is more effective to assess linking patterns of diverse networks, especially for large-size networks. We demonstrate this point by testing a variety of empirical and artificial networks. Moreover, we show that normalized Spearman ranks of stubs are subject to an interesting linear rule where the correlation coefficient is just the Spearman coefficient. This compelling linear relationship allows us to directly produce networks with any prescribed Spearman coefficient. Our method apparently has an edge over the well known uncorrelated configuration model.

  20. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC1400-12 Ref. [9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc. ) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  1. Evaluation of indirect impedance for measuring microbial growth in complex food matrices.

    Johnson, N; Chang, Z; Bravo Almeida, C; Michel, M; Iversen, C; Callanan, M

    2014-09-01

    The suitability of indirect impedance to accurately measure microbial growth in real food matrices was investigated. A variety of semi-solid and liquid food products were inoculated with Bacillus cereus, Listeria monocytogenes, Staphylococcus aureus, Lactobacillus plantarum, Pseudomonas aeruginosa, Escherichia coli, Salmonella enteriditis, Candida tropicalis or Zygosaccharomyces rouxii and CO2 production was monitored using a conductimetric (Don Whitely R.A.B.I.T.) system. The majority (80%) of food and microbe combinations produced a detectable growth signal. The linearity of conductance responses in selected food products was investigated and a good correlation (R(2) ≥ 0.84) was observed between inoculum levels and times to detection. Specific growth rate estimations from the data were sufficiently accurate for predictive modeling in some cases. This initial evaluation of the suitability of indirect impedance to generate microbial growth data in complex food matrices indicates significant potential for the technology as an alternative to plating methods. PMID:24929710

  2. Estimating the operator's performance time of emergency procedural tasks based on a task complexity measure

    It is important to understand the amount of time required to execute an emergency procedural task in a high-stress situation for managing human performance under emergencies in a nuclear power plant. However, the time to execute an emergency procedural task is highly dependent upon expert judgment due to the lack of actual data. This paper proposes an analytical method to estimate the operator's performance time (OPT) of a procedural task, which is based on a measure of the task complexity (TACOM). The proposed method for estimating an OPT is an equation that uses the TACOM as a variable, and the OPT of a procedural task can be calculated if its relevant TACOM score is available. The validity of the proposed equation is demonstrated by comparing the estimated OPTs with the observed OPTs for emergency procedural tasks in a steam generator tube rupture scenario.

  3. Entropy-based complexity measures for gait data of patients with Parkinson's disease

    Afsar, Ozgur; Tirnakli, Ugur; Kurths, Juergen

    2016-02-01

    Shannon, Kullback-Leibler, and Klimontovich's renormalized entropies are applied as three different complexity measures on gait data of patients with Parkinson's disease (PD) and healthy control group. We show that the renormalized entropy of variability of total reaction force of gait is a very efficient tool to compare patients with respect to disease severity. Moreover, it is a good risk predictor such that the sensitivity, i.e., the percentage of patients with PD who are correctly identified as having PD, increases from 25% to 67% while the Hoehn-Yahr stage increases from 2.5 to 3.0 (this stage goes from 0 to 5 as the disease severity increases). The renormalized entropy method for stride time variability of gait is found to correctly identify patients with a sensitivity of 80%, while the Shannon entropy and the Kullback-Leibler relative entropy can do this with a sensitivity of only 26.7% and 13.3%, respectively.

  4. Integrating Sound Scattering Measurements in the Design of Complex Architectural Surfaces

    Peters, Brady

    2010-01-01

    Digital tools present the opportunity for incorporating performance analysis into the architectural design process. Acoustic performance is an important criterion for architectural design. There is much known about sound absorption but little about sound scattering, even though scattering is...... recognized to be one of the most important factors in predicting the computational prediction of acoustic performance. This paper proposes a workflow for the design of complex architectural surfaces and the prediction of their sound scattering properties. This workflow includes the development of...... computational design tools, geometry generation, fabrication of test surfaces, measurement of acoustic performance, and the incorporation of this data into the generative tool. The Hexagon Wall is included and discussed as an illustrative design study....

  5. Response to Disturbance and Abundance of Final State: a Measure for Complexity?

    SHEN Dan; WANG Wen-Xiu; JIANG Yu-Mei; HE Yue; HE Da-Ren

    2007-01-01

    We propose a new definition of complexity. The definition shows that when a system evolves to a final state via a transient state, its complexity depends on the abundance of both the final state and transient state. The abundance of the transient state may be described by the diversity of the response to disturbance. We hope that this definition can describe a clear boundary between simple systems and complex systems by showing that all the simple systems have zero complexity, and all the complex systems have positive complexity. Some examples of the complexity calculations are presented, which supports our hope.

  6. Progressive evolution and a measure for its noise-dependent complexity

    Fussy, Siegfried; Grössing, Gerhard; Schwabl, Herbert

    1999-03-01

    -Queen-effect." Additionally, for the memory based model a parameter was found indicating a limited range of noise allowing for the most complex behavior of the model, whereas the entropy of the system provides only a monotonous measure with respect to the varying noise level.

  7. Characterization of a complex near-surface structure using well logging and passive seismic measurements

    Benjumea, Beatriz; Macau, Albert; Gabàs, Anna; Figueras, Sara

    2016-04-01

    We combine geophysical well logging and passive seismic measurements to characterize the near-surface geology of an area located in Hontomin, Burgos (Spain). This area has some near-surface challenges for a geophysical study. The irregular topography is characterized by limestone outcrops and unconsolidated sediments areas. Additionally, the near-surface geology includes an upper layer of pure limestones overlying marly limestones and marls (Upper Cretaceous). These materials lie on top of Low Cretaceous siliciclastic sediments (sandstones, clays, gravels). In any case, a layer with reduced velocity is expected. The geophysical data sets used in this study include sonic and gamma-ray logs at two boreholes and passive seismic measurements: three arrays and 224 seismic stations for applying the horizontal-to-vertical amplitude spectra ratio method (H/V). Well-logging data define two significant changes in the P-wave-velocity log within the Upper Cretaceous layer and one more at the Upper to Lower Cretaceous contact. This technique has also been used for refining the geological interpretation. The passive seismic measurements provide a map of sediment thickness with a maximum of around 40 m and shear-wave velocity profiles from the array technique. A comparison between seismic velocity coming from well logging and array measurements defines the resolution limits of the passive seismic techniques and helps it to be interpreted. This study shows how these low-cost techniques can provide useful information about near-surface complexity that could be used for designing a geophysical field survey or for seismic processing steps such as statics or imaging.

  8. Investigation of the Ionic conductivity and dielectric measurements of poly (N-vinyl pyrrolidone)-sulfamic acid polymer complexes

    Polymer electrolyte complexes of poly (N-vinyl pyrrolidone) (PVP)-sulfamic acid (NH2SO3H) were prepared by a familiar solution casting method with different molar concentrations of PVP and sulfamic acid. The interaction between PVP and NH2SO3H was confirmed by Fourier transform infrared spectroscopy analysis. Laser microscopy analysis was used to study the surface morphology of the polymer complexes. The glass transition temperature (Tg) and the melting temperature (Tm) of polymer complexes were computed from Differential scanning calorimetric studies. AC impedance spectroscopic measurements revealed that the polymer complex, 97 mol% PVP-3 mol% NH2SO3H shows higher ionic conductivity with two different activation energies above and below the glass transition temperature (Tg). Dielectric studies confirmed that the dc conduction mechanism has dominated in the polymer complexes. The value of power law exponent (n) confirmed the translational motion of ions from one site to another vacant site in these complexes

  9. Determination of the Landau-Lifshitz damping parameter by means of complex susceptibility measurements

    A new experimental method for the determination of the Landau-Lifshitz damping parameter, α, based on measurements of the frequency and field dependence of the complex magnetic susceptibility, χ(ω,H)=χ'(ω,H)-iχ''(ω,H), is proposed. The method centres on evaluating the ratio of fmax/fres, where fres is the resonance frequency and fmax is the maximum absorption frequency at resonance, of the sample susceptibility spectra, measured in strong polarizing fields. We have investigated three magnetic fluid samples, namely sample 1, sample 2 and sample 3. Sample 1 consisted of particles of Mn0.6Fe0.4Fe2O4 dispersed in kerosene, sample 2 consisted of magnetite particles dispersed in Isopar M and sample 3 was composed of particles of Mn0.66Zn0.34Fe2O4 dispersed in Isopar M. The results obtained for the mean damping parameter of particles within the magnetic fluid samples are as follows: 0.6Fe0.4Fe2O4)>=0.057 with the corresponding standard deviation SD=0.0104; 3O4)>=0.1105 with the corresponding standard deviation, SD=0.034 and 0.66Zn0.34Fe2O4)>=0.096 with the corresponding standard deviation, SD=0.037

  10. Determination of the Landau Lifshitz damping parameter by means of complex susceptibility measurements

    Fannin, P. C.; Marin, C. N.

    2006-04-01

    A new experimental method for the determination of the Landau-Lifshitz damping parameter, α, based on measurements of the frequency and field dependence of the complex magnetic susceptibility, χ(ω,H)=χ'(ω,H)-iχ″(ω,H), is proposed. The method centres on evaluating the ratio of fmax/ fres, where fres is the resonance frequency and fmax is the maximum absorption frequency at resonance, of the sample susceptibility spectra, measured in strong polarizing fields. We have investigated three magnetic fluid samples, namely sample 1, sample 2 and sample 3. Sample 1 consisted of particles of Mn 0.6Fe 0.4Fe 2O 4 dispersed in kerosene, sample 2 consisted of magnetite particles dispersed in Isopar M and sample 3 was composed of particles of Mn 0.66Zn 0.34Fe 2O 4 dispersed in Isopar M. The results obtained for the mean damping parameter of particles within the magnetic fluid samples are as follows: =0.057 with the corresponding standard deviation SD=0.0104; =0.1105 with the corresponding standard deviation, SD=0.034 and =0.096 with the corresponding standard deviation, SD=0.037.