WorldWideScience

Sample records for acid-water complexes measured

  1. Technology of complex cleaning of mine acidic waters

    It is shown, that problem of complex use of mine waters includes two tasks: its cleaning and use of these waters in capacity of hydro-mineral raw. The floatation-extraction technology of mine acidic waters reprocessing is developed. The possibility of extraction processing of foam products of floatation with purpose of selective isolation of valuable components (Co, Ni, Sc, numerous of rare elements) is considered, optimum modes of metal extraction are defined. (author)

  2. Measuring Tax Complexity

    David Ulph

    2014-01-01

    This paper critically examines a number of issues relating to the measurement of tax complexity. It starts with an analysis of the concept of tax complexity, distinguishing tax design complexity and operational complexity. It considers the consequences/costs of complexity, and then examines the rationale for measuring complexity. Finally it applies the analysis to an examination of an index of complexity developed by the UK Office of Tax Simplification (OTS). Postprint

  3. Viral quasispecies complexity measures.

    Gregori, Josep; Perales, Celia; Rodriguez-Frias, Francisco; Esteban, Juan I; Quer, Josep; Domingo, Esteban

    2016-06-01

    Mutant spectrum dynamics (changes in the related mutants that compose viral populations) has a decisive impact on virus behavior. The several platforms of next generation sequencing (NGS) to study viral quasispecies offer a magnifying glass to study viral quasispecies complexity. Several parameters are available to quantify the complexity of mutant spectra, but they have limitations. Here we critically evaluate the information provided by several population diversity indices, and we propose the introduction of some new ones used in ecology. In particular we make a distinction between incidence, abundance and function measures of viral quasispecies composition. We suggest a multidimensional approach (complementary information contributed by adequately chosen indices), propose some guidelines, and illustrate the use of indices with a simple example. We apply the indices to three clinical samples of hepatitis C virus that display different population heterogeneity. Areas of virus biology in which population complexity plays a role are discussed. PMID:27060566

  4. Fundamental Complexity Measures of Life

    Grandpierre, Attila

    2012-01-01

    At present, there is a great deal of confusion regarding complexity and its measures (reviews on complexity measures are found in, e.g. Lloyd, 2001 and Shalizi, 2006 and more references therein). Moreover, there is also confusion regarding the nature of life. In this situation, it seems the task of determining the fundamental complexity measures of life is especially difficult. Yet this task is just part of a greater task: obtaining substantial insights into the nature of biological evolution. We think that without a firm quantitative basis characterizing the most fundamental aspects of life, it is impossible to overcome the confusion so as to clarify the nature of biological evolution. The approach we present here offers such quantitative measures of complexity characterizing biological organization and, as we will see, evolution.

  5. Measuring importance in complex networks

    Morrison, Greg; Dudte, Levi; Mahadevan, L.

    2013-03-01

    A variety of centrality measures can be defined on a network to determine the global `importance' of a node i. However, the inhomogeneity of complex networks implies that not all nodes j will consider i equally important. In this talk, we use a linearized form of the Generalized Erdos numbers [Morrison and Mahadevan EPL 93 40002 (2011)] to define a pairwise measure of the importance of a node i from the perspective of node j which incorporates the global network topology. This localized importance can be used to define a global measure of centrality that is consistent with other well-known centrality measures. We illustrate the use of the localized importance in both artificial and real-world networks with a complex global topology.

  6. Hierarchy measure for complex networks.

    Enys Mones

    Full Text Available Nature, technology and society are full of complexity arising from the intricate web of the interactions among the units of the related systems (e.g., proteins, computers, people. Consequently, one of the most successful recent approaches to capturing the fundamental features of the structure and dynamics of complex systems has been the investigation of the networks associated with the above units (nodes together with their relations (edges. Most complex systems have an inherently hierarchical organization and, correspondingly, the networks behind them also exhibit hierarchical features. Indeed, several papers have been devoted to describing this essential aspect of networks, however, without resulting in a widely accepted, converging concept concerning the quantitative characterization of the level of their hierarchy. Here we develop an approach and propose a quantity (measure which is simple enough to be widely applicable, reveals a number of universal features of the organization of real-world networks and, as we demonstrate, is capable of capturing the essential features of the structure and the degree of hierarchy in a complex network. The measure we introduce is based on a generalization of the m-reach centrality, which we first extend to directed/partially directed graphs. Then, we define the global reaching centrality (GRC, which is the difference between the maximum and the average value of the generalized reach centralities over the network. We investigate the behavior of the GRC considering both a synthetic model with an adjustable level of hierarchy and real networks. Results for real networks show that our hierarchy measure is related to the controllability of the given system. We also propose a visualization procedure for large complex networks that can be used to obtain an overall qualitative picture about the nature of their hierarchical structure.

  7. Acceptable Complexity Measures of Theorems

    Grenet, Bruno

    2009-01-01

    In 1931, G\\"odel presented in K\\"onigsberg his famous Incompleteness Theorem, stating that some true mathematical statements are unprovable. Yet, this result gives us no idea about those independent (that is, true and unprovable) statements, about their frequency, the reason they are unprovable, and so on. Calude and J\\"urgensen proved in 2005 Chaitin's "heuristic principle" for an appropriate measure: the theorems of a finitely-specified theory cannot be significantly more complex than the t...

  8. Computerized measures of visual complexity.

    Machado, Penousal; Romero, Juan; Nadal, Marcos; Santos, Antonino; Correia, João; Carballal, Adrián

    2015-09-01

    Visual complexity influences people's perception of, preference for, and behaviour toward many classes of objects, from artworks to web pages. The ability to predict people's impression of the complexity of different kinds of visual stimuli holds, therefore, great potential for many domains, basic and applied. Here we use edge detection operations and several image metrics based on image compression error and Zipf's law to estimate the visual complexity of images. The experiments involved 800 images, each previously rated by thirty participants on perceived complexity. In a first set of experiments we analysed the correlation of individual features with the average human response, obtaining correlations up to rs = .771. In a second set of experiments we employed Machine Learning techniques to predict the average visual complexity score attributed by humans to each stimuli. The best configurations obtained a correlation of rs = .832. The average prediction error of the Machine Learning system over the set of all stimuli was .096 in a normalized 0 to 1 interval, showing that it is possible to predict, with high accuracy human responses. Overall, edge density and compression error were the strongest predictors of human complexity ratings. PMID:26164647

  9. Metric for Early Measurement of Software Complexity

    Ghazal Keshavarz,; Dr. Nasser Modiri,; Dr. Mirmohsen Pedram

    2011-01-01

    Software quality depends on several factors such as on time delivery; within budget and fulfilling user's needs. Complexity is one of the most important factors that may affect the quality. Therefore, measuring and controlling the complexity result in improving the quality. So far, most of the researches have tried to identify and measure the complexity in design and code phase. However, whenwe have the code or design for software, it is too late to control complexity. In this article, with e...

  10. Complexity measures, emergence, and multiparticle correlations

    Galla, Tobias

    2011-01-01

    We study correlation measures for complex systems. First, we investigate some recently proposed measures based on information geometry. We show that these measures can increase under local transformations as well as under discarding particles, thereby questioning their interpretation as a quantifier for complexity or correlations. We then propose a refined definition of these measures, investigate its properties and discuss its numerical evaluation. As an example, we study coupled logistic maps and study the behavior of the different measures for that case. Finally, we investigate other local effects during the coarse graining of the complex system.

  11. Complexity measures, emergence, and multiparticle correlations

    Galla, Tobias; Gühne, Otfried

    2012-04-01

    We study correlation measures for complex systems. First, we investigate some recently proposed measures based on information geometry. We show that these measures can increase under local transformations as well as under discarding particles, thereby questioning their interpretation as a quantifier for complexity or correlations. We then propose a refined definition of these measures, investigate its properties, and discuss its numerical evaluation. As an example, we study coupled logistic maps and study the behavior of the different measures for that case. Finally, we investigate other local effects during the coarse graining of the complex system.

  12. Metric for Early Measurement of Software Complexity

    Ghazal Keshavarz,

    2011-06-01

    Full Text Available Software quality depends on several factors such as on time delivery; within budget and fulfilling user's needs. Complexity is one of the most important factors that may affect the quality. Therefore, measuring and controlling the complexity result in improving the quality. So far, most of the researches have tried to identify and measure the complexity in design and code phase. However, whenwe have the code or design for software, it is too late to control complexity. In this article, with emphasis on Requirement Engineering process, we analyze the causes of software complexity, particularly in the first phase of software development, and propose a requirement based metric. This metric enables a software engineer to measure the complexity before actual design and implementation and choosestrategies that are appropriate to the software complexity degree, thus saving on cost and human resource wastage and, more importantly, leading to lower maintenance costs.

  13. Measurement methods on the complexity of network

    LIN Lin; DING Gang; CHEN Guo-song

    2010-01-01

    Based on the size of network and the number of paths in the network,we proposed a model of topology complexity of a network to measure the topology complexity of the network.Based on the analyses of the effects of the number of the equipment,the types of equipment and the processing time of the node on the complexity of the network with the equipment-constrained,a complexity model of equipment-constrained network was constructed to measure the integrated complexity of the equipment-constrained network.The algorithms for the two models were also developed.An automatic generator of the random single label network was developed to test the models.The results show that the models can correctly evaluate the topology complexity and the integrated complexity of the networks.

  14. Cardiac Aging Detection Using Complexity Measures

    Balasubramanian, Karthi

    2016-01-01

    As we age, our hearts undergo changes which result in reduction in complexity of physiological interactions between different control mechanisms. This results in a potential risk of cardiovascular diseases which are the number one cause of death globally. Since cardiac signals are nonstationary and nonlinear in nature, complexity measures are better suited to handle such data. In this study, non-invasive methods for detection of cardiac aging using complexity measures are explored. Lempel-Ziv (LZ) complexity, Approximate Entropy (ApEn) and Effort-to-Compress (ETC) measures are used to differentiate between healthy young and old subjects using heartbeat interval data. We show that both LZ and ETC complexity measures are able to differentiate between young and old subjects with only 10 data samples while ApEn requires at least 15 data samples.

  15. Artificial sequences and complexity measures

    Baronchelli, Andrea; Caglioti, Emanuele; Loreto, Vittorio

    2005-04-01

    In this paper we exploit concepts of information theory to address the fundamental problem of identifying and defining the most suitable tools for extracting, in a automatic and agnostic way, information from a generic string of characters. We introduce in particular a class of methods which use in a crucial way data compression techniques in order to define a measure of remoteness and distance between pairs of sequences of characters (e.g. texts) based on their relative information content. We also discuss in detail how specific features of data compression techniques could be used to introduce the notion of dictionary of a given sequence and of artificial text and we show how these new tools can be used for information extraction purposes. We point out the versatility and generality of our method that applies to any kind of corpora of character strings independently of the type of coding behind them. We consider as a case study linguistic motivated problems and we present results for automatic language recognition, authorship attribution and self-consistent classification.

  16. Measuring Complexity in an Aquatic Ecosystem

    Fernandez, Nelson; Gershenson, Carlos

    2013-01-01

    We apply formal measures of emergence, self-organization, homeostasis, autopoiesis and complexity to an aquatic ecosystem; in particular to the physiochemical component of an Arctic lake. These measures are based on information theory. Variables with an homogeneous distribution have higher values of emergence, while variables with a more heterogeneous distribution have a higher self-organization. Variables with a high complexity reflect a balance between change (emergence) and regularity/orde...

  17. A Simple Measure of Economic Complexity

    Inoua, Sabiou

    2016-01-01

    We show from a simple model that a country's technological development can be measured by the logarithm of the number of products it makes. We show that much of the income gaps among countries are due to differences in technology, as measured by this simple metric. Finally, we show that the so-called Economic Complexity Index (ECI), a recently proposed measure of collective knowhow, is in fact an estimate of this simple metric (with correlation above 0.9).

  18. Measuring Customer Profitability in Complex Environments

    Holm, Morten; Kumar, V.; Rohde, Carsten

    2012-01-01

    Customer profitability measurement is an important element in customer relationship management and a lever for enhanced marketing accountability. Two distinct measurement approaches have emerged in the marketing literature: Customer Lifetime Value (CLV) and Customer Profitability Analysis (CPA...... degree of sophistication deployed when implementing customer profitability measurement models is determined by the type of complexity encountered in firms’ customer environments. This gives rise to a contingency framework for customer profitability measurement model selection and five research...... propositions. Additionally, the framework provides design and implementation guidance for managers seeking to implement customer profitability measurement models for resource allocation purposes....

  19. Measurement of Diffusion in Flowing Complex Fluids

    Leonard, Edward F.; Aucoin, Christian P.; Nanne, Edgar E.

    2006-01-01

    A microfluidic device for the measurement of solute diffusion as well as particle diffusion and migration in flowing complex fluids is described. The device is particularly suited to obtaining diffusivities in such fluids, which require a desired flow state to be maintained during measurement. A method based on the Loschmidt diffusion theory and short times of exposure is presented to allow calculation of diffusivities from concentration differences in the flow streams leaving the cell.

  20. A complexity measure for diachronic Chinese phonology

    Raman, A; Patrick, J; Raman, Anand; Newman, John; Patrick, Jon

    1997-01-01

    This paper addresses the problem of deriving distance measures between parent and daughter languages with specific relevance to historical Chinese phonology. The diachronic relationship between the languages is modelled as a Probabilistic Finite State Automaton. The Minimum Message Length principle is then employed to find the complexity of this structure. The idea is that this measure is representative of the amount of dissimilarity between the two languages.

  1. Study on fluorescence spectra of molecular association of acetic acid-water

    Caiqin Han; Ying Liu; Yang Yang; Xiaowu Ni; Jian Lu; Xiaosen Luo

    2009-01-01

    Fluorescence spectra of acetic acid-water solution excited by ultraviolet (UV) light are studied, and the relationship between fluorescence spectra and molecular association of acetic acid is discussed. The results indicate that when the exciting light wavelength is longer than 246 nm, there are two fluorescence peaks located at 305 and 334 nm, respectively. By measuring the excitation spectra, the optimal wavelengths of the two fluorescence peaks are obtained, which are 258 and 284 nm, respectively. Fluorescence spectra of acetic acid-water solution change with concentrations, which is primarily attributed to changes of molecular association of acetic acid in aqueous solution. Through theoretical analysis, three variations of molecular association have been obtained in acetic acid-water solution, which are the hydrated monomers, the linear dimers, and the water separated dimers. This research can provide references to studies of molecular association of acetic acid-water, especially studies of hydrogen bonds.

  2. Balancing model complexity and measurements in hydrology

    Van De Giesen, N.; Schoups, G.; Weijs, S. V.

    2012-12-01

    The Data Processing Inequality implies that hydrological modeling can only reduce, and never increase, the amount of information available in the original data used to formulate and calibrate hydrological models: I(X;Z(Y)) ≤ I(X;Y). Still, hydrologists around the world seem quite content building models for "their" watersheds to move our discipline forward. Hydrological models tend to have a hybrid character with respect to underlying physics. Most models make use of some well established physical principles, such as mass and energy balances. One could argue that such principles are based on many observations, and therefore add data. These physical principles, however, are applied to hydrological models that often contain concepts that have no direct counterpart in the observable physical universe, such as "buckets" or "reservoirs" that fill up and empty out over time. These not-so-physical concepts are more like the Artificial Neural Networks and Support Vector Machines of the Artificial Intelligence (AI) community. Within AI, one quickly came to the realization that by increasing model complexity, one could basically fit any dataset but that complexity should be controlled in order to be able to predict unseen events. The more data are available to train or calibrate the model, the more complex it can be. Many complexity control approaches exist in AI, with Solomonoff inductive inference being one of the first formal approaches, the Akaike Information Criterion the most popular, and Statistical Learning Theory arguably being the most comprehensive practical approach. In hydrology, complexity control has hardly been used so far. There are a number of reasons for that lack of interest, the more valid ones of which will be presented during the presentation. For starters, there are no readily available complexity measures for our models. Second, some unrealistic simplifications of the underlying complex physics tend to have a smoothing effect on possible model

  3. Residual radioactivity measurements at Indus accelerator complex

    Indus-1 and Indus-2 are two Synchrotron Radiation Sources (SRS) operational at RRCAT, Indore. Indus-1 and Indus-2 are designed for maximum electron beam energy of 450 MeV and 2.5 GeV respectively. During shut down of these accelerators for maintenance purpose, residual radioactivity measurements were carried out. The residual radioactivity formation in various parts of the high energy electron accelerators is due to the beam loss taking place at these locations. The present paper describes the recent residual radioactivity measurements carried out at the electron accelerators of Indus Accelerator Complex and the radio-isotopes identified. The maximum dose rate due to induced activity obtained is 30 μSv/h, near dipole-5 of booster synchrotron after 12 h of cooling time. In case of Indus-1 and Indus-2 SRS the dose rate due to induced radioactivity is found to be of the order of 2 - 3 μSv/h. The radio isotopes identified at these beam loss locations are beta emitters that do not pose serious external hazard to the working personnel. However, precautions are to be observed while doing maintenance on activated components. The paper describes the measurements in detail with the results. (author)

  4. Measure of robustness for complex networks

    Youssef, Mina Nabil

    Critical infrastructures are repeatedly attacked by external triggers causing tremendous amount of damages. Any infrastructure can be studied using the powerful theory of complex networks. A complex network is composed of extremely large number of different elements that exchange commodities providing significant services. The main functions of complex networks can be damaged by different types of attacks and failures that degrade the network performance. These attacks and failures are considered as disturbing dynamics, such as the spread of viruses in computer networks, the spread of epidemics in social networks, and the cascading failures in power grids. Depending on the network structure and the attack strength, every network differently suffers damages and performance degradation. Hence, quantifying the robustness of complex networks becomes an essential task. In this dissertation, new metrics are introduced to measure the robustness of technological and social networks with respect to the spread of epidemics, and the robustness of power grids with respect to cascading failures. First, we introduce a new metric called the Viral Conductance (VCSIS ) to assess the robustness of networks with respect to the spread of epidemics that are modeled through the susceptible/infected/susceptible (SIS) epidemic approach. In contrast to assessing the robustness of networks based on a classical metric, the epidemic threshold, the new metric integrates the fraction of infected nodes at steady state for all possible effective infection strengths. Through examples, VCSIS provides more insights about the robustness of networks than the epidemic threshold. In addition, both the paradoxical robustness of Barabasi-Albert preferential attachment networks and the effect of the topology on the steady state infection are studied, to show the importance of quantifying the robustness of networks. Second, a new metric VCSIR is introduced to assess the robustness of networks with respect

  5. A New Method for Measurement and Reduction of Software Complexity

    SHI Yindun; XU Shiyi

    2007-01-01

    This paper develops an improved structural software complexity metrics named information flow complexity which is closely related to the reliability of software. Together with the three software complexity metrics, the total software complexity is measured and some rules to reduce the complexity are presented in the paper. To illustrate and explain the process of measurement and reduction of software complexity, several examples and experiments are given. It is proposed that software complexity metrics can be measured earlier in software development and can provide substantial information of software systems whose reliability can be modeled and used in the determination of initial parameter estimation.

  6. Laser beam complex amplitude measurement by phase diversity

    Védrenne, Nicolas; Mugnier, Laurent M.; Michau, Vincent; Velluet, Marie-Thérèse; Bierent, Rudolph

    2014-01-01

    The control of the optical quality of a laser beam requires a complex amplitude measurement able to deal with strong modulus variations and potentially highly perturbed wavefronts. The method proposed here consists in an extension of phase diversity to complex amplitude measurements that is effective for highly perturbed beams. Named CAMELOT for Complex Amplitude MEasurement by a Likelihood Optimization Tool, it relies on the acquisition and processing of few images of the beam section taken ...

  7. Complexity analysis in particulate matter measurements

    Luciano Telesca

    2011-09-01

    Full Text Available We investigated the complex temporal fluctuations of particulate matter data recorded in London area by using the Fisher-Shannon (FS information plane. In the FS plane the PM10 and PM2.5 data are aggregated in two different clusters, characterized by different degrees of order and organization. This results could be related to different sources of the particulate matter.

  8. An entropy based measure for comparing distributions of complexity

    Rajaram, R.; Castellani, B.

    2016-07-01

    This paper is part of a series addressing the empirical/statistical distribution of the diversity of complexity within and amongst complex systems. Here, we consider the problem of measuring the diversity of complexity in a system, given its ordered range of complexity types i and their probability of occurrence pi, with the understanding that larger values of i mean a higher degree of complexity. To address this problem, we introduce a new complexity measure called case-based entropyCc - a modification of the Shannon-Wiener entropy measure H. The utility of this measure is that, unlike current complexity measures-which focus on the macroscopic complexity of a single system-Cc can be used to empirically identify and measure the distribution of the diversity of complexity within and across multiple natural and human-made systems, as well as the diversity contribution of complexity of any part of a system, relative to the total range of ordered complexity types.

  9. Unraveling chaotic attractors by complex networks and measurements of stock market complexity

    We present a novel method for measuring the complexity of a time series by unraveling a chaotic attractor modeled on complex networks. The complexity index R, which can potentially be exploited for prediction, has a similar meaning to the Kolmogorov complexity (calculated from the Lempel–Ziv complexity), and is an appropriate measure of a series' complexity. The proposed method is used to research the complexity of the world's major capital markets. None of these markets are completely random, and they have different degrees of complexity, both over the entire length of their time series and at a level of detail. However, developing markets differ significantly from mature markets. Specifically, the complexity of mature stock markets is stronger and more stable over time, whereas developing markets exhibit relatively low and unstable complexity over certain time periods, implying a stronger long-term price memory process

  10. A computer program for geochemical analysis of acid-rain and other low-ionic-strength, acidic waters

    Johnsson, P.A.; Lord, D.G.

    1987-01-01

    ARCHEM, a computer program written in FORTRAN 77, is designed primarily for use in the routine geochemical interpretation of low-ionic-strength, acidic waters. On the basis of chemical analyses of the water, and either laboratory or field determinations of pH, temperature, and dissolved oxygen, the program calculates the equilibrium distribution of major inorganic aqueous species and of inorganic aluminum complexes. The concentration of the organic anion is estimated from the dissolved organic concentration. Ionic ferrous iron is calculated from the dissolved oxygen concentration. Ionic balances and comparisons of computed with measured specific conductances are performed as checks on the analytical accuracy of chemical analyses. ARCHEM may be tailored easily to fit different sampling protocols, and may be run on multiple sample analyses. (Author 's abstract)

  11. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  12. Measuring control structure complexity through execution sequence grammars

    MacLennan, Bruce J.

    1981-01-01

    A method for measuring the complexity of control structures is presented. It is based on the size of a grammar describing the possible execution sequences of the control structure. This method is applied to a number of control structures, including Pascal's control structures, Dijkstra's operators, and a structure recently proposed by Parnas. The verification of complexity measures is briefly discussed. (Author)

  13. SAT is a problem with exponential complexity measured by negentropy

    Pan, Feng(Department of Physics, Liaoning Normal University, Dalian 116029, China)

    2014-01-01

    In this paper the reason why entropy reduction (negentropy) can be used to measure the complexity of any computation was first elaborated both in the aspect of mathematics and informational physics. In the same time the equivalence of computation and information was clearly stated. Then the complexities of three specific problems: logical compare, sorting and SAT, were analyzed and measured. The result showed SAT was a problem with exponential complexity which naturally leads to the conclusio...

  14. Communication line for an image scanning and measurement Complex

    A complex for an on-line processing of the film information obtained in the process of photography of events in a bubble chambers is described. The complex involves an image scanning and measurement apparatus (5 SAMET image scanning and measurement tables and 2 VT-340 alphanumeric displays) the Electronics computer and data transmission line consisting of the communication line itself and two buffer shaping amplifiers and interfaces. The flowsheet of the above complex communication line is presented

  15. Digital System for Complex Bioimpedance Measurement

    Verner, Petr

    Brno : Vysoké učení technické v Brně, 2004 - (Boušek, J.; Háze, J.), s. 149-152 ISBN 80-214-2701-9. [EDS '04 /11./ Electronic Devices and Systems Conference. Brno (CZ), 09.09.2004-10.09.2004] R&D Projects: GA ČR GA102/00/1262 Keywords : bioimpedance measurement * digital receiver * cardiac output Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering

  16. Complex permittivity measurements of ferroelectrics employing composite dielectric resonator technique.

    Krupka, Jerzy; Zychowicz, Tomasz; Bovtun, Viktor; Veljko, Sergiy

    2006-10-01

    Composite cylindrical TE(0n1) mode dielectric resonator has been used for the complex permittivity measurements of ferroelectrics at frequency about 8.8 GHz. Rigorous equations have been derived that allowed us to find a relationship between measured resonance frequency and Q-factor and the complex permittivity. It has been shown that the choice of appropriate diameter of a sample together with rigorous complex angular frequency analysis allows precise measurements of various ferroelectric. Proposed technique can be used for materials having both real and imaginary part of permittivity as large as a few thousand. Variable temperature measurements were performed on a PbMg(1/3)Nb(2/3)O3 (PMN) ceramic sample, and the measured complex permittivity have shown good agreement with the results of measurements obtained on the same sample at lower frequencies (0.1-1.8 GHz). PMID:17036796

  17. A Simple Complexity Measurement for Software Verification and Software Testing

    Cheng, Zheng; Monahan, Rosemary; Power, James F.

    2012-01-01

    In this paper, we used a simple metric (i.e. Lines of Code) to measure the complexity involved in software verification and software testing. The goal is then, to argue for software verification over software testing, and motivate a discussion of how to reduce the complexity involved in software verification. We propose to reduce this complexity by translating the software to a simple intermediate representation which can be verified using an efficient verifier, such as Boog...

  18. Complexity measure for the Prototype System Description Language (PSDL)

    Dupont, Joseph P.

    2002-01-01

    "We often misunderstand, ill define or improperly measure the complexity of software. Software complexity is represented by the degree of complication of a system determined by such factors as control flow, information flow, the degree of nesting, the types of data structures, and other system characteristics, such as unconventional architectures. However, a common notion of software complexity fulfills a non-functional requirement, that of understandability. How well do we understand the...

  19. SYNAPTONEMAL COMPLEX DAMAGE AS A MEASURE OF GENOTOXICITY AT MEIOSIS

    Synaptonemal complex aberrations can provide a sensitive measure of chemical-specific alterations to meiotic chromosomes. Mitomycin C, cyclophosphamide, amsacrine, ellipticine, colchicine, vinblastine sulfate, and cis-platin exposures in mice have been shown to cause various patt...

  20. High Dynamic Range Complex Impedance Measurement System for Petrophysical Usage

    Chen, R.; He, X.; Yao, H.; Tan, S.; Shi, H.; Shen, R.; Yan, C.; Zeng, P.; He, L.; Qiao, N.; Xi, F.; Zhang, H.; Xie, J.

    2015-12-01

    Spectral induced polarization method (SIP) or complex resistivity method is increasing its application in metalliferous ore exploration, hydrocarbon exploration, underground water exploration, monitoring of environment pollution, and the evaluation of environment remediation. And the measurement of complex resistivity or complex impedance of rock/ore sample and polluted water plays a fundamental role in improving the application effect of SIP and the application scope of SIP. However, current instruments can't guaranty the accuracy of measurement when the resistance of sample is less than 10Ω or great than 100kΩ. A lot of samples, such as liquid, polluted sea water, igneous rock, limestone, and sandstone, can't be measured with reliable complex resistivity result. Therefore, this problem projects a shadow in the basic research and application research of SIP. We design a high precision measurement system from the study of measurement principle, sample holder, and measurement instrument. We design input buffers in a single board. We adopt operation amplifier AD549 in this system because of its ultra-high input impedance and ultra-low current noise. This buffer is good in acquiring potential signal across high impedance sample. By analyzing the sources of measurement error and errors generated by the measurement system, we propose a correction method to remove the error in order to achieve high quality complex impedance measurement for rock and ore samples. This measurement system can improve the measurement range of the complex impedance to 0.1 Ω ~ 10 GΩ with amplitude error less than 0.1% and phase error less than 0.1mrad when frequency ranges as 0.01 Hz ~ 1 kHz. We tested our system on resistors with resistance as 0.1Ω ~ 10 GΩ in frequency range as 1 Hz ~ 1000 Hz, and the measurement error is less than 0.1 mrad. We also compared the result with LCR bridge and SCIP, we can find that the bridge's measuring range only reaches 100 MΩ, SCIP's measuring range

  1. Confidence bounds of recurrence-based complexity measures

    Schinkel, Stefan [Interdisciplinary Centre for Dynamics of Complex Systems, University of Potsdam (Germany)], E-mail: schinkel@agnld.uni-potsdam.de; Marwan, N. [Interdisciplinary Centre for Dynamics of Complex Systems, University of Potsdam (Germany); Potsdam Institute for Climate Impact Research (PIK) (Germany); Dimigen, O. [Department of Psychology, University of Potsdam (Germany); Kurths, J. [Potsdam Institute for Climate Impact Research (PIK) (Germany); Department of Physics, Humboldt University at Berlin (Germany)

    2009-06-15

    In the recent past, recurrence quantification analysis (RQA) has gained an increasing interest in various research areas. The complexity measures the RQA provides have been useful in describing and analysing a broad range of data. It is known to be rather robust to noise and nonstationarities. Yet, one key question in empirical research concerns the confidence bounds of measured data. In the present Letter we suggest a method for estimating the confidence bounds of recurrence-based complexity measures. We study the applicability of the suggested method with model and real-life data.

  2. One Single Static Measurement Predicts Wave Localization in Complex Structures

    Lefebvre, Gautier; Gondel, Alexane; Dubois, Marc; Atlan, Michael; Feppon, Florian; Labbé, Aimé; Gillot, Camille; Garelli, Alix; Ernoult, Maxence; Mayboroda, Svitlana; Filoche, Marcel; Sebbah, Patrick

    2016-08-01

    A recent theoretical breakthrough has brought a new tool, called the localization landscape, for predicting the localization regions of vibration modes in complex or disordered systems. Here, we report on the first experiment which measures the localization landscape and demonstrates its predictive power. Holographic measurement of the static deformation under uniform load of a thin plate with complex geometry provides direct access to the landscape function. When put in vibration, this system shows modes precisely confined within the subregions delineated by the landscape function. Also the maxima of this function match the measured eigenfrequencies, while the minima of the valley network gives the frequencies at which modes become extended. This approach fully characterizes the low frequency spectrum of a complex structure from a single static measurement. It paves the way for controlling and engineering eigenmodes in any vibratory system, especially where a structural or microscopic description is not accessible.

  3. Network Decomposition and Complexity Measures: An Information Geometrical Approach

    Masatoshi Funabashi

    2014-07-01

    Full Text Available We consider the graph representation of the stochastic model with n binary variables, and develop an information theoretical framework to measure the degree of statistical association existing between subsystems as well as the ones represented by each edge of the graph representation. Besides, we consider the novel measures of complexity with respect to the system decompositionability, by introducing the geometric product of Kullback–Leibler (KL- divergence. The novel complexity measures satisfy the boundary condition of vanishing at the limit of completely random and ordered state, and also with the existence of independent subsystem of any size. Such complexity measures based on the geometric means are relevant to the heterogeneity of dependencies between subsystems, and the amount of information propagation shared entirely in the system.

  4. A Collection of Complex Permittivity and Permeability Measurements

    Barry, W.; Byrd, J.; Johnson, J.; Smithwick, J.

    1993-02-01

    We present the results of measurements of the complex permittivity and permeability over a frequency range of 0.1-5.1 GHz for a range of microwave absorbing materials used in a variety of accelerator applications. We also describe the automated measurement technique which uses swept-frequency S-parameter measurements made on a strip transmission line device loaded with the material under test.

  5. Measuring logic complexity can guide pattern discovery in empirical systems

    Gherardi, Marco

    2016-01-01

    We explore a definition of complexity based on logic functions, which are widely used as compact descriptions of rules in diverse fields of contemporary science. Detailed numerical analysis shows that (i) logic complexity is effective in discriminating between classes of functions commonly employed in modelling contexts; (ii) it extends the notion of canalisation, used in the study of genetic regulation, to a more general and detailed measure; (iii) it is tightly linked to the resilience of a function's output to noise affecting its inputs. We demonstrate its utility by measuring it in empirical data on gene regulation, digital circuitry, and propositional calculus. Logic complexity is exceptionally low in these systems. The asymmetry between "on" and "off" states in the data correlates with the complexity in a non-null way; a model of random Boolean networks clarifies this trend and indicates a common hierarchical architecture in the three systems.

  6. Minimal classical communication and measurement complexity for quantum information splitting

    We present two quantum information splitting schemes using respectively tripartite GHZ and asymmetric W states as quantum channels. We show that if the secret state is chosen from a special ensemble and known to the sender (Alice), then she can split and distribute it to the receivers Bob and Charlie by performing only a single-qubit measurement and broadcasting a one-cbit message. It is clear that no other schemes could possibly achieve the same goal with simpler measurement and less classical communication. In comparison, existing schemes work for arbitrary quantum states which need not be known to Alice; however she is required to perform a two-qubit Bell measurement and communicate a two-cbit message. Hence there is a trade-off between flexibility and measurement complexity plus classical resource. In situations where our schemes are applicable, they will greatly reduce the measurement complexity and at the same time cut the communication overhead by one half

  7. A Measure of Learning Model Complexity by VC Dimension

    WANG Wen-jian; ZHANG Li-xia; XU Zong-ben

    2002-01-01

    When developing models there is always a trade-off between model complexity and model fit. In this paper, a measure of learning model complexity based on VC dimension is presented, and some relevant mathematical theory surrounding the derivation and use of this metric is summarized. The measure allows modelers to control the amount of error that is returned from a modeling system and to state upper bounds on the amount of error that the modeling system will return on all future, as yet unseen and uncollected data sets. It is possible for modelers to use the VC theory to determine which type of model more accurately represents a system.

  8. A new complexity measure for time series analysis and classification

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  9. Defining statistical relative complexity measure: Application to diversity in atoms

    A statistical relative complexity measure, based on the Kullback-Leibler distance measure defining the relative information and the Carbo quantum similarity index defining the relative disequilibrium is proposed. It is shown that with the specific choice of prior density corresponding to the atom at the beginning of the subshell, this measure reveals the diversity of atoms as the subshells are filled across the periodic table. Numerical tests are reported using the non-relativistic Hartree-Fock as well as the relativistic Dirac-Fock density for all atoms in the periodic table. -- Highlights: → A statistical relative complexity measure is introduced. → Numerator as Kullback-Leibler relative information. → Denominator as Carbo quantum similarity as relative disequilibrium is proposed. → Prior density set as atom at the beginning of the subshell in the periodic table. → The diversity of atoms as the subshells are filled is revealed.

  10. On bias of kinetic temperature measurements in complex plasmas

    Kantor, M.; Moseev, D.; Salewski, Mirko

    2014-01-01

    The kinetic temperature in complex plasmas is often measured using particle tracking velocimetry. Here, we introduce a criterion which minimizes the probability of faulty tracking of particles with normally distributed random displacements in consecutive frames. Faulty particle tracking results in...... a measurement bias of the deduced velocity distribution function and hence the deduced kinetic temperature. For particles with a normal velocity distribution function, mistracking biases the obtained velocity distribution function towards small velocities at the expense of large velocities, i...

  11. The Generalization Complexity Measure for Continuous Input Data

    Iván Gómez

    2014-01-01

    defined in Boolean space, quantifies the complexity of data in relationship to the prediction accuracy that can be expected when using a supervised classifier like a neural network, SVM, and so forth. We first extend the original measure for its use with continuous functions to later on, using an approach based on the use of the set of Walsh functions, consider the case of having a finite number of data points (inputs/outputs pairs, that is, usually the practical case. Using a set of trigonometric functions a model that gives a relationship between the size of the hidden layer of a neural network and the complexity is constructed. Finally, we demonstrate the application of the introduced complexity measure, by using the generated model, to the problem of estimating an adequate neural network architecture for real-world data sets.

  12. Complex dielectric constant measurements by the microwave resonant cavities method

    A complex dielectric constant measurement method for solids, using cylindrical and parallelipipedic microwave resonant cavities is presented. This method provides high accuracy when calculating the value of epsilonsup(*) for dielectric, semiconductor, ferroelectric and ferromagnetic materials. The paper contains a short theoretical approach, the description of the experimental method, as well as some experimental results obtained in the frequency band (19500 MHz). (author)

  13. A SHARC based ROB Complex : design and measurement results

    Boterenbrood, H; Kieft, G; Scholte, R; Slopsema, R; Vermeulen, J C

    2000-01-01

    ROB hardware, based on and exploiting the properties of the SHARC DSP and of FPGAs, and the associated software are described. Results from performance measurements and an analysis of the results for a single ROBIn as well as for a ROB Complex with up to 4 ROBIns are presented.

  14. Assessment of Complex Performances: Limitations of Key Measurement Assumptions.

    Delandshere, Ginette; Petrosky, Anthony R.

    1998-01-01

    Examines measurement concepts and assumptions traditionally used in educational assessment, using the Early Adolescence/English Language Arts assessment developed for the National Board for Professional Teaching Standards as a context. The use of numerical ratings in complex performance assessment is questioned. (SLD)

  15. Effect of ions on sulfuric acid-water binary particle formation: 1. Theory for kinetic- and nucleation-type particle formation and atmospheric implications

    Merikanto, Joonas; Duplissy, Jonathan; Määttänen, Anni; Henschel, Henning; Donahue, Neil M.; Brus, David; Schobesberger, Siegfried; Kulmala, Markku; Vehkamäki, Hanna

    2016-02-01

    We derive a version of Classical Nucleation Theory normalized by quantum chemical results on sulfuric acid-water hydration to describe neutral and ion-induced particle formation in the binary sulfuric acid-water system. The theory is extended to treat the kinetic regime where the nucleation free energy barrier vanishes at high sulfuric acid concentrations or low temperatures. In the kinetic regime particle formation rates become proportional to sulfuric acid concentration to second power in the neutral system or first power in the ion-induced system. We derive simple general expressions for the prefactors in kinetic-type and activation-type particle formation calculations applicable also to more complex systems stabilized by other species. The theory predicts that the binary water-sulfuric acid system can produce strong new particle formation in the free troposphere both through barrier crossing and through kinetic pathways. At cold stratospheric and upper free tropospheric temperatures neutral formation dominates the binary particle formation rates. At midtropospheric temperatures the ion-induced pathway becomes the dominant mechanism. However, even the ion-induced binary mechanism does not produce significant particle formation in warm boundary layer conditions, as it requires temperatures below 0°C to take place at atmospheric concentrations. The theory successfully reproduces the characteristics of measured charged and neutral binary particle formation in CERN CLOUD3 and CLOUD5 experiments, as discussed in a companion paper.

  16. Measuring the Complexity of Self-organizing Traffic Lights

    Zubillaga, Dario; Aguilar, Luis Daniel; Zapotecatl, Jorge; Fernandez, Nelson; Aguilar, Jose; Rosenblueth, David A; Gershenson, Carlos

    2014-01-01

    We apply measures of complexity, emergence and self-organization to an abstract city traffic model for comparing a traditional traffic coordination method with a self-organizing method in two scenarios: cyclic boundaries and non-orientable boundaries. We show that the measures are useful to identify and characterize different dynamical phases. It becomes clear that different operation regimes are required for different traffic demands. Thus, not only traffic is a non-stationary problem, which requires controllers to adapt constantly. Controllers must also change drastically the complexity of their behavior depending on the demand. Based on our measures, we can say that the self-organizing method achieves an adaptability level comparable to a living system.

  17. Measuring the Complexity of Self-Organizing Traffic Lights

    Darío Zubillaga

    2014-04-01

    Full Text Available We apply measures of complexity, emergence, and self-organization to an urban traffic model for comparing a traditional traffic-light coordination method with a self-organizing method in two scenarios: cyclic boundaries and non-orientable boundaries. We show that the measures are useful to identify and characterize different dynamical phases. It becomes clear that different operation regimes are required for different traffic demands. Thus, not only is traffic a non-stationary problem, requiring controllers to adapt constantly; controllers must also change drastically the complexity of their behavior depending on the demand. Based on our measures and extending Ashby’s law of requisite variety, we can say that the self-organizing method achieves an adaptability level comparable to that of a living system.

  18. Complexity-Entropy Causality Plane as a Complexity Measure for Two-dimensional Patterns

    Ribeiro, H V; Lenzi, E K; Santoro, P A; Mendes, R S; 10.1371/journal.pone.0040689

    2012-01-01

    Complexity measures are essential to understand complex systems and there are numerous definitions to analyze one-dimensional data. However, extensions of these approaches to two or higher-dimensional data, such as images, are much less common. Here, we reduce this gap by applying the ideas of the permutation entropy combined with a relative entropic index. We build up a numerical procedure that can be easily implemented to evaluate the complexity of two or higher-dimensional patterns. We work out this method in different scenarios where numerical experiments and empirical data were taken into account. Specifically, we have applied the method to i) fractal landscapes generated numerically where we compare our measures with the Hurst exponent; ii) liquid crystal textures where nematic-isotropic-nematic phase transitions were properly identified; iii) 12 characteristic textures of liquid crystals where the different values show that the method can distinguish different phases; iv) and Ising surfaces where our m...

  19. A Method for Measuring the Structure Complexity of Web Application

    2006-01-01

    The precise and effective measure results of Web applications not only facilitate good comprehension of them, but also benefit to the macro-management of software activities, such as testing, reverse engineering, reuse, etc. The paper exploits some researches on measuring the structure complexity of Web application. Through a deep analysis of the configuration and objects' interactions of Web system, two conclusions have been drawn:①A generic Web application consists of static web page, dynamic page, component and database object;②The main interactions have only three styles, that is static link, dynamic link and call/return relation. Based on analysis and modeling of the content of a Web page (static or dynamic), complexity measure methods of both control logic of script and nesting of HTML code are further discussed. In addition, two methods for measuring the complexity of inter-page navigation are also addressed by modeling the inter-page navigation behaviors of Web application via WNG graph.

  20. The step complexity measure its meaning and applications

    According to related studies, it was revealed that procedural error plays a significant role for initiating accidents or incidents. This means that, to maximize safety, it is indispensable to be able to answer the question of 'why the operators perpetrate procedural error?' In this study, the SC (Step Complexity) measure is introduced to investigate its applicability for studying procedural error, since it was shown that the change of the operators' performance is strongly correlated with the change of SC scores. This means that the SC measure could play an important role for researches related to procedural error, since it is strongly believed that complicated procedures would affect both the operators' performance and the possibility of procedural error. Thus, to ensure this expectation, the meaning of the SC measure is investigated through brief explanations including the necessity, theoretical basis and verification activities of the SC measure. As the result, it is quite positive that the SC measure can be used to explain the change of the operators' performance due to the task complexity implied by procedures. In addition, it seems that the SC measure may be useful for various purposes, particularly for scrutinizing the relationship between procedural error and complicated procedures

  1. Measuring system complexity to support development cost estimates

    Malone, P.; Wolfarth, L.

    Systems and System-of-Systems (SoS) are being used more frequently either as a design element of stand alone systems or architectural frameworks. Consequently, a programmatic need has arisen to understand and measure systems complexity in order to estimate more accurately development plans and life-cycle costs. In a prior paper, we introduced the System Readiness Level (SRL) concept as a composite function of both Technology Readiness Levels (TRLs) and Integration Readiness Levels (IRLs) and touched on system complexity. While the SRL approach provides a repeatable, process-driven method to assess the maturity of a system or SoS, it does not capture all aspects of system complexity. In this paper we assess the concept of cyclomatic complexity as a system complexity metric and consider its utility as an approach for estimating the life-cycle costs and cost growth of complex systems. We hypothesize that the greater the number of technologies and integration tasks, the more complex the system and the higher its cost to develop and maintain. We base our analysis on historical data from DoD programs that have experienced significant cost growth, including some that have been cancelled due to unsustainable cost (and schedule) growth. We begin by describing the original implementation of the cyclomatic method, which was developed to estimate the effort to maintain system software. We then describe how the method can be generalized and applied to systems. Next, we show how to estimate the cyclomatic number (CN) and show the statistical significance between a system's CN metric and its cost. We illustrate the method with an example. Last, we discuss opportunities for future research.

  2. Applications of fidelity measures to complex quantum systems.

    Wimberger, Sandro

    2016-06-13

    We revisit fidelity as a measure for the stability and the complexity of the quantum motion of single-and many-body systems. Within the context of cold atoms, we present an overview of applications of two fidelities, which we call static and dynamical fidelity, respectively. The static fidelity applies to quantum problems which can be diagonalized since it is defined via the eigenfunctions. In particular, we show that the static fidelity is a highly effective practical detector of avoided crossings characterizing the complexity of the systems and their evolutions. The dynamical fidelity is defined via the time-dependent wave functions. Focusing on the quantum kicked rotor system, we highlight a few practical applications of fidelity measurements in order to better understand the large variety of dynamical regimes of this paradigm of a low-dimensional system with mixed regular-chaotic phase space. PMID:27140967

  3. Statistical analysis of complex systems with nonclassical invariant measures

    Fratalocchi, Andrea

    2011-02-28

    I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a general formalism based on the Ablowitz-Kaup-Newell-Segur scheme, I demonstrate how to build an invariant measure and, within a one-dimensional phase space, how to develop a suitable thermodynamics. A detailed example is provided with a universal model of wave propagation, with reference to a transparent potential sustaining gray solitons. The system shows a rich thermodynamic scenario, with a free-energy landscape supporting phase transitions and controllable emergent properties. I finally discuss the origin of such behavior, trying to identify common denominators in the area of complex dynamics.

  4. Measuring complex problem solving: the MicroDYN approach

    Greiff, Samuel; Funke, Joachim

    2009-01-01

    In educational large-scale assessments such as PISA only recently an increasing interest in measuring cross-curricular competencies can be observed. These are now discovered as valuable aspects of school achievement. Complex problem solving (CPS) describes an interesting construct for the diagnostics of domain-general competencies. Here, we present MicroDYN, a new approach for computer-based assessment of CPS. We introduce the new concept, describe proper software and present first results...

  5. Novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis

    Mihailovic, Dragutin T; Nikolic-Djoric, Emilija; Arsenic, Ilija

    2013-01-01

    We have proposed novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis. We have considered background of the Kolmogorov complexity and also we have discussed meaning of the physical as well as other complexities. To get better insights into the complexity of complex systems and time series analysis we have introduced the three novel measures based on the Kolmogorov complexity: (i) the Kolmogorov complexity spectrum, (ii) the Kolmogorov complexity spectrum highest value and (iii) the overall Kolmogorov complexity. The characteristics of these measures have been tested using a generalized logistic equation. Finally, the proposed measures have been applied on different time series originating from: the model output (the biochemical substance exchange in a multi-cell system), four different geophysical phenomena (dynamics of: river flow, long term precipitation, indoor 222Rn concentration and UV radiation dose) and economy (stock prices dynamics). Re...

  6. Novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis

    Mihailović Dragutin T.

    2015-01-01

    Full Text Available We propose novel metrics based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis. We consider the origins of the Kolmogorov complexity and discuss its physical meaning. To get better insights into the nature of complex systems and time series analysis we introduce three novel measures based on the Kolmogorov complexity: (i the Kolmogorov complexity spectrum, (ii the Kolmogorov complexity spectrum highest value and (iii the overall Kolmogorov complexity. The characteristics of these measures have been tested using a generalized logistic equation. Finally, the proposed measures have been applied to different time series originating from: a model output (the biochemical substance exchange in a multi-cell system, four different geophysical phenomena (dynamics of: river flow, long term precipitation, indoor 222Rn concentration and UV radiation dose and the economy (stock price dynamics. The results obtained offer deeper insights into the complexity of system dynamics and time series analysis with the proposed complexity measures.

  7. On the extension of Importance Measures to complex components

    Importance Measures are indicators of the risk significance of the components of a system. They are widely used in various applications of Probabilistic Safety Analyses, off-line and on-line, in decision making for preventive and corrective purposes, as well as to rank components according to their contribution to the global risk. They are primarily defined for the case the support model is a coherent fault tree and failures of components are described by basic events of this fault tree. In this article, we study their extension to complex components, i.e. components whose failures are modeled by a gate rather than just a basic event. Although quite natural, such an extension has not received much attention in the literature. We show that it raises a number of problems. The Birnbaum Importance Measure and the notion of Critical States concentrate these difficulties. We present alternative solutions for the extension of these notions. We discuss their respective advantages and drawbacks. This article gives a new point of view on the mathematical foundations of Importance Measures and helps us to clarify their physical meaning. - Highlights: • We propose an extension of Importance Measures to complex components. • We define our extension in term minterms, i.e. states of the system. • We discuss the physical interpretation of Importance Measures in light of this interpretation

  8. OPEN PUBLIC SPACE ATTRIBUTES AND CATEGORIES – COMPLEXITY AND MEASURABILITY

    Ljiljana Čavić

    2014-12-01

    Full Text Available Within the field of architectural and urban research, this work addresses the complexity of contemporary public space, both in a conceptual and concrete sense. It aims at systematizing spatial attributes and their categories and discussing spatial complexity and measurability, all this in order to reach a more comprehensive understanding, description and analysis of public space. Our aim is to improve everyday usage of open public space and we acknowledged users as its crucial factor. There are numerous investigations on the complex urban and architectural reality of public space that recognise importance of users. However, we did not find any that would holistically account for what users find essential in public space. Based on the incompleteness of existing approaches on open public space and the importance of users for their success, this paper proposes a user-orientated approach. Through an initial survey directed to users, we collected the most important aspects of public spaces in the way that contemporary humans see them. The gathered data is analysed and coded into spatial attributes from which their role in the complexity of open public space and measurability are discussed. The work results in an inventory of attributes that users find salient in public spaces. It does not discuss their qualitative values or contribution in generating spatial realities. It aims to define them clearly so that any further logical argumentation on open space concerning users may be solidly constructed. Finally, through categorisation of attributes it proposes the disciplinary levels necessary for the analysis of complex urban-architectural reality

  9. Compositional segmentation and complexity measurement in stock indices

    Wang, Haifeng; Shang, Pengjian; Xia, Jianan

    2016-01-01

    In this paper, we introduce a complexity measure based on the entropic segmentation called sequence compositional complexity (SCC) into the analysis of financial time series. SCC was first used to deal directly with the complex heterogeneity in nonstationary DNA sequences. We already know that SCC was found to be higher in sequences with long-range correlation than those with low long-range correlation, especially in the DNA sequences. Now, we introduce this method into financial index data, subsequently, we find that the values of SCC of some mature stock indices, such as S & P 500 (simplified with S & P in the following) and HSI, are likely to be lower than the SCC value of Chinese index data (such as SSE). What is more, we find that, if we classify the indices with the method of SCC, the financial market of Hong Kong has more similarities with mature foreign markets than Chinese ones. So we believe that a good correspondence is found between the SCC of the index sequence and the complexity of the market involved.

  10. Atmospheric stability and complex terrain: comparing measurements and CFD

    Koblitz, Tilman; Bechmann, Andreas; Berg, Jacob;

    2014-01-01

    , buoyancy forces and heat transport, are mostly ignored in state-of-the-art flow solvers. In order to decrease the uncertainty of wind resource assessment, the effect of thermal stratification on the atmospheric boundary layer should be included in such models. The present work focuses on non......-neutral atmospheric flow over complex terrain including physical processes like stability and Coriolis force. We examine the influence of these effects on the whole atmospheric boundary layer using the DTU Wind Energy flow solver EllipSys3D. To validate the flow solver, measurements from Benakanahalli hill, a field...... experiment that took place in India in early 2010, are used. The experiment was specifically designed to address the combined effects of stability and Coriolis force over complex terrain, and provides a dataset to validate flow solvers. Including those effects into EllipSys3D significantly improves the...

  11. Increment Entropy as a Measure of Complexity for Time Series

    Xiaofeng Liu

    2016-01-01

    Full Text Available Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce an increment entropy to measure the complexity of time series in which each increment is mapped onto a word of two letters, one corresponding to the sign and the other corresponding to the magnitude. Increment entropy (IncrEn is defined as the Shannon entropy of the words. Simulations on synthetic data and tests on epileptic electroencephalogram (EEG signals demonstrate its ability of detecting abrupt changes, regardless of the energetic (e.g., spikes or bursts or structural changes. The computation of IncrEn does not make any assumption on time series, and it can be applicable to arbitrary real-world data.

  12. Increment entropy as a measure of complexity for time series

    Liu, Xiaofeng; Xu, Ning; Xue, Jianru

    2015-01-01

    Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce increment entropy to measure the complexity of time series in which each increment is mapped into a word of two letters, one letter corresponding to direction and the other corresponding to magnitude. The Shannon entropy of the words is termed as increment entropy (IncrEn). Simulations on synthetic data and tests on epileptic EEG signals have demonstrated its ability of detecting the abrupt change, regardless of energetic (e.g. spikes or bursts) or structural changes. The computation of IncrEn does not make any assumption on time series and it can be applicable to arbitrary real-world data.

  13. Determination of complex microcalorimeter parameters with impedance measurements

    The proper understanding and modeling of a microcalorimeter's response requires accurate knowledge of a handful of parameters, such as C, G, α. While a few of these parameters are directly determined from the IV characteristics, some others, notoriously the heat capacity (C) and α, appear in degenerate combinations in most measurable quantities. The consideration of a complex microcalorimeter leads to an added ambiguity in the determination of the parameters. In general, the dependence of the microcalorimeter's complex impedance on these various parameters varies with frequency. This dependence allows us to determine individual parameters by fitting the prediction of the microcalorimeter model to impedance data. In this paper we describe efforts at characterizing the Goddard X-ray microcalorimeters. With the parameters determined by this method, we compare the pulse shape and noise spectra predictions to data taken with the same devices

  14. Overcoming Problems in the Measurement of Biological Complexity

    Cebrian, Manuel; Ortega, Alfonso

    2010-01-01

    In a genetic algorithm, fluctuations of the entropy of a genome over time are interpreted as fluctuations of the information that the genome's organism is storing about its environment, being this reflected in more complex organisms. The computation of this entropy presents technical problems due to the small population sizes used in practice. In this work we propose and test an alternative way of measuring the entropy variation in a population by means of algorithmic information theory, where the entropy variation between two generational steps is the Kolmogorov complexity of the first step conditioned to the second one. As an example application of this technique, we report experimental differences in entropy evolution between systems in which sexual reproduction is present or absent.

  15. A new measure of heterogeneity for complex networks

    Jacob, Rinku; Misra, R; Ambika, G

    2016-01-01

    We propose a novel measure of heterogeneity for unweighted and undirected complex networks that can be derived from the degree distribution of the network instead of the degree sequences, as is done at present. We show that the proposed measure can be applied to all types of topology with ease and shows direct correlation with the diversity of node degrees in the network. The measure is mathematically well behaved and is normalised in the interval [0, 1]. The measure is applied to compute the heterogeneity of synthetic (both random and scale free) and real world networks. We specifically show that the heterogeneity of an evolving scale free network decreases as a power law with the size of the network N, implying a scale free character for the proposed measure. Finally, as a specific application, we show that the proposed measure can be used to compare the heterogeneity of recurrence networks constructed from the time series of several low dimensional chaotic attractors, thereby providing a single index to co...

  16. Entropies from Markov Models as Complexity Measures of Embedded Attractors

    Julián D. Arias-Londoño

    2015-06-01

    Full Text Available This paper addresses the problem of measuring complexity from embedded attractors as a way to characterize changes in the dynamical behavior of different types of systems with a quasi-periodic behavior by observing their outputs. With the aim of measuring the stability of the trajectories of the attractor along time, this paper proposes three new estimations of entropy that are derived from a Markov model of the embedded attractor. The proposed estimators are compared with traditional nonparametric entropy measures, such as approximate entropy, sample entropy and fuzzy entropy, which only take into account the spatial dimension of the trajectory. The method proposes the use of an unsupervised algorithm to find the principal curve, which is considered as the “profile trajectory”, that will serve to adjust the Markov model. The new entropy measures are evaluated using three synthetic experiments and three datasets of physiological signals. In terms of consistency and discrimination capabilities, the results show that the proposed measures perform better than the other entropy measures used for comparison purposes.

  17. Power Quality Measurement in a Modern Hotel Complex

    Velimir Strugar

    2010-06-01

    Full Text Available The paper presents the analysis of power quality characteristics at the 10 kV grids supplying a modern hotel complex in Montenegrin Adriatic coast. The consumer is characterized with different type of loads, of which some are with highly nonlinear characteristic. For example, smart rooms, lift drives, modern equipment for hotel kitchen, public electric lighting, audio, video and TV devices, etc. Such loads in the hotel complex may be source of negative effects regarding power quality at MV public distribution network (10 kV and 35 kV. In the first part of the paper, results of harmonic measurement at a 35/10 kV substation are presented. The measurements lasted one week in real operating conditions (in accordance with EN 50160. The results were the basis for developing a simulation model. The measurement results were analyzed and compared with simulation ones. Application of harmonic filter is simulated. Filter effects on harmonic level is calculated and discussed using simulation results.

  18. Automated imitating-measuring complex for designing and measuring characteristics of phased antenna arrays

    Usin, V.; Markov, V.; Pomazanov, S.; Usina, A.; Filonenko, A.

    2011-01-01

    This article considers design principles, structure and technical characteristics of automated imitating-measuring complex, contains variants of its hardware and software implementation for selecting APD, tolerance justification, estimation of manufacturing errors’ influence, discrete nature of control and mutual influence of radiating elements on PAA parameters.

  19. Measuring complexity with multifractals in texts. Translation effects

    Highlights: ► Two texts in English and one in Esperanto are transformed into 6 time series. ► D(q) and f(alpha) of such (and shuffled) time series are obtained. ► A model for text construction is presented based on a parametrized Cantor set. ► The model parameters can also be used when examining machine translated texts. ► Suggested extensions to higher dimensions: in 2D image analysis and on hypertexts. - Abstract: Should quality be almost a synonymous of complexity? To measure quality appears to be audacious, even very subjective. It is hereby proposed to use a multifractal approach in order to quantify quality, thus through complexity measures. A one-dimensional system is examined. It is known that (all) written texts can be one-dimensional nonlinear maps. Thus, several written texts by the same author are considered, together with their translation, into an unusual language, Esperanto, and asa baseline their corresponding shuffled versions. Different one-dimensional time series can be used: e.g. (i) one based on word lengths, (ii) the other based on word frequencies; both are used for studying, comparing and discussing the map structure. It is shown that a variety in style can be measured through the D(q) and f(α) curves characterizing multifractal objects. This allows to observe on the one hand whether natural and artificial languages significantly influence the writing and the translation, and whether one author’s texts differ technically from each other. In fact, the f(α) curves of the original texts are similar to each other, but the translated text shows marked differences. However in each case, the f(α) curves are far from being parabolic, – in contrast to the shuffled texts. Moreover, the Esperanto text has more extreme values. Criteria are thereby suggested for estimating a text quality, as if it is a time series only. A model is introduced in order to substantiate the findings: it consists in considering a text as a random Cantor set

  20. Range-limited Centrality Measures in Complex Networks

    Ercsey-Ravasz, Maria; Chawla, Nitesh V; Toroczkai, Zoltan

    2011-01-01

    Here we present a range-limited approach to centrality measures in both non-weighted and weighted directed complex networks. We introduce an efficient method that generates for every node and every edge its betweenness centrality based on shortest paths of lengths not longer than $\\ell = 1,...,L$ in case of non-weighted networks, and for weighted networks the corresponding quantities based on minimum weight paths with path weights not larger than $w_{\\ell}=\\ell \\Delta$, $\\ell=1,2...,L=R/\\Delta$. These measures provide a systematic description on the positioning importance of a node (edge) with respect to its network neighborhoods 1-step out, 2-steps out, etc. up to including the whole network. We show that range-limited centralities obey universal scaling laws for large non-weighted networks. As the computation of traditional centrality measures is costly, this scaling behavior can be exploited to efficiently estimate centralities of nodes and edges for all ranges, including the traditional ones. The scaling ...

  1. Digraph Complexity Measures and Applications in Formal Language Theory

    Gruber, Hermann

    2011-01-01

    We investigate structural complexity measures on digraphs, in particular the cycle rank. This concept is intimately related to a classical topic in formal language theory, namely the star height of regular languages. We explore this connection, and obtain several new algorithmic insights regarding both cycle rank and star height. Among other results, we show that computing the cycle rank is NP-complete, even for sparse digraphs of maximum outdegree 2. Notwithstanding, we provide both a polynomial-time approximation algorithm and an exponential-time exact algorithm for this problem. The former algorithm yields an O((log n)^(3/2))- approximation in polynomial time, whereas the latter yields the optimum solution, and runs in time and space O*(1.9129^n) on digraphs of maximum outdegree at most two. Regarding the star height problem, we identify a subclass of the regular languages for which we can precisely determine the computational complexity of the star height problem. Namely, the star height problem for bidet...

  2. Determination of Complex Microcalorimeter Parameters with Impedance Measurements

    Saab, T.; Bandler, S. R.; Chervenak, J.; Figueroa-Feliciano, E.; Finkbeiner, F.; Iyomoto, N.; Kelley, R.; Kilbourne, C. A.; Lindeman, M. A.; Porter, F. S.; Sadleir, J.

    2005-01-01

    The proper understanding and modeling of a microcalorimeter s response requires the accurate knowledge of a handful of parameters, such as C, G, alpha, . . . . While a few of these, such 8s the normal state resistance and the total thermal conductance to the heat bath (G) are directly determined from the DC IV characteristics, some others, notoriously the heat capacity (C) and alpha, appear in degenerate combinations in most measurable quantities. The case of a complex microcalorimeter, i.e. one in which the absorber s heat capacity is connected by a finite thermal impedance to the sensor, and subsequently by another thermal impedance to the heat bath, results in an added ambiguity in the determination of the individual C's and G's. In general, the dependence of the microcalorimeter s complex impedance on these parameters varies with frequency. This variation allows us to determine the individual parameters by fitting the prediction of the microcalorimeter model to the impedance data. We describe in this paper our efforts at characterizing the Goddard X-ray microcalorimeters. Using the parameters determined with this method we them compare the pulse shape and noise spectra predicted by the microcalorimeter model to data taken with the same devices.

  3. Measuring the complex behavior of the SO2 oxidation reaction

    Muhammad Shahzad

    2015-09-01

    Full Text Available The two step reversible chemical reaction involving five chemical species is investigated. The quasi equilibrium manifold (QEM and spectral quasi equilibrium manifold (SQEM are used for initial approximation to simplify the mechanisms, which we want to utilize in order to investigate the behavior of the desired species. They show a meaningful picture, but for maximum clarity, the investigation method of invariant grid (MIG is employed. These methods simplify the complex chemical kinetics and deduce low dimensional manifold (LDM from the high dimensional mechanism. The coverage of the species near equilibrium point is investigated and then we shall discuss moving along the equilibrium of ODEs. The steady state behavior is observed and the Lyapunov function is utilized to study the stability of ODEs. Graphical results are used to describe the physical aspects of measurements.

  4. Measurement of complex supercontinuum light pulses using time domain ptychography

    Heidt, Alexander M; Brügmann, Michael; Rohwer, Erich G; Feurer, Thomas

    2016-01-01

    We demonstrate that time-domain ptychography, a recently introduced ultrafast pulse reconstruction modality, has properties ideally suited for the temporal characterization of complex light pulses with large time-bandwidth products as it achieves temporal resolution on the scale of a single optical cycle using long probe pulses, low sampling rates, and an extremely fast and robust algorithm. In comparison to existing techniques, ptychography minimizes the data to be recorded and processed, and drastically reduces the computational time of the reconstruction. Experimentally we measure the temporal waveform of an octave-spanning, 3.5~ps long supercontinuum pulse generated in photonic crystal fiber, resolving features as short as 5.7~fs with sub-fs resolution and 30~dB dynamic range using 100~fs probe pulses and similarly large delay steps.

  5. Measuring robustness of community structure in complex networks

    Li, Hui-Jia; Chen, Luonan

    2015-01-01

    The theory of community structure is a powerful tool for real networks, which can simplify their topological and functional analysis considerably. However, since community detection methods have random factors and real social networks obtained from complex systems always contain error edges, evaluating the robustness of community structure is an urgent and important task. In this letter, we employ the critical threshold of resolution parameter in Hamiltonian function, $\\gamma_C$, to measure the robustness of a network. According to spectral theory, a rigorous proof shows that the index we proposed is inversely proportional to robustness of community structure. Furthermore, by utilizing the co-evolution model, we provides a new efficient method for computing the value of $\\gamma_C$. The research can be applied to broad clustering problems in network analysis and data mining due to its solid mathematical basis and experimental effects.

  6. Permutation Complexity and Coupling Measures in Hidden Markov Models

    Taichi Haruna

    2013-09-01

    Full Text Available Recently, the duality between values (words and orderings (permutations has been proposed by the authors as a basis to discuss the relationship between information theoretic measures for finite-alphabet stationary stochastic processes and their permutatio nanalogues. It has been used to give a simple proof of the equality between the entropy rate and the permutation entropy rate for any finite-alphabet stationary stochastic process and to show some results on the excess entropy and the transfer entropy for finite-alphabet stationary ergodic Markov processes. In this paper, we extend our previous results to hidden Markov models and show the equalities between various information theoretic complexity and coupling measures and their permutation analogues. In particular, we show the following two results within the realm of hidden Markov models with ergodic internal processes: the two permutation analogues of the transfer entropy, the symbolic transfer entropy and the transfer entropy on rank vectors, are both equivalent to the transfer entropy if they are considered as the rates, and the directed information theory can be captured by the permutation entropy approach.

  7. A high accuracy broadband measurement system for time resolved complex bioimpedance measurements

    Bioimpedance measurements are useful tools in biomedical engineering and life science. Bioimpedance is the electrical impedance of living tissue and can be used in the analysis of various physiological parameters. Bioimpedance is commonly measured by injecting a small well known alternating current via surface electrodes into an object under test and measuring the resultant surface voltages. It is non-invasive, painless and has no known hazards. This work presents a field programmable gate array based high accuracy broadband bioimpedance measurement system for time resolved bioimpedance measurements. The system is able to measure magnitude and phase of complex impedances under test in a frequency range of about 10–500 kHz with excitation currents from 10 µA to 5 mA. The overall measurement uncertainties stay below 1% for the impedance magnitude and below 0.5° for the phase in most measurement ranges. Furthermore, the described system has a sample rate of up to 3840 impedance spectra per second. The performance of the bioimpedance measurement system is demonstrated with a resistor based system calibration and with measurements on biological samples. (paper)

  8. Introducing a Space Complexity Measure for P Systems

    Porreca, Antonio E.; Leporati, Alberto; Mauri, Giancarlo; Zandron, Claudio; Research Group on Natural Computing (Universidad de Sevilla) (Coordinador)

    2009-01-01

    We define space complexity classes in the framework of membrane computing, giving some initial results about their mutual relations and their connection with time complexity classes, and identifying some potentially interesting problems which require further research.

  9. Disassembling "evapotranspiration" in-situ with a complex measurement tool

    Chormanski, Jaroslaw; Kleniewska, Malgorzata; Berezowski, Tomasz; Sporak-Wasilewska, Sylwia; Okruszko, Tomasz; Szatylowicz, Jan; Batelaan, Okke

    2014-05-01

    In this work we present a complex tool for measuring water fluxes in wetland ecosystems. The tool was designed to quantify processes related to interception storage on plants leafs. The measurements are conducted by combining readings from various instruments, including: eddy covariance tower (EC), field spectrometer, SapFlow system, rain gauges above and under canopy, soil moisture probes and other. The idea of this set-up is to provide continuous measurement of overall water flux from the ecosystem (EC tower), intercepted water volume and timing (field spectrometers), through-fall (rain gauges above and under canopy), transpiration (SapFlow), evaporation and soil moisture (soil moisture probes). Disassembling the water flux to the above components allows giving more insight to the interception related processes and differentiates them from the total evapotranspiration. The measurements are conducted in the Upper Biebrza Basin (NE Poland). The study area is part of the valley and is covered by peat soils (mainly peat moss with the exception of areas near the river) and receives no inundations waters of the Biebrza. The plant community of Agrostietum-Carici caninae has a dominant share here creating an up to 0.6 km wide belt along the river. The area is covered also by Caricion lasiocarpae as well as meadows and pastures Molinio-Arrhenatheretea, Phragmitetum communis. Sedges form a hummock pattern characteristic for the sedge communities in natural river valleys with wetland vegetation. The main result of the measurement set-up will be the analyzed characteristics and dynamics of interception storage for sedge ecosystems and a developed methodology for interception monitoring by use spectral reflectance technique. This will give a new insight to processes of evapotranspiration in wetlands and its components transpiration, evaporation from interception and evaporation from soil. Moreover, other important results of this project will be the estimation of energy and

  10. Methodology for Measuring the Complexity of Enterprise Information Systems

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  11. Complex Squeezing and Force Measurement Beyond the Standard Quantum Limit

    Buchmann, L F; Kohler, J; Spethmann, N; Stamper-Kurn, D M

    2016-01-01

    A continuous quantum field, such as a propagating beam of light, may be characterized by a squeezing spectrum that is inhomogeneous in frequency. We point out that homodyne detectors, which are commonly employed to detect quantum squeezing, are blind to squeezing spectra in which the correlation between amplitude and phase fluctuations is complex. We find theoretically that such complex squeezing is a component of ponderomotive squeezing of light through cavity optomechanics. We propose a detection scheme, called synodyne detection, which reveals complex squeezing and allows its use to improve force detection beyond the standard quantum limit.

  12. Measuring the Level of Complexity of Scientific Inquiries: The LCSI Index

    Eilam, Efrat

    2015-01-01

    The study developed and applied an index for measuring the level of complexity of full authentic scientific inquiry. Complexity is a fundamental attribute of real life scientific research. The level of complexity is an overall reflection of complex cognitive and metacognitive processes which are required for navigating the authentic inquiry…

  13. Approximate entropy as a measure of system complexity.

    Pincus, S M

    1991-01-01

    Techniques to determine changing system complexity from data are evaluated. Convergence of a frequently used correlation dimension algorithm to a finite value does not necessarily imply an underlying deterministic model or chaos. Analysis of a recently developed family of formulas and statistics, approximate entropy (ApEn), suggests that ApEn can classify complex systems, given at least 1000 data values in diverse settings that include both deterministic chaotic and stochastic processes. The ...

  14. A Measure for Complex Dynamics in Power Systems

    Ralph Wilson

    2011-06-01

    Full Text Available In an attempt to quantify the dynamical complexity of power systems, we introduce the use of a non-linear time series technique to detect complex dynamics in a signal. The technique is a significant reinterpretation of the Approximate Entropy (ApEn introduced by Pincus, as an approximation to the Eckmann- Ruelle entropy. It is examined in the context of power systems, and several examples are explored.

  15. Power Quality Measurement in a Modern Hotel Complex

    Velimir Strugar; Vladimir Katić

    2010-01-01

    The paper presents the analysis of power quality characteristics at the 10 kV grids supplying a modern hotel complex in Montenegrin Adriatic coast. The consumer is characterized with different type of loads, of which some are with highly nonlinear characteristic. For example, smart rooms, lift drives, modern equipment for hotel kitchen, public electric lighting, audio, video and TV devices, etc. Such loads in the hotel complex may be source of negative effects regarding power quality at MV pu...

  16. Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR and Information Entropy

    Rong Jiang

    2015-04-01

    Full Text Available Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL, Shili (SL and Renli (RL, so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.

  17. Information and complexity measures for hydrologic model evaluation

    Hydrological models are commonly evaluated through the residual-based performance measures such as the root-mean square error or efficiency criteria. Such measures, however, do not evaluate the degree of similarity of patterns in simulated and measured time series. The objective of this study was to...

  18. Measuring and modeling of the wind profile in complex terrain

    Hošek, Jiří

    Wilhelmshaven : Deutsche Wind Energie Institut, 2004, ---. [DEWEK 2004. Wilhelmshaven (DE), 20.10.2005-21.10.2005] Institutional research plan: CEZ:AV0Z3042911 Keywords : wind profile * complex terrain * numerical model Subject RIV: JE - Non-nuclear Energetics, Energy Consumption ; Use

  19. Existence of biological uncertainty principle implies that we can never find 'THE' measure for biological complexity

    Banerji, Anirban

    2009-01-01

    There are innumerable 'biological complexity measure's. While some patterns emerge from these attempts to represent biological complexity, a single measure to encompass the seemingly countless features of biological systems, still eludes the students of Biology. It is the pursuit of this paper to discuss the feasibility of finding one complete and objective measure for biological complexity. A theoretical construct (the 'Thread-Mesh model') is proposed here to describe biological reality. It ...

  20. Quantum mechanics with chaos correspondence principle, measurement and complexity

    Kirilyuk, A P

    1995-01-01

    The true dynamical randomness is obtained as a natural fundamental property of deterministic quantum systems. It provides quantum chaos passing to the classical dynamical chaos under the ordinary semiclassical transition, which extends the correspondence principle to chaotic systems. In return one should accept the modified form of quantum formalism (exemplified by the Schrodinger equation) which, however, does not contradict the ordinary form, and the main postulates, of quantum mechanics. It introduces the principle of the fundamental dynamic multivaluedness extending the quantum paradigm to complex dynamical behaviour. Moreover, a causal solution to the well-known problems of the foundations of quantum mechanics, those of quantum indeterminacy and wave reduction, is also found using the same method. The concept of the fundamental dynamic uncertainty thus established is universal in character and provides a unified scheme of the complete description of arbitrary complex system of any origin. This scheme inc...

  1. The Complex Trauma Questionnaire (ComplexTQ): development and preliminary psychometric properties of an instrument for measuring early relational trauma

    Maggiora Vergano, Carola; Lauriola, Marco; Speranza, Anna M.

    2015-01-01

    Research on the etiology of adult psychopathology and its relationship with childhood trauma has focused primarily on specific forms of maltreatment. This study developed an instrument for the assessment of childhood and adolescence trauma that would aid in identifying the role of co-occurring childhood stressors and chronic adverse conditions. The Complex Trauma Questionnaire (ComplexTQ), in both clinician and self-report versions, is a measure for the assessment of multi-type maltreatment: ...

  2. Liquid structure of acetic acid-water and trifluoroacetic acid-water mixtures studied by large-angle X-ray scattering and NMR.

    Takamuku, Toshiyuki; Kyoshoin, Yasuhiro; Noguchi, Hiroshi; Kusano, Shoji; Yamaguchi, Toshio

    2007-08-01

    The structures of acetic acid (AA), trifluoroacetic acid (TFA), and their aqueous mixtures over the entire range of acid mole fraction xA have been investigated by using large-angle X-ray scattering (LAXS) and NMR techniques. The results from the LAXS experiments have shown that acetic acid molecules mainly form a chain structure via hydrogen bonding in the pure liquid. In acetic acid-water mixtures hydrogen bonds of acetic acid-water and water-water gradually increase with decreasing xA, while the chain structure of acetic acid molecules is moderately ruptured. Hydrogen bonds among water molecules are remarkably formed in acetic acid-water mixtures at xATFA molecules form not a chain structure but cyclic dimers through hydrogen bonding in the pure liquid. In TFA-water mixtures O...O hydrogen bonds among water molecules gradually increase when xA decreases, and hydrogen bonds among water molecules are significantly formed in the mixtures at xATFA molecules are considerably dissociated to hydrogen ions and trifluoroacetate in the mixtures. 1H, 13C, and 19F NMR chemical shifts of acetic acid and TFA molecules for acetic acid-water and TFA-water mixtures have indicated strong relationships between a structural change of the mixtures and the acid mole fraction. On the basis of both LAXS and NMR results, the structural changes of acetic acid-water and TFA-water mixtures with decreasing acid mole fraction and the effects of fluorination of the methyl group on the structure are discussed at the molecular level. PMID:17628099

  3. 3-D profile measurement for complex micro-structures

    HU Chun-guang; HU Xiao-dong; XU Lin-yan; GUO Tong; HU Xiao-tang

    2005-01-01

    Micro-structures 3-D profile measurement is an important measurement content for research on micro-machining and characterization of micro-dimension. In this paper,a new method involved 2-D structure template, which guides phase unwrapping,is proposed based on phase-shifting microscopic interferometry.It is fit not only for static measurement, but also for dynamic measurement,especially for motion of MEMS devices.3-D profile of active comb of micro-resonator is obtained by using the method.The theoretic precision in out-of-plane direction is better than 0.5 nm.The in-plane theoretic precision in micro-structures is better than 0.5 μm.But at the edge of micro-structures,it is on the level of micrometer mainly caused by imprecise edge analysis.Finally,its disadvantages and the following development are discussed.

  4. Block-based test data adequacy measurement criteria and test complexity metrics

    2002-01-01

    On the basis of software testing tools we developed for programming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program-based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J-complexity 0; J-complexity 1; J-complexity 1 +; J-complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.

  5. Block-based test data adequacy measurement criteria and test complexity metrics

    陈卫东; 杨建军; 叶澄清; 潘云鹤

    2002-01-01

    On the basis of software testing tools we developed for progrmnming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program-based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J-complexity 0; J-complexity 1 ; J-complexity 1 + ; J-complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.

  6. On the Measurement of Turbulence Over Complex Mountainous Terrain

    Stiperski, Ivana; Rotach, Mathias W.

    2016-04-01

    The theoretical treatment of turbulence is largely based on the assumption of horizontally homogeneous and flat underlying surfaces. Correspondingly, approaches developed over the years to measure turbulence statistics in order to test this theoretical understanding or to provide model input, are also largely based on the same assumption of horizontally homogeneous and flat terrain. Here we discuss aspects of turbulence measurements that require special attention in mountainous terrain. We especially emphasize the importance of data quality (flux corrections, data quality assessment, uncertainty estimates) and address the issues of coordinate systems and different post-processing options in mountainous terrain. The appropriate choice of post-processing methods is then tested based on local scaling arguments. We demonstrate that conclusions drawn from turbulence measurements obtained in mountainous terrain are rather sensitive to these post-processing choices and give suggestions as to those that are most appropriate.

  7. Complex measurement of risk factors at uranium mine workplaces

    The measurement reported was oriented to monitoring the concentrations of dust aerosol and nitrogen oxides during individual operations and the impact of diesel machinery. Other data on the measurement point included the air flow volume, temperature, relative humidity, activity of radon and its daughters. Thanks to the fact that in uranium mines a high value of fresh air flow is prescribed in view of radon and radon daughters contamination, the concentrations of dust aerosol and nitrogen oxides was found not to even reach permissible values. (Ha)

  8. Measuring complexity, nonextensivity and chaos in the DNA sequence of the Major Histocompatibility Complex

    Pavlos, G. P.; Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Xenakis, M. N.; Clark, Peter; Duke, Jamie; Monos, D. S.

    2015-11-01

    We analyze 4 Mb sequences of the Major Histocompatibility Complex (MHC), which is a DNA segment on chromosome 6 with high gene density, controlling many immunological functions and associated with many diseases. The analysis is based on modern theoretical and mathematical tools of complexity theory, such as nonlinear time series analysis and Tsallis non-extensive statistics. The results revealed that the DNA complexity and self-organization can be related to fractional dynamical nonlinear processes with low-dimensional deterministic chaotic and non-extensive statistical character, which generate the DNA sequences under the extremization of Tsallis q-entropy principle. While it still remains an open question as to whether the DNA walk is a fractional Brownian motion (FBM), a static anomalous diffusion process or a non-Gaussian dynamical fractional anomalous diffusion process, the results of this study testify for the latter, providing also a possible explanation for the previously observed long-range power law correlations of nucleotides, as well as the long-range correlation properties of coding and non-coding sequences present in DNA sequences.

  9. Measuring Viscosity with a Levitating Magnet: Application to Complex Fluids

    Even, C.; Bouquet, F.; Remond, J.; Deloche, B.

    2009-01-01

    As an experimental project proposed to students in fourth year of university, a viscometer was developed, consisting of a small magnet levitating in a viscous fluid. The viscous force acting on the magnet is directly measured: viscosities in the range 10-10[superscript 6] mPa s are obtained. This experiment is used as an introduction to complex…

  10. Resolving and measuring diffusion in complex interfaces: Exploring new capabilities

    Alam, Todd M. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This exploratory LDRD targeted the use of a new high resolution spectroscopic diffusion capabilities developed at Sandia to resolve transport processes at interfaces in heterogeneous polymer materials. In particular, the combination of high resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) spectroscopy with pulsed field gradient (PFG) diffusion experiments were used to directly explore interface diffusion within heterogeneous polymer composites, including measuring diffusion for individual chemical species in multi-component mixtures. Several different types of heterogeneous polymer systems were studied using these HRMAS NMR diffusion capabilities to probe the resolution limitations, determine the spatial length scales involved, and explore the general applicability to specific heterogeneous systems. The investigations pursued included a) the direct measurement of the diffusion for poly(dimethyl siloxane) polymer (PDMS) on nano-porous materials, b) measurement of penetrant diffusion in additive manufactures (3D printed) processed PDMS composites, and c) the measurement of diffusion in swollen polymers/penetrant mixtures within nano-confined aluminum oxide membranes. The NMR diffusion results obtained were encouraging and allowed for an improved understanding of diffusion and transport processes at the molecular level, while at the same time demonstrating that the spatial heterogeneity that can be resolved using HRMAS NMR PFG diffusion experiment must be larger than ~μm length scales, expect for polymer transport within nanoporous carbons where additional chemical resolution improves the resolvable heterogeneous length scale to hundreds of nm.

  11. Simulation and Efficient Measurements of Intensities for Complex Imaging Sequences

    Jensen, Jørgen Arendt; Rasmussen, Morten Fischer; Stuart, Matthias Bo;

    2014-01-01

    on the sequence to simulate both intensity and mechanical index (MI) according to FDA rules. A 3 MHz BK Medical 8820e convex array transducer is used with the SARUS scanner. An Onda HFL-0400 hydrophone and the Onda AIMS III system measures the pressure field for three imaging schemes: a fixed focus, single...

  12. Comparison of task complexity measures for emergency operating procedures: Convergent validity and predictive validity

    Human performance while executing operating procedures is critically important for the safety of complex industrial systems. To predict and model human performance, several complexity measures have been developed. This study aims to compare the convergent validity and predictive validity of three existing complexity measures, step complexity (SC), task size, and task complexity (TC), using operator performance data collected from an emergency operating procedure (EOP) experiment. This comparative study shows that these measures have a high convergent validity with each other, most likely because all of them involve the size dimension of complexity. These measures and their sub-measures also have a high predictive validity for operation time and a moderate-to-high predictive validity for error rate, except the step logic complexity (SLC) measure, a component of the SC measure. SLC appears not to contribute to the predictive validity in the experimental EOPs. The use of visual, auditory, cognitive, and psychomotor (VACP) rating scales in the TC measure seems to be significantly beneficial for explaining the human error rate; however, these rating scales appear not to adequately reflect the complexity differences among the meta-operations in EOPs

  13. An approach to measuring adolescents' perception of complexity for pictures of fruit and vegetable mixes

    Mielby, Line Holler; Bennedbæk-Jensen, Sidsel; Edelenbos, Merete;

    2013-01-01

    Complexity is an important parameter in the food industry because of its relationship with hedonic appreciation. However, difficulties are encountered when measuring complexity. The hypothesis of this study was that sensory descriptive analysis is an effective tool for deriving terms to measure a...... simplicity can be used to measure perceived complexity. In relation to attractiveness, different optimal levels of simplicity of pictures of fruit mixes were found for segments of the adolescent consumer group.......Complexity is an important parameter in the food industry because of its relationship with hedonic appreciation. However, difficulties are encountered when measuring complexity. The hypothesis of this study was that sensory descriptive analysis is an effective tool for deriving terms to measure....... An adolescent consumer group (n = 242) and an adult consumer group (n = 86) subsequently rated the pictures on simplicity and attractiveness. Pearson's correlation coefficients revealed strong correlations between the sensory panel and both consumer groups' usage of simplicity. This suggests that...

  14. Complex permittivity measurements of ferroelectric employing composite dielectric resonator technique

    Krupka, J.; Zychowicz, T.; Bovtun, Viktor; Veljko, Sergiy

    2006-01-01

    Roč. 53, č. 10 (2006), s. 1883-1888. ISSN 0885-3010 R&D Projects: GA AV ČR(CZ) IAA1010213; GA ČR(CZ) GA202/04/0993; GA ČR(CZ) GA202/06/0403 Institutional research plan: CEZ:AV0Z10100520 Keywords : dielectric resonator * ferroelectrics * microwave measurements Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.729, year: 2006

  15. Urban sustainability : complex interactions and the measurement of risk

    Lidia Diappi

    1999-05-01

    Full Text Available This paper focuses on the concept of asustainable city and its theoretical implications for the urban system. Urban sustainability is based on positive interactions among three different urban sub-systems : social, economic and physical, where social well-being coexists with economic development and environmental quality. This utopian scenario doesn’t appear. Affluent economy is often associated with poverty and criminality, labour variety and urban efficiency coexist with pollution and congestion. The research subject is the analysis of local risk and opportunity conditions, based on the application of a special definition of risk elaborated and made operative with the production of a set of maps representing the multidimensional facets of spatial organisation in urban sustainability. The interactions among the economic/social and environmental systems are complex and unpredictable and present the opportunity for a new methodology of scientific investigation : the connectionistic approach, processed by Self-Reflexive Neural Networks (SRNN. These Networks are a useful instrument of investigation and analogic questioning of the Data Base. Once the SRNN has learned the structure of the weights from the DB, by querying the network with the maximization or minimization of specific groups of attributes, it is possible to read the related properties and to rank the areas. The survey scale assumed by the research is purposefully aimed at the micro-scale and concerns the Municipality of Milan which is spatially divided into 144 zones.

  16. Measuring patient satisfaction in complex continuing care/rehabilitation care.

    Malik, Navin; Alvaro, Celeste; Kuluski, Kerry; Wilkinson, Andrea J

    2016-04-18

    Purpose - The purpose of this paper is to develop a psychometrically validated survey to assess satisfaction in complex continuing care (CCC)/rehabilitation patients. Design/methodology/approach - A paper or computer-based survey was administered to 252 CCC/rehabilitation patients (i.e. post-acute hospital care setting for people who require ongoing care before returning home) across two hospitals in Toronto, Ontario, Canada. Findings - Using factor analysis, five domains were identified with loadings above 0.4 for all but one item. Behavioral intention and information/communication showed the lowest patient satisfaction, while patient centredness the highest. Each domain correlated positively and significantly predicted overall satisfaction, with quality and safety showing the strongest predictive power and the healing environment the weakest. Gender made a significant contribution to predicting overall satisfaction, but age did not. Research limitations/implications - Results provide evidence of the survey's psychometric properties. Owing to a small sample, supplemental testing with a larger patient group is required to confirm the five-factor structure and to assess test-retest reliability. Originality/value - Improving the health system requires integrating patient perspectives. The patient experience, however, will vary depending on the population being served. This is the first psychometrically validated survey specific to a smaller specialty patient group receiving care at a CCC/rehabilitation facility in Canada. PMID:27120509

  17. Complexity and Information: Measuring Emergence, Self-organization, and Homeostasis at Multiple Scales

    Gershenson, Carlos

    2012-01-01

    Concepts used in the scientific study of complex systems have become so widespread that their use and abuse has led to ambiguity and confusion in their meaning. In this paper we use information theory to provide abstract and concise measures of complexity, emergence, self-organization, and homeostasis. The purpose is to clarify the meaning of these concepts with the aid of the proposed formal measures. In a simplified version of the measures (focussing on the information produced by a system), emergence becomes the opposite of self-organization, while complexity represents their balance. We use computational experiments on random Boolean networks and elementary cellular automata to illustrate our measures at multiple scales.

  18. Reconstruction of Complex Materials by Integral Geometric Measures

    2002-01-01

    The goal of much research in computational materials science is to quantify necessary morphological information and then to develop stochastic models which both accurately reflect the material morphology and allow one to estimate macroscopic physical properties. A novel method of characterizing the morphology of disordered systems is presented based on the evolution of a family of integral geometric measures during erosion and dilation operations.The method is used to determine the accuracy of model reconstructions of random systems. It is shown that the use of erosion/dilation operations on the original image leads to a more accurate discrimination of morphology than previous methods.

  19. Measuring the complex field scattered by single submicron particles

    Potenza, Marco A. C., E-mail: marco.potenza@unimi.it; Sanvito, Tiziano [Department of Physics, University of Milan, via Celoria, 16 – I-20133 Milan (Italy); CIMAINA, University of Milan, via Celoria, 16 – I-20133 Milan (Italy); EOS s.r.l., viale Ortles 22/4, I-20139 Milan (Italy); Pullia, Alberto [Department of Physics, University of Milan, via Celoria, 16 – I-20133 Milan (Italy)

    2015-11-15

    We describe a method for simultaneous measurements of the real and imaginary parts of the field scattered by single nanoparticles illuminated by a laser beam, exploiting a self-reference interferometric scheme relying on the fundamentals of the Optical Theorem. Results obtained with calibrated spheres of different materials are compared to the expected values obtained through a simplified analytical model without any free parameters, and the method is applied to a highly polydisperse water suspension of Poly(D,L-lactide-co-glycolide) nanoparticles. Advantages with respect to existing methods and possible applications are discussed.

  20. Prediction of Software Requirements Stability Based on Complexity Point Measurement Using Multi-Criteria Fuzzy Approach

    D. Francis Xavier Christopher

    2012-12-01

    Full Text Available Many software projects fail due to instable requirements and lack of managing the requirements changesefficiently. Software Requirements Stability Index Metric (RSI helps to evaluate the overall stability ofrequirements and also keep track of the project status. Higher the stability, less changes tends topropagate. The existing system use Function Point modeling for measuring the Requirements Stability.However, the main drawback of the existing modeling is that the complexity of non-functional requirementshas not been measured for Requirements Stability. The Non-Functional Factors plays a vital role inassessing the Requirements Stability. Numerous Measurement methods have been proposed for measuringthe software complexity. This paper proposes Multi-criteria Fuzzy Based approach for finding out thecomplexity weight based on Requirement Complexity Attributes such as Functional RequirementComplexity, Non-Functional Requirement Complexity, Input Output Complexity, Interface and FileComplexity. Based on the complexity weight, this paper computes the software complexity point. And thenpredict the Software Requirements Stability based on Software Complexity Point changes. The advantageof this model is that it is able to estimate the software complexity early which in turn predicts the SoftwareRequirement Stability during the software development life cycle.

  1. Design of New Complex Detector Used for Gross Beta Measuring

    The level of gross β for radioactive aerosol in the containment of nuclear plants can indicate how serious the radioactive pollution is in the shell, and it can provide evidence which shows whether there is the phenomenon of leak in the boundaries of confined aquifer of the primary coolant circuit equipment.In the process of measuring, the counting of gross β is influenced by γ. In order to avoid the influence of γ, a new method was introduced and a new detector was designed using plastic scintillator as the major detecting component and BGO as the sub-component. Based on distinctive difference of light attenuation time, signal induced in them can be discriminated. γ background in plastic scintillator was subtracted according to the counting of γ in BGO. The functions of absolute detection efficiency were obtained. The simulation for Monte-Carlo method shows that the influence of γ background is decreased about one order of magnitude. (authors)

  2. A comparison of LMC and SDL complexity measures on binomial distributions

    Piqueira, José Roberto C.

    2016-02-01

    The concept of complexity has been widely discussed in the last forty years, with a lot of thinking contributions coming from all areas of the human knowledge, including Philosophy, Linguistics, History, Biology, Physics, Chemistry and many others, with mathematicians trying to give a rigorous view of it. In this sense, thermodynamics meets information theory and, by using the entropy definition, López-Ruiz, Mancini and Calbet proposed a definition for complexity that is referred as LMC measure. Shiner, Davison and Landsberg, by slightly changing the LMC definition, proposed the SDL measure and the both, LMC and SDL, are satisfactory to measure complexity for a lot of problems. Here, SDL and LMC measures are applied to the case of a binomial probability distribution, trying to clarify how the length of the data set implies complexity and how the success probability of the repeated trials determines how complex the whole set is.

  3. What the complex joint probabilities observed in weak measurements can tell us about quantum physics

    Hofmann, Holger F. [Graduate School of Advanced Sciences of Matter, Hiroshima University, Kagamiyama 1-3-1, Higashi Hiroshima 739-8530, Japan and JST, CREST, Sanbancho 5, Chiyoda-ku, Tokyo 102-0075 (Japan)

    2014-12-04

    Quantummechanics does not permit joint measurements of non-commuting observables. However, it is possible to measure the weak value of a projection operator, followed by the precise measurement of a different property. The results can be interpreted as complex joint probabilities of the two non-commuting measurement outcomes. Significantly, it is possible to predict the outcome of completely different measurements by combining the joint probabilities of the initial state with complex conditional probabilities relating the new measurement to the possible combinations of measurement outcomes used in the characterization of the quantum state. We can therefore conclude that the complex conditional probabilities observed in weak measurements describe fundamental state-independent relations between non-commuting properties that represent the most fundamental form of universal laws in quantum physics.

  4. What the complex joint probabilities observed in weak measurements can tell us about quantum physics

    Quantummechanics does not permit joint measurements of non-commuting observables. However, it is possible to measure the weak value of a projection operator, followed by the precise measurement of a different property. The results can be interpreted as complex joint probabilities of the two non-commuting measurement outcomes. Significantly, it is possible to predict the outcome of completely different measurements by combining the joint probabilities of the initial state with complex conditional probabilities relating the new measurement to the possible combinations of measurement outcomes used in the characterization of the quantum state. We can therefore conclude that the complex conditional probabilities observed in weak measurements describe fundamental state-independent relations between non-commuting properties that represent the most fundamental form of universal laws in quantum physics

  5. Matrix Energy as a Measure of Topological Complexity of a Graph

    Sinha, Kaushik

    2016-01-01

    The complexity of highly interconnected systems is rooted in the interwoven architecture defined by its connectivity structure. In this paper, we develop matrix energy of the underlying connectivity structure as a measure of topological complexity and highlight interpretations about certain global features of underlying system connectivity patterns. The proposed complexity metric is shown to satisfy the Weyuker criteria as a measure of its validity as a formal complexity metric. We also introduce the notion of P point in the graph density space. The P point acts as a boundary between multiple connectivity regimes for finite-size graphs.

  6. Computed phase diagrams for the system: Sodium hydroxide-uric acid-hydrochloric acid-water

    Brown, W. E.; Gregory, T. M.; Füredi-Milhofer, H.

    1987-07-01

    Renal stone formation is made complex by the variety of solid phases that are formed, by the number of components in the aqueous phase, and by the multiplicity of ionic dissociation and association processes that are involved. In the present work we apply phase diagrams calculated by the use of equilibrium constants from the ternary system sodium hydroxide-uric acid-water to simplify and make more rigorous the understanding of the factors governing dissolution and precipitation of uric acid (anhydrous and dihydrate) and sodium urate monohydrate. The system is then examined in terms of four components. Finally, procedures are described for fluids containing more than four components. The isotherms, singular points, and fields of supersaturation and undersaturation are shown in various forms of phase diagrams. This system has two notable features: (1) in the coordinates -log[H 2U] versus -log[NaOH], the solubility isotherms for anhydrous uric acid and uric acid dihydrate approximate straight lines with slopes equal to +1 over a wide range of concentrations. As a result, substantial quantities of sodium acid urate monohydrate can precipitate from solution or dissolve without changing the degree of saturation of uric acid significantly. (2) The solubility isotherm for NaHU·H 2O has a deltoid shape with the low-pH branch having a slope of infinity. As a result of the vertical slope of this isotherm, substantial quantities of uric acid can dissolve or precipitate without changing the degree of saturation of sodium acid urate monohydrate significantly. The H 2U-NaOH singular point has a pH of 6.87 at 310 K in the ternary system.

  7. Measurements of complex impedance in microwave high power systems with a new bluetooth integrated circuit.

    Roussy, Georges; Dichtel, Bernard; Chaabane, Haykel

    2003-01-01

    By using a new integrated circuit, which is marketed for bluetooth applications, it is possible to simplify the method of measuring the complex impedance, complex reflection coefficient and complex transmission coefficient in an industrial microwave setup. The Analog Devices circuit AD 8302, which measures gain and phase up to 2.7 GHz, operates with variable level input signals and is less sensitive to both amplitude and frequency fluctuations of the industrial magnetrons than are mixers and AM crystal detectors. Therefore, accurate gain and phase measurements can be performed with low stability generators. A mechanical setup with an AD 8302 is described; the calibration procedure and its performance are presented. PMID:15078067

  8. Measurement of Characteristic Self-Similarity and Self-Diversity for Complex Mechanical Systems

    ZHOU Meili; LAI Jiangfeng

    2006-01-01

    Based on similarity science and complex system theory, a new concept of characteristic self-diversity and corresponding relations between self-similarity and self-diversity for complex mechanical systems are presented in this paper. Methods of system self-similarity and self-diversity measure between main system and sub-system are studied. Numerical calculations show that the characteristic self-similarity and self-diversity measure method is validity. A new theory and method of self-similarity and self-diversity measure for complexity mechanical system is presented.

  9. The Complex Trauma Questionnaire (ComplexTQ): development and preliminary psychometric properties of an instrument for measuring early relational trauma.

    Maggiora Vergano, Carola; Lauriola, Marco; Speranza, Anna M

    2015-01-01

    Research on the etiology of adult psychopathology and its relationship with childhood trauma has focused primarily on specific forms of maltreatment. This study developed an instrument for the assessment of childhood and adolescence trauma that would aid in identifying the role of co-occurring childhood stressors and chronic adverse conditions. The Complex Trauma Questionnaire (ComplexTQ), in both clinician and self-report versions, is a measure for the assessment of multi-type maltreatment: physical, psychological, and sexual abuse; physical and emotional neglect as well as other traumatic experiences, such rejection, role reversal, witnessing domestic violence, separations, and losses. The four-point Likert scale allows to specifically indicate with which caregiver the traumatic experience has occurred. A total of 229 participants, a sample of 79 nonclinical and that of 150 high-risk and clinical participants, were assessed with the ComplexTQ clinician version applied to Adult Attachment Interview (AAI) transcripts. Initial analyses indicate acceptable inter-rater reliability. A good fit to a 6-factor model regarding the experience with the mother and to a 5-factor model with the experience with the father was obtained; the internal consistency of factors derived was good. Convergent validity was provided with the AAI scales. ComplexTQ factors discriminated normative from high-risk and clinical samples. The findings suggest a promising, reliable, and valid measurement of early relational trauma that is reported; furthermore, it is easy to complete and is useful for both research and clinical practice. PMID:26388820

  10. MEASURING OF COMPLEX STRUCTURE TRANSFER FUNCTION AND CALCULATING OF INNER SOUND FIELD

    Chen Yuan; Huang Qibai; Shi Hanmin

    2005-01-01

    In order to measure complex structure transfer function and calculate inner sound field, transfer function of integration is mentioned. By establishing virtual system, transfer function of integration can be measured and the inner sound field can also be calculated. In the experiment, automobile body transfer function of integration is measured and experimental method of establishing virtual system is very valid.

  11. Measurement of the total solar energy transmittance (g-value) for complex glazings

    Duer, Karsten

    1999-01-01

    Four different complex glazings have been investigated in the Danish experimental setup METSET.The purpose of the measurements is to increase the confidence in the calorimetric measurements and to perform measurements and corrections according to a method developed in the ALTSET project...

  12. Counterions release from electrostatic complexes of polyelectrolytes and proteins of opposite charge : a direct measurement

    Gummel, Jérémie; Cousin, Fabrice; Boué, François

    2009-01-01

    Though often considered as one of the main driving process of the complexation of species of opposite charges, the release of counterions has never been experimentally directly measured on polyelectrolyte/proteins complexes. We present here the first structural determination of such a release by Small Angle Neutron Scattering in complexes made of lysozyme, a positively charged protein and of PSS, a negatively charged polyelectrolyte. Both components have the same neutron density length, so th...

  13. Measurement of solubilities for rhodium complexes and phosphine ligands in supercritical carbon dioxide

    Shimoyama, Yusuke; Sonoda, Masanori; Miyazaki, Kaoru; Higashi, Hidenori; Iwai, Yoshio; ARAI, Yasuhiko

    2008-01-01

    The solubilities of phosphine ligands and rhodium (Rh) complexes in supercritical carbon dioxide were measured with Fourier transform infrared (FT-IR) spectroscopy at 320 and 333 K and several pressures. Triphenylphosphine (TPP) and tris(p-trifluoromethylphenyl)-phosphine (TTFMPP) were selected as ligands for the Rh complex. The solubilities of the fluorinated ligands and complexes were compared with those of the non-fluorinated compounds. The solubilities of ligand increased up to 10 times b...

  14. Measuring Search Efficiency in Complex Visual Search Tasks: Global and Local Clutter

    Beck, Melissa R.; Lohrenz, Maura C.; Trafton, J. Gregory

    2010-01-01

    Set size and crowding affect search efficiency by limiting attention for recognition and attention against competition; however, these factors can be difficult to quantify in complex search tasks. The current experiments use a quantitative measure of the amount and variability of visual information (i.e., clutter) in highly complex stimuli (i.e.,…

  15. Information Measures of Complexity, Emergence, Self-organization, Homeostasis, and Autopoiesis

    Fernandez, Nelson; Maldonado, Carlos; Gershenson, Carlos

    2013-01-01

    This chapter reviews measures of emergence, self-organization, complexity, homeostasis, and autopoiesis based on information theory. These measures are derived from proposed axioms and tested in two case studies: random Boolean networks and an Arctic lake ecosystem. Emergence is defined as the information a system or process produces. Self-organization is defined as the opposite of emergence, while complexity is defined as the balance between emergence and self-organization. Homeostasis refle...

  16. Silicon Isotope Fractionation During Acid Water-Igneous Rock Interaction

    van den Boorn, S. H.; van Bergen, M. J.; Vroon, P. Z.

    2007-12-01

    Silica enrichment by metasomatic/hydrothermal alteration is a widespread phenomenon in crustal environments where acid fluids interact with silicate rocks. High-sulfidation epithermal ore deposits and acid-leached residues at hot-spring settings are among the best known examples. Acid alteration acting on basalts has also been invoked to explain the relatively high silica contents of the surface of Mars. We have analyzed basaltic-andesitic lavas from the Kawah Ijen volcanic complex (East Java, Indonesia) that were altered by interaction with highly acid (pH~1) sulfate-chloride water of its crater lake and seepage stream. Quantitative removal of major elements during this interaction has led to relative increase in SiO2 contents. Our silicon isotope data, obtained by HR-MC-ICPMS and reported relative to the NIST RM8546 (=NBS28) standard, show a systematic increase in &δ&&30Si from -0.2‰ (±0.3, 2sd) for unaltered andesites and basalts to +1.5‰ (±0.3, 2sd) for the most altered/silicified rocks. These results demonstrate that silicification induced by pervasive acid alteration is accompanied by significant Si isotope fractionation, so that alterered products become isotopically heavier than the precursor rocks. Despite the observed enrichment in SiO2, the rocks have experienced an overall net loss of silicon upon alteration, if Nb is considered as perfectly immobile. The observed &δ&&30Si values of the alteration products appeared to correlate well with the inferred amounts of silicon loss. These findings would suggest that &28Si is preferentially leached during water-rock interaction, implying that dissolved silica in the ambient lake and stream water is isotopically light. However, layered opaline lake sediments, that are believed to represent precipitates from the silica-saturated water show a conspicuous &30Si-enrichment (+1.2 ± 0.2‰). Because anorganic precipitation is known to discriminate against the heavy isotope (e.g. Basile- Doelsch et al., 2006

  17. Solubilities of Isophthalic Acid in Acetic Acid + Water Solvent Mixtures

    CHENG Youwei; HUO Lei; LI Xi

    2013-01-01

    The solubilities of isophthalic acid (1) in binary acetic acid (2) + water (3) solvent mixtures were determined in a pressurized vessel.The temperature range was from 373.2 to 473.2K and the range of the mole fraction of acetic acid in the solvent mixtures was from x2 =0 to 1.A new method to measure the solubility was developed,which solved the problem of sampling at high temperature.The experimental results indicated that within the temperature range studied,the solubilities of isophthalic acid in all mixtures showed an increasing trend with increasing temperature.The experimental solubilities were correlated by the Buchowski equation,and the calculate results showed good agreement with the experimental solubilities.Furthermore,the mixed solvent systems were found to exhibit a maximum solubility effect on the solubility,which may be attributed to the intermolecular association between the solute and the solvent mixture.The maximum solubility effect was well modeled by the modified Wilson equation.

  18. Size Distribution Studies on Sulfuric Acid-Water Particles in a Photolytic Reactor

    Abdullahi, H. U.; Kunz, J. C.; Hanson, D. R.; Thao, S.; Vences, J.

    2015-12-01

    The size distribution of particles composed of sulfuric acid and water were measured in a Photolytic cylindrical Flow Reactor (PhoFR, inner diameter 5 cm, length ~ 100 cm). In the reactor, nitrous acid, water and sulfur dioxide gases along with ultraviolet light produced sulfuric acid. The particles formed from these vapors were detected with a scanning mobility particle spectrometer equipped with a diethylene glycol condensation particle counter (Jiang et al. 2011). For a set of standard conditions, particles attained a log-normal distribution with a peak diameter of 6 nm, and a total number of about 3x105 cm-3. The distributions show that ~70 % of the particles are between 4 and 8 nm diameter (lnσ ~ 0.37). These standard conditions are: 296 K, 25% relative humidity, total flow = 3 sLpm, ~10 ppbv HONO, SO2 in excess. With variations of relative humidity, the total particle number varied strongly, with a power relationship of ~3.5, and the size distributions showed a slight increase in peak diameter with relative humidity, increasing about 1 nm from 8 to 33 % relative humidity. Variations of HONO at a constant light intensity (wavelength of ~ 360 nm) were performed and particle size and total number changed dramatically. Size distributions also changed drastically with variations of light intensity, accomplished by turning on/off some of the black light flourescent bulbs that illuminated the flow reactor. Comparisons of these size distributions to recently published nucleation experiments (e.g. Zollner et al., Glasoe et al.) as well as to simulations of PhoFR reveal important details about the levels of sulfuric acid present in PhoFR as well as possible base contaminants.

  19. Binary Homogeneous Nucleation of Sulfuric Acid-Water: Particle Size Distribution and Effect of

    Neitola, K.; Brus, David; Sipilä, M.; Kulmala, M.

    Thessaloniki : Hellenic Association for Aerosol Research, 2008, T03A041P. [European Aerosol Conference 2008. Thessaloniki (GR), 24.08.2008-29.08.2008] Institutional research plan: CEZ:AV0Z40720504 Keywords : sulphuric acid -water * homogeneous nucleation Subject RIV: CF - Physical ; Theoretical Chemistry

  20. In Situ Fluorescence Microscopic Measurements of Complexation Reactions at Liquid/Liquid Interface

    TSUKAHARA, Satoshi

    2005-01-01

    In situ microscopic measurement is a novel approach to clarify the intrinsic mechanism of complexation reactions occurring at liquid/liquid interfaces. The present review was mainly focused on recent three topics of methodology of in situ fluorescence microscopic observation and measurement of interfacial complexation reactions: (1) two kinds of self-assemblies of Pd2+ and 5,10,15,20-tetra(4-pyridyl)-21H, 23H-porphine complexes formed at the toluene/water interface, (2) microextraction of Eu3...

  1. Variances as order parameter and complexity measure for random Boolean networks

    Luque, Bartolo [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain); Ballesteros, Fernando J [Observatori Astronomic, Universitat de Valencia, Ed. Instituts d' Investigacio, Pol. La Coma s/n, E-46980 Paterna, Valencia (Spain); Fernandez, Manuel [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain)

    2005-02-04

    Several order parameters have been considered to predict and characterize the transition between ordered and disordered phases in random Boolean networks, such as the Hamming distance between replicas or the stable core, which have been successfully used. In this work, we propose a natural and clear new order parameter: the temporal variance. We compute its value analytically and compare it with the results of numerical experiments. Finally, we propose a complexity measure based on the compromise between temporal and spatial variances. This new order parameter and its related complexity measure can be easily applied to other complex systems.

  2. Stochastic processes with values in Riemannian admissible complex: Isotropic process, Wiener measure and Brownian motion

    The purpose of this work was to construct a Brownian motion with values in simplicial complexes with piecewise differential structure. After a martingale theory attempt, we constructed a family of continuous Markov processes with values in an admissible complex; we named every process of this family, isotropic transport process. We showed that the family of the isotropic processes contains a subsequence, which converged weakly to a measure; we named it the Wiener measure. Then, we constructed, thanks to the finite dimensional distributions of the Wiener measure a new continuous Markov process with values in an admissible complex: the Brownian motion. We finished with a geometric analysis of this Brownian motion, to determinate, under hypothesis on the complex, the recurrent or transient behavior of such process. (author)

  3. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    Park, J. K.; Jung, W. D.; Kim, J. W.; Ha, J. J

    2001-04-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed.

  4. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed

  5. Modeling complexity in pathologist workload measurement: the Automatable Activity-Based Approach to Complexity Unit Scoring (AABACUS).

    Cheung, Carol C; Torlakovic, Emina E; Chow, Hung; Snover, Dale C; Asa, Sylvia L

    2015-03-01

    Pathologists provide diagnoses relevant to the disease state of the patient and identify specific tissue characteristics relevant to response to therapy and prognosis. As personalized medicine evolves, there is a trend for increased demand of tissue-derived parameters. Pathologists perform increasingly complex analyses on the same 'cases'. Traditional methods of workload assessment and reimbursement, based on number of cases sometimes with a modifier (eg, the relative value unit (RVU) system used in the United States), often grossly underestimate the amount of work needed for complex cases and may overvalue simple, small biopsy cases. We describe a new approach to pathologist workload measurement that aligns with this new practice paradigm. Our multisite institution with geographically diverse partner institutions has developed the Automatable Activity-Based Approach to Complexity Unit Scoring (AABACUS) model that captures pathologists' clinical activities from parameters documented in departmental laboratory information systems (LISs). The model's algorithm includes: 'capture', 'export', 'identify', 'count', 'score', 'attribute', 'filter', and 'assess filtered results'. Captured data include specimen acquisition, handling, analysis, and reporting activities. Activities were counted and complexity units (CUs) generated using a complexity factor for each activity. CUs were compared between institutions, practice groups, and practice types and evaluated over a 5-year period (2008-2012). The annual load of a clinical service pathologist, irrespective of subspecialty, was ∼40,000 CUs using relative benchmarking. The model detected changing practice patterns and was appropriate for monitoring clinical workload for anatomical pathology, neuropathology, and hematopathology in academic and community settings, and encompassing subspecialty and generalist practices. AABACUS is objective, can be integrated with an LIS and automated, is reproducible, backwards compatible

  6. The Microcantilever: A Versatile Tool for Measuring the Rheological Properties of Complex Fluids

    I. Dufour

    2012-01-01

    Full Text Available Silicon microcantilevers can be used to measure the rheological properties of complex fluids. In this paper, two different methods will be presented. In the first method, the microcantilever is used to measure the hydrodynamic force exerted by a confined fluid on a sphere that is attached to the microcantilever. In the second method, the measurement of the microcantilever's dynamic spectrum is used to extract the hydrodynamic force exerted by the surrounding fluid on the microcantilever. The originality of the proposed methods lies in the fact that not only may the viscosity of the fluid be measured, but also the fluid's viscoelasticity, that is, both viscous and elastic properties, which are key parameters in the case of complex fluids. In both methods, the use of analytical equations permits the fluid's complex shear modulus to be extracted and expressed as a function of shear stress and/or frequency.

  7. Complexity

    Gershenson, Carlos

    2011-01-01

    The term complexity derives etymologically from the Latin plexus, which means interwoven. Intuitively, this implies that something complex is composed by elements that are difficult to separate. This difficulty arises from the relevant interactions that take place between components. This lack of separability is at odds with the classical scientific method - which has been used since the times of Galileo, Newton, Descartes, and Laplace - and has also influenced philosophy and engineering. In recent decades, the scientific study of complexity and complex systems has proposed a paradigm shift in science and philosophy, proposing novel methods that take into account relevant interactions.

  8. Fast laser systems for measuring the geometry of complex-shaped objects

    Galiulin, Ravil M.; Galiulin, Rishat M.; Bakirov, J. M.; Vorontsov, A. V.; Ponomarenko, I. V.

    1999-01-01

    The technical characteristics, advantages and applications of an automated optoelectronic measuring system designed by 'Optel' company, State Aviation University of Ufa, are presented in this paper. The measuring apparatus can be applied for industrial development and research, for example, in rapid prototyping, and for obtaining geometrical parameters in medicine and criminalistics. It essentially is a non-contact and rapid scanning system, allowing measurements of complex shaped objects like metal and plastic workpieces or parts of human body.

  9. Design and Functional Validation of a Complex Impedance Measurement Device for Characterization of Ultrasonic Transducers

    This paper presents the design and practical implementation of a complex impedance measurement device capable of characterization of ultrasonic transducers. The device works in the frequency range used by industrial ultrasonic transducers which is below the measurement range of modern high end network analyzers. The device uses the Goertzel algorithm instead of the more common FFT algorithm to calculate the magnitude and phase component of the impedance under test. A theoretical overview is given followed by a practical approach and measurement results. (authors)

  10. The effect of electrode contact resistance and capacitive coupling on Complex Resistivity measurements

    Ingeman-Nielsen, Thomas

    2006-01-01

    The effect of electrode contact resistance and capacitive coupling on complex resistivity (CR) measurements is studied in this paper. An equivalent circuit model for the receiver is developed to describe the effects. The model shows that CR measurements are severely affected even at relatively lo...... the contact resistance artificially increased by resistors. The results emphasize the importance of keeping contact resistance low in CR measurements....

  11. Exploring The Globalization Of German Mncs With The Complex Spread And Diversity Measure

    Jan Hendrik Fisch; Michael-Jörg Oesterle

    2003-01-01

    In this paper, we present a new quantitative measurement concept that integrates multiple dimensions of internationalization in a complex number and tries to measure globalization instead of simple internationalization. We apply this measure to assess the globalization states and processes of the most internationalized German MNCs. Our results suggest that these MNCs are neither globalized nor do they show a straightforward path towards globalization in the last decade. This outcome contradic...

  12. Implementing digital holograms to create and measure complex-plane optical fields

    Dudley, Angela; Majola, Nombuso; Chetty, Naven; Forbes, Andrew

    2016-02-01

    The coherent superposition of a Gaussian beam with an optical vortex can be mathematically described to occupy the complex plane. We provide a simple analogy between the mathematics, in the form of the complex plane, and the visual representation of these two superimposed optical fields. We provide detailed instructions as to how one can experimentally produce, measure, and control these fields with the use of digital holograms encoded on a spatial light modulator.

  13. Complexity measures for object-oriented conceptual models of an application domain

    Poels, Geert; Dedene, Guido

    1997-01-01

    According to Norman Fenton few work has been done on measuring the complexity of the problems underlying software development. Nonetheless, it is believed that this attribute has a significant impact on software quality and development effort. A substantial portion of the underlying problems are captured in the conceptual model of the application domain. Based on previous work on conceptual modelling of aplication domains, the attribute 'complexity of a conceptual model' is formally define...

  14. Measuring streetscape complexity based on the statistics of local contrast and spatial frequency.

    Cavalcante, André; Mansouri, Ahmed; Kacha, Lemya; Barros, Allan Kardec; Takeuchi, Yoshinori; Matsumoto, Naoji; Ohnishi, Noboru

    2014-01-01

    Streetscapes are basic urban elements which play a major role in the livability of a city. The visual complexity of streetscapes is known to influence how people behave in such built spaces. However, how and which characteristics of a visual scene influence our perception of complexity have yet to be fully understood. This study proposes a method to evaluate the complexity perceived in streetscapes based on the statistics of local contrast and spatial frequency. Here, 74 streetscape images from four cities, including daytime and nighttime scenes, were ranked for complexity by 40 participants. Image processing was then used to locally segment contrast and spatial frequency in the streetscapes. The statistics of these characteristics were extracted and later combined to form a single objective measure. The direct use of statistics revealed structural or morphological patterns in streetscapes related to the perception of complexity. Furthermore, in comparison to conventional measures of visual complexity, the proposed objective measure exhibits a higher correlation with the opinion of the participants. Also, the performance of this method is more robust regarding different time scenarios. PMID:24498292

  15. Measuring streetscape complexity based on the statistics of local contrast and spatial frequency.

    André Cavalcante

    Full Text Available Streetscapes are basic urban elements which play a major role in the livability of a city. The visual complexity of streetscapes is known to influence how people behave in such built spaces. However, how and which characteristics of a visual scene influence our perception of complexity have yet to be fully understood. This study proposes a method to evaluate the complexity perceived in streetscapes based on the statistics of local contrast and spatial frequency. Here, 74 streetscape images from four cities, including daytime and nighttime scenes, were ranked for complexity by 40 participants. Image processing was then used to locally segment contrast and spatial frequency in the streetscapes. The statistics of these characteristics were extracted and later combined to form a single objective measure. The direct use of statistics revealed structural or morphological patterns in streetscapes related to the perception of complexity. Furthermore, in comparison to conventional measures of visual complexity, the proposed objective measure exhibits a higher correlation with the opinion of the participants. Also, the performance of this method is more robust regarding different time scenarios.

  16. Three-dimensional quantification of structures in trabecular bone using measures of complexity

    Marwan, Norbert; Kurths, Jürgen; Thomsen, Jesper Skovhus; Felsenberg, Dieter; Saparin, Peter

    2009-01-01

    The study of pathological changes of bone is an important task in diagnostic procedures of patients with metabolic bone diseases such as osteoporosis as well as in monitoring the health state of astronauts during long-term space flights. The recent availability of high-resolution three......-dimensional (3D) imaging of bone challenges the development of data analysis techniques able to assess changes of the 3D microarchitecture of trabecular bone. We introduce an approach based on spatial geometrical properties and define structural measures of complexity for 3D image analysis. These measures...... evaluate different aspects of organization and complexity of 3D structures, such as complexity of its surface or shape variability. We apply these measures to 3D data acquired by high-resolution microcomputed tomography (µCT) from human proximal tibiae and lumbar vertebrae at different stages of...

  17. Classification of periodic, chaotic and random sequences using approximate entropy and Lempel–Ziv complexity measures

    Karthi Balasubramanian; Silpa S Nair; Nithin Nagaraj

    2015-03-01

    ‘Complexity’ has several definitions in diverse fields. These measures are indicators of some aspects of the nature of the signal. Such measures are used to analyse and classify signals and as a signal diagnostics tool to distinguish between periodic, quasiperiodic, chaotic and random signals. Lempel–Ziv (LZ) complexity and approximate entropy (ApEn) are such popular complexity measures that are widely used for characterizing biological signals also. In this paper, we compare the utility of ApEn, LZ complexities and Shannon’s entropy in characterizing data from a nonlinear chaotic map (logistic map). In this work, we show that LZ and ApEn complexity measures can characterize the data complexities correctly for data sequences as short as 20 in length while Shannon’s entropy fails for length less than 50. In the case of noisy sequences with 10% uniform noise, Shannon’s entropy works only for lengths greater than 200 while LZ and ApEn are successful with sequences of lengths greater than 30 and 20, respectively.

  18. Quantification of spatial structure of human proximal tibial bone biopsies using 3D measures of complexity

    Saparin, Peter I.; Thomsen, Jesper Skovhus; Prohaska, Steffen; Zaikin, Alexei; Kurths, Jürgen; Hege, H.-C.; Gowin, Wolfgang

    3D data sets of human tibia bone biopsies acquired by a micro-CT scanner. In order to justify the newly proposed approach, the measures of complexity of the bone architecture were compared with the results of traditional 2D bone histomorphometry. The proposed technique is able to quantify the......Changes in trabecular bone composition during development of osteoporosis are used as a model for bone loss in microgravity conditions during a space flight. Symbolic dynamics and measures of complexity are proposed and applied to assess quantitatively the structural composition of bone tissue from...

  19. An Activation Force-based Affinity Measure for Analyzing Complex Networks

    Jun Guo; Hanliang Guo; Zhanyi Wang

    2011-01-01

    Affinity measure is a key factor that determines the quality of the analysis of a complex network. Here, we introduce a type of statistics, activation forces, to weight the links of a complex network and thereby develop a desired affinity measure. We show that the approach is superior in facilitating the analysis through experiments on a large-scale word network and a protein-protein interaction (PPI) network consisting of ∼5,000 human proteins. The experiment on the word network verifies tha...

  20. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces.

    Nakano, Yuichiro; Akamatsu, Norihiko; Mori, Tsuyoshi; Sano, Kazunori; Satoh, Katsuya; Nagayasu, Takeshi; Miyoshi, Yoshiaki; Sugio, Tomomi; Sakai, Hideyuki; Sakae, Eiji; Ichimiya, Kazuko; Hamada, Masahisa; Nakayama, Takehisa; Fujita, Yuhzo; Yanagihara, Katsunori; Nishida, Noriyuki

    2016-01-01

    Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9) with sonication, and then with acidic water (pH 2.7) without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa) and a fungus (Candida albicans) were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound. PMID:27223116

  1. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces.

    Yuichiro Nakano

    Full Text Available Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9 with sonication, and then with acidic water (pH 2.7 without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa and a fungus (Candida albicans were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound.

  2. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces

    Nakano, Yuichiro; Akamatsu, Norihiko; Mori, Tsuyoshi; Sano, Kazunori; Satoh, Katsuya; Nagayasu, Takeshi; Miyoshi, Yoshiaki; Sugio, Tomomi; Sakai, Hideyuki; Sakae, Eiji; Ichimiya, Kazuko; Hamada, Masahisa; Nakayama, Takehisa; Fujita, Yuhzo; Yanagihara, Katsunori; Nishida, Noriyuki

    2016-01-01

    Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9) with sonication, and then with acidic water (pH 2.7) without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa) and a fungus (Candida albicans) were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound. PMID:27223116

  3. PIV measurements and data accuracy analysis of flow in complex terrain

    Yao, Rentai; Hao, Hongwei; Qiao, Qingdang

    2000-10-01

    In this paper velocity fields and flow visualization in complex terrain in an environmental wind tunnel have been measured using PIV. In addition, it would be useful to appraise the PIV data by comparing the PIV results with those obtained from the well- established point measurement methods, such as constant temperature anemometry (CTA) and Dantec FlowMaster, in order to verify the accuracy of PIV measurements. The results indicate that PIV is a powerful tool for velocity measurements in the environmental wind tunnel.

  4. Measurement and documentation of complex PTSD in treatment seeking traumatized refugees

    Palic, Sabina

    syndromes on self-report measures. A total of 34% of the refugee clinical convenience sample (n = 116) met the criteria for DESNOS, and 32% were estimated to have one of the two PD. Furthermore, Axis-II pathology and DESNOS was observed in traumatized refugees even when there was no presence of childhood...... limited to measuring symptoms of PTSD, anxiety, and depression. This renders documentation, measurement, and treatment of possible complex traumatic adaptations in traumatized refugees very difficult. The thesis comprises two studies using different measures and different samples. The first study...... investigated complex traumatization as Disorders of Extreme Stress Not Otherwise Specified (DESNOS). The first article from this study demonstrated that DESNOS in a clinical sample of refugees, primarily resembled the Schizotypal, and Paranoid personality disorders (PD), when compared to Axis I and Axis II...

  5. Comparing entropy with tests for randomness as a measure of complexity in time series

    Gan, Chee Chun

    2015-01-01

    Entropy measures have become increasingly popular as an evaluation metric for complexity in the analysis of time series data, especially in physiology and medicine. Entropy measures the rate of information gain, or degree of regularity in a time series e.g. heartbeat. Ideally, entropy should be able to quantify the complexity of any underlying structure in the series, as well as determine if the variation arises from a random process. Unfortunately current entropy measures mostly are unable to perform the latter differentiation. Thus, a high entropy score indicates a random or chaotic series, whereas a low score indicates a high degree of regularity. This leads to the observation that current entropy measures are equivalent to evaluating how random a series is, or conversely the degree of regularity in a time series. This raises the possibility that existing tests for randomness, such as the runs test or permutation test, may have similar utility in diagnosing certain conditions. This paper compares various t...

  6. Counterions release from electrostatic complexes of polyelectrolytes and proteins of opposite charge : a direct measurement

    Gummel, Jérémie; Boué, François

    2009-01-01

    Though often considered as one of the main driving process of the complexation of species of opposite charges, the release of counterions has never been experimentally directly measured on polyelectrolyte/proteins complexes. We present here the first structural determination of such a release by Small Angle Neutron Scattering in complexes made of lysozyme, a positively charged protein and of PSS, a negatively charged polyelectrolyte. Both components have the same neutron density length, so their scattering can be switched off simultaneously in an appropriate "matching" solvent; this enables determination of the spatial distribution of the single counterions within the complexes. The counterions (including the one subjected to Manning condensation) are expelled from the cores where the species are at electrostatic stoichiometry.

  7. Measuring complex for studying cascade solar photovoltaic cells and concentrator modules on their basis

    Larionov, V. R.; Malevskii, D. A.; Pokrovskii, P. V.; Rumyantsev, V. D.

    2015-06-01

    The design and implementation of several measuring complexes intended for studying cascade solar photovoltaic converters are considered. The complexes consist of a solar simulator and an electronic unit with an active load. The high-aperture light source of the complex reproduces solar intensity over wide spectral range λ = 350-1700 nm with an angle of divergence of ±0.26°, which are characteristic of solar radiation. The active load of the electronic unit allows taking both dark and illuminated I- V characteristics of test objects within about 1 ms during the quasi-stationary part of the irradiation pulse. The small size and low power consumption of the complexes hold out the hope that they will be widely used in designing, refining, and testing cascade efficient photovoltaic converters made of III-V materials and solar modules integrating these converters with concentrator modules.

  8. The precision of visual memory for a complex contour shape measured by a freehand drawing task.

    Osugi, Takayuki; Takeda, Yuji

    2013-03-01

    Contour information is an important source for object perception and memory. Three experiments examined the precision of visual short-term memory for complex contour shapes. All used a new procedure that assessed recall memory for holistic information in complex contour shapes: Participants studied, then reproduced (without cues), a contoured shape by freehand drawing. In Experiment 1 memory precision was measured by comparing Fourier descriptors for studied and reproduced contours. Results indicated survival of lower (holistic) frequency information (i.e., ⩽5cycles/perimeter) and loss of higher (detail) frequency information. Secondary tasks placed demands on either verbal memory (Experiment 2) or visual spatial memory (Experiment 3). Neither secondary task interfered with recall of complex contour shapes, suggesting that the memory system maintaining holistic shape information was independent of both the verbal memory system and the visual spatial memory subsystem of visual short-term memory. The nature of memory for complex contour shape is discussed. PMID:23296198

  9. Study of proton-transfer processes by the NMR method applied to various nuclei. VIII. The trifluoroacetic acid-water system

    It was shown earlier that the determination of the composition and type of the complexes is possible by the use of the NMR method applied to various nuclei. This method is based on the simultaneous solution of equations describing the concentration dependence of the NMR chemical shifts for the various nuclei in the system and material-balance equations. It has been applied to the investigation of complex-formation and proton-transfer processes in the nitric acid-water system. In the present work the authors studied aqueous solutions of an acid that is weaker than nitric acid, namely trifluoroacetic acid, both of the usual isotopic composition, and also a sample deuterated to the extent of 97.65%, in the concentration range of 0-100 mole %. The considerable changes in the chemical shifts of the 1H, 13C, and 19F nuclei, depending on the concentration, indicate the formation of complexes of various types and compositions

  10. Node-weighted interacting network measures improve the representation of real-world complex systems

    Wiedermann, Marc; Heitzig, Jobst; Kurths, Jürgen

    2013-01-01

    Network theory provides a rich toolbox consisting of methods, measures, and models for studying the structure and dynamics of complex systems found in nature, society, or technology. Recently, it has been pointed out that many real-world complex systems are more adequately mapped by networks of interacting or interdependent networks, e.g., a power grid showing interdependency with a communication network. Additionally, in many real-world situations it is reasonable to include node weights into complex network statistics to reflect the varying size or importance of subsystems that are represented by nodes in the network of interest. E.g., nodes can represent vastly different surface area in climate networks, volume in brain networks or economic capacity in trade networks. In this letter, combining both ideas, we derive a novel class of statistical measures for analysing the structure of networks of interacting networks with heterogeneous node weights. Using a prototypical spatial network model, we show that th...

  11. Effects of Lability of Metal Complex on Free Ion Measurement Using DMT

    Weng, L.P.; Riemsdijk, van W.H.; Temminghoff, E.J.M.

    2010-01-01

    Very low concentrations of free metal ion in natural samples can be measured using the Donnan membrane technique (DMT) based on ion transport kinetics. In this paper, the possible effects of slow dissociation of metal complexes on the interpretation of kinetic DMT are investigated both theoretically

  12. A measure of statistical complexity based on predictive information with application to finite spin systems

    Abdallah, Samer A., E-mail: samer.abdallah@eecs.qmul.ac.uk [School of Electronic Engineering and Computer Science, Queen Mary University of London, London E1 4NS (United Kingdom); Plumbley, Mark D., E-mail: mark.plumbley@eecs.qmul.ac.uk [School of Electronic Engineering and Computer Science, Queen Mary University of London, London E1 4NS (United Kingdom)

    2012-01-09

    We propose the binding information as an information theoretic measure of complexity between multiple random variables, such as those found in the Ising or Potts models of interacting spins, and compare it with several previously proposed measures of statistical complexity, including excess entropy, Bialek et al.'s predictive information, and the multi-information. We discuss and prove some of the properties of binding information, particularly in relation to multi-information and entropy, and show that, in the case of binary random variables, the processes which maximise binding information are the ‘parity’ processes. The computation of binding information is demonstrated on Ising models of finite spin systems, showing that various upper and lower bounds are respected and also that there is a strong relationship between the introduction of high-order interactions and an increase of binding-information. Finally we discuss some of the implications this has for the use of the binding information as a measure of complexity. -- Highlights: ► We introduce ‘binding information’ as a entropic/statistical measure of complexity. ► Binding information (BI) is related to earlier notions of predictive information. ► We derive upper and lower bounds of BI relation to entropy and multi-information. ► Parity processes found to maximise BI in finite sets of binary random variables. ► Application to spin glasses shows highest BI obtained with high-order interactions.

  13. The Word Complexity Measure: Description and Application to Developmental Phonology and Disorders

    Stoel-Gammon, Carol

    2010-01-01

    Miccio's work included a number of articles on the assessment of phonology in children with phonological disorders, typically using measures of correct articulation, using the PCC, or analyses of errors, using the framework of phonological processes. This paper introduces an approach to assessing phonology by examining the phonetic complexity of…

  14. 2D and 3D endoanal and translabial ultrasound measurement variation in normal postpartum measurements of the anal sphincter complex

    MERIWETHER, Kate V.; HALL, Rebecca J.; LEEMAN, Lawrence M.; MIGLIACCIO, Laura; QUALLS, Clifford; ROGERS, Rebecca G.

    2015-01-01

    Introduction Women may experience anal sphincter anatomy changes after vaginal or Cesarean delivery. Therefore, accurate and acceptable imaging options to evaluate the anal sphincter complex (ASC) are needed. ASC measurements may differ between translabial (TL-US) and endoanal ultrasound (EA-US) imaging and between 2D and 3D ultrasound. The objective of this analysis was to describe measurement variation between these modalities. Methods Primiparous women underwent 2D and 3D TL-US imaging of the ASC six months after a vaginal birth (VB) or Cesarean delivery (CD). A subset of women also underwent EA-US measurements. Measurements included the internal anal sphincter (IAS) thickness at proximal, mid, and distal levels and the external anal sphincter (EAS) at 3, 6, 9, and 12 o’clock positions as well as bilateral thickness of the pubovisceralis muscle (PVM). Results 433 women presented for US: 423 had TL-US and 64 had both TL-US and EA-US of the ASC. All IAS measurements were significantly thicker on TL-US than EA-US (all p0.20). On both TL-US and EA-US, there were multiple sites where significant asymmetry existed in left versus right measurements. Conclusion The ultrasound modality used to image the ASC introduces small but significant changes in measurements, and the direction of the bias depends on the muscle and location being imaged. PMID:25344221

  15. Measuring economic complexity of countries and products: which metric to use?

    Mariani, Manuel Sebastian; Vidmer, Alexandre; Medo, Matsúš; Zhang, Yi-Cheng

    2015-11-01

    Evaluating the economies of countries and their relations with products in the global market is a central problem in economics, with far-reaching implications to our theoretical understanding of the international trade as well as to practical applications, such as policy making and financial investment planning. The recent Economic Complexity approach aims to quantify the competitiveness of countries and the quality of the exported products based on the empirical observation that the most competitive countries have diversified exports, whereas developing countries only export few low quality products - typically those exported by many other countries. Two different metrics, Fitness-Complexity and the Method of Reflections, have been proposed to measure country and product score in the Economic Complexity framework. We use international trade data and a recent ranking evaluation measure to quantitatively compare the ability of the two metrics to rank countries and products according to their importance in the network. The results show that the Fitness-Complexity metric outperforms the Method of Reflections in both the ranking of products and the ranking of countries. We also investigate a generalization of the Fitness-Complexity metric and show that it can produce improved rankings provided that the input data are reliable.

  16. Measuring Complexity, Development Time and Understandability of a Program: A Cognitive Approach

    Amit Kumar Jakhar

    2014-11-01

    Full Text Available One of the central problems in software engineering is the inherent complexity. Since software is the result of human creative activity and cognitive informatics plays an important role in understanding its fundamental characteristics. This paper models one of the fundamental characteristics of software complexity by examining the cognitive weights of basic software control structures. Cognitive weights are the degree of the difficulty or relative time and effort required for comprehending a given piece of software, which satisfy the definition of complexity. Based on this approach a new concept of New Weighted Method Complexity (NWMC of software is developed. Twenty programs are distributed among 5 PG students and development time is noted of all of them and mean is considered as the actual time needed time to develop the programs and Understandability (UA is also measured of all the programs means how much time needed to understand the code. This paper considers Jingqiu Shao et al Cognitive Functional Size (CFS of software for study. In order to validate the new complexity metrics we have calculated the correlation between proposed metric and CFS with respect to actual development time and performed analysis of NWMC with CFS with Mean Relative Error (MRE and Standard Deviation (Std.. Finally, the authors found that the accuracy to estimate the development time with proposed measure is far better than CFS.

  17. Recurrence Plot Based Measures of Complexity and its Application to Heart Rate Variability Data

    Marwan, N; Meyerfeldt, U; Schirdewan, A; Kurths, J

    2002-01-01

    In complex systems the knowledge of transitions between regular, laminar or chaotic behavior is essential to understand the processes going on there. Linear approaches are often not sufficient to describe these processes and several nonlinear methods require rather long time observations. To overcome these difficulties, we propose measures of complexity based on vertical structures in recurrence plots and apply them to the logistic map as well as to heart rate variability data. For the logistic map these measures enable us to detect transitions between chaotic and periodic states, as well as to identify additional laminar states, i.e. chaos-chaos transitions. Traditional recurrence quantification analysis fails to detect these latter transitions. Applying our new measures to the heart rate variability data, we are able to detect and quantify laminar phases before a life-threatening cardiac arrhythmia and, thus, to enable a prediction of such an event. Our findings could be of importance for the therapy of mal...

  18. Microbial growth and biofilm formation in geologic media is detected with complex conductivity measurements

    Davis, Caroline A.; Atekwana, Estella; Atekwana, Eliot; Slater, Lee D.; Rossbach, Silvia; Mormile, Melanie R.

    2006-09-01

    Complex conductivity measurements (0.1-1000 Hz) were obtained from biostimulated sand-packed columns to investigate the effect of microbial growth and biofilm formation on the electrical properties of porous media. Microbial growth was verified by direct microbial counts, pH measurements, and environmental scanning electron microscope imaging. Peaks in imaginary (interfacial) conductivity in the biostimulated columns were coincident with peaks in the microbial cell concentrations extracted from sands. However, the real conductivity component showed no discernible relationship to microbial cell concentration. We suggest that the observed dynamic changes in the imaginary conductivity (σ″) arise from the growth and attachment of microbial cells and biofilms to sand surfaces. We conclude that complex conductivity techniques, specifically imaginary conductivity measurements are a proxy indicator for microbial growth and biofilm formation in porous media. Our results have implications for microbial enhanced oil recovery, CO2 sequestration, bioremediation, and astrobiology studies.

  19. Measuring the pollutant transport capacity of dissolved organic matter in complex matrixes

    Persson, L.; Alsberg, T.; Odham, G.;

    2003-01-01

    Dissolved organic matter (DOM) facilitated transport in contaminated groundwater was investigated through the measurement of the binding capacity of landfill leachate DOM (Vejen, Denmark) towards two model pollutants (pyrene and phenanthrene). Three different methods for measuring binding capacity...... were used and evaluated, head-space solid-phase micro-extraction (HS-SPME), enhanced solubility (ES) and fluorescence quenching (FQ). It was concluded that for samples with complex matrixes it was possible to measure the net effect of the DOM binding capacity and the salting out effect of the matrix...... binding capacity....

  20. Complex hand dexterity: a review of biomechanical methods for measuring musical performance.

    Metcalf, Cheryl D; Irvine, Thomas A; Sims, Jennifer L; Wang, Yu L; Su, Alvin W Y; Norris, David O

    2014-01-01

    Complex hand dexterity is fundamental to our interactions with the physical, social, and cultural environment. Dexterity can be an expression of creativity and precision in a range of activities, including musical performance. Little is understood about complex hand dexterity or how virtuoso expertise is acquired, due to the versatility of movement combinations available to complete any given task. This has historically limited progress of the field because of difficulties in measuring movements of the hand. Recent developments in methods of motion capture and analysis mean it is now possible to explore the intricate movements of the hand and fingers. These methods allow us insights into the neurophysiological mechanisms underpinning complex hand dexterity and motor learning. They also allow investigation into the key factors that contribute to injury, recovery and functional compensation. The application of such analytical techniques within musical performance provides a multidisciplinary framework for purposeful investigation into the process of learning and skill acquisition in instrumental performance. These highly skilled manual and cognitive tasks present the ultimate achievement in complex hand dexterity. This paper will review methods of assessing instrumental performance in music, focusing specifically on biomechanical measurement and the associated technical challenges faced when measuring highly dexterous activities. PMID:24860531

  1. Complex Hand Dexterity: A Review of Biomechanical Methods for Measuring Musical Performance

    Cheryl Diane Metcalf

    2014-05-01

    Full Text Available Complex hand dexterity is fundamental to our interactions with the physical, social and cultural environment. Dexterity can be an expression of creativity and precision in a range of activities, including musical performance. Little is understood about complex hand dexterity or how virtuoso expertise is acquired, due to the versatility of movement combinations available to complete any given task. This has historically limited progress of the field because of difficulties in measuring movements of the hand. Recent developments in methods of motion capture and analysis mean it is now possible to explore the intricate movements of the hand and fingers. These methods allow us insights into the neurophysiological mechanisms underpinning complex hand dexterity and motor learning. They also allow investigation into the key factors that contribute to injury, recovery and functional compensation.The application of such analytical techniques within musical performance provides a multidisciplinary framework for purposeful investigation into the process of learning and skill acquisition in instrumental performance. These highly skilled manual and cognitive tasks present the ultimate achievement in complex hand dexterity. This paper will review methods of assessing instrumental performance in music, focusing specifically on biomechanical measurement and the associated technical challenges faced when measuring highly dexterous activities.

  2. Investigating the TACOM measure as a general tool for quantifying the complexity of procedure guided tasks

    According to operation experience, it is evident that the role of human operators is critical for securing the safety of complex socio-technical systems. For this reason, various kinds of HRA (Human Reliability Analysis) techniques have been used for several decades in order to systematically manage the likelihood of human error. One of the prerequisites to accomplish this goal is the provision of sufficient data that are helpful for HRA practitioners. In this regard, Podofillini, Park, and Dang (2013) investigated the feasibility of the TACOM (Task Complexity) measure as a tool to represent the effect of a task complexity on the performance of human operators in an objective manner. As a result, it was observed that TACOM scores systematically explain the variation of difficulty rankings and the likelihood of human error being empirically measured. Accordingly, it is possible to expect that the TACOM measure can support HRA practitioners because they can estimate the relative difficulties (or the likelihoods of human error) among tasks based on the associated TACOM scores to some extent. In order to confirm this expectation, however, it is indispensable to ensure the generality of the TACOM measure. From this necessity, task performance time data obtained from different task environments are compared. Consequently, it is believed that the TACOM measure can be regarded as a general tool for representing the complexity of procedure guided tasks because human operators who are faced with similar TACOM scores showed comparable task performance times even under different task environments. - Highlights: • TACOM scores seemed to be correlated with difficulty rankings and empirical HEPs. • Two sets of task performance times are gathered from different task environments. • Task performance times are compared with the associated TACOM scores. • TACOM measure seems to be applied in common to different task environments

  3. Raman spectroscopy of the system iron(III)-sulfuric acid-water: an approach to Tinto River's (Spain) hydrogeochemistry.

    Sobron, P; Rull, F; Sobron, F; Sanz, A; Medina, J; Nielsen, C J

    2007-12-15

    Acid mine drainage is formed when pyrite (FeS(2)) is exposed and reacts with air and water to form sulfuric acid and dissolved iron. Tinto River (Huelva, Spain) is an example of this phenomenon. In this study, Raman spectroscopy has been used to investigate the speciation of the system iron(III)-sulfuric acid-water as an approach to Tinto River's aqueous solutions. The molalities of sulfuric acid (0.09 mol/kg) and iron(III) (0.01-1.5 mol/kg) were chosen to mimic the concentration of the species in Tinto River waters. Raman spectra of the solutions reveal a strong iron(III)-sulfate inner-sphere interaction through the nu(1) sulfate band at 981 cm(-1) and its shoulder at 1005 cm(-1). Iron(III)-sulfate interaction may also be facilitated by hydrogen bonds and monitored in the Raman spectra through the symmetric stretching band of bisulfate at 1052 cm(-1) and a shoulder at 1040 cm(-1). Other bands in the low-frequency region of the Raman spectra are attributed to the hydrogen-bonded complexes formation as well. PMID:17869164

  4. Full-field velocity and temperature measurements using magnetic resonance imaging in turbulent complex internal flows

    Flow and heat transfer in complex internal passages are difficult to predict due to the presence of strong secondary flows and multiple regions of separation. Two methods based on magnetic resonance imaging called 4D magnetic resonance velocimetry (4D-MRV) and thermometry (4D-MRT) are described for measuring the full-field mean velocities and temperatures, respectively, in complex internal passage flows. 4D-MRV measurements are presented for flow through a model of a gas turbine blade internal cooling passage geometry with Reh = 10,000 and compared to PIV measurements in a highly complex 180 deg bend. Measured three-component velocities provide excellent qualitative and quantitative insight into flow structures throughout the entire flow domain. The velocities agree within ±10% in magnitude and ±10 deg in direction in a large portion of the bend which is characterized by turbulent fluctuations as high as 10-20% of the passage inlet bulk velocity. Integrated average flow rates are accurate to 4% throughout the flow domain. Preliminary 4D-MRV/MRT results are presented for heated fully developed turbulent pipe flow at ReD = 13,000

  5. BETWEEN PARCIMONY AND COMPLEXITY: COMPARING PERFORMANCE MEASURES FOR ROMANIAN BANKING INSTITUTIONS

    ANCA MUNTEANU

    2012-01-01

    Full Text Available The main objective of this study is to establish the relationship between traditional measures of performance (ROE, ROA and NIM and EVA in order to gain some insight about the relevance of using more sophisticated performance measurements tools. Towards this end the study uses two acknowledged statistical measures: Kendall’s Tau and Spearman rank correlation Index. Using data from 12 Romanian banking institutions that report under IFRS for the period 2006-2010 the results suggest that generally EVA is highly correlated with Residual Income in the years that present positive operational profits whereas for the years with negative outcome the correlation is low. ROA and ROE are the measure that best correlates with EVA for the entire period and thus -applying Occam’s razor- could be used as a substitute for more complex shareholder earnings measures.

  6. A quantitative measure, mechanism and attractor for self-organization in networked complex systems

    Georgiev, Georgi Yordanov

    2012-01-01

    Quantity of organization in complex networks here is measured as the inverse of the average sum of physical actions of all elements per unit motion multiplied by the Planck's constant. The meaning of quantity of organization is the inverse of the number of quanta of action per one unit motion of an element. This definition can be applied to the organization of any complex system. Systems self-organize to decrease the average action per element per unit motion. This lowest action state is the attractor for the continuous self-organization and evolution of a dynamical complex system. Constraints increase this average action and constraint minimization by the elements is a basic mechanism for action minimization. Increase of quantity of elements in a network, leads to faster constraint minimization through grouping, decrease of average action per element and motion and therefore accelerated rate of self-organization. Progressive development, as self-organization, is a process of minimization of action.

  7. Quantifying the complexity of human colonic pressure signals using an entropy measure.

    Xu, Fei; Yan, Guozheng; Zhao, Kai; Lu, Li; Wang, Zhiwu; Gao, Jinyang

    2016-02-01

    Studying the complexity of human colonic pressure signals is important in understanding this intricate, evolved, dynamic system. This article presents a method for quantifying the complexity of colonic pressure signals using an entropy measure. As a self-adaptive non-stationary signal analysis algorithm, empirical mode decomposition can decompose a complex pressure signal into a set of intrinsic mode functions (IMFs). Considering that IMF2, IMF3, and IMF4 represent crucial characteristics of colonic motility, a new signal was reconstructed with these three signals. Then, the time entropy (TE), power spectral entropy (PSE), and approximate entropy (AE) of the reconstructed signal were calculated. For subjects with constipation and healthy individuals, experimental results showed that the entropies of reconstructed signals between these two classes were distinguishable. Moreover, the TE, PSE, and AE can be extracted as features for further subject classification. PMID:26043437

  8. Multi-attribute integrated measurement of node importance in complex networks

    Wang, Shibo; Zhao, Jinlou

    2015-11-01

    The measure of node importance in complex networks is very important to the research of networks stability and robustness; it also can ensure the security of the whole network. Most researchers have used a single indicator to measure the networks node importance, so that the obtained measurement results only reflect certain aspects of the networks with a loss of information. Meanwhile, because of the difference of networks topology, the nodes' importance should be described by combining the character of the networks topology. Most of the existing evaluation algorithms cannot completely reflect the circumstances of complex networks, so this paper takes into account the degree of centrality, the relative closeness centrality, clustering coefficient, and topology potential and raises an integrated measuring method to measure the nodes' importance. This method can reflect nodes' internal and outside attributes and eliminate the influence of network structure on the node importance. The experiments of karate network and dolphin network show that networks topology structure integrated measure has smaller range of metrical result than a single indicator and more universal. Experiments show that attacking the North American power grid and the Internet network with the method has a faster convergence speed than other methods.

  9. Long-lifetime Ru(II) complexes for the measurement of high molecular weight protein hydrodynamics.

    Szmacinski, H; Castellano, F N; Terpetschnig, E; Dattelbaum, J D; Lakowicz, J R; Meyer, G J

    1998-03-01

    We describe the synthesis and characterization of two asymmetrical ruthenium(II) complexes, [Ru(dpp)2(dcbpy)]2+ and [Ru(dpp)2(mcbpy)]2+, as well as the water soluble sulfonated derivatives [Ru(dpp(SO3Na)2)2(dcbpy)]2+ and [Ru(dpp(SO3Na)2)2(mcbpy)]2+ (dpp is 4,7-diphenyl-1,10-phenanthroline, dcbpy is 4,4'-dicarboxylic acid-2,2'-bipyridine, mcbpy is 4-methyl,4'-carboxylic acid-2,2'-bipyridine, and dpp(SO3Na)2 is the disulfonated derivative of dpp) as probes for the measurement of the rotational motions of proteins. The spectral (absorption, emission, and anisotropy) and photophysical (time-resolved intensity and anisotropy decays) properties of these metal-ligand complexes were determined in solution, in both the presence and absence of human serum albumin (HSA). These complexes display lifetimes ranging from 345 ns to 3.8 microseconds in deoxygenated aqueous solutions under a variety of conditions. The carboxylic acid groups on these complexes were activated to form N-hydroxysuccinimide (NHS) esters which were used to covalently lable HSA, and were characterized spectroscopically in the same manner as above. Time-resolved anisotropy measurements were performed to demonstrate the utility of these complexes in measuring long rotational correlation times of bioconjugates between HSA and antibody to HSA. The potential usefulness of these probes in fluorescence polarization immunoassays was demonstrated by an association assay of the Ru(II)-labeled HSA with polyclonal antibody. PMID:9546056

  10. Crater size-frequency distribution measurements and age of the Compton-Belkovich Volcanic Complex

    Shirley, K. A.; Zanetti, M.; Jolliff, B.; van der Bogert, C. H.; Hiesinger, H.

    2016-07-01

    The Compton-Belkovich Volcanic Complex (CBVC) is a 25 × 35 km feature on the lunar farside marked by elevated topography, high albedo, high thorium concentration, and high silica content. Morphologies indicate that the complex is volcanic in origin and compositions indicate that it represents rare silicic volcanism on the Moon. Constraining the timing of silicic volcanism at the complex is necessary to better understand the development of evolved magmas and when they were active on the lunar surface. We employ image analysis and crater size-frequency distribution (CSFD) measurements on several locations within the complex and at surrounding impact craters, Hayn (87 km diameter), and Compton (160 km diameter), to determine relative and absolute model ages of regional events. Using CSFD measurements, we establish a chronology dating regional resurfacing events and the earliest possible onset of CBVC volcanism at ∼3.8 Ga, the formation of Compton Crater at 3.6 Ga, likely resurfacing by volcanism at the CBVC at ∼3.5 Ga, and the formation of Hayn Crater at ∼1 Ga. For the CBVC, we find the most consistent results are obtained using craters larger than 300 m in diameter; the small crater population is affected by their approach to an equilibrium condition and by the physical properties of regolith at the CBVC.

  11. Direct measurement and modulation of single-molecule coordinative bonding forces in a transition metal complex

    Hao, Xian; Zhu, Nan; Gschneidtner, Tina;

    2013-01-01

    Coordination chemistry has been a consistently active branch of chemistry since Werner's seminal theory of coordination compounds inaugurated in 1893, with the central focus on transition metal complexes. However, control and measurement of metal-ligand interactions at the single-molecule level...... remain a daunting challenge. Here we demonstrate an interdisciplinary and systematic approach that enables measurement and modulation of the coordinative bonding forces in a transition metal complex. Terpyridine is derived with a thiol linker, facilitating covalent attachment of this ligand on both gold...... significant impact on the metal-ligand interactions. The present approach represents a major advancement in unravelling the nature of metal-ligand interactions and could have broad implications in coordination chemistry....

  12. Systematic Study of Information Measures, Statistical Complexity and Atomic Structure Properties

    Chatzisavvas, K. Ch.; Tserkis, S. T.; Panos, C. P.; Moustakidis, Ch. C.

    2015-05-01

    We present a comparative study of several information and statistical complexity measures in order to examine a possible correlation with certain experimental properties of atomic structure. Comparisons are also carried out quantitatively using Pearson correlation coefficient. In particular, it is shown that Fisher information in momentum space is very sensitive to shell effects. It is also seen that three measures expressed in momentum space that is Fisher information, Fisher-Shannon plane and LMC complexity are associated with atomic radius, ionization energy, electronegativity, and atomic dipole polarizability. Our results indicate that a momentum space treatment of atomic periodicity is superior to a position space one. Finally we present a relation that emerges between Fisher information and the second moment of the probability distribution in momentum space i.e. an energy functional of interest in (e,2e) experiments.

  13. Wettability of reservoir rock and fluid systems from complex resistivity measurements

    Moss, A.K.; Jing, X.D.; Archer, J.S. [Department of Earth Science and Engineering, Imperial College of Science, Technology and Medicine, London (United Kingdom)

    2002-04-01

    Electrical resistivity measurements at a single low AC frequency have long been recognized as providing an indication of the wettability of reservoir rock and fluid systems. However, the resistivity response over a range of frequencies for samples of varying wettability is not so well characterized. Data is presented from reservoir core plugs of differing lithologies, permeabilities, and wettabilities. The complex resistivity response at differing saturations and wettability was measured. This research group has been investigating relationships between complex resistivity, permeability, and clay content, described in previous research papers. This study extends this work to include wettability. Electrical resistivity measurements in the low-frequency range (10 Hz-10 kHz) include an electrode polarization effect. At frequencies between 10 and 200 kHz, the electrode polarization effect is reduced and the bulk sample response measured. An Argand diagram analysis is employed to find the critical frequency (f{sub c}) separating the electrode polarization from the bulk sample response. Samples are tested in a multi-sample rig at hydrostatic reservoir overburden stresses. The test equipment allows the measurement of resistivity in the two or four electrode configurations over a frequency range from 10 Hz to 1 MHz during drainage and imbibition cycles. Multi-electrodes down the sample length allow saturation monitoring and thus the detection of any saturation inhomogeneity throughout the samples. Sample wettability is evaluated using the Amott-Harvey wettability index (AHWI) on adjacent samples and change in Archie Saturation exponent before and after aging in crude oil. The effect of frequency dispersion was analysed in relation to pore-scale fluid distribution and, hence, wettability. The results suggest complex resistivity measurement have the potential as a non-invasive technique to evaluate reservoir wettability.

  14. An Assessment of Wind Plant Complex Flows Using Advanced Doppler Radar Measurements

    Gunter, W. S.; Schroeder, J.; Hirth, B.; Duncan, J.; Guynes, J.

    2015-12-01

    As installed wind energy capacity continues to steadily increase, the need for comprehensive measurements of wind plant complex flows to further reduce the cost of wind energy has been well advertised by the industry as a whole. Such measurements serve diverse perspectives including resource assessment, turbine inflow and power curve validation, wake and wind plant layout model verification, operations and maintenance, and the development of future advanced wind plant control schemes. While various measurement devices have been matured for wind energy applications (e.g. meteorological towers, LIDAR, SODAR), this presentation will focus on the use of advanced Doppler radar systems to observe the complex wind flows within and surrounding wind plants. Advanced Doppler radars can provide the combined advantage of a large analysis footprint (tens of square kilometers) with rapid data analysis updates (a few seconds to one minute) using both single- and dual-Doppler data collection methods. This presentation demonstrates the utility of measurements collected by the Texas Tech University Ka-band (TTUKa) radars to identify complex wind flows occurring within and nearby operational wind plants, and provide reliable forecasts of wind speeds and directions at given locations (i.e. turbine or instrumented tower sites) 45+ seconds in advance. Radar-derived wind maps reveal commonly observed features such as turbine wakes and turbine-to-turbine interaction, high momentum wind speed channels between turbine wakes, turbine array edge effects, transient boundary layer flow structures (such as wind streaks, frontal boundaries, etc.), and the impact of local terrain. Operational turbine or instrumented tower data are merged with the radar analysis to link the observed complex flow features to turbine and wind plant performance.

  15. An entropy-based measure of hydrologic complexity and its applications

    Castillo, Aldrich; Castelli, Fabio; Entekhabi, Dara

    2015-01-01

    Abstract Basin response and hydrologic fluxes are functions of hydrologic states, most notably of soil moisture. However, characterization of hillslope‐scale soil moisture is challenging since it is both spatially heterogeneous and dynamic. This paper introduces an entropy‐based and discretization‐invariant dimensionless index of hydrologic complexity H that measures the distance of a given distribution of soil moisture from a Dirac delta (most organization) and a uniform distribution (widest...

  16. Shock tunnel free flight force measurements using a complex model configuration

    Hannemann, Klaus; Martinez Schramm, Jan; Laurence, Stuart; Karl, Sebastian

    2015-01-01

    The free flight force measurement technique is a very attractive tool to determine forces and moments in particular in short duration ground based test facilities. With test times in the order of a few milliseconds, conventional force balances cannot be applied here. The technique has been applied in a number of shock tunnels utilizing models up to approximately 300 mm in length and looking at external aerodynamics. In the present study the technique is applied using a complex 1.5 m l...

  17. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Szi-Wen Chen

    2007-01-01

    A novel approach that employs a complexity-based sequential hypothesis testing (SHT) technique for real-time detection of ventricular fibrillation (VF) and ventricular tachycardia (VT) is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG) recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For ...

  18. Instrumentation measurement and testing complex for detection and identification of radioactive materials using the emitted radiation

    Simultaneous measurement of neutron and gamma radiation is a very usefull method for effective nuclear materials identification and control. The gamma-ray-neutron complex described in the paper is based on two multi-layer 3He neutrons detectors and two High Pressure Xenon gamma-ray spectrometers assembled in one unit. All these detectors were callibrated on neutron and gamma-ray sources. The main characteristics of the instrumentation , its testing results and gamma-ray and neutron radiation parameters, which have been measured are represented in the paper. The gamma-neutron sources and fissile materials reliable detection and identification capability was demonstrated

  19. RNACompress: Grammar-based compression and informational complexity measurement of RNA secondary structure

    Chen Chun

    2008-03-01

    Full Text Available Abstract Background With the rapid emergence of RNA databases and newly identified non-coding RNAs, an efficient compression algorithm for RNA sequence and structural information is needed for the storage and analysis of such data. Although several algorithms for compressing DNA sequences have been proposed, none of them are suitable for the compression of RNA sequences with their secondary structures simultaneously. This kind of compression not only facilitates the maintenance of RNA data, but also supplies a novel way to measure the informational complexity of RNA structural data, raising the possibility of studying the relationship between the functional activities of RNA structures and their complexities, as well as various structural properties of RNA based on compression. Results RNACompress employs an efficient grammar-based model to compress RNA sequences and their secondary structures. The main goals of this algorithm are two fold: (1 present a robust and effective way for RNA structural data compression; (2 design a suitable model to represent RNA secondary structure as well as derive the informational complexity of the structural data based on compression. Our extensive tests have shown that RNACompress achieves a universally better compression ratio compared with other sequence-specific or common text-specific compression algorithms, such as Gencompress, winrar and gzip. Moreover, a test of the activities of distinct GTP-binding RNAs (aptamers compared with their structural complexity shows that our defined informational complexity can be used to describe how complexity varies with activity. These results lead to an objective means of comparing the functional properties of heteropolymers from the information perspective. Conclusion A universal algorithm for the compression of RNA secondary structure as well as the evaluation of its informational complexity is discussed in this paper. We have developed RNACompress, as a useful tool

  20. Determining complex permittivity from propagation constant measurements with planar transmission lines

    A new two-standard calibration procedure is outlined for determining the complex permittivity of materials from the propagation constant measured with planar transmission lines. Once calibrated, a closed-form expression for the material permittivity is obtained. The effects of radiation and conductor losses are accounted for in the calibration. The multiline technique, combined with a recently proposed planar transmission-line configuration, is used to determine the line propagation constant. An uncertainty analysis is presented for the proposed calibration procedure that includes the uncertainties associated with the multiline technique. This allows line dimensions and calibration standards to be selected that minimize the total measurement uncertainty. The use of air and distilled water as calibration standards gives relatively small measurement uncertainty. Permittivity measurement results for five liquids, covering a wide permittivity range, agree very closely with expected values from 0.5–5 GHz. (paper)

  1. MEASURING OBJECT-ORIENTED SYSTEMS BASED ON THE EXPERIMENTAL ANALYSIS OF THE COMPLEXITY METRICS

    J.S.V.R.S.SASTRY,

    2011-05-01

    Full Text Available Metrics are used to help a software engineer in quantitative analysis to assess the quality of the design before a system is built. The focus of Object-Oriented metrics is on the class which is the fundamental building block of the Object-Oriented architecture. These metrics are focused on internal object structure and external object structure. Internal object structure reflects the complexity of each individual entity such as methods and classes. External complexity measures the interaction among entities such as Coupling and Inheritance. This paper mainly focuses on a set of object oriented metrics that can be used to measure the quality of an object oriented design. Two types of complexity metrics in Object-Oriented paradigm namely Mood metrics and Lorenz & Kidd metrics. Mood metrics consist of Method inheritance factor(MIF, Coupling factor(CF, Attribute inheritance factor(AIF, Method hiding factor(MHF, Attribute hiding factor(AHF, and polymorphism factor(PF. Lorenz & Kidd metrics consist of Number of operations overridden (NOO, Number operations added (NOA, Specialization index(SI. Mood metrics and Lorenz & Kidd metrics measurements are used mainly by designers and testers. Designers uses these metrics to access the software early in process,making changes that will reduce complexity and improve the continuing capability of the design. Testers use to test the software for finding the complexity, performance of the system, quality of the software. This paper reviews Mood metrics and Lorenz & Kidd metrics are validates theoretically and empirically methods. In thispaper, work has been done to explore the quality of design of software components using object oriented paradigm. A number of object oriented metrics have been proposed in the literature for measuring the design attributes such as inheritance, coupling, polymorphism etc. This paper, metrics have been used to analyzevarious features of software component. Complexity of methods

  2. Acid water problem in Bukit Assam Coal Mine, South Sumatra, Indonesia

    Gautama, R.S. [Institut Teknologi Bandung, Bandung (Indonesia). Dept. of Mining Engineering

    1994-09-01

    With an average annual rainfall of more than 2800 mm runoff water is considered to be the main water problem faced by Bukit Asam Coal Mine. There is only a minor problem due to groundwater as the relevant aquifer consists of sandstone with low hydraulic conductivity, i.e. less than 10{sup -7}m/s. Water quality monitoring done periodically as a part of an environmental monitoring program has detected water with low pH. The problem is significant as it relates to a large amount of acid water in an abandoned pit in East Klawas which could be discharged to the nearly Enim River. The acid water problem in Bukit Asam mine will be discussed. Since this problem has never been encountered before, the analyses to understand the acid generating process at present is based on very limited data. Further research is necessary, is being conducted and, at the same time, the appropriate method to handle the problem has to be developed to support an environmentally sound mining operation. 6 refs., 4 figs., 2 tabs.

  3. A Statistical Framework to Infer Delay and Direction of Information Flow from Measurements of Complex Systems.

    Schumacher, Johannes; Wunderle, Thomas; Fries, Pascal; Jäkel, Frank; Pipa, Gordon

    2015-08-01

    In neuroscience, data are typically generated from neural network activity. The resulting time series represent measurements from spatially distributed subsystems with complex interactions, weakly coupled to a high-dimensional global system. We present a statistical framework to estimate the direction of information flow and its delay in measurements from systems of this type. Informed by differential topology, gaussian process regression is employed to reconstruct measurements of putative driving systems from measurements of the driven systems. These reconstructions serve to estimate the delay of the interaction by means of an analytical criterion developed for this purpose. The model accounts for a range of possible sources of uncertainty, including temporally evolving intrinsic noise, while assuming complex nonlinear dependencies. Furthermore, we show that if information flow is delayed, this approach also allows for inference in strong coupling scenarios of systems exhibiting synchronization phenomena. The validity of the method is demonstrated with a variety of delay-coupled chaotic oscillators. In addition, we show that these results seamlessly transfer to local field potentials in cat visual cortex. PMID:26079751

  4. A Measure for Brain Complexity: Relating Functional Segregation and Integration in the Nervous System

    Tononi, Giulio; Sporns, Olaf; Edelman, Gerald M.

    1994-05-01

    In brains of higher vertebrates, the functional segregation of local areas that differ in their anatomy and physiology contrasts sharply with their global integration during perception and behavior. In this paper, we introduce a measure, called neural complexity (C_N), that captures the interplay between these two fundamental aspects of brain organization. We express functional segregation within a neural system in terms of the relative statistical independence of small subsets of the system and functional integration in terms of significant deviations from independence of large subsets. C_N is then obtained from estimates of the average deviation from statistical independence for subsets of increasing size. C_N is shown to be high when functional segregation coexists with integration and to be low when the components of a system are either completely independent (segregated) or completely dependent (integrated). We apply this complexity measure in computer simulations of cortical areas to examine how some basic principles of neuroanatomical organization constrain brain dynamics. We show that the connectivity patterns of the cerebral cortex, such as a high density of connections, strong local connectivity organizing cells into neuronal groups, patchiness in the connectivity among neuronal groups, and prevalent reciprocal connections, are associated with high values of C_N. The approach outlined here may prove useful in analyzing complexity in other biological domains such as gene regulation and embryogenesis.

  5. Growing complex network of citations of scientific papers -- measurements and modeling

    Golosovsky, M

    2016-01-01

    To quantify the mechanism of a complex network growth we focus on the network of citations of scientific papers and use a combination of the theoretical and experimental tools to uncover microscopic details of this network growth. Namely, we develop a stochastic model of citation dynamics based on copying/redirection/triadic closure mechanism. In a complementary and coherent way, the model accounts both for statistics of references of scientific papers and for their citation dynamics. Originating in empirical measurements, the model is cast in such a way that it can be verified quantitatively in every aspect. Such verification is performed by measuring citation dynamics of Physics papers. The measurements revealed nonlinear citation dynamics, the nonlinearity being intricately related to network topology. The nonlinearity has far-reaching consequences including non-stationary citation distributions, diverging citation trajectory of similar papers, runaways or "immortal papers" with infinite citation lifetime ...

  6. Simulation of complex glazing products; from optical data measurements to model based predictive controls

    Kohler, Christian

    2012-08-01

    Complex glazing systems such as venetian blinds, fritted glass and woven shades require more detailed optical and thermal input data for their components than specular non light-redirecting glazing systems. Various methods for measuring these data sets are described in this paper. These data sets are used in multiple simulation tools to model the thermal and optical properties of complex glazing systems. The output from these tools can be used to generate simplified rating values or as an input to other simulation tools such as whole building annual energy programs, or lighting analysis tools. I also describe some of the challenges of creating a rating system for these products and which factors affect this rating. A potential future direction of simulation and building operations is model based predictive controls, where detailed computer models are run in real-time, receiving data for an actual building and providing control input to building elements such as shades.

  7. Measuring working memory in aphasia: Comparing performance on complex span and N-back tasks

    Maria Ivanova

    2014-04-01

    No significant correlations were observed between performance on complex span task and N-back tasks.Furthermore, performance on the modified listening span was related to performance on the comprehension subtest of the QASA, while no relationship was found for 2-back and 0-back tasks.Our results mirror studies in healthy controls that demonstrated no relationship between performance on the two tasks(Jaeggi et al., 2010; Kane et al., 2007. Thus although N-back tasks seem similar to traditional complex span measures and may also index abilities related to cognitive processing, the evidence to date does not warrant their direct association with the construct of WM. Implications for future investigation of cognitive deficits in aphasia will be discussed.

  8. Estimation of Defect proneness Using Design complexity Measurements in Object- Oriented Software

    Selvarani, R; Prasad, V Kamakshi

    2010-01-01

    Software engineering is continuously facing the challenges of growing complexity of software packages and increased level of data on defects and drawbacks from software production process. This makes a clarion call for inventions and methods which can enable a more reusable, reliable, easily maintainable and high quality software systems with deeper control on software generation process. Quality and productivity are indeed the two most important parameters for controlling any industrial process. Implementation of a successful control system requires some means of measurement. Software metrics play an important role in the management aspects of the software development process such as better planning, assessment of improvements, resource allocation and reduction of unpredictability. The process involving early detection of potential problems, productivity evaluation and evaluating external quality factors such as reusability, maintainability, defect proneness and complexity are of utmost importance. Here we d...

  9. In vivo and in situ measurement and modelling of intra-body effective complex permittivity.

    Nadimi, Esmaeil S; Blanes-Vidal, Victoria; Harslund, Jakob L F; Ramezani, Mohammad H; Kjeldsen, Jens; Johansen, Per Michael; Thiel, David; Tarokh, Vahid

    2015-12-01

    Radio frequency tracking of medical micro-robots in minimally invasive medicine is usually investigated upon the assumption that the human body is a homogeneous propagation medium. In this Letter, the authors conducted various trial programs to measure and model the effective complex permittivity ε in terms of refraction ε', absorption ε″ and their variations in gastrointestinal (GI) tract organs (i.e. oesophagus, stomach, small intestine and large intestine) and the porcine abdominal wall under in vivo and in situ conditions. They further investigated the effects of irregular and unsynchronised contractions and simulated peristaltic movements of the GI tract organs inside the abdominal cavity and in the presence of the abdominal wall on the measurements and variations of ε' and ε''. They advanced the previous models of effective complex permittivity of a multilayer inhomogeneous medium, by estimating an analytical model that accounts for reflections between the layers and calculates the attenuation that the wave encounters as it traverses the GI tract and the abdominal wall. They observed that deviation from the specified nominal layer thicknesses due to non-geometric boundaries of GI tract morphometric variables has an impact on the performance of the authors' model. Therefore, they derived statistical-based models for ε' and ε'' using their experimental measurements. PMID:26713157

  10. Detecting Microbial Growth and Metabolism in Geologic Media with Complex Conductivity Measurements

    Davis, C. A.; Atekwana, E. A.; Slater, L. D.; Bottrell, P. M.; Chasten, L. E.; Heidenreich, J. D.

    2006-05-01

    Complex conductivity measurements between 0.1-1000 Hz were obtained from biostimulated sand-packed (coarse and mixed fine and medium grain) columns to investigate microbial growth, biofilm formation, and microbial metabolism on the electrical properties of porous media. Microbial growth and metabolism was verified by direct microbial counts, pH changes, and environmental scanning electron microscope imaging. Peaks in imaginary (interfacial) conductivity in the coarse grain columns occurred concurrently with peaks in the microbial cell concentrations. The magnitude of the imaginary conductivity response in the mixed fine and medium grain columns, however, was low compared to the coarse grain sand columns, consistent with lower microbial cell concentrations. It is possible that the pore size in the mixed fine and medium grain sand restricted bacteria cell division, inhibiting microbial growth, and thus the smaller magnitude imaginary conductivity response. The biostimulated columns for both grain sizes displayed similar trends and showed an increase in the real (electrolytic) conductivity and decrease in pH over time. Dynamic changes in the imaginary conductivity arises from the growth and attachment of microbial cells and biofilms to surfaces, whereas, changes in the real conductivity arises from the release of byproducts (ionic species) of microbial metabolism. We conclude that complex conductivity techniques are feasible sensors for detecting microbial growth (imaginary conductivity measurements) and metabolism (real conductivity measurements) with implications for bioremediation and astrobiology studies.

  11. Complex permittivity measurement at millimetre-wave frequencies during the fermentation process of Japanese sake

    Kouzai, Masaki [Tokyo Institute of Technology, Meguro, Tokyo 152-8552 (Japan); Nishikata, Atsuhiro [Tokyo Institute of Technology, Meguro, Tokyo 152-8552 (Japan); Fukunaga, Kaori [NICT, Koganei, Tokyo 184-8795 (Japan); Miyaoka, Shunsuke [Industrial Research Centre of Ehime, Matsuyama, Ehime 791-1101 (Japan)

    2007-01-07

    Various chemical reactions occur simultaneously in barrels during the fermentation processes of alcoholic beverages. Chemical analyses are employed to monitor the change in chemical components, such as glucose and ethyl alcohol. The tests are carried out with extracted specimens, are costly and require time. We have developed a permittivity measurement system for liquid specimens in the frequency range from 2.6 to 50 GHz, and applied the system to fermentation monitoring. Experimental results proved that the observed change in complex permittivity suggests a decrease in the amount of glucose and an increase in alcohol content, which are the key chemical components during the fermentation process.

  12. Complex permittivity measurement at millimetre-wave frequencies during the fermentation process of Japanese sake

    Various chemical reactions occur simultaneously in barrels during the fermentation processes of alcoholic beverages. Chemical analyses are employed to monitor the change in chemical components, such as glucose and ethyl alcohol. The tests are carried out with extracted specimens, are costly and require time. We have developed a permittivity measurement system for liquid specimens in the frequency range from 2.6 to 50 GHz, and applied the system to fermentation monitoring. Experimental results proved that the observed change in complex permittivity suggests a decrease in the amount of glucose and an increase in alcohol content, which are the key chemical components during the fermentation process

  13. Relating Hyperspectral Airborne Data to Ground Measurements in a Complex and Discontinuous Canopy

    Calleja Javier F.

    2015-12-01

    Full Text Available The work described in this paper is aimed at validating hyperspectral airborne reflectance data collected during the Regional Experiments For Land-atmosphere EXchanges (REFLEX campaign. Ground reflectance data measured in a vineyard were compared with airborne reflectance data. A sampling strategy and subsequent ground data processing had to be devised so as to capture a representative spectral sample of this complex crop. A linear model between airborne and ground data was tried and statistically tested. Results reveal a sound correspondence between ground and airborne reflectance data (R2 > 0.97, validating the atmospheric correction of the latter.

  14. A Thorax Simulator for Complex Dynamic Bioimpedance Measurements With Textile Electrodes.

    Ulbrich, Mark; Muhlsteff, Jens; Teichmann, Daniel; Leonhardt, Steffen; Walter, Marian

    2015-06-01

    Bioimpedance measurements on the human thorax are suitable for assessment of body composition or hemodynamic parameters, such as stroke volume; they are non-invasive, easy in application and inexpensive. When targeting personal healthcare scenarios, the technology can be integrated into textiles to increase ease, comfort and coverage of measurements. Bioimpedance is generally measured using two electrodes injecting low alternating currents (0.5-10 mA) and two additional electrodes to measure the corresponding voltage drop. The impedance is measured either spectroscopically (bioimpedance spectroscopy, BIS) between 5 kHz and 1 MHz or continuously at a fixed frequency around 100 kHz (impedance cardiography, ICG). A thorax simulator is being developed for testing and calibration of bioimpedance devices and other new developments. For the first time, it is possible to mimic the complete time-variant properties of the thorax during an impedance measurement. This includes the dynamic real part and dynamic imaginary part of the impedance with a peak-to-peak value of 0.2 Ω and an adjustable base impedance (24.6 Ω ≥ Z0 ≥ 51.6 Ω). Another novelty is adjustable complex electrode-skin contact impedances for up to 8 electrodes to evaluate bioimpedance devices in combination with textile electrodes. In addition, an electrocardiographic signal is provided for cardiographic measurements which is used in ICG devices. This provides the possibility to generate physiologic impedance changes, and in combination with an ECG, all parameters of interest such as stroke volume (SV), pre-ejection period (PEP) or extracellular resistance (Re) can be simulated. The speed of all dynamic signals can be altered. The simulator was successfully tested with commercially available BIS and ICG devices and the preset signals are measured with high correlation (r = 0.996). PMID:25148671

  15. Selective extraction of metals from products of mine acidic water treatment

    A study was made on possibility of processing of foam products prepared during flotation purification of mine acidic waters for the purpose of selective extraction of non-ferrous (Co, Ni) and rare earth elements (REE) and their separation from the basic macrocomponent of waters-iron. Optimal conditions of selective metal extraction from foam flotation products are the following: T=333 K, pH=3.0-3.5, ratio of solid and liquid phase - 1:4-1:7, duration of sulfuric acid leaching - 30 min. Rare earth extraction under such conditions equals 87.6-93.0%. The degree of valuable component concentration equals ∼ 10. Rare earths are separated from iron by extraction methods

  16. Fine-grained permutation entropy as a measure of natural complexity for time series

    Liu Xiao-Feng; Wang Yue

    2009-01-01

    In a recent paper [2002 Phys. Rev. Lett. 88 174102], Bandt and Pompe propose permutation entropy (PE)as a natural complexity measure for arbitrary time series which may be stationary or nonstationary, deterministic or stochastic. Their method is based on a comparison of neighbouring values. This paper further develops PE, and proposes the concept of fine-grained PE (FGPE) defined by the order pattern and magnitude of the difference between neighbouring values. This measure excludes the case where vectors with a distinct appearance are mistakenly mapped onto the same permutation type, and consequently FGPE becomes more sensitive to the dynamical change of time series than does PE, according to our simulation and experimental results.

  17. Fine-grained permutation entropy as a measure of natural complexity for time series

    In a recent paper [2002 Phys. Rev. Lett. 88 174102], Bandt and Pompe propose permutation entropy (PE) as a natural complexity measure for arbitrary time series which may be stationary or nonstationary, deterministic or stochastic. Their method is based on a comparison of neighbouring values. This paper further develops PE, and proposes the concept of fine-grained PE (FGPE) defined by the order pattern and magnitude of the difference between neighbouring values. This measure excludes the case where vectors with a distinct appearance are mistakenly mapped onto the same permutation type, and consequently FGPE becomes more sensitive to the dynamical change of time series than does PE, according to our simulation and experimental results. (general)

  18. A new closeness centrality measure via effective distance in complex networks

    Du, Yuxian; Gao, Cai; Chen, Xin; Hu, Yong; Sadiq, Rehan; Deng, Yong

    2015-03-01

    Closeness centrality (CC) measure, as a well-known global measure, is widely applied in many complex networks. However, the classical CC presents many problems for flow networks since these networks are directed and weighted. To address these issues, we propose an effective distance based closeness centrality (EDCC), which uses effective distance to replace conventional geographic distance and binary distance obtained by Dijkstra's shortest path algorithm. The proposed EDCC considers not only the global structure of the network but also the local information of nodes. And it can be well applied in directed or undirected, weighted or unweighted networks. Susceptible-Infected model is utilized to evaluate the performance by using the spreading rate and the number of infected nodes. Numerical examples simulated on four real networks are given to show the effectiveness of the proposed EDCC.

  19. Rb-Sr measurements on metamorphic rocks from the Barro Alto Complex, Goias, Brazil

    The Barro Alto Complex comprises a highly deformed and metamorphosed association of plutonic, volcanic, and sedimentary rocks exposed in a 150 x 25 Km boomerang-like strip in Central Goias, Brazil. It is the southernmost tip of an extensive yet discontinuous belt of granulite and amphibolite facies metamorphic rocks which include the Niquelandia and Cana Brava complexes to the north. Two rock associations are distinguished within the granulite belt. The first one comprises a sequence of fine-grained mafic granulite, hypersthene-quartz-feldspar granulite, garnet quartzite, sillimanite-garnet-cordierite gneiss, calc-silicate rock, and magnetite-rich iron formation. The second association comprises medium-to coarse-grained mafic rocks. The medium-grade rocks of the western/northern portion (Barro Alto Complex) comprise both layered mafic rocks and a volcanic-sedimentary sequence, deformed and metamorphosed under amphibolite facies conditions. The fine-grained amphibolite form the basal part of the Juscelandia meta volcanic-sedimentary sequence. A geochronologic investigation by the Rb-Sr method has been carried out mainly on felsic rocks from the granulite belt and gneisses of the Juscelandia sequence. The analytical results for the Juscelandia sequence are presented. Isotope results for rocks from different outcrops along the gneiss layer near Juscelandia are also presented. In conclusion, Rb-Sr isotope measurements suggest that the Barro Alto rocks have undergone at least one important metamorphic event during Middle Proterozoic times, around 1300 Ma ago. During that event volcanic and sedimentary rocks of the Juscelandia sequence, as well as the underlying gabbro-anorthosite layered complex, underwent deformation and recrystallization under amphibolite facies conditions. (author)

  20. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Chen Szi-Wen

    2007-01-01

    Full Text Available A novel approach that employs a complexity-based sequential hypothesis testing (SHT technique for real-time detection of ventricular fibrillation (VF and ventricular tachycardia (VT is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of . The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  1. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Chen, Szi-Wen

    2006-12-01

    A novel approach that employs a complexity-based sequential hypothesis testing (SHT) technique for real-time detection of ventricular fibrillation (VF) and ventricular tachycardia (VT) is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG) recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM) value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of[InlineEquation not available: see fulltext.]. The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  2. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Szi-Wen Chen

    2007-01-01

    Full Text Available A novel approach that employs a complexity-based sequential hypothesis testing (SHT technique for real-time detection of ventricular fibrillation (VF and ventricular tachycardia (VT is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of 96.67%. The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  3. [Sample pretreatment for the measurement of phthalate esters in complex matrices].

    Liang, Jing; Zhuang, Wan'e; Lin, Fang; Yao, Wensong

    2014-11-01

    Sample pretreatment methods for the measurement of phthalate esters (PAEs) by gas chromatography-mass spectrometry (GC-MS) in various complex matrices, including sediment, soil, suspended particle matter, urban surface dust, Sinonovacula Constricta, cosmet- ic, leather, plastic and coastal/estuarine seawater, were proposed. The pretreatment which was appropriate for GC-MS detection was focused on the investigation and optimization of oper- ating parameters for the extraction and purification, such as the extraction solvent, the eluant and the adsorbent of solid phase extraction. The results of the study of pretreatment for various complex matrices showed that methylene chloride was the best solvent for the ultrasonic extraction when solid-liquid extraction was used; silica gel was the economical and practical adsorbent for solid-phase extraction for purification; C18 was the most commonly adsorbent for preconcentration of PAE in coastal/estuarine seawater sample; the mixed solution of n-hexane and ethyl acetate with a certain proportion was the suitable SPE eluent. Under the optimized conditions, the spiked recoveries were above 58% and the relative standard deviations (RSDs) were less than 10.5% (n = 6). The detection limits (DL, 3σ) were in the range of 0.3 μg/kg (dibutyl phthalate)--5.2 μg/kg ( diisononyl phthalate) for sediment, and 6 ng/L (dipropyl phthalate)--67 ng/L (diisodecyl phthalate) for costal/estuarine seawater. The pretreatment meth- od for various complex matrices is prominent for the measurement of the 16 PAEs with GC-MS. PMID:25764660

  4. A study on the development of a task complexity measure for emergency operating procedures of nuclear power plants

    Park, Jinkyun [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of)]. E-mail: kshpjk@kaeri.re.kr; Jung, Wondea [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of)

    2007-08-15

    In this study, a measure called task complexity (TACOM) that can quantify the complexity of tasks stipulated in emergency operating procedures of nuclear power plants is developed. The TACOM measure consists of five sub-measures that can cover remarkable complexity factors: (1) amount of information to be managed by operators, (2) logical entanglement due to the logical sequence of the required actions, (3) amount of actions to be accomplished by operators, (4) amount of system knowledge in recognizing the problem space, and (5) amount of cognitive resources in establishing an appropriate decision criterion. The appropriateness of the TACOM measure is investigated by comparing task performance time data with the associated TACOM scores. As a result, it is observed that there is a significant correlation between TACOM scores and task performance time data. Therefore, it is reasonable to expect that the TACOM measure can be used as a meaningful tool to quantify the complexity of tasks.

  5. Measurement of unsteady convection in a complex fenestration using laser interferometry

    Poulad, M.E.; Naylor, D. [Ryerson Univ., Toronto, ON (Canada). Dept. of Mechanical and Industrial Engineering; Oosthuizen, P.H. [Queen' s Univ., Kingston, ON (Canada). Dept. of Mechanical and Materials Engineering

    2009-06-15

    Complex fenestration involving windows with between-panes louvered blinds is gaining interest as a means to control solar gains in buildings. However, the heat transfer performance of this type of shading system is not well understood, especially at high Rayleigh numbers. A Mach-Zehnder interferometer was used in this study to measure the unsteady convective heat transfer in a tall enclosure with between-panes blind that was heated to simulate absorbed solar radiation. Digital cinematography was combined with laser interferometry to make time-averaged measurements of unsteady and turbulent free convective heat transfer. This paper described the procedures used to measure the time-average local heat flux. Under strongly turbulent conditions, the average Nusselt number for the enclosure was found to compare well with empirical correlations. A total sampling time of about ten seconds was needed in this experiment to obtain a stationary time-average heat flux. The time-average heat flux was found to be relatively insensitive to the camera frame rate. The local heat flux was found to be unsteady and periodic. Heating of the blind made the flow more unstable, producing a higher amplitude heat flux variation than for the unheated blind condition. This paper reported on only a small set of preliminary measurements. This study is being extended to other blind angles and glazing spacings. The next phase will focus on flow visualization studies to characterize the nature of the flow. 8 refs., 2 tabs., 7 figs.

  6. Measuring spatial patterns in floodplains: A step towards understanding the complexity of floodplain ecosystems: Chapter 6

    Murray Scown; Martin Thoms; DeJager, Nathan R.

    2016-01-01

    Floodplains can be viewed as complex adaptive systems (Levin, 1998) because they are comprised of many different biophysical components, such as morphological features, soil groups and vegetation communities as well as being sites of key biogeochemical processing (Stanford et al., 2005). Interactions and feedbacks among the biophysical components often result in additional phenomena occuring over a range of scales, often in the absence of any controlling factors (sensu Hallet, 1990). This emergence of new biophysical features and rates of processing can lead to alternative stable states which feed back into floodplain adaptive cycles (cf. Hughes, 1997; Stanford et al., 2005). Interactions between different biophysical components, feedbacks, self emergence and scale are all key properties of complex adaptive systems (Levin, 1998; Phillips, 2003; Murray et al., 2014) and therefore will influence the manner in which we study and view spatial patterns. Measuring the spatial patterns of floodplain biophysical components is a prerequisite to examining and understanding these ecosystems as complex adaptive systems. Elucidating relationships between pattern and process, which are intrinsically linked within floodplains (Ward et al., 2002), is dependent upon an understanding of spatial pattern. This knowledge can help river scientists determine the major drivers, controllers and responses of floodplain structure and function, as well as the consequences of altering those drivers and controllers (Hughes and Cass, 1997; Whited et al., 2007). Interactions and feedbacks between physical, chemical and biological components of floodplain ecosystems create and maintain a structurally diverse and dynamic template (Stanford et al., 2005). This template influences subsequent interactions between components that consequently affect system trajectories within floodplains (sensu Bak et al., 1988). Constructing and evaluating models used to predict floodplain ecosystem responses to

  7. Vertical profiles of urban aerosol complex refractive index in the frame of ESQUIF airborne measurements

    J.-C. Raut

    2008-02-01

    Full Text Available A synergy between lidar, sunphotometer and in situ measurements has been applied to airborne observations performed during the Etude et Simulation de la QUalité de l'air en Ile-de-France (ESQUIF, enabling the retrieval of vertical profiles for the aerosol complex refractive index (ACRI and single-scattering albedo with a vertical resolution of 200 m over Paris area. The averaged value over the entire planetary boundary layer (PBL for the ACRI is close to 1.51(±0.02–i0.017(±0.003 at 532 nm. The single-scattering albedo of the corresponding aerosols is found to be ~0.9 at the same wavelength. A good agreement is found with previous studies for urban aerosols. A comparison of vertical profiles of ACRI with simulations combining in situ measurements and relative humidity (RH profiles has highlighted a modification in aerosol optical properties linked to their history and the origin of the air mass. The determination of ACRI in the atmospheric column enabled to retrieve vertical profiles of extinction coefficient in accordance with lidar profiles measurements.

  8. Vertical profiles of urban aerosol complex refractive index in the frame of ESQUIF airborne measurements

    A synergy between lidar, sun photometer and in situ measurements has been applied to airborne observations performed during the Etude et Simulation de la QUalite de l'air en Ile-de-France (ESQUIF), enabling the retrieval of vertical profiles for the aerosol complex refractive index (ACRI) and single-scattering albedo with a vertical resolution of 200 m over Paris area. The averaged value over the entire planetary boundary layer (PBL) for the ACRI is close to 1.51(± 0.02)-i0.017(± 0.003) at 532 nm. The single-scattering albedo of the corresponding aerosols is found to be similar to 0.9 at the same wavelength. A good agreement is found with previous studies for urban aerosols. A comparison of vertical profiles of ACRI with simulations combining in situ measurements and relative humidity (RH) profiles has highlighted a modification in aerosol optical properties linked to their history and the origin of the air mass. The determination of ACRI in the atmospheric column enabled to retrieve vertical profiles of extinction coefficient in accordance with lidar profiles measurements. (authors)

  9. Vertical profiles of urban aerosol complex refractive index in the frame of ESQUIF airborne measurements

    J.-C. Raut

    2007-07-01

    Full Text Available A synergy between lidar, sunphotometer and in situ measurements has been applied to airborne observations performed during the Etude et Simulation de la QUalité de l'air en Ile-de-France (ESQUIF, enabling the retrieval of vertical profiles for the aerosol complex refractive index (ACRI and single-scattering albedo with a vertical resolution of 200 m over Paris area. The averaged value over the entire planetary boundary layer (PBL for the ACRI is close to 1.51(±0.02–i0.017(±0.003 at 532 nm. The single-scattering albedo of the corresponding aerosols is found to be ~0.9 at the same wavelength. A good agreement is found with previous studies for urban aerosols. A comparison of vertical profiles of ACRI with simulations combining in situ measurements and relative humidity (RH profiles has highlighted a modification in aerosol optical properties linked to their history and the origin of the air mass. The determination of ACRI in the atmospheric column enabled to retrieve vertical profiles of extinction coefficient in accordance with lidar profiles measurements.

  10. Cortical complexity as a measure of age-related brain atrophy.

    Madan, Christopher R; Kensinger, Elizabeth A

    2016-07-01

    The structure of the human brain changes in a variety of ways as we age. While a sizeable literature has examined age-related differences in cortical thickness, and to a lesser degree, gyrification, here we examined differences in cortical complexity, as indexed by fractal dimensionality in a sample of over 400 individuals across the adult lifespan. While prior studies have shown differences in fractal dimensionality between patient populations and age-matched, healthy controls, it is unclear how well this measure would relate to age-related cortical atrophy. Initially computing a single measure for the entire cortical ribbon, i.e., unparcellated gray matter, we found fractal dimensionality to be more sensitive to age-related differences than either cortical thickness or gyrification index. We additionally observed regional differences in age-related atrophy between the three measures, suggesting that they may index distinct differences in cortical structure. We also provide a freely available MATLAB toolbox for calculating fractal dimensionality. PMID:27103141

  11. Rotational study of the CH4-CO complex: Millimeter-wave measurements and ab initio calculations.

    Surin, L A; Tarabukin, I V; Panfilov, V A; Schlemmer, S; Kalugina, Y N; Faure, A; Rist, C; van der Avoird, A

    2015-10-21

    The rotational spectrum of the van der Waals complex CH4-CO has been measured with the intracavity OROTRON jet spectrometer in the frequency range of 110-145 GHz. Newly observed and assigned transitions belong to the K = 2-1 subband correlating with the rotationless jCH4 = 0 ground state and the K = 2-1 and K = 0-1 subbands correlating with the jCH4 = 2 excited state of free methane. The (approximate) quantum number K is the projection of the total angular momentum J on the intermolecular axis. The new data were analyzed together with the known millimeter-wave and microwave transitions in order to determine the molecular parameters of the CH4-CO complex. Accompanying ab initio calculations of the intermolecular potential energy surface (PES) of CH4-CO have been carried out at the explicitly correlated coupled cluster level of theory with single, double, and perturbative triple excitations [CCSD(T)-F12a] and an augmented correlation-consistent triple zeta (aVTZ) basis set. The global minimum of the five-dimensional PES corresponds to an approximately T-shaped structure with the CH4 face closest to the CO subunit and binding energy De = 177.82 cm(-1). The bound rovibrational levels of the CH4-CO complex were calculated for total angular momentum J = 0-6 on this intermolecular potential surface and compared with the experimental results. The calculated dissociation energies D0 are 91.32, 94.46, and 104.21 cm(-1) for A (jCH4 = 0), F (jCH4 = 1), and E (jCH4 = 2) nuclear spin modifications of CH4-CO, respectively. PMID:26493903

  12. Measuring The Influence of TAsk COMplexity on Human Error Probability: An Empirical Evaluation

    Podofillini, Luca; Dang, Vinh N. [Paul Scherrer Institute, Villigen (Switzerland)

    2013-04-15

    A key input for the assessment of Human Error Probabilities (HEPs) with Human Reliability Analysis (HRA) methods is the evaluation of the factors influencing the human performance (often referred to as Performance Shaping Factors, PSFs). In general, the definition of these factors and the supporting guidance are such that their evaluation involves significant subjectivity. This affects the repeatability of HRA results as well as the collection of HRA data for model construction and verification. In this context, the present paper considers the TAsk COMplexity (TACOM) measure, developed by one of the authors to quantify the complexity of procedure-guided tasks (by the operating crew of nuclear power plants in emergency situations), and evaluates its use to represent (objectively and quantitatively) task complexity issues relevant to HRA methods. In particular, TACOM scores are calculated for five Human Failure Events (HFEs) for which empirical evidence on the HEPs (albeit with large uncertainty) and influencing factors are available from the International HRA Empirical Study. The empirical evaluation has shown promising results. The TACOM score increases as the empirical HEP of the selected HFEs increases. Except for one case, TACOM scores are well distinguished if related to different difficulty categories (e. g., 'easy' vs. 'somewhat difficult'), while values corresponding to tasks within the same category are very close. Despite some important limitations related to the small number of HFEs investigated and the large uncertainty in their HEPs, this paper presents one of few attempts to empirically study the effect of a performance shaping factor on the human error probability. This type of study is important to enhance the empirical basis of HRA methods, to make sure that 1) the definitions of the PSFs cover the influences important for HRA (i. e., influencing the error probability), and 2) the quantitative relationships among PSFs and error

  13. Measuring The Influence of TAsk COMplexity on Human Error Probability: An Empirical Evaluation

    A key input for the assessment of Human Error Probabilities (HEPs) with Human Reliability Analysis (HRA) methods is the evaluation of the factors influencing the human performance (often referred to as Performance Shaping Factors, PSFs). In general, the definition of these factors and the supporting guidance are such that their evaluation involves significant subjectivity. This affects the repeatability of HRA results as well as the collection of HRA data for model construction and verification. In this context, the present paper considers the TAsk COMplexity (TACOM) measure, developed by one of the authors to quantify the complexity of procedure-guided tasks (by the operating crew of nuclear power plants in emergency situations), and evaluates its use to represent (objectively and quantitatively) task complexity issues relevant to HRA methods. In particular, TACOM scores are calculated for five Human Failure Events (HFEs) for which empirical evidence on the HEPs (albeit with large uncertainty) and influencing factors are available from the International HRA Empirical Study. The empirical evaluation has shown promising results. The TACOM score increases as the empirical HEP of the selected HFEs increases. Except for one case, TACOM scores are well distinguished if related to different difficulty categories (e. g., 'easy' vs. 'somewhat difficult'), while values corresponding to tasks within the same category are very close. Despite some important limitations related to the small number of HFEs investigated and the large uncertainty in their HEPs, this paper presents one of few attempts to empirically study the effect of a performance shaping factor on the human error probability. This type of study is important to enhance the empirical basis of HRA methods, to make sure that 1) the definitions of the PSFs cover the influences important for HRA (i. e., influencing the error probability), and 2) the quantitative relationships among PSFs and error probability are

  14. PAFit: A Statistical Method for Measuring Preferential Attachment in Temporal Complex Networks.

    Thong Pham

    Full Text Available Preferential attachment is a stochastic process that has been proposed to explain certain topological features characteristic of complex networks from diverse domains. The systematic investigation of preferential attachment is an important area of research in network science, not only for the theoretical matter of verifying whether this hypothesized process is operative in real-world networks, but also for the practical insights that follow from knowledge of its functional form. Here we describe a maximum likelihood based estimation method for the measurement of preferential attachment in temporal complex networks. We call the method PAFit, and implement it in an R package of the same name. PAFit constitutes an advance over previous methods primarily because we based it on a nonparametric statistical framework that enables attachment kernel estimation free of any assumptions about its functional form. We show this results in PAFit outperforming the popular methods of Jeong and Newman in Monte Carlo simulations. What is more, we found that the application of PAFit to a publically available Flickr social network dataset yielded clear evidence for a deviation of the attachment kernel from the popularly assumed log-linear form. Independent of our main work, we provide a correction to a consequential error in Newman's original method which had evidently gone unnoticed since its publication over a decade ago.

  15. A New Efficient Analytical Method for Picolinate Ion Measurements in Complex Aqueous Solutions

    Parazols, M.; Dodi, A. [CEA Cadarache, Lab Anal Radiochim and Chim, DEN, F-13108 St Paul Les Durance (France)

    2010-07-01

    This study focuses on the development of a new simple but sensitive, fast and quantitative liquid chromatography method for picolinate ion measurement in high ionic strength aqueous solutions. It involves cation separation over a chromatographic CS16 column using methane sulfonic acid as a mobile phase and detection by UV absorbance (254 nm). The CS16 column is a high-capacity stationary phase exhibiting both cation exchange and RP properties. It allows interaction with picolinate ions which are in their zwitterionic form at the pH of the mobile phase (1.3-1.7). Analysis is performed in 30 min with a detection limit of about 0.05 {mu}M and a quantification limit of about 0.15 {mu}M. Moreover, this analytical technique has been tested efficiently on complex aqueous samples from an effluent treatment facility. (authors)

  16. Measuring mixing patterns in complex networks by Spearman rank correlation coefficient

    Zhang, Wen-Yao; Wei, Zong-Wen; Wang, Bing-Hong; Han, Xiao-Pu

    2016-06-01

    In this paper, we utilize Spearman rank correlation coefficient to measure mixing patterns in complex networks. Compared with the widely used Pearson coefficient, Spearman coefficient is rank-based, nonparametric, and size-independent. Thus it is more effective to assess linking patterns of diverse networks, especially for large-size networks. We demonstrate this point by testing a variety of empirical and artificial networks. Moreover, we show that normalized Spearman ranks of stubs are subject to an interesting linear rule where the correlation coefficient is just the Spearman coefficient. This compelling linear relationship allows us to directly produce networks with any prescribed Spearman coefficient. Our method apparently has an edge over the well known uncorrelated configuration model.

  17. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC1400-12 Ref. [9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc. ) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  18. Evaluation of indirect impedance for measuring microbial growth in complex food matrices.

    Johnson, N; Chang, Z; Bravo Almeida, C; Michel, M; Iversen, C; Callanan, M

    2014-09-01

    The suitability of indirect impedance to accurately measure microbial growth in real food matrices was investigated. A variety of semi-solid and liquid food products were inoculated with Bacillus cereus, Listeria monocytogenes, Staphylococcus aureus, Lactobacillus plantarum, Pseudomonas aeruginosa, Escherichia coli, Salmonella enteriditis, Candida tropicalis or Zygosaccharomyces rouxii and CO2 production was monitored using a conductimetric (Don Whitely R.A.B.I.T.) system. The majority (80%) of food and microbe combinations produced a detectable growth signal. The linearity of conductance responses in selected food products was investigated and a good correlation (R(2) ≥ 0.84) was observed between inoculum levels and times to detection. Specific growth rate estimations from the data were sufficiently accurate for predictive modeling in some cases. This initial evaluation of the suitability of indirect impedance to generate microbial growth data in complex food matrices indicates significant potential for the technology as an alternative to plating methods. PMID:24929710

  19. Measuring complexity in a business cycle model of the Kaldor type

    The purpose of this paper is to study the dynamical behavior of a family of two-dimensional nonlinear maps associated to an economic model. Our objective is to measure the complexity of the system using techniques of symbolic dynamics in order to compute the topological entropy. The analysis of the variation of this important topological invariant with the parameters of the system, allows us to distinguish different chaotic scenarios. Finally, we use a another topological invariant to distinguish isentropic dynamics and we exhibit numerical results about maps with the same topological entropy. This work provides an illustration of how our understanding of higher dimensional economic models can be enhanced by the theory of dynamical systems.

  20. Entropy-based complexity measures for gait data of patients with Parkinson's disease

    Afsar, Ozgur; Tirnakli, Ugur; Kurths, Juergen

    2016-02-01

    Shannon, Kullback-Leibler, and Klimontovich's renormalized entropies are applied as three different complexity measures on gait data of patients with Parkinson's disease (PD) and healthy control group. We show that the renormalized entropy of variability of total reaction force of gait is a very efficient tool to compare patients with respect to disease severity. Moreover, it is a good risk predictor such that the sensitivity, i.e., the percentage of patients with PD who are correctly identified as having PD, increases from 25% to 67% while the Hoehn-Yahr stage increases from 2.5 to 3.0 (this stage goes from 0 to 5 as the disease severity increases). The renormalized entropy method for stride time variability of gait is found to correctly identify patients with a sensitivity of 80%, while the Shannon entropy and the Kullback-Leibler relative entropy can do this with a sensitivity of only 26.7% and 13.3%, respectively.

  1. Integrating Sound Scattering Measurements in the Design of Complex Architectural Surfaces

    Peters, Brady

    2010-01-01

    Digital tools present the opportunity for incorporating performance analysis into the architectural design process. Acoustic performance is an important criterion for architectural design. There is much known about sound absorption but little about sound scattering, even though scattering is...... recognized to be one of the most important factors in predicting the computational prediction of acoustic performance. This paper proposes a workflow for the design of complex architectural surfaces and the prediction of their sound scattering properties. This workflow includes the development of...... computational design tools, geometry generation, fabrication of test surfaces, measurement of acoustic performance, and the incorporation of this data into the generative tool. The Hexagon Wall is included and discussed as an illustrative design study....

  2. Estimating the operator's performance time of emergency procedural tasks based on a task complexity measure

    It is important to understand the amount of time required to execute an emergency procedural task in a high-stress situation for managing human performance under emergencies in a nuclear power plant. However, the time to execute an emergency procedural task is highly dependent upon expert judgment due to the lack of actual data. This paper proposes an analytical method to estimate the operator's performance time (OPT) of a procedural task, which is based on a measure of the task complexity (TACOM). The proposed method for estimating an OPT is an equation that uses the TACOM as a variable, and the OPT of a procedural task can be calculated if its relevant TACOM score is available. The validity of the proposed equation is demonstrated by comparing the estimated OPTs with the observed OPTs for emergency procedural tasks in a steam generator tube rupture scenario.

  3. Complex Networks Measures for Differentiation between Normal and Shuffled Croatian Texts

    Margan, Domagoj; Martinčić-Ipšić, Sanda

    2014-01-01

    This paper studies the properties of the Croatian texts via complex networks. We present network properties of normal and shuffled Croatian texts for different shuffling principles: on the sentence level and on the text level. In both experiments we preserved the vocabulary size, word and sentence frequency distributions. Additionally, in the first shuffling approach we preserved the sentence structure of the text and the number of words per sentence. Obtained results showed that degree rank distributions exhibit no substantial deviation in shuffled networks, and strength rank distributions are preserved due to the same word frequencies. Therefore, standard approach to study the structure of linguistic co-occurrence networks showed no clear difference among the topologies of normal and shuffled texts. Finally, we showed that the in- and out- selectivity values from shuffled texts are constantly below selectivity values calculated from normal texts. Our results corroborate that the node selectivity measure can...

  4. Wide-band complex magnetic susceptibility measurements of magnetic fluids as a function of temperature

    Fannin, P. C.; Kinsella, L.; Charles, S. W.

    1999-07-01

    Measurements of the complex magnetic susceptibility over the frequency and temperature ranges of 2 MHz-6 GHz and 20 to -100°C, respectively, are reported for the first time for a magnetic fluid. The fluid used was a colloidal suspension of magnetite particles of median diameter 9 nm in a hydrocarbon oil (isopar m). Resonance was observed and found to increase from approx 1.5 GHz to 3.3 GHz in the temperature range 20 to -50°C. The increase in resonant frequency is attributed to a decrease in thermal fluctuations with decrease in temperature. At frequencies below approximately 19 MHz, a significant drop in χ'( ω) with decrease in temperature over the temperature range 20 to -100°C, is observed and is attributed to the changes in the Néel and Brownian relaxation processes. Below -60°C, the temperature at which the suspension becomes solid, Brownian relaxation ceases to exist.

  5. Response to Disturbance and Abundance of Final State: a Measure for Complexity?

    SHEN Dan; WANG Wen-Xiu; JIANG Yu-Mei; HE Yue; HE Da-Ren

    2007-01-01

    We propose a new definition of complexity. The definition shows that when a system evolves to a final state via a transient state, its complexity depends on the abundance of both the final state and transient state. The abundance of the transient state may be described by the diversity of the response to disturbance. We hope that this definition can describe a clear boundary between simple systems and complex systems by showing that all the simple systems have zero complexity, and all the complex systems have positive complexity. Some examples of the complexity calculations are presented, which supports our hope.

  6. Progressive evolution and a measure for its noise-dependent complexity

    Fussy, Siegfried; Grössing, Gerhard; Schwabl, Herbert

    1999-03-01

    -Queen-effect." Additionally, for the memory based model a parameter was found indicating a limited range of noise allowing for the most complex behavior of the model, whereas the entropy of the system provides only a monotonous measure with respect to the varying noise level.

  7. Characterization of a complex near-surface structure using well logging and passive seismic measurements

    Benjumea, Beatriz; Macau, Albert; Gabàs, Anna; Figueras, Sara

    2016-04-01

    We combine geophysical well logging and passive seismic measurements to characterize the near-surface geology of an area located in Hontomin, Burgos (Spain). This area has some near-surface challenges for a geophysical study. The irregular topography is characterized by limestone outcrops and unconsolidated sediments areas. Additionally, the near-surface geology includes an upper layer of pure limestones overlying marly limestones and marls (Upper Cretaceous). These materials lie on top of Low Cretaceous siliciclastic sediments (sandstones, clays, gravels). In any case, a layer with reduced velocity is expected. The geophysical data sets used in this study include sonic and gamma-ray logs at two boreholes and passive seismic measurements: three arrays and 224 seismic stations for applying the horizontal-to-vertical amplitude spectra ratio method (H/V). Well-logging data define two significant changes in the P-wave-velocity log within the Upper Cretaceous layer and one more at the Upper to Lower Cretaceous contact. This technique has also been used for refining the geological interpretation. The passive seismic measurements provide a map of sediment thickness with a maximum of around 40 m and shear-wave velocity profiles from the array technique. A comparison between seismic velocity coming from well logging and array measurements defines the resolution limits of the passive seismic techniques and helps it to be interpreted. This study shows how these low-cost techniques can provide useful information about near-surface complexity that could be used for designing a geophysical field survey or for seismic processing steps such as statics or imaging.

  8. Determination of the Landau-Lifshitz damping parameter by means of complex susceptibility measurements

    A new experimental method for the determination of the Landau-Lifshitz damping parameter, α, based on measurements of the frequency and field dependence of the complex magnetic susceptibility, χ(ω,H)=χ'(ω,H)-iχ''(ω,H), is proposed. The method centres on evaluating the ratio of fmax/fres, where fres is the resonance frequency and fmax is the maximum absorption frequency at resonance, of the sample susceptibility spectra, measured in strong polarizing fields. We have investigated three magnetic fluid samples, namely sample 1, sample 2 and sample 3. Sample 1 consisted of particles of Mn0.6Fe0.4Fe2O4 dispersed in kerosene, sample 2 consisted of magnetite particles dispersed in Isopar M and sample 3 was composed of particles of Mn0.66Zn0.34Fe2O4 dispersed in Isopar M. The results obtained for the mean damping parameter of particles within the magnetic fluid samples are as follows: 0.6Fe0.4Fe2O4)>=0.057 with the corresponding standard deviation SD=0.0104; 3O4)>=0.1105 with the corresponding standard deviation, SD=0.034 and 0.66Zn0.34Fe2O4)>=0.096 with the corresponding standard deviation, SD=0.037

  9. Determination of the Landau Lifshitz damping parameter by means of complex susceptibility measurements

    Fannin, P. C.; Marin, C. N.

    2006-04-01

    A new experimental method for the determination of the Landau-Lifshitz damping parameter, α, based on measurements of the frequency and field dependence of the complex magnetic susceptibility, χ(ω,H)=χ'(ω,H)-iχ″(ω,H), is proposed. The method centres on evaluating the ratio of fmax/ fres, where fres is the resonance frequency and fmax is the maximum absorption frequency at resonance, of the sample susceptibility spectra, measured in strong polarizing fields. We have investigated three magnetic fluid samples, namely sample 1, sample 2 and sample 3. Sample 1 consisted of particles of Mn 0.6Fe 0.4Fe 2O 4 dispersed in kerosene, sample 2 consisted of magnetite particles dispersed in Isopar M and sample 3 was composed of particles of Mn 0.66Zn 0.34Fe 2O 4 dispersed in Isopar M. The results obtained for the mean damping parameter of particles within the magnetic fluid samples are as follows: =0.057 with the corresponding standard deviation SD=0.0104; =0.1105 with the corresponding standard deviation, SD=0.034 and =0.096 with the corresponding standard deviation, SD=0.037.

  10. Effect of metal complexation on the conductance of single-molecular wires measured at room temperature.

    Ponce, Julia; Arroyo, Carlos R; Tatay, Sergio; Frisenda, Riccardo; Gaviña, Pablo; Aravena, Daniel; Ruiz, Eliseo; van der Zant, Herre S J; Coronado, Eugenio

    2014-06-11

    The present work aims to give insight into the effect that metal coordination has on the room-temperature conductance of molecular wires. For that purpose, we have designed a family of rigid, highly conductive ligands functionalized with different terminations (acetylthiols, pyridines, and ethynyl groups), in which the conformational changes induced by metal coordination are negligible. The single-molecule conductance features of this series of molecular wires and their corresponding Cu(I) complexes have been measured in break-junction setups at room temperature. Experimental and theoretical data show that no matter the anchoring group, in all cases metal coordination leads to a shift toward lower energies of the ligand energy levels and a reduction of the HOMO-LUMO gap. However, electron-transport measurements carried out at room temperature revealed a variable metal coordination effect depending on the anchoring group: upon metal coordination, the molecular conductance of thiol and ethynyl derivatives decreased, whereas that of pyridine derivatives increased. These differences reside on the molecular levels implied in the conduction. According to quantum-mechanical calculations based on density functional theory methods, the ligand frontier orbital lying closer to the Fermi energy of the leads differs depending on the anchoring group. Thereby, the effect of metal coordination on molecular conductance observed for each anchoring could be explained in terms of the different energy alignments of the molecular orbitals within the gold Fermi level. PMID:24831452

  11. Radiometric characterization of six soils in the microwave X-range through complex permittivity measurements

    Estimating and monitoring up-to-date soil moisture conditions over extensive areas through passive (or active) microwave remote sensing techniques requires the knowledge of the complex relative permittivity (εr*) in function of soil moisture. X-band measurements of εr* for different moisture conditions were made in laboratory for soil samples of six important Soils (PV2, LV3, LRd, LE1, SAP and Sc). Using a theoretical model and computational programmes developed, these measurements allowed estimates of the emissive characteristics of the soils that would be expected with the X-Band Microwave Radiometer built at INPE. The results, new, for soils from tropical regions, showed that only the physical characteristics and properties of the soils are not sufficient to explain the behaviour of εr* in function of soil moisture, indicating that the chemical and/or mineralogical properties of the soils do have an important contribution. The results also showed thast εr* in function of soil moisture depends on soil class. (author)

  12. Investigation of the Ionic conductivity and dielectric measurements of poly (N-vinyl pyrrolidone)-sulfamic acid polymer complexes

    Polymer electrolyte complexes of poly (N-vinyl pyrrolidone) (PVP)-sulfamic acid (NH2SO3H) were prepared by a familiar solution casting method with different molar concentrations of PVP and sulfamic acid. The interaction between PVP and NH2SO3H was confirmed by Fourier transform infrared spectroscopy analysis. Laser microscopy analysis was used to study the surface morphology of the polymer complexes. The glass transition temperature (Tg) and the melting temperature (Tm) of polymer complexes were computed from Differential scanning calorimetric studies. AC impedance spectroscopic measurements revealed that the polymer complex, 97 mol% PVP-3 mol% NH2SO3H shows higher ionic conductivity with two different activation energies above and below the glass transition temperature (Tg). Dielectric studies confirmed that the dc conduction mechanism has dominated in the polymer complexes. The value of power law exponent (n) confirmed the translational motion of ions from one site to another vacant site in these complexes

  13. Inactivation of bacteria on surfaces by sprayed slightly acidic hypochlorous acid water: in vitro experiments.

    Hakim, Hakimullah; Alam, Md Shahin; Sangsriratanakul, Natthanan; Nakajima, Katsuhiro; Kitazawa, Minori; Ota, Mari; Toyofuku, Chiharu; Yamada, Masashi; Thammakarn, Chanathip; Shoham, Dany; Takehara, Kazuaki

    2016-08-01

    The capacity of slightly acidic hypochlorous acid water (SAHW), in both liquid and spray form, to inactivate bacteria was evaluated as a potential candidate for biosecurity enhancement in poultry production. SAHW (containing 50 or 100 ppm chlorine, pH 6) was able to inactivate Escherichia coli and Salmonella Infantis in liquid to below detectable levels (≤2.6 log10 CFU/ml) within 5 sec of exposure. In addition, SAHW antibacterial capacity was evaluated by spraying it using a nebulizer into a box containing these bacteria, which were present on the surfaces of glass plates and rayon sheets. SAHW was able to inactivate both bacterial species on the glass plates (dry condition) and rayon sheets within 5 min spraying and 5 min contact times, with the exception of 50 ppm SAHW on the rayon sheets. Furthermore, a corrosivity test determined that SAHW does not corrode metallic objects, even at the longest exposure times (83 days). Our findings demonstrate that SAHW is a good candidate for biosecurity enhancement in the poultry industry. Spraying it on the surfaces of objects, eggshells, egg incubators and transport cages could reduce the chances of contamination and disease transmission. These results augment previous findings demonstrating the competence of SAHW as an anti-viral disinfectant. PMID:27052464

  14. Evaluation of fly ashes for the removal of Cu, Ni and Cd from acidic waters

    The presence of sulphides in many mine wastes and the formation of acid mine drainages (AMD) has been widely recognized as one of the great environmental problems nowadays. Waters from many of the abandoned mines, with thousands of cubic meters of residue scattered in dumps and ponds, are affected by this type of pollution characterized by its acidity, high contents of sulphates and heavy metals such as Fe, Mn, Al, Cu, Ni, Cd. This study was designed to study the effect of use flying ash coming from Power Stations as a neutralizer for acidic waters resulting from this type of abandoned facilities. In this study, and due to the heterogeneity of the contaminants present, we have studied the removal of Ni, Cu and Cd. Different parameters were studied: metal concentration and pH of the solution to be treated, time reaction and pulp density. Fly ash can be used as a neutralization/fixation agent. Fly ash will add alkalinity and increase the pH on contacting AMD. This will result in precipitation of metal hydroxides.

  15. Complex Correlation Measure: a novel descriptor for Poincaré plot

    Gubbi Jayavardhana

    2009-08-01

    Full Text Available Abstract Background Poincaré plot is one of the important techniques used for visually representing the heart rate variability. It is valuable due to its ability to display nonlinear aspects of the data sequence. However, the problem lies in capturing temporal information of the plot quantitatively. The standard descriptors used in quantifying the Poincaré plot (SD1, SD2 measure the gross variability of the time series data. Determination of advanced methods for capturing temporal properties pose a significant challenge. In this paper, we propose a novel descriptor "Complex Correlation Measure (CCM" to quantify the temporal aspect of the Poincaré plot. In contrast to SD1 and SD2, the CCM incorporates point-to-point variation of the signal. Methods First, we have derived expressions for CCM. Then the sensitivity of descriptors has been shown by measuring all descriptors before and after surrogation of the signal. For each case study, lag-1 Poincaré plots were constructed for three groups of subjects (Arrhythmia, Congestive Heart Failure (CHF and those with Normal Sinus Rhythm (NSR, and the new measure CCM was computed along with SD1 and SD2. ANOVA analysis distribution was used to define the level of significance of mean and variance of SD1, SD2 and CCM for different groups of subjects. Results CCM is defined based on the autocorrelation at different lags of the time series, hence giving an in depth measurement of the correlation structure of the Poincaré plot. A surrogate analysis was performed, and the sensitivity of the proposed descriptor was found to be higher as compared to the standard descriptors. Two case studies were conducted for recognizing arrhythmia and congestive heart failure (CHF subjects from those with NSR, using the Physionet database and demonstrated the usefulness of the proposed descriptors in biomedical applications. CCM was found to be a more significant (p = 6.28E-18 parameter than SD1 and SD2 in discriminating

  16. Eddy-correlation measurements of benthic fluxes under complex flow conditions: Effects of coordinate transformations and averaging time scales

    Lorke, Andreas; McGinnis, Daniel F.; Maeck, Andreas

    2013-01-01

    hours of continuous eddy-correlation measurements of sediment oxygen fluxes in an impounded river, we demonstrate that rotation of measured current velocities into streamline coordinates can be a crucial and necessary step in data processing under complex flow conditions in non-flat environments...

  17. Complexity Measures, Task Type, and Analytic Evaluations of Speaking Proficiency in a School-Based Assessment Context

    Gan, Zhengdong

    2012-01-01

    This study, which is part of a large-scale study of using objective measures to validate assessment rating scales and assessment tasks in a high-profile school-based assessment initiative in Hong Kong, examined how grammatical complexity measures relate to task type and analytic evaluations of students' speaking proficiency in a classroom-based…

  18. Inspection of Complex Internal Surface Shape with Fiber-optic Sensor II: for Specular Tilted Surface Measurement

    2001-01-01

    Complex surface shape measurement has been a focus topic in the CAD/CAM field. A popular method for measuring dimensional information is using a 3D coordinate measuring machine (CMM)with a touch trigger probe. The measurement set up with CMM, however, is a time consuming task and the accuracy of the measurement deteriorates as the speed of measurement increase. Non-contact measurement is favored since high speed measurement can be achieved and problems with vibration and friction can be eliminated. Although much research has been conducted in non-contact measurement using image capturing and processing schemes, accuracy is poor and measurement is limited. Some optical technologies developed provide a good accuracy but the dynamic range and versatility is very limited. A novel fiber-optic sensor used for the inspection of complex internal contours is presented in this paper, which is able to measure a surface shape in a non-contact manner with high accuracy and high speed, and is compact and flexible to be incorporated into a CMM. Modulation functions for tilted surface shape measurement, based on the Gaussian distribution of the emitting beam from single-mode fiber (SMF), were derived for specular reflection. The feasibility of the proposed measurement principle was verified by simulations.

  19. Impact of automation: Measurement of performance, workload and behaviour in a complex control environment.

    Balfe, Nora; Sharples, Sarah; Wilson, John R

    2015-03-01

    This paper describes an experiment that was undertaken to compare three levels of automation in rail signalling; a high level in which an automated agent set routes for trains using timetable information, a medium level in which trains were routed along pre-defined paths, and a low level where the operator (signaller) was responsible for the movement of all trains. These levels are described in terms of a Rail Automation Model based on previous automation theory (Parasuraman et al., 2000). Performance, subjective workload, and signaller activity were measured for each level of automation running under both normal operating conditions and abnormal, or disrupted, conditions. The results indicate that perceived workload, during both normal and disrupted phases of the experiment, decreased as the level of automation increased and performance was most consistent (i.e. showed the least variation between participants) with the highest level of automation. The results give a strong case in favour of automation, particularly in terms of demonstrating the potential for automation to reduce workload, but also suggest much benefit can achieved from a mid-level of automation potentially at a lower cost and complexity. PMID:25479974

  20. [Evaluation of a complex trace element composition and bioutilization using isotope technics and total body measurement].

    Balogh, L; Kerekes, A; Bodó, K; Körösi, L; Jánoki, G A

    1998-05-24

    Modified mineral and trace element solutions were prepared containing Zn-65, Co-57, Mn-54, Fe-59, Mo-99 and Ni-63 isotopes which were physico-chemically identical to the original solution. Bioutilization examinations were carried out on animals receiving their normal feeding, after p. os application of complex trace element composition (CTEC) namely whole-body retention studies, bioassays, scintigraphic and excretion examinations in altogether 180 Wistar rats, 6 Beagle and 2 mongrel dogs using whole body counter, gamma and beta counters, gamma camera and metabolic cages. Extremely high whole body retention was measured in case of iron (8-30%), high utilizations in case of zinc (4-5%), cobalt (4-6%), molybdenum (3-4%) and manganese (2-4%) and a lower value in case of nickel. Bioassay and scintigraphic evaluations showed marked liver-, kidney-, and muscle and moderated blood uptakes. The way of excretion was mainly (more than 90%) via the faeces in case of zinc, manganese, iron and nickel, although cobalt excreted in 8% and molybdenum in 52% via the urinary tract. Our results show, that isotope technique combined with whole body counting and excretion studies in an available method for trace element bioutilization studies. PMID:9632924

  1. An entropy-based measure of hydrologic complexity and its applications

    Castillo, Aldrich; Castelli, Fabio; Entekhabi, Dara

    2015-07-01

    Basin response and hydrologic fluxes are functions of hydrologic states, most notably of soil moisture. However, characterization of hillslope-scale soil moisture is challenging since it is both spatially heterogeneous and dynamic. This paper introduces an entropy-based and discretization-invariant dimensionless index of hydrologic complexity H that measures the distance of a given distribution of soil moisture from a Dirac delta (most organization) and a uniform distribution (widest distribution). Applying the distributed hydrologic model MOBIDIC to seven test basins with areas ranging 100-103 km2 and representing semiarid and temperate climates, H is shown to capture distributional characteristics of soil moisture fields. It can also track the temporal evolution of the distributional features. Furthermore, this paper explores how basin attributes affect the characteristic H, and how H can be used to explain interbasin variability in hydrologic response. Relationships are found only by grouping basins with the same climate or size. For the semiarid basins, H scales with catchment area, topographic wetness, infiltration ratio, and base flow index; while H is inversely related to relief ratio.

  2. An entropy‐based measure of hydrologic complexity and its applications

    Castelli, Fabio; Entekhabi, Dara

    2015-01-01

    Abstract Basin response and hydrologic fluxes are functions of hydrologic states, most notably of soil moisture. However, characterization of hillslope‐scale soil moisture is challenging since it is both spatially heterogeneous and dynamic. This paper introduces an entropy‐based and discretization‐invariant dimensionless index of hydrologic complexity H that measures the distance of a given distribution of soil moisture from a Dirac delta (most organization) and a uniform distribution (widest distribution). Applying the distributed hydrologic model MOBIDIC to seven test basins with areas ranging 100−103 km2 and representing semiarid and temperate climates, H is shown to capture distributional characteristics of soil moisture fields. It can also track the temporal evolution of the distributional features. Furthermore, this paper explores how basin attributes affect the characteristic H, and how H can be used to explain interbasin variability in hydrologic response. Relationships are found only by grouping basins with the same climate or size. For the semiarid basins, H scales with catchment area, topographic wetness, infiltration ratio, and base flow index; while H is inversely related to relief ratio. PMID:26937055

  3. Method for Determining the Activation Energy Distribution Function of Complex Reactions by Sieving and Thermogravimetric Measurements.

    Bufalo, Gennaro; Ambrosone, Luigi

    2016-01-14

    A method for studying the kinetics of thermal degradation of complex compounds is suggested. Although the method is applicable to any matrix whose grain size can be measured, herein we focus our investigation on thermogravimetric analysis, under a nitrogen atmosphere, of ground soft wheat and ground maize. The thermogravimetric curves reveal that there are two well-distinct jumps of mass loss. They correspond to volatilization, which is in the temperature range 298-433 K, and decomposition regions go from 450 to 1073 K. Thermal degradation is schematized as a reaction in the solid state whose kinetics is analyzed separately in each of the two regions. By means of a sieving analysis different size fractions of the material are separated and studied. A quasi-Newton fitting algorithm is used to obtain the grain size distribution as best fit to experimental data. The individual fractions are thermogravimetrically analyzed for deriving the functional relationship between activation energy of the degradation reactions and the particle size. Such functional relationship turns out to be crucial to evaluate the moments of the activation energy distribution, which is unknown in terms of the distribution calculated by sieve analysis. From the knowledge of moments one can reconstruct the reaction conversion. The method is applied first to the volatilization region, then to the decomposition region. The comparison with the experimental data reveals that the method reproduces the experimental conversion with an accuracy of 5-10% in the volatilization region and of 3-5% in the decomposition region. PMID:26671287

  4. Nuclear Data Measurement Using the Accurate Neutron-Nucleus Reaction Measurement Instrument (ANNRI) in the Japan Proton Accelerator Research Complex (J-PARC)

    A nuclear data measurement project using a spallation neutron source is ongoing at the Japan Proton Accelerator Research Complex (J-PARC). The Accurate Neutron-Nucleus Reaction Measurement Instrument (ANNRI) was built as a beam line for measurement of neutron capture cross sections in J-PARC. The project is aimed to measure the neutron capture cross sections of minor actinides (MA) and long-lived fission products (LLFP) for design of innovative nuclear reactors and study of nuclear transmutation of nuclear waste. The ongoing ANNRI project is overviewed. (author)

  5. Precision waveguide system for measurement of complex permittivity of liquids at frequencies from 60 to 90 GHz.

    Hunger, J; Cerjak, I; Schoenmaker, H; Bonn, M; Bakker, H J

    2011-10-01

    We describe a variable path length waveguide setup developed to accurately measure the complex dielectric permittivity of liquids. This is achieved by measuring the complex scattering parameter of the liquid in a waveguide section with a vector network analyzer in combination with an E-band frequency converter. The automated measurement procedure allows fast acquisition at closely spaced intervals over the entire measurement bandwidth: 60-90 GHz. The presented technique is an absolute method and as such is not prone to calibration errors. The technique is suited to investigate low-loss as well as high-loss liquids in contrast to similar setups described previously. We present measurements for a high-loss liquid (water), an intermediate-loss sample (ethanol), and for nearly loss-less n-octane. Due to the available phase information, the present data have an improved accuracy in comparison with literature data. PMID:22047313

  6. Precision waveguide system for measurement of complex permittivity of liquids at frequencies from 60 to 90Â GHz

    Hunger, J.; Cerjak, I.; Schoenmaker, H.; Bonn, M.; Bakker, H. J.

    2011-10-01

    We describe a variable path length waveguide setup developed to accurately measure the complex dielectric permittivity of liquids. This is achieved by measuring the complex scattering parameter of the liquid in a waveguide section with a vector network analyzer in combination with an E-band frequency converter. The automated measurement procedure allows fast acquisition at closely spaced intervals over the entire measurement bandwidth: 60-90 GHz. The presented technique is an absolute method and as such is not prone to calibration errors. The technique is suited to investigate low-loss as well as high-loss liquids in contrast to similar setups described previously. We present measurements for a high-loss liquid (water), an intermediate-loss sample (ethanol), and for nearly loss-less n-octane. Due to the available phase information, the present data have an improved accuracy in comparison with literature data.

  7. Characterization of Nuclear Materials Using Complex of Non-Destructive and Mass-Spectroscopy Methods of Measurements

    Information and Analytical Centre for nuclear materials investigations was established in Russian Federation in the February 2 of 2009 by ROSATOM State Atomic Energy Corporation (the order #80). Its purpose is in preventing unauthorized access to nuclear materials and excluding their illicit traffic. Information and Analytical Centre includes analytical laboratory to provide composition and properties of nuclear materials of unknown origin for their identification. According to Regulation the Centre deals with: · identification of nuclear materials of unknown origin to provide information about their composition and properties; · arbitration analyzes of nuclear materials; · comprehensive research of nuclear and radioactive materials for developing techniques characterization of materials; · interlaboratory measurements; · measurements for control and accounting; · confirmatory measurements. Complex of non-destructive and mass-spectroscopy techniques was developed for the measurements. The complex consists of: · gamma-ray techniques on the base of MGAU, MGA and FRAM codes for uranium and plutonium isotopic composition; · gravimetrical technique with gamma-spectroscopy in addition for uranium content; · calorimetric technique for plutonium mass; · neutron multiplicity technique for plutonium mass; · measurement technique on the base of mass-spectroscopy for uranium isotopic composition; · measurement technique on the base of mass-spectroscopy for metallic impurities. Complex satisfies the state regulation requirements of ensuring the uniformity of measurements including the Russian Federation Federal Law on Ensuring the Uniformity of Measurements #102-FZ, Interstate Standard GOST R ISO/IEC 17025-2006, National Standards of Russian Federation GOST R 8.563-2009, GOST R 8.703-2010, Federal Regulations NRB-99/2009, OSPORB 99/2010. Created complex is provided in reference materials, equipment end certificated techniques. The complex is included in accredited

  8. Is the habitation of acidic-water sanctuaries by galaxiid fish facilitated by natural organic matter modification of sodium metabolism?

    Glover, Chris N; Donovan, Katherine A; Hill, Jonathan V

    2012-01-01

    Acidic waters of New Zealand's West Coast are hypothesized to be a refuge for native galaxiid fish, allowing them to escape predation from acid-sensitive invasive salmonid species. To determine the mechanisms by which galaxiids tolerate low pH, we investigated sodium metabolism in inanga Galaxias maculatus in response to water pH, short-term acclimation to acidic waters, the presence and source of natural organic matter (NOM), and fish life history. Contrary to expectation, inanga were physiologically sensitive to acid exposure, displaying inhibited sodium influx and exacerbated sodium efflux. Short-term (144 h) acclimation to acid did not modify this effect, and NOM did not exert a protective effect on sodium metabolism at low pH. Inanga sourced from naturally acidic West Coast waters did, however, display a sodium influx capacity (J(max)) that was significantly elevated when compared with that of fish collected from neutral waters. All inanga, independent of source, exhibited exceptionally high sodium uptake affinities (18-40 μM) relative to previously studied freshwater teleosts. Although inanga displayed relatively poor physiological tolerance to acidic waters, their high sodium influx affinity coupled with their occupation of near-coastal waters with elevated sodium levels may permit habitation of low-pH freshwaters. PMID:22902374

  9. Water Accounting Plus (WA+) - a water accounting procedure for complex river basins based on satellite measurements

    Karimi, P.; Bastiaanssen, W. G. M.; Molden, D.

    2013-07-01

    Coping with water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links depletion to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper, we introduce Water Accounting Plus (WA+), which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use and landscape evapotranspiration on the water cycle is described explicitly by defining land use groups with common characteristics. WA+ presents four sheets including (i) a resource base sheet, (ii) an evapotranspiration sheet, (iii) a productivity sheet, and (iv) a withdrawal sheet. Every sheet encompasses a set of indicators that summarise the overall water resources situation. The impact of external (e.g., climate change) and internal influences (e.g., infrastructure building) can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used to acquire a vast amount of required data but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  10. Magnetization transfer ratio measurements of the brain in children with tuberous sclerosis complex

    Magnetization transfer contrast and magnetization transfer ratio (MTR) in brain are mainly related to the presence of myelin. Neuropathological studies of brain lesions in tuberous sclerosis complex (TSC) have demonstrated disordered myelin sheaths. To evaluate the MTR of the brain in children with TSC and to compare with that in controls. Four patients (aged 0.41-8.4 years, mean 2.5 years) with TSC and four age- and sex-matched controls were evaluated with classic MR sequences and with a three-dimensional gradient-echo sequence without and with magnetization transfer pre-pulse. The MTR was calculated as: (SI0-SIm)/SI0 x 100%, where SIm refers to signal intensity from an image acquired with a magnetization transfer pre-pulse and SI0 the signal intensity from the image acquired without a magnetization transfer pre-pulse. The MTR values of cortical tubers (44.1±4.1), of subependymal nodules (51.6±4.8) and of white matter lesions (52.4±1.8) were significantly lower than those of cortex (58.7±3.53), of basal ganglia (caudate nucleus 58.2±2.8, putamen 59.6±2.5, thalamus 61.3±2.4) and of white matter (64.2±2.5) in controls (P<0.001). The MTR of normal-appearing white matter (61.2±3.0) in patients was lower than that of white matter in controls (P<0.01). The MTR of cortex and basal ganglia in patients was not significantly different from that in controls. MTR measurements not only provide semiquantitative information for TSC lesions but also reveal more extensive disease. (orig.)

  11. Magnetization transfer ratio measurements of the brain in children with tuberous sclerosis complex

    Zikou, Anastasia; Ioannidou, Maria-Christina; Astrakas, Loukas; Argyropoulou, Maria I. [University of Ioannina, Department of Radiology, Medical School, Ioannina (Greece); Tzoufi, Meropi [University of Ioannina, Child Health Department, Medical School, Ioannina (Greece)

    2005-11-01

    Magnetization transfer contrast and magnetization transfer ratio (MTR) in brain are mainly related to the presence of myelin. Neuropathological studies of brain lesions in tuberous sclerosis complex (TSC) have demonstrated disordered myelin sheaths. To evaluate the MTR of the brain in children with TSC and to compare with that in controls. Four patients (aged 0.41-8.4 years, mean 2.5 years) with TSC and four age- and sex-matched controls were evaluated with classic MR sequences and with a three-dimensional gradient-echo sequence without and with magnetization transfer pre-pulse. The MTR was calculated as: (SI{sub 0}-SI{sub m})/SI{sub 0} x 100%, where SI{sub m} refers to signal intensity from an image acquired with a magnetization transfer pre-pulse and SI{sub 0} the signal intensity from the image acquired without a magnetization transfer pre-pulse. The MTR values of cortical tubers (44.1{+-}4.1), of subependymal nodules (51.6{+-}4.8) and of white matter lesions (52.4{+-}1.8) were significantly lower than those of cortex (58.7{+-}3.53), of basal ganglia (caudate nucleus 58.2{+-}2.8, putamen 59.6{+-}2.5, thalamus 61.3{+-}2.4) and of white matter (64.2{+-}2.5) in controls (P<0.001). The MTR of normal-appearing white matter (61.2{+-}3.0) in patients was lower than that of white matter in controls (P<0.01). The MTR of cortex and basal ganglia in patients was not significantly different from that in controls. MTR measurements not only provide semiquantitative information for TSC lesions but also reveal more extensive disease. (orig.)

  12. Water Accounting Plus (WA+) - a water accounting procedure for complex river basins based on satellite measurements

    Karimi, P.; Bastiaanssen, W. G. M.; Molden, D.

    2012-11-01

    Coping with the issue of water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links hydrological flows to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper we introduce Water Accounting Plus (WA+), which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use on the water cycle is described explicitly by defining land use groups with common characteristics. Analogous to financial accounting, WA+ presents four sheets including (i) a resource base sheet, (ii) a consumption sheet, (iii) a productivity sheet, and (iv) a withdrawal sheet. Every sheet encompasses a set of indicators that summarize the overall water resources situation. The impact of external (e.g. climate change) and internal influences (e.g. infrastructure building) can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used for 3 out of the 4 sheets, but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  13. Predictive Potential of Heart Rate Complexity Measurement: An Indication for Laparotomy Following Solid Organ Injury

    Foroutan

    2015-11-01

    Full Text Available Background Nonlinear analysis of heart rate variability (HRV has been recently used as a predictor of prognosis in trauma patients. Objectives We applied nonlinear analysis of HRV in patients with blunt trauma and intraperitoneal bleeding to assess our ability to predict the outcome of conservative management. Patients and Methods An analysis of electrocardiography (ECG from 120 patients with blunt trauma was conducted at the onset of admission to the emergency department. ECGs of 65 patients were excluded due to inadequacy of noise-free length. Of the remaining 55 patients, 47 survived (S group and eight patients died in the hospital (Non-S group. Nineteen patients were found to have intra-abdominal bleeding, eight of which ultimately underwent laparotomy to control bleeding (Op group and 11 underwent successful non-operative management (non-Op. Demographic data including vital signs, glasgow coma scale (GCS, arterial blood gas and injury severity scores (ISS were recorded. Heart rate complexity (HRC methods, including entropy, were used to analyze the ECG. Results There were no differences in age, gender, heart rate (HR and blood pressure between the S and Non-S groups. However, approximate entropy, used as a method of HRC measurement, and GCS were significantly higher in S group, compared to the Non-S group. The base deficit and ISS were significantly higher in the Non-S group. Regarding age, sex, ISS, base deficit, vital signs and GCS, no difference was found between Op and Non-Op groups. Approximate entropy was significantly lower in the Op group, compared to the Non-Op group. Conclusions The loss of HRC at the onset of admission may predict mortality in patients with blunt trauma. Lower entropy, in recently admitted patients with intra-abdominal bleeding, may indicate laparotomy when the vital signs are stable.

  14. Complex bounds and microstructural recovery from measurements of sea ice permittivity

    Sea ice is a porous composite of pure ice with brine, air, and salt inclusions. The polar sea ice packs play a key role in the earth's ocean-climate system, and they host robust algal and bacterial communities that support the Arctic and Antarctic ecosystems. Monitoring the sea ice packs on global or regional scales is an increasingly important problem, typically involving the interaction of an electromagnetic wave with sea ice. In the quasistatic regime where the wavelength is much longer than the composite microstructural scale, the electromagnetic behavior is characterized by the effective complex permittivity tensor ε*. In assessing the impact of climate change on the polar sea ice covers, current satellites and algorithms can predict ice extent, but the thickness distribution remains an elusive, yet most important feature. In recent years, electromagnetic induction devices using low frequency waves have been deployed on ships, helicopters and planes to obtain thickness data. Here we compare two sets of theoretical bounds to extensive outdoor tank and in situ field data on ε* at 50MHz taken in the Arctic and Antarctic. The sea ice is assumed to be a two phase composite of ice and brine with known constituent permittivities. The first set of bounds assumes only knowledge of the brine volume fraction or porosity, and the second set further assumes statistical isotropy of the microstructure. We obtain excellent agreement between theory and experiment, and are able to observe the apparent violation of the isotropic bounds as the vertically oriented microstructure becomes increasingly connected for higher porosities. Moreover, these bounds are inverted to obtain estimates of the porosity from the measurements of ε*. We find that the temporal variations of the reconstructed porosity, which is directly related to temperature, closely follow the actual behavior

  15. Development and evaluation of aperture-based complexity metrics using film and EPID measurements of static MLC openings

    Götstedt, Julia [Department of Radiation Physics, University of Gothenburg, Göteborg 413 45 (Sweden); Karlsson Hauer, Anna; Bäck, Anna, E-mail: anna.back@vgregion.se [Department of Therapeutic Radiation Physics, Sahlgrenska University Hospital, Göteborg 413 45 (Sweden)

    2015-07-15

    Purpose: Complexity metrics have been suggested as a complement to measurement-based quality assurance for intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT). However, these metrics have not yet been sufficiently validated. This study develops and evaluates new aperture-based complexity metrics in the context of static multileaf collimator (MLC) openings and compares them to previously published metrics. Methods: This study develops the converted aperture metric and the edge area metric. The converted aperture metric is based on small and irregular parts within the MLC opening that are quantified as measured distances between MLC leaves. The edge area metric is based on the relative size of the region around the edges defined by the MLC. Another metric suggested in this study is the circumference/area ratio. Earlier defined aperture-based complexity metrics—the modulation complexity score, the edge metric, the ratio monitor units (MU)/Gy, the aperture area, and the aperture irregularity—are compared to the newly proposed metrics. A set of small and irregular static MLC openings are created which simulate individual IMRT/VMAT control points of various complexities. These are measured with both an amorphous silicon electronic portal imaging device and EBT3 film. The differences between calculated and measured dose distributions are evaluated using a pixel-by-pixel comparison with two global dose difference criteria of 3% and 5%. The extent of the dose differences, expressed in terms of pass rate, is used as a measure of the complexity of the MLC openings and used for the evaluation of the metrics compared in this study. The different complexity scores are calculated for each created static MLC opening. The correlation between the calculated complexity scores and the extent of the dose differences (pass rate) are analyzed in scatter plots and using Pearson’s r-values. Results: The complexity scores calculated by the edge

  16. Validation of ASTER Surface Temperature Data with In Situ Measurements to Evaluate Heat Islands in Complex Urban Areas

    Bonggeun Song; Kyunghun Park

    2014-01-01

    This study compared Advanced Spaceborne Thermal Emission Reflection Radiometer (ASTER) surface temperature data with in situ measurements to validate the use of ASTER data for studying heat islands in urban settings with complex spatial characteristics. Eight sites in Changwon, Korea, were selected for analyses. Surface temperature data were extracted from the thermal infrared (TIR) band of ASTER on four dates during the summer and fall of 2012, and corresponding in situ measurements of tempe...

  17. An investigation of ozone and planetary boundary layer dynamics over the complex topography of Grenoble combining measurements and modeling

    Couach, O.; I. Balin; Jiménez, R.; Ristori, P.; Perego, S.; Kirchner, F.; Simeonov, V.; Calpini, B.; Van den Bergh, H.

    2003-01-01

    This paper concerns an evaluation of ozone (O3) and planetary boundary layer (PBL) dynamics over the complex topography of the Grenoble region through a combination of measurements and mesoscale model (METPHOMOD) predictions for three days, during July 1999. The measurements of O3 and PBL structure were obtained with a Differential Absorption Lidar (DIAL) system, situated 20 km south of Grenoble at Vif (310 m ASL). The combined lidar observations and model calculations are in good agree...

  18. An investigation of ozone and planetary boundary layer dynamics over the complex topography of Grenoble combining measurements and modeling

    Couach, O.; Balin, I.; R. Jiménez; Ristori, P.; Perego, S.; Kirchner, F.; Simeonov, V.; B. Calpini; Bergh, H.

    2003-01-01

    This paper concerns an evaluation of ozone (O3) and planetary boundary layer (PBL) dynamics over the complex topography of the Grenoble region through a combination of measurements and mesoscale model (METPHOMOD) predictions for three days, during July 1999. The measurements of O3 and PBL structure were obtained with a Differential Absorption Lidar (DIAL) system, situated 20 km south of Grenoble at Vif (310 m ASL). The combined lidar observations ...

  19. Measurement of the speed of sound by observation of the Mach cones in a complex plasma under microgravity conditions

    Zhukhovitskii, D I; Molotkov, V I; Lipaev, A M; Naumkin, V N; Thomas, H M; Ivlev, A V; Schwabe, M; Morfill, G E

    2014-01-01

    We report the first observation of the Mach cones excited by a larger microparticle (projectile) moving through a cloud of smaller microparticles (dust) in a complex plasma with neon as a buffer gas under microgravity conditions. A collective motion of the dust particles occurs as propagation of the contact discontinuity. The corresponding speed of sound was measured by a special method of the Mach cone visualization. The measurement results are fully incompatible with the theory of ion acoustic waves. We explore the analogy between a strongly coupled Coulomb system and a solid. A scaling law for the complex plasma makes it possible to derive a theoretical estimate for the speed of sound, which is in a reasonable agreement with the experiments in strongly coupled complex plasmas.

  20. Measurement of the speed of sound by observation of the Mach cones in a complex plasma under microgravity conditions

    We report the first observation of the Mach cones excited by a larger microparticle (projectile) moving through a cloud of smaller microparticles (dust) in a complex plasma with neon as a buffer gas under microgravity conditions. A collective motion of the dust particles occurs as propagation of the contact discontinuity. The corresponding speed of sound was measured by a special method of the Mach cone visualization. The measurement results are incompatible with the theory of ion acoustic waves. The estimate for the pressure in a strongly coupled Coulomb system and a scaling law for the complex plasma make it possible to derive an evaluation for the speed of sound, which is in a reasonable agreement with the experiments in complex plasmas

  1. Comment on 'Interpretation of the Lempel-Ziv Complexity Measure in the context of Biomedical Signal Analysis'

    Balasubramanian, Karthi

    2013-01-01

    In this Communication, we express our reservations on some aspects of the interpretation of the Lempel-Ziv Complexity measure (LZ) by Mateo et al. in "Interpretation of the Lempel-Ziv complexity measure in the context of biomedical signal analysis," IEEE Trans. Biomed. Eng., vol. 53, no. 11, pp. 2282-2288, Nov. 2006. In particular, we comment on the dependence of the LZ complexity measure on number of harmonics, frequency content and amplitude modulation. We disagree with the following statements made by Mateo et al. 1. "LZ is not sensitive to the number of harmonics in periodic signals." 2. "LZ increases as the frequency of a sinusoid increases." 3. "Amplitude modulation of a signal doesnot result in an increase in LZ." We show the dependence of LZ complexity measure on harmonics and amplitude modulation by using a modified version of the synthetic signal that has been used in the original paper. Also, the second statement is a generic statement which is not entirely true. This is true only in the low freque...

  2. Carleson Measures for the Drury-Arveson Hardy space and other Besov-Sobolev spaces on Complex Balls

    Arcozzi, N.; Rochberg, R.; Sawyer, E

    2007-01-01

    We characterize the Carleson measures for the Drury-Arveson Hardy space and other Hilbert spaces of analytic functions of several complex variables. This provides sharp estimates for Drury's generalization of Von Neumann's inequality. The characterization is in terms of a geometric condition, the "split tree condition", which reflects the nonisotropic geometry underlying the Drury-Arveson Hardy space.

  3. Modeling and measuring Business/IT Alignment by using a complex-network approach

    Sousa, José Luís da Rocha

    2014-01-01

    Business/IT Alignment is an information systems research field with a long existence and a high number of researchers and represents a central thinking direction over the entanglement between business and information systems. lt aims to achieve a paradigm, on which there is a high degree of visibility and availability of information about the information systems sociomateriality. _ Complex-networks constitute an approach to the study of the emergent properties of complex-sys...

  4. Methodological possibilities for using the electron and ion energy balance in thermospheric complex measurements

    Combination of ground based measurements for determination of basic thermospheric characteristics is proposed . An expression for the energy transport between components of space plasma is also derived and discussed within the framework of the presented methodology which could be devided into the folowing major sections: 1) application of ionosonde, absorption measurements, TEC-measurements using Faradey radiation or the differential Doppler effect; 2) ground-based airglow measurements; 3) airglow and palsma satelite measurements. 9 refs

  5. Finalizing a measurement framework for the burden of treatment in complex patients with chronic conditions

    Eton DT

    2015-03-01

    % were coping with multiple chronic conditions. A preliminary conceptual framework using data from the first 32 interviews was evaluated and was modified using narrative data from 18 additional interviews with a racially and socioeconomically diverse sample of patients. The final framework features three overarching themes with associated subthemes. These themes included: 1 work patients must do to care for their health (eg, taking medications, keeping medical appointments, monitoring health; 2 challenges/stressors that exacerbate perceived burden (eg, financial, interpersonal, provider obstacles; and 3 impacts of burden (eg, role limitations, mental exhaustion. All themes and subthemes were subsequently confirmed in focus groups. Conclusion: The final conceptual framework can be used as a foundation for building a patient self-report measure to systematically study treatment burden for research and analytical purposes, as well as to promote meaningful clinic-based dialogue between patients and providers about the challenges inherent in maintaining complex self-management of health. Keywords: treatment burden, conceptual framework, adherence, questionnaire, self-management, multi-morbidity

  6. Complexity measures of the central respiratory networks during wakefulness and sleep

    Dragomir, Andrei; Akay, Yasemin; Curran, Aidan K.; Akay, Metin

    2008-06-01

    Since sleep is known to influence respiratory activity we studied whether the sleep state would affect the complexity value of the respiratory network output. Specifically, we tested the hypothesis that the complexity values of the diaphragm EMG (EMGdia) activity would be lower during REM compared to NREM. Furthermore, since REM is primarily generated by a homogeneous population of neurons in the medulla, the possibility that REM-related respiratory output would be less complex than that of the awake state was also considered. Additionally, in order to examine the influence of neuron vulnerabilities within the rostral ventral medulla (RVM) on the complexity of the respiratory network output, we inhibited respiratory neurons in the RVM by microdialysis of GABAA receptor agonist muscimol. Diaphragm EMG, nuchal EMG, EEG, EOG as well as other physiological signals (tracheal pressure, blood pressure and respiratory volume) were recorded from five unanesthetized chronically instrumented intact piglets (3-10 days old). Complexity of the diaphragm EMG (EMGdia) signal during wakefulness, NREM and REM was evaluated using the approximate entropy method (ApEn). ApEn values of the EMGdia during NREM and REM sleep were found significantly (p < 0.05 and p < 0.001, respectively) lower than those of awake EMGdia after muscimol inhibition. In the absence of muscimol, only the differences between REM and wakefulness ApEn values were found to be significantly different.

  7. Computation of complexity measures of morphologically significant zones decomposed from binary fractal sets via multiscale convexity analysis

    Multiscale convexity analysis of certain fractal binary objects-like 8-segment Koch quadric, Koch triadic, and random Koch quadric and triadic islands-is performed via (i) morphologic openings with respect to recursively changing the size of a template, and (ii) construction of convex hulls through half-plane closings. Based on scale vs convexity measure relationship, transition levels between the morphologic regimes are determined as crossover scales. These crossover scales are taken as the basis to segment binary fractal objects into various morphologically prominent zones. Each segmented zone is characterized through normalized morphologic complexity measures. Despite the fact that there is no notably significant relationship between the zone-wise complexity measures and fractal dimensions computed by conventional box counting method, fractal objects-whether they are generated deterministically or by introducing randomness-possess morphologically significant sub-zones with varied degrees of spatial complexities. Classification of realistic fractal sets and/or fields according to sub-zones possessing varied degrees of spatial complexities provides insight to explore links with the physical processes involved in the formation of fractal-like phenomena.

  8. Developing palaeolimnological records of organic content (DOC and POC) using the UK Acid Water Monitoring Network sites

    Russell, Fiona; Chiverrell, Richard; Boyle, John

    2016-04-01

    Monitoring programmes have shown increases in concentrations of dissolved organic matter (DOM) in the surface waters of northern and central Europe (Monteith et al. 2007), and negative impacts of the browning of river waters have been reported for fish populations (Jonsson et al. 2012; Ranaker et al. 2012) and for ecosystem services such as water treatment (Tuvendal and Elmqvist 2011). Still the exact causes of the recent browning remain uncertain, the main contenders being climate change (Evans et al. 2005) and reduced ionic strength in surface water resulting from declines in anthropogenic sulphur and sea salt deposition (Monteith et al. 2007). There is a need to better understand the pattern, drivers and trajectory of these increases in DOC and POC in both recent and longer-term (Holocene) contexts to improve the understanding of carbon cycling within lakes and their catchments. In Britain there are some ideal sites for testing whether these trends are preserved and developing methods for reconstructing organic fluxes from lake sedimentary archives. There is a suite of lakes distributed across the country, the UK Acid Waters Monitoring Network (UKAWMN) sites, which have been monitored monthly for dissolved organic carbon and other aqueous species since 1988. These 12 lakes have well studied recent and in some case whole Holocene sediment records. Here four of those lakes (Grannoch, Chon, Scoat Tarn and Cwm Mynach) are revisited, with sampling focused on the sediment-water interface and very recent sediments (approx.150 years). At Scoat Tarn (approx. 1000 years) and Llyn Mynach (11.5k years) longer records have been obtained to assess equivalent patterns through the Holocene. Analyses of the gravity cores have focused on measuring and characterising the organic content for comparison with recorded surface water DOC measurements (UKAWMN). Data from pyrolysis measurements (TGA/DSC) in an N atmosphere show that the mass loss between 330-415°C correlates well with

  9. Constraint of the limited information content of discharge measurements on the benefits of rating curve models with increased complexities

    Van Eerdenbrugh, Katrien; Verhoest, Niko; De Mulder, Tom

    2015-04-01

    Discharge assessment through rating curves is a widespread technique in the field of hydrologic monitoring. In practical applications, this technique often consists of the use of one or multiple power laws, based on rather stringent assumptions concerning the nature of the prevailing flow conditions. In reality, those assumptions are regularly violated, inducing considerable uncertainties in the calculated discharges. It is thus important to estimate the effect of those simplifications when performing an overall uncertainty analysis of rating curve discharges. In this study, different rating curve formulations are compared based on both results of a hydrodynamic model and measured water levels and discharges. The results of a hydrodynamic model are used to justify the applicability of several rating curve models with increased complexity as an alternative for a single power law equation. With the hydrodynamic model, situations are simulated that correspond to steady state conditions and to minimal effect of downstream boundaries. Comparison of simulations results with those of measurement-driven simulations leads to an increased understanding of the rating curve dynamics. It allows for evaluation of rating curve formulations accounting for the influences of hysteresis and backwater effects which are neglected in power law rating curves. Subsequently, the performance of those rating curve models and the identifiability of their parameters are assessed based on available stage-discharge measurements and their accompanying uncertainties as described in literature. This assessment is performed based on the Generalised Likelihood Uncertainty Estimation (GLUE). Rejection criteria to distinguish behavioural from non-behavioural models are defined by uncertainties on both water levels and discharge measurements that envelop measured data points. The results of the hydrodynamic model clearly indicate benefits of adding complexity to the rating curve model, mainly by

  10. Amines are likely to enhance neutral and ion-induced sulfuric acid-water nucleation in the atmosphere more effectively than ammonia

    T. Kurtén

    2008-07-01

    Full Text Available We have studied the structure and formation thermodynamics of dimer clusters containing H2SO4 or HSO4 together with ammonia and seven different amines possibly present in the atmosphere, using the high-level ab initio methods RI-MP2 and RI-CC2. As expected from e.g. proton affinity data, the binding of all studied amine-H2SO4 complexes is significantly stronger than that of NH3•H2SO4, while most amine-HSO4 complexes are only somewhat more strongly bound than NH3•HSO4. Further calculations on larger cluster structures containing dimethylamine or ammonia together with two H2SO4 molecules or one H2SO4 molecule and one HSO4 ion demonstrate that amines, unlike ammonia, significantly assist the growth of not only neutral but also ionic clusters along the H2SO4 co-ordinate. A sensitivity analysis indicates that the difference in complexation free energies for amine- and ammonia-containing clusters is large enough to overcome the mass-balance effect caused by the fact that the concentration of amines in the atmosphere is probably 2 or 3 orders of magnitude lower than that of ammonia. This implies that amines might be more important than ammonia in enhancing neutral and especially ion-induced sulfuric acid-water nucleation in the atmosphere.

  11. Amines are likely to enhance neutral and ion-induced sulfuric acid-water nucleation in the atmosphere more effectively than ammonia

    T. Kurtén

    2008-04-01

    Full Text Available We have studied the structure and formation thermodynamics of dimer clusters containing H2SO4 or HSO4 together with ammonia and seven different amines possibly present in the atmosphere, using the high-level ab initio methods RI-MP2 and RI-CC2. As expected from e.g. proton affinity data, the binding of all studied amine – H2SO4 complexes is significantly stronger than that of NH3•H2SO4, while most amine – HSO4 complexes are only somewhat more strongly bound than NH3•HSO4. Further calculations on larger cluster structures containing dimethylamine or ammonia together with two H2SO4 molecules or one H2SO4 molecule and one HSO4 ion demonstrate that amines, unlike ammonia, significantly assist the growth of not only neutral but also ionic clusters along the H2SO4 co-ordinate. A sensitivity analysis indicates that the difference in complexation free energies for amine- and ammonia-containing clusters is large enough to overcome the mass-balance effect caused by the fact that the concentration of amines in the atmosphere is probably 2 or 3 orders of magnitude lower than that of ammonia. This implies that amines might be more important than ammonia in enhancing neutral and especially ion-induced sulfuric acid-water nucleation in the atmosphere.

  12. Force and complexity of tongue task training influences behavioral measures of motor learning

    Kothari, Mohit; Svensson, Peter; Huo, Xueliang;

    2012-01-01

    Relearning of motor skills is important in neurorehabilitation. We investigated the improvement of training success during simple tongue protrusion (two force levels) and a more complex tongue-training paradigm using the Tongue Drive System (TDS). We also compared subject-based reports of fun, pa...

  13. Measuring Conceptual Complexity: A Content-Analytic Model Using the Federal Income Tax Laws.

    Karlinsky, Stewart S.; Andrews, J. Douglas

    1986-01-01

    Concludes that more than 15 percent of the federal income tax law's complexity is attributable to the capital gains sections. Confirms the idea that the capital gain and loss provisions substantially complicate the law in both absolute and relative terms. (FL)

  14. On tight separation for Blum measures applied to Turing machine buffer complexity

    Šíma, Jiří; Žák, Stanislav

    -, submitted February 4 2015 (2016) R&D Projects: GA ČR GBP202/12/G061; GA ČR GAP202/10/1333 Institutional support: RVO:67985807 Keywords : Turing machine * hierarchy * buffer complexity * diagonalization Subject RIV: IN - Informatics, Computer Science

  15. Change ΔS of the entropy in natural time under time reversal: Complexity measures upon change of scale

    Sarlis, N. V.; Christopoulos, S.-R. G.; Bemplidaki, M. M.

    2015-01-01

    The entropy S in natural time as well as the entropy in natural time under time reversal S- have already found useful applications in the physics of complex systems, e.g., in the analysis of electrocardiograms (ECGs). Here, we focus on the complexity measures Λl which result upon considering how the statistics of the time series Δ S≤ft[\\equiv S- S-\\right] changes upon varying the scale l. These scale-specific measures are ratios of the standard deviations σ(Δ S_l) and hence independent of the mean value and the standard deviation of the data. They focus on the different dynamics that appear on different scales. For this reason, they can be considered complementary to other standard measures of heart rate variability in ECG, like SDNN, as well as other complexity measures already defined in natural time. An application to the analysis of ECG —when solely using NN intervals— is presented: We show how Λl can be used to separate ECG of healthy individuals from those suffering from congestive heart failure and sudden cardiac death.

  16. Instrumentation Suite for Acoustic Propagation Measurements in Complex Shallow Water Environments

    Federal Laboratory Consortium — FUNCTION: Obtain at-sea measurements to test theoretical and modeling predictions of acoustic propagation in dynamic, inhomogeneous, and nonisotropic shallow water...

  17. Measuring social complexity and the emergence of cooperation from entropic principles

    López-Corona, O; Huerta, A; Mustri-Trejo, D; Perez, K; Ruiz, A; Valdés, O; Zamudio, F

    2015-01-01

    Assessing quantitatively the state and dynamics of a social system is a very difficult problem. It is of great importance for both practical and theoretical reasons such as establishing the efficiency of social action programs, detecting possible community needs or allocating resources. In this paper we propose a new general theoretical framework for the study of social complexity, based on the relation of complexity and entropy in combination with evolutionary dynamics to asses the dynamics of the system. Imposing the second law of thermodynamics, we study the conditions under which cooperation emerges and demonstrate that it depends of relative importance of local and global fitness. As cooperation is a central concept in sustainability, this thermodynamic-informational approach allows new insights and means to asses it using the concept of Helmholtz free energy. Finally we introduce a new set of equations that consider the more general case where the social system change both in time and space, and relate ...

  18. High-precision optical measuring instruments and their application as part of mobile diagnostic complexes

    Igor Miroshnichenko

    2014-01-01

    The article is devoted to the results of the laser technologies and methods of optical interferometry for registering information in the quality control and diagnostics of the construction materials and the power of elements of the acoustic non-destructive control, and also described new technical decisions, allowing to apply the results obtained to the solution of practical problems of diagnostics of products in operation, as part of mobile diagnostic complexes.

  19. Estimation of Defect proneness Using Design complexity Measurements in Object- Oriented Software

    Selvarani, R.; Nair, T R GopalaKrishnan; Prasad, V. Kamakshi

    2010-01-01

    Software engineering is continuously facing the challenges of growing complexity of software packages and increased level of data on defects and drawbacks from software production process. This makes a clarion call for inventions and methods which can enable a more reusable, reliable, easily maintainable and high quality software systems with deeper control on software generation process. Quality and productivity are indeed the two most important parameters for controlling any industrial proc...

  20. High-precision optical measuring instruments and their application as part of mobile diagnostic complexes

    Igor Miroshnichenko

    2014-04-01

    Full Text Available The article is devoted to the results of the laser technologies and methods of optical interferometry for registering information in the quality control and diagnostics of the construction materials and the power of elements of the acoustic non-destructive control, and also described new technical decisions, allowing to apply the results obtained to the solution of practical problems of diagnostics of products in operation, as part of mobile diagnostic complexes.

  1. Multiscale Cross-Approximate Entropy Analysis as a Measure of Complexity among the Aged and Diabetic

    Hsien-Tsai Wu; Cyuan-Cin Liu; Men-Tzung Lo; Po-Chun Hsu; An-Bang Liu; Kai-Yu Chang; Chieh-Ju Tang

    2013-01-01

    Complex fluctuations within physiological signals can be used to evaluate the health of the human body. This study recruited four groups of subjects: young healthy subjects (Group 1, n = 32), healthy upper middle-aged subjects (Group 2, n = 36), subjects with well-controlled type 2 diabetes (Group 3, n = 31), and subjects with poorly controlled type 2 diabetes (Group 4, n = 24). Data acquisition for each participant lasted 30 minutes. We obtained data related to consecutive time series with R...

  2. Complex Permittivity and Permeability Measurements and Numerical Simulation of carbonyl iron rubber in X-Band frequency

    Adriano Luiz de Paula

    2014-06-01

    Full Text Available Recognizing the importance of an adequate characterization of radar absorbing materials (RAM, and consequently their development, the present study aims to contribute for the establishment and validation of experimental determination and numerical simulation of complex permittivity and permeability of electromagnetic materials, using for this a carbonyl iron was seventy percent of the mass concentration. The present work branches out into two related topics. The first one is concerned with the implementation of a computational modeling to predict the behavior of electromagnetic materials in confined environment by using electromagnetic three-dimensional simulation. The second topic re-examines the Nicolson-Ross-Weir mathematical model to retrieve the constitutive parameters (complex permittivity and permeability of a homogeneous sample (carbonyl iron from scattering coefficient measurements. The measured and calculated results show a good convergence that guarantees the application of the used methodologies for the characterization of carbonyl iron rubber in x-band frequency.

  3. Update of a footprint-based approach for the characterisation of complex measurement sites

    Goeckede, M.; Markkanen, T.; Hasager, C.B.;

    2006-01-01

    , uncertainty concerning the sources or sinks influencing a measurement compromises the data interpretation. The consideration of the spatial context of a measurement, defined by a footprint analysis, can therefore provide an important tool for data quality assessment. This study presents an update of an...

  4. An improved technique for the measurement of the complex susceptibility of magnetic colloids in the microwave region

    FANNIN, PAUL; COUPER, COLIN STUART

    2010-01-01

    PUBLISHED Measurements by means of the short-circuit (S/C) and open circuit (O/C) transmission line techniques are well established methods for investigating the magnetic and dielectric properties of magnetic colloids, respectively. In particular, the S/C technique has been used in the investigation of the resonant properties of ferrofluids; resonance being indicated by the transition of the real component of the magnetic complex susceptibility, ?(?)=??(?)?i??(?), from a positive to a nega...

  5. Complexity-based measures inform tai chi’s impact on standing postural control in older adults with peripheral neuropathy

    Manor, Bradley David; Lipsitz, Lewis Arnold; Wayne, Peter Michael; Peng, Chung-Kang; Li, Li

    2013-01-01

    Background: Tai Chi training enhances physical function and may reduce falls in older adults with and without balance disorders, yet its effect on postural control as quantified by the magnitude or speed of center-of-pressure (COP) excursions beneath the feet is less clear. We hypothesized that COP metrics derived from complex systems theory may better capture the multi-component stimulus that Tai Chi has on the postural control system, as compared with traditional COP measures. Methods: We p...

  6. Measurement and correlation of phase equilibrium data of the mixtures consisting of butyric acid, water, cyclohexanone at different temperatures

    Highlights: ► Liquid phase equilibria of (water + BA + cyclohexanone) system were investigated. ► Experimental LLE data were correlated with NRTL and UNIQUAC models. ► Distribution coefficients and separation factors were evaluated. - Abstract: In this work, experimental solubility and tie-line data for the (water + butyric acid + cyclohexanone) system were obtained at T = (298.2, 308.2, and 318.2) K and atmospheric pressure. The ternary system investigated exhibits type-1 behavior of LLE. The experimental tie-line data were compared with those correlated by the UNIQUAC and NRTL models. The consistency of the experimental tie-line data was determined through the Othmer Tobias and Hand correlation equations. Distribution coefficients and separation factors were evaluated over the immiscibility regions. A comparison of the extracting capability of the solvent at different temperatures was made with respect to separation factors. The Katritzky and Kamlet–Abboud–Taft multiparameter scales were applied to correlate distribution coefficients and separation factors in this ternary system. The LSER models values were interpreted in terms of intermolecular interactions.

  7. Validation of ASTER Surface Temperature Data with In Situ Measurements to Evaluate Heat Islands in Complex Urban Areas

    Bonggeun Song

    2014-01-01

    Full Text Available This study compared Advanced Spaceborne Thermal Emission Reflection Radiometer (ASTER surface temperature data with in situ measurements to validate the use of ASTER data for studying heat islands in urban settings with complex spatial characteristics. Eight sites in Changwon, Korea, were selected for analyses. Surface temperature data were extracted from the thermal infrared (TIR band of ASTER on four dates during the summer and fall of 2012, and corresponding in situ measurements of temperature were also collected. Comparisons showed that ASTER derived temperatures were generally 4.27°C lower than temperatures collected by in situ measurements during the daytime, except on cloudy days. However, ASTER temperatures were higher by 2.23–2.69°C on two dates during the nighttime. Temperature differences between a city park and a paved area were insignificant. Differences between ASTER derived temperatures and onsite measurements are caused by a variety of factors including the application of emissivity values that do not consider the complex spatial characteristics of urban areas. Therefore, to improve the accuracy of surface temperatures extracted from infrared satellite imagery, we propose a revised model whereby temperature data is obtained from ASTER and emissivity values for various land covers are extracted based on in situ measurements.

  8. Application of Image Measurement and Continuum Mechanics to the Direct Measurement of Two-Dimensional Finite Strain in a Complex Fibro-Porous Material

    Britton, Paul; Loughran, Jeff

    This paper outlines a computational procedure that has been implemented for the direct measurement of finite material strains from digital images taken of a material surface during plane-strain process experiments. The selection of both hardware and software components of the image processing system is presented, and the numerical procedures developed for measuring the 2D material deformations are described. The algorithms are presented with respect to two-roll milling of sugar cane bagasse, a complex fibro-porous material that undergoes large strains during processing to extract the sucrose-rich liquid. Elaborations are made in regard to numerical developments for other forms of experimentation, algorithm calibrations and measurement improvements. Finite 2D strain results are shown for both confined uniaxial compression and two-roll milling experiments.

  9. Measuring and using admixture to study the genetics of complex diseases

    Halder Indrani

    2003-11-01

    Full Text Available Abstract Admixture is an important evolutionary force that can and should be used in efforts to apply genomic data and technology to the study of complex disease genetics. Admixture linkage disequilibrium (ALD is created by the process of admixture and, in recently admixed populations, extends for substantial distances (of the order of 10 to 20 cM. The amount of ALD generated depends on the level of admixture, ancestry information content of markers and the admixture dynamics of the population, and thus influences admixture mapping (AM. The authors discuss different models of admixture and how these can have an impact on the success of AM studies. Selection of markers is important, since markers informative for parental population ancestry are required and these are uncommon. Rarely does the process of admixture result in a population that is uniform for individual admixture levels, but instead there is substantial population stratification. This stratification can be understood as variation in individual admixtures and can be both a source of statistical power for ancestry-phenotype correlation studies as well as a confounder in causing false-positives in gene association studies. Methods to detect and control for stratification in case/control and AM studies are reviewed, along with recent studies showing individual ancestry-phenotype correlations. Using skin pigmentation as a model phenotype, implications of AM in complex disease gene mapping studies are discussed. Finally, the article discusses some limitations of this approach that should be considered when designing an effective AM study.

  10. Noise exposure assessment with task-based measurement in complex noise environment

    LI Nan; YANG Qiu-ling; ZENG Lin; ZHU Liang-liang; TAO Li-yuan; ZHANG Hua; ZHAO Yi-ming

    2011-01-01

    Background Task-based measurement (TBM) is a method to assess the eight-hour A-weighted equivalent noise exposure level (LAeq. 8h) besides dosimeter. TBM can be better used in factories by non-professional workers and staffs.However, it is still not clear if TBM is equal or similar with dosimeter for LAeq.8h measurement in general. This study considered the measurement with dosimeter as real personal noise exposure level (PNEL) and assessed the accuracy of TBM by comparing the consistencies of TBM and dosimeter in LAeq.8h measurement.Methods The study was conducted in one automobile firm among 387 workers who are exposed to unstable noise.Dosimeters and TBM were used to compare the two strategies and assess the degree of agreement and causes of disagreement. Worker's PNEL was measured via TBM for noise; the real PNEL was also recorded. The TBM for noise was computed with task/position noise levels measured via sound level meter and workers' exposure information collected via working diary forms (WDF) filled by participants themselves. Full-shift noise exposure measurement via personal noise dosimeters were taken as the real PNEL. General linear model (GLM) was built to analyze the accuracy of TBM for noise and the source of difference between TBM for noise and real PNEL.Results The LAeq.8h with TBM were slightly higher than the real PNELs, except the electricians. Differences of the two values had statistical significance in stamping workers (P <0.001), assembly workers (P=0.015) and welding workers (P=0.001). The correlation coefficient of LAeq.8h with TBM and real PNELs was 0.841. Differences of the two results were mainly affected by real PNEL (F=11.27, P=0.001); and work groups (F=3.11, P <0.001) divided by jobs and workshops were also independent factors. PNEL of workers with fixed task/position ((86.53±8.82) dB(A)) was higher than those without ((75.76±9.92) dB(A)) (t=8.84, P <0.01). Whether workers had fixed task/position was another factor on the

  11. Characterization of known protein complexes using k-connectivity and other topological measures [version 2; referees: 1 approved, 2 approved with reservations

    Suzanne R Gallagher

    2015-10-01

    Full Text Available Many protein complexes are densely packed, so proteins within complexes often interact with several other proteins in the complex. Steric constraints prevent most proteins from simultaneously binding more than a handful of other proteins, regardless of the number of proteins in the complex. Because of this, as complex size increases, several measures of the complex decrease within protein-protein interaction networks. However, k-connectivity, the number of vertices or edges that need to be removed in order to disconnect a graph, may be consistently high for protein complexes. The property of k-connectivity has been little used previously in the investigation of protein-protein interactions. To understand the discriminative power of k-connectivity and other topological measures for identifying unknown protein complexes, we characterized these properties in known Saccharomyces cerevisiae protein complexes in networks generated both from highly accurate X-ray crystallography experiments which give an accurate model of each complex, and also as the complexes appear in high-throughput yeast 2-hybrid studies in which new complexes may be discovered. We also computed these properties for appropriate random subgraphs.We found that clustering coefficient, mutual clustering coefficient, and k-connectivity are better indicators of known protein complexes than edge density, degree, or betweenness. This suggests new directions for future protein complex-finding algorithms.

  12. Solution NMR Experiment for Measurement of (15)N-(1)H Residual Dipolar Couplings in Large Proteins and Supramolecular Complexes.

    Eletsky, Alexander; Pulavarti, Surya V S R K; Beaumont, Victor; Gollnick, Paul; Szyperski, Thomas

    2015-09-01

    NMR residual dipolar couplings (RDCs) are exquisite probes of protein structure and dynamics. A new solution NMR experiment named 2D SE2 J-TROSY is presented to measure N-H RDCs for proteins and supramolecular complexes in excess of 200 kDa. This enables validation and refinement of their X-ray crystal and solution NMR structures and the characterization of structural and dynamic changes occurring upon complex formation. Accurate N-H RDCs were measured at 750 MHz (1)H resonance frequency for 11-mer 93 kDa (2)H,(15)N-labeled Trp RNA-binding attenuator protein tumbling with a correlation time τc of 120 ns. This is about twice as long as that for the most slowly tumbling system, for which N-H RDCs could be measured, so far, and corresponds to molecular weights of ∼200 kDa at 25 °C. Furthermore, due to the robustness of SE2 J-TROSY with respect to residual (1)H density from exchangeable protons, increased sensitivity at (1)H resonance frequencies around 1 GHz promises to enable N-H RDC measurement for even larger systems. PMID:26293598

  13. The Eye-Key Span as a Measure for Translation Complexity

    Carl, Michael; Schaeffer, Moritz

    Dragsted (Dragsted & Hansen, 2008; Dragsted, 2010) developed the eye-key span (EKS) in reference to the ear-voice span which is used to describe the distance between input and output during simultaneous interpreting, typically measured in words or seconds (e.g. Defrancq, 2015). The EKS during...... spans than easy words. The difficulty of the words is described in terms of the number of alternative translations different translators produced for the same source text words. Easy words were translated the same way by all translators and difficult words were translated differently by nearly all...... more monolingual. Trad itional eye movement measures cannot adequately describe the processes which are unique to the task of translation. The EKS and the degree to which ST reading and TT typing co-occur are measures address this shortcoming....

  14. ELF field in the proximity of complex power line configuration measurement procedures

    The issue of how to measure magnetic induction fields generated by various power line configurations, when there are several power lines that run across the same exposure area, has become a matter of interest and study within the Regional Environment Protection Agency of Friuli Venezia Giulia. In classifying the various power line typologies the definition of double circuit line was given: in this instance the magnetic field is determined by knowing the electrical and geometric parameters of the line. In the case of independent lines instead, the field is undetermined. It is therefore pointed out how, in the latter case, extracting projected information from a set of measurements of the magnetic field alone is impossible. Making measurements throughout the territory of service has in several cases offered the opportunity to define standard operational procedures. (authors)

  15. Measurement of the high-frequency complex permittivity and conductivity of magnetic fluids

    Fannin, P. C.; Charles, S. W.; Vincent, D.; Giannitsis, A. T.

    2002-11-01

    Measurements of the permittivity, ɛ( ω)= ɛ'( ω)-i ɛ″( ω), and conductivity, σ( ω)= ωɛ0ɛ″( ω), of two ferrofluid samples of ferrite particles in a hydrocarbon carrier, isopar M, over the frequency range 0.1-6 GHz are presented. It is shown that the sample with the highest concentration of particles has the highest permittivity and by means of profile fitting it is demonstrated, for the first time, that σ( ω) has a Debye-type profile. The frequency limitation of the used measurement technique is highlighted and a possible alternative technique capable of measuring in the 20-30 GHz region is presented.

  16. Investigation of scandium nitrate-nitric acid-water-tributylphosphate system. 1. Scandium extraction isotherms

    Distribution of scandium between 100% tbp and 0.2-1.2 mole/l scandium in the 2.7-12.3 mole/l nitric acid is investigated. It is shown that Sc extracts in undiluted tbp, equilibrium with 6-12 mole/l HNO3, contain approximately from one to five molecules pro atom of scandium in the extracted Sc-complex. 14 refs.; 2 figs

  17. Measurement of net electric charge and dipole moment of dust aggregates in a complex plasma

    Yousefi, Razieh; Carmona-Reyes, Jorge; Matthews, Lorin S; Hyde, Truell W

    2014-01-01

    Understanding the agglomeration of dust particles in complex plasmas requires a knowledge of the basic properties such as the net electrostatic charge and dipole moment of the dust. In this study, dust aggregates are formed from gold coated mono-disperse spherical melamine-formaldehyde monomers in a radio-frequency (rf) argon discharge plasma. The behavior of observed dust aggregates is analyzed both by studying the particle trajectories and by employing computer models examining 3D structures of aggregates and their interactions and rotations as induced by torques arising from their dipole moments. These allow the basic characteristics of the dust aggregates, such as the electrostatic charge and dipole moment, to be determined. It is shown that the experimental results support the predicted values from computer models for aggregates in these environments.

  18. Measurement of microbial alpha-amylases with p-nitrophenyl glycosides as the substrate complex.

    Trepeta, R W; Edberg, S C

    1984-01-01

    The detection of alpha-amylase is commonly used in clinical microbiology laboratories to aid in differentiating Streptococcus bovis from other streptococci. It is also useful in identifying Eikenella corrodens and the gravis subspecies of Corynebacterium diphtheriae and in separating species of the genera Bacteroides, Clostridium, Actinomyces, and Bacillus. Currently, the most frequently used procedure utilizes starch as the substrate and iodine as the indicator. Starch is incorporated into a agar medium, the isolate is inoculated on the surface, and the medium is incubated for 24 to 48 h. A 15-min test containing p-nitrophenyl polyglycosides as the substrate complex was developed to yield results comparable with the agar-based starch test. The reagent was made in liquid form, 0.20 ml per tube, and could be incubated either in ambient air or at 35 degrees C. When dried, the p-nitrophenyl polyglycoside reagent could be stored at 0 degrees C for 4 weeks. PMID:6418764

  19. Method for reconstruction of complex surface shapes from a reflection-based non-null interferometric measurement

    Micali, Jason D.; Greivenkamp, John E.

    2016-03-01

    Complex surface forms are becoming increasingly prevalent in optical designs, requiring advances in manufacturing and surface metrology to maintain the state of the art. Non-null interferometry extends the range of standard interferometers to test complex shapes without the need for complicated and expensive compensating elements. However, non-null measurements will accumulate significant retrace errors, or interferometer-induced errors, which can be difficult to isolate from surface figure errors. Methods discussed in the literature to correct for retrace errors in a reflection-based interferometer are computationally intensive and limited in spatial resolution. A method is presented for reconstructing complex surface shapes in a reflection-based non-null interferometer configuration, which is computationally efficient, easy to implement, and can produce high spatial resolution surface reconstructions. The method is verified against simulated surfaces that contain more than 200 μm of surface departure from a null configuration. Examples are provided to demonstrate the effects of measurement noise and interferometer model uncertainties, as well as an experimental validation of the method.

  20. Areal Measurements of Ozone, Water, and Heat Fluxes Over Land With Different Surface Complexity, Using Aircraft

    Contemporary models addressing issues of air quality and/or atmospheric deposition continue to exploit air-surface exchange formulations originating from single-tower studies. In reality,these expressions describe situations that are rare in the real world - nearly flat and spatially homogeneous. There have been several theoretical suggestions about how to extend from single-point understanding to areal descriptions, but so far the capability to address the problem experimentally has been limited. In recent years, however, developments in sensing technology have permitted adaptation of eddy-correlation methods to low-flying aircraft in a far more cost-effective manner than previously. A series of field experiments has been conducted, ranging from flat farmland to rolling countryside, employing a recently modified research aircraft operated by the US NationalOceanic and Atmospheric Administration (NOAA). The results demonstrate the complexity of the spatial heterogeneity question,especially for pollutants (ozone in particular). In general, the uncertainty associated with the adoption of any single-point formulation when describing areal averages is likely to be in the range 10% to 40%. In the case of sensible and latent heat fluxes, the overall behavior is controlled by the amount of energy available. For pollutant deposition, there is no constraint equivalent to the net radiation limitation on convective heat exchange. Consequently, dry deposition rates and air-surface exchange of trace gases in general are especially vulnerable to errors in spatial extrapolation. The results indicate that the susceptibility of dry deposition formulations to terrain complexity depends on the deposition velocity itself. For readily transferred pollutants (such as HNO3), a factor of two error could be involved

  1. Characterisation of Complex Electrode Processes using Simultaneous Impedance Spectroscopy and Electrochemical Nanogravimetric Measurements

    Berkes, B. B.; Huang, M.; Henry, J. B.; Kokoschka, Malte; Bandarenka, A. S.

    2014-01-01

    Roč. 79, č. 3 (2014), s. 348-358. ISSN 2192-6506 Institutional support: RVO:61388963 Keywords : electrochemistry * impedance spectroscopy * monolayers * nanogravimetric measurements * self-assembly Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 2.997, year: 2014

  2. Microwave generation and complex microwave responsivity measurements on small Dayem bridges

    Pedersen, Niels Falsig; Sørensen, O; Mygind, Jesper;

    1977-01-01

    Measurements of the active properties of a Dayem micro-bridge at X-band frequencies is described. The bridge was mounted in a microwave cavity designed to match the bridge properly and the microwave output from the cavity was detected using a sensitive X-band spectrometer. Microwave power...

  3. Measures of Causality in Complex Datasets with Application to Financial Data

    Anna Zaremba

    2014-04-01

    Full Text Available This article investigates the causality structure of financial time series. We concentrate on three main approaches to measuring causality: linear Granger causality, kernel generalisations of Granger causality (based on ridge regression and the Hilbert–Schmidt norm of the cross-covariance operator and transfer entropy, examining each method and comparing their theoretical properties, with special attention given to the ability to capture nonlinear causality. We also present the theoretical benefits of applying non-symmetrical measures rather than symmetrical measures of dependence. We apply the measures to a range of simulated and real data. The simulated data sets were generated with linear and several types of nonlinear dependence, using bivariate, as well as multivariate settings. An application to real-world financial data highlights the practical difficulties, as well as the potential of the methods. We use two real data sets: (1 U.S. inflation and one-month Libor; (2 S&P data and exchange rates for the following currencies: AUDJPY, CADJPY, NZDJPY, AUDCHF, CADCHF, NZDCHF. Overall, we reach the conclusion that no single method can be recognised as the best in all circumstances, and each of the methods has its domain of best applicability. We also highlight areas for improvement and future research.

  4. Multi-sensor data fusion for measurement of complex freeform surfaces

    Ren, M. J.; Liu, M. Y.; Cheung, C. F.; Yin, Y. H.

    2016-01-01

    Along with the rapid development of the science and technology in fields such as space optics, multi-scale enriched freeform surfaces are widely used to enhance the performance of the optical systems in both functionality and size reduction. Multi-sensor technology is considered as one of the promising methods to measure and characterize these surfaces at multiple scales. This paper presents a multi-sensor data fusion based measurement method to purposely extract the geometric information of the components with different scales which is used to establish a holistic geometry of the surface via data fusion. To address the key problems of multi-sensor data fusion, an intrinsic feature pattern based surface registration method is developed to transform the measured datasets to a common coordinate frame. Gaussian zero-order regression filter is then used to separate each measured data in different scales, and the datasets are fused based on an edge intensity data fusion algorithm within the same wavelength. The fused data at different scales is then merged to form a new surface with holistic multiscale information. Experimental study is presented to verify the effectiveness of the proposed method.

  5. Building out a Measurement Model to Incorporate Complexities of Testing in the Language Domain

    Wilson, Mark; Moore, Stephen

    2011-01-01

    This paper provides a summary of a novel and integrated way to think about the item response models (most often used in measurement applications in social science areas such as psychology, education, and especially testing of various kinds) from the viewpoint of the statistical theory of generalized linear and nonlinear mixed models. In addition,…

  6. Online measurement of mental representations of complex spatial decision problems : Comparison of CNET and hard laddering

    O. Horeni (Oliver); T.A. Arentze (Theo); B.G.C. Dellaert (Benedict); H.J.P. Timmermans (Harry)

    2013-01-01

    textabstractThis paper introduces the online Causal Network Elicitation Technique (CNET), as a technique for measuring components of mental representations of choice tasks and compares it with the more common technique of online ‘hard’ laddering (HL). While CNET works in basically two phases, one in

  7. Measuring component of integrated information technology design and manufacturing of products with complex geometric configuration

    О.О. Лещенко

    2007-02-01

    Full Text Available  Use measuring information technologies in modern production - are analyzed. It is made checking the detail on three cuts and is determined deflections of the detail in cut from its mathematical model on example of the concrete detail.

  8. Real time pressure-volume loops in mice using complex admittance: measurement and implications.

    Kottam, Anil T G; Porterfield, John; Raghavan, Karthik; Fernandez, Daniel; Feldman, Marc D; Valvano, Jonathan W; Pearce, John A

    2006-01-01

    Real time left ventricular (LV) pressure-volume (P-V) loops have provided a framework for understanding cardiac mechanics in experimental animals and humans. Conductance measurements have been used for the past 25 years to generate an instantaneous left ventricular (LV) volume signal. The standard conductance method yields a combination of blood and ventricular muscle conductance; however, only the blood signal is used to estimate LV volume. The state of the art techniques like hypertonic saline injection and IVC occlusion, determine only a single steady-state value of the parallel conductance of the cardiac muscle. This is inaccurate, since the cardiac muscle component should vary instantaneously throughout the cardiac cycle as the LV contracts and fills, because the distance from the catheter to the muscle changes. The capacitive nature of cardiac muscle can be used to identify its contribution to the combined conductance signal. This method, in contrast to existing techniques, yields an instantaneous estimate of the parallel admittance of cardiac muscle that can be used to correct the measurement in real time. The corrected signal consists of blood conductance alone. We present the results of real time in vivo measurements of pressure-admittance and pressure-phase loops inside the murine left ventricle. We then use the magnitude and phase angle of the measured admittance to determine pressure volume loops inside the LV on a beat by beat basis. These results may be used to achieve a substantial improvement in the state of the art in this measurement method by eliminating the need for hypertonic saline injection. PMID:17946238

  9. Effect of ions on sulfuric acid-water binary particle formation: 2. Experimental data and comparison with QC-normalized classical nucleation theory

    Duplissy, J.; Merikanto, J.; Franchin, A.; Tsagkogeorgas, G.; Kangasluoma, J.; Wimmer, D.; Vuollekoski, H.; Schobesberger, S.; Lehtipalo, K.; Flagan, R. C.; Brus, D.; Donahue, N. M.; Vehkamäki, H.; Almeida, J.; Amorim, A.; Barmet, P.; Bianchi, F.; Breitenlechner, M.; Dunne, E. M.; Guida, R.; Henschel, H.; Junninen, H.; Kirkby, J.; Kürten, A.; Kupc, A.; Määttänen, A.; Makhmutov, V.; Mathot, S.; Nieminen, T.; Onnela, A.; Praplan, A. P.; Riccobono, F.; Rondo, L.; Steiner, G.; Tome, A.; Walther, H.; Baltensperger, U.; Carslaw, K. S.; Dommen, J.; Hansel, A.; Petäjä, T.; Sipilä, M.; Stratmann, F.; Vrtala, A.; Wagner, P. E.; Worsnop, D. R.; Curtius, J.; Kulmala, M.

    2016-02-01

    We report comprehensive, demonstrably contaminant-free measurements of binary particle formation rates by sulfuric acid and water for neutral and ion-induced pathways conducted in the European Organization for Nuclear Research Cosmics Leaving Outdoor Droplets chamber. The recently developed Atmospheric Pressure interface-time of flight-mass spectrometer was used to detect contaminants in charged clusters and to identify runs free of any contaminants. Four parameters were varied to cover ambient conditions: sulfuric acid concentration (105 to 109 mol cm-3), relative humidity (11% to 58%), temperature (207 K to 299 K), and total ion concentration (0 to 6800 ions cm-3). Formation rates were directly measured with novel instruments at sizes close to the critical cluster size (mobility size of 1.3 nm to 3.2 nm). We compare our results with predictions from Classical Nucleation Theory normalized by Quantum Chemical calculation (QC-normalized CNT), which is described in a companion paper. The formation rates predicted by the QC-normalized CNT were extended from critical cluster sizes to measured sizes using the UHMA2 sectional particle microphysics model. Our results show, for the first time, good agreement between predicted and measured particle formation rates for the binary (neutral and ion-induced) sulfuric acid-water system. Formation rates increase with RH, sulfuric acid, and ion concentrations and decrease with temperature at fixed RH and sulfuric acid concentration. Under atmospheric conditions, neutral particle formation dominates at low temperatures, while ion-induced particle formation dominates at higher temperatures. The good agreement between the theory and our comprehensive data set gives confidence in using the QC-normalized CNT as a powerful tool to study neutral and ion-induced binary particle formation in atmospheric modeling.

  10. Measuring The Impact Of Innovations On Efficiency In Complex Hospital Settings

    Bonća Petra Došenović

    2015-12-01

    Full Text Available In this paper the authors propose an approach for measuring the impact of innovations on hospital efficiency. The suggested methodology can be applied to any type of innovation, including technology-based innovations, as well as consumer-focused and business model innovations. The authors apply the proposed approach to measure the impact of transcanalicular diode laser-assisted dacryocystorhinostomy (DCR, i.e. an innovation introduced in the surgical procedure for treating a tear duct blockage, on the efficiency of general hospitals in Slovenia. They demonstrate that the impact of an innovation on hospital efficiency depends not only on the features of the studied innovation but also on the characteristics of hospitals adopting the innovation and their external environment represented by a set of comparable hospitals.

  11. The challenging measurement of protein in complex biomass-derived samples

    Haven, M.O.; Jørgensen, H.

    2014-01-01

    Measurement of the protein content in samples from production of lignocellulosic bioethanol is an important tool when studying the adsorption of cellulases. Several methods have been used for this, and after reviewing the literature, we concluded that one of the most promising assays for simple and...... fast protein measurement on this type of samples was the ninhydrin assay. This method has also been used widely for this purpose, but with two different methods for protein hydrolysis prior to the assay - alkaline or acidic hydrolysis. In samples containing glucose or ethanol, there was significant...... interference from these compounds when using acid hydrolysis, which was not the case when using the alkaline hydrolysis. We evaluated the interference from glucose, cellulose, xylose, xylan, lignin and ethanol on protein determination of BSA, Accellerase® 1500 and Cellic® CTec2. The experiments demonstrated...

  12. Serum, urinary, and salivary nitric oxide in rheumatoid arthritis: complexities of interpreting nitric oxide measures

    Weinberg, J. Brice; Lang, Thomas; Wilkinson, William E.; Pisetsky, David S.; St Clair, E. William

    2006-01-01

    Nitric oxide (NO) may play important roles in rheumatoid arthritis (RA). RA is an inflammatory disease involving joints and other systems including salivary glands. To assess NO production in RA patients, we compared levels of serum, urine, and salivary nitrite and nitrate (NOx) in patients with RA and normal subjects, and we examined the relationships of these measures to disease activity. Serum, urine, and NOx levels as well as renal creatinine, NOx clearance and fractional excretion rates ...

  13. Presentation of a quality organisation pattern for the design, manufacturing and implementation of complex measuring chains

    The EFMT (Experience Feedback, Measures-Tests) branch of EDF Research and Development Division designs and installs instrumentation systems for power generation sites. These systems include either testing (thermal and mechanical operation surveys) or process control instruments. The context in which instrumentation is developed and used has much varied during the past few years from both technical and organisational viewpoint. An instrumentation system consists of a set of measuring chains associated to communication supports and an acquisition software; the technical knowledge involved are highly diversified: measurement, field bus, computing, data processing... Customers now include quality requirements in their specifications and often make reference to standards of the EN 29000 series. The EFMT Branch has defined a quality approach applicable to instrumentation field which aims at ensuring the technical success (namely to attain the expected characteristics) and meeting customers' quality requirements. This approach based upon project management techniques defines the design implementation and operating process phases. It emphasizes a global approach to instrumentation while promoting the communication between the partners in a project. This paper presents the whole approach and underlines its critical phases: users' requirements, testing and acceptance procedures. (authors). 5 refs., 2 figs., 1 tab

  14. Interaction of Cucurbit(5)uril with U(VI) in formic acid water medium

    Cucurbit(n)urils (CBn) are a new class of macrocyclic cage compounds capable of binding organic and inorganic species, owing to their unique pumpkin like structure comprising of both hydrophobic cavity and hydrophilic portal. Complexation of U(VI) with Cucurbit(5)uril (CB5) in 50 wt% formic acid medium has been studied by UV-Vis spectroscopy. In order to understand the species formed, the interaction of formic acid with CB5 was studied by monitoring fluorescence of CB5. Formic was found to form 1:1 species with interaction constant (K) 17.4 M-1. (author)

  15. Determining Wind Turbine Gearbox Model Complexity Using Measurement Validation and Cost Comparison: Preprint

    LaCava, W.; Xing, Y.; Guo, Y.; Moan, T.

    2012-04-01

    The Gearbox Reliability Collaborative (GRC) has conducted extensive field and dynamometer test campaigns on two heavily instrumented wind turbine gearboxes. In this paper, data from the planetary stage is used to evaluate the accuracy and computation time of numerical models of the gearbox. First, planet-bearing load and motion data is analyzed to characterize planetary stage behavior in different environments and to derive requirements for gearbox models and life calculations. Second, a set of models are constructed that represent different levels of fidelity. Simulations of the test conditions are compared to the test data and the computational cost of the models are compared. The test data suggests that the planet-bearing life calculations should be made separately for each bearing on a row due to unequal load distribution. It also shows that tilting of the gear axes is related to planet load share. The modeling study concluded that fully flexible models were needed to predict planet-bearing loading in some cases, although less complex models were able to achieve good correlation in the field-loading case. Significant differences in planet load share were found in simulation and were dependent on the scope of the model and the bearing stiffness model used.

  16. Using Complexity Metrics With R-R Intervals and BPM Heart Rate Measures

    Wallot, Sebastian; Fusaroli, Riccardo; Tylén, Kristian;

    2013-01-01

    on variability of the data, different choices regarding the kind of measures can have a substantial impact on the results. In this article we compare linear and non-linear statistics on two prominent types of heart beat data, beat-to-beat intervals (R-R interval) and beats-per-minute (BPM). As a proof...... dynamics, but their power to do so critically depends on the type data that is employed: While R-R intervals are very susceptible to nonlinear analyses, the success of nonlinear methods for BPM data critically depends on their construction. Generally, ‘oversampled’ BPM time-series can be recommended...

  17. The direct and indirect measurement of boundary stress and drag on individual and complex arrays of elements

    Tinoco, Rafael O.; Cowen, Edwin A.

    2013-04-01

    Motivated by the study of drag on plant canopies, a novel non-intrusive drag measurement device was developed—its design, calibration, and validation are presented. The device is based on isolating a region of a test facility, a section of the bed of an open channel flume in the present case, from the facility itself. The drag plate, sufficiently large to allow for spatial averaging over multiple elements, is constrained to move on essentially frictionless rails in the direction of flow, and the force applied to the plate by the interaction of objects on the plate with the flow is monitored. In contrast to force balances used in wind tunnels, our design allows for easy mounting of multiple elements on different configurations, it holds large vertical loads with negligible effect to the horizontal forces measured, does not require intrusive frames to hold the elements within the flow, all of its components are externally located at the bottom of the flume, providing immediate access for adjustments, and the mounted load cell is easily interchangeable to increase the measurement dynamic range without system modifications. The measurement of two canonical, well-studied cases is used to validate the drag plate approach: drag induced by a turbulent boundary layer and the drag on a rigid cylinder. A third series of experiments, flow through arrays of rigid cylinders, is presented to show the applicability of the drag plate on more complex flows. The experimental results confirm the drag plate approach to be suitable for the accurate direct measurement of drag on simple and complex arrays of objects, which makes it ideal for studies of vegetated flows, natural rough boundary layers, coastal structures, and urban canopies, just to name a few possibilities.

  18. Complex Measurement System for Enhancement of Capability for Marine Engines Diagnostics

    Adam Charchalis

    2013-09-01

    Full Text Available Modern way of machines’ exploitation, due to their high level of structural complication, requires proper level of supervising. That supervising is generally based on detection of pre-failure states and evaluation of machines’ single elements or components condition. In the frame of development of the research capacity of the Mechanical Faculty of Maritime Academy in Gdynia, has been developed the Exploitation Decision Aid System for marine engines exploitation. The system was based on existing test bed with the marine diesel engine Sulzer AL 25/30 as a core element. Modernization of the measurement equipment, significantly extended research capacity, what resulted with improvement of quality, extension of the span, and acceleration of conducted research and development works in the domain of safety of exploitation and diagnostics of marine power plants. Investments in modern measurement apparatus enables also an extension of the range of research and expertise related to engines’ failures and pollutants emission, in relation to broad spectrum of implemented fuels. The goal has been achieved in way of modernization of engine’s monitoring system and stands in Technical Diagnostic Laboratory.

  19. Complexities of particulate matter measurement in parenteral formulations of small-molecule amphiphilic drugs.

    Hickey, Magali B; Waggener, Sara; Gole, Dilip; Jimidar, Ilias; Vermeersch, Hans; Ratanabanangkoon, Poe; Tinke, Arjen P; Almarsson, Örn

    2011-03-01

    Reconstituted parenteral solutions of three surface-active anti-infective small-molecule drugs and solutions of sodium dodecyl sulfate (SDS, a model surfactant) were studied to quantify the impact of sample preparation and handling on particle counts. Turbidimetry and light obscuration profiles were recorded as a function of agitation and shearing with and without the introduction of foam into the solutions. SDS solutions at concentrations above the critical micelle concentration (CMC) show significantly greater sensitivity to shear and foam presence than SDS solution below the CMC: Values of >10 μm particles increased 8 fold over control (an unsheared sample) in the micellar solution vs. 4 fold particle count increase over control at a sub-micellar concentration. An even more significant increase in the ratio of particle count in sheared/unsheared solution is seen for >25 μm unit counts, due to the increased interference of foam with the measurement. Two commercial products, injection formulations of teicoplanin and cefotaxime sodium, as well as an investigational compound 1, showed an increase in scattering as a function of foam production. The impact of foaming was significant, resulting in an increase of turbidity and light obscuration measurements in all solutions. The results illustrate some of the challenges that are inherent to optically clear, homogeneous pharmaceutical injections containing compounds which have a tendency toward self-association and surfactant-like behavior. PMID:21234824

  20. Circular dichroism measured on single chlorosomal light-harvesting complexes of green photosynthetic bacteria

    Furumaki, Shu

    2012-12-06

    We report results on circular dichroism (CD) measured on single immobilized chlorosomes of a triple mutant of green sulfur bacterium Chlorobaculum tepidum. The CD signal is measured by monitoring chlorosomal bacteriochlorphyll c fluorescence excited by alternate left and right circularly polarized laser light with a fixed wavelength of 733 nm. The excitation wavelength is close to a maximum of the negative CD signal of a bulk solution of the same chlorosomes. The average CD dissymmetry parameter obtained from an ensemble of individual chlorosomes was gs = -0.025, with an intrinsic standard deviation (due to variations between individual chlorosomes) of 0.006. The dissymmetry value is about 2.5 times larger than that obtained at the same wavelength in the bulk solution. The difference can be satisfactorily explained by taking into account the orientation factor in the single-chlorosome experiments. The observed distribution of the dissymmetry parameter reflects the well-ordered nature of the mutant chlorosomes. © 2012 American Chemical Society.

  1. PREPARATION OF XYLOSE AND KRAFT PULP FROM POPLAR BASED ON FORMIC/ACETIC ACID /WATER SYSTEM HYDROLYSIS

    Junping Zhuang

    2009-08-01

    Full Text Available A formic/acetic acid/water system was used in the ratios of 30:60:10, 20:60:20, and 30:50:20 separately for efficient hydrolysis and bioconversion of poplar chips, under the solid/liquid ratio of 1:12(g/ml, at 105 oC for 30, 45, 60, 75, and 90 min, respectively. The highest yield of 69.89% was at a formic/acetic acid /water ratio of 30:50:20(v/v/v, with solid/liquid in the ratio of 1:12(g/ml at 105 oC for 90min. Lower kappa number and similar yield were achieved when hydrolytic residual woodchips were used for kraft pulping with over 2% Na2O and temperature 5 °C lower compared to untreated chips. Pulps from prehydrolysis-treated chips were easy to beat. But the tensile index, tear index, and burst index of the handsheets obtained from pulp with lowest kappa number from prehydrolysis-treated poplar chips were lower than those of the pulp from the untreated chips. Considerable xylose could be obtained from the prehydrolysis stage following kraft pulping under the same conditions for prehydrolysis-treated chips and untreated chips. However, by building on the mature kraft pulping and xylitol processes, large amounts of xylose from the hemicellulose were obtained in prehydrolysis, allowing production of high-valued products via biorefinery pathways. An economical balance of chemical dosage, energy consumption, pulp properties, and xylose value for prehydrolysis with organic acid should be reached with further investigation.

  2. An investigation of ozone and planetary boundary layer dynamics over the complex topography of Grenoble combining measurements and modeling

    O. Couach

    2003-01-01

    Full Text Available This paper concerns an evaluation of ozone (O3 and planetary boundary layer (PBL dynamics over the complex topography of the Grenoble region through a combination of measurements and mesoscale model (METPHOMOD predictions for three days, during July 1999. The measurements of O3 and PBL structure were obtained with a Differential Absorption Lidar (DIAL system, situated 20 km south of Grenoble at Vif (310 m ASL. The combined lidar observations and model calculations are in good agreement with atmospheric measurements obtained with an instrumented aircraft (METAIR. Ozone fluxes were calculated using lidar measurements of ozone vertical profiles concentrations and the horizontal wind speeds measured with a Radar Doppler wind profiler (DEGREANE. The ozone flux patterns indicate that the diurnal cycle of ozone production is controlled by local thermal winds. The convective PBL maximum height was some 2700 m above the land surface while the nighttime residual ozone layer was generally found between 1200 and 2200 m. Finally we evaluate the magnitude of the ozone processes at different altitudes in order to estimate the photochemical ozone production due to the primary pollutants emissions of Grenoble city and the regional network of automobile traffic.

  3. Quantification of Soil Pore Network Complexity with X-ray Computed Tomography and Gas Transport Measurements

    Katuwal, Sheela; Arthur, Emmanuel; Tuller, Markus;

    2015-01-01

    different soils subjected to 22 mo of field regeneration were quantified with X-ray computed tomography (CT) and compared with functional pore characteristics estimated from measurements of air permeability and gas diffusivity. Furthermore, predictive models for air permeability and gas diffusivity were...... developed based on CT-derived structural parameters and compared with previously proposed predictive models. Strong correlations between functional and pore geometry parameters were observed. The consideration of CT-derived air-filled porosity, pore network tortuosity and connectivity, and minimum...... equivalent pore diameter in predictive gas diffusivity and air permeability models significantly improved their performance. The obtained results suggest that the application of X-ray CT-derived pore-structural parameters has great potential for predicting gas diffusivity and air permeability....

  4. Analysis of full scale measurements for the investigation of the turbulence structure acting on a rotor disk over complex terrain

    Glinou, G.L.; Morfiadakis, E.E.; Koulouvari, M.J. [Centre for Renewable Energy Sources, Wind Energy Dept., Pikermi (Greece)

    1996-12-31

    In the framework of the MOUNTURB project, contract no JOU2-CT93- 0378, co-funded by the European Union, full scale measurements have been carried out at CRES`s test site, situated in complex terrain, for the investigation of the wind and turbulence structure acting on the rotor disk of a wind turbine. The analysis of the deterministic characteristics shows evidence of terrain induced effects on both the longitudinal and the vertical velocity component. The analysis of the stochastic characteristics of the wind field suggests non isotropic turbulence decreasing with the height above ground level. The measured coherence exhibited the typical exponential decay with increasing turbulence frequency and the decay rate is increasing with wind speed. The horizontal coherence is slightly higher than the vertical coherence. (Author)

  5. A complexity measure based method for studying the dependence of 222Rn concentration time series on indoor air temperature and humidity

    Mihailovic, Dragutin T; Krmar, Miodrag; Arsenić, Ilija

    2013-01-01

    We have suggested a complexity measure based method for studying the dependence of measured 222Rn concentration time series on indoor air temperature and humidity. This method is based on the Kolmogorov complexity (KL). We have introduced (i) the sequence of the KL, (ii) the Kolmogorov complexity highest value in the sequence (KLM) and (iii) the KL of the product of time series. The noticed loss of the KLM complexity of 222Rn concentration time series can be attributed to the indoor air humidity that keeps the radon daughters in air.

  6. A new parallel plate shear cell for in situ real-space measurements of complex fluids under shear flow

    Wu, Yu Ling; Brand, Joost H. J.; van Gemert, Josephus L. A.; Verkerk, Jaap; Wisman, Hans; van Blaaderen, Alfons; Imhof, Arnout

    2007-10-01

    We developed and tested a parallel plate shear cell that can be mounted on top of an inverted microscope to perform confocal real-space measurements on complex fluids under shear. To follow structural changes in time, a plane of zero velocity is created by letting the plates move in opposite directions. The location of this plane is varied by changing the relative velocities of the plates. The gap width is variable between 20 and 200μm with parallelism better than 1μm. Such a small gap width enables us to examine the total sample thickness using high numerical aperture objective lenses. The achieved shear rates cover the range of 0.02-103s-1. This shear cell can apply an oscillatory shear with adjustable amplitude and frequency. The maximum travel of each plate equals 1cm, so that strains up to 500 can be applied. For most complex fluids, an oscillatory shear with such a large amplitude can be regarded as a continuous shear. We measured the flow profile of a suspension of silica colloids in this shear cell. It was linear except for a small deviation caused by sedimentation. To demonstrate the excellent performance and capabilities of this new setup we examined shear induced crystallization and melting of concentrated suspensions of 1μm diameter silica colloids.

  7. Low Charge and Reduced Mobility of Membrane Protein Complexes Has Implications for Calibration of Collision Cross Section Measurements.

    Allison, Timothy M; Landreh, Michael; Benesch, Justin L P; Robinson, Carol V

    2016-06-01

    Ion mobility mass spectrometry of integral membrane proteins provides valuable insights into their architecture and stability. Here we show that, due to their lower charge, the average mobility of native-like membrane protein ions is approximately 30% lower than that of soluble proteins of similar mass. This has implications for drift time measurements, made on traveling wave ion mobility mass spectrometers, which have to be calibrated to extract collision cross sections (Ω). Common calibration strategies employ unfolded or native-like soluble protein standards with masses and mobilities comparable to the protein of interest. We compare Ω values for membrane proteins, derived from standard calibration protocols using soluble proteins, to values measured using an RF-confined drift tube. Our results demonstrate that, while common calibration methods underestimate Ω for native-like or unfolded membrane protein complexes, higher mass soluble calibration standards consistently yield more accurate Ω values. These findings enable us to obtain directly structural information for highly charge-reduced complexes by traveling wave ion mobility mass spectrometry. PMID:27153188

  8. Reprint of The improvement of the energy resolution in epi-thermal neutron region of Bonner sphere using boric acid water solution moderator.

    Ueda, H; Tanaka, H; Sakurai, Y

    2015-12-01

    Bonner sphere is useful to evaluate the neutron spectrum in detail. We are improving the energy resolution in epi-thermal neutron region of Bonner sphere, using boric acid water solution as a moderator. Its response function peak is narrower than that for polyethylene moderator and the improvement of the resolution is expected. The resolutions between polyethylene moderator and boric acid water solution moderator were compared by simulation calculation. Also the influence in the uncertainty of Bonner sphere configuration to spectrum estimation was simulated. PMID:26508275

  9. The improvement of the energy resolution in epi-thermal neutron region of Bonner sphere using boric acid water solution moderator.

    Ueda, H; Tanaka, H; Sakurai, Y

    2015-10-01

    Bonner sphere is useful to evaluate the neutron spectrum in detail. We are improving the energy resolution in epi-thermal neutron region of Bonner sphere, using boric acid water solution as a moderator. Its response function peak is narrower than that for polyethylene moderator and the improvement of the resolution is expected. The resolutions between polyethylene moderator and boric acid water solution moderator were compared by simulation calculation. Also the influence in the uncertainty of Bonner sphere configuration to spectrum estimation was simulated. PMID:26133664

  10. Unstable work histories and fertility in France: An adaptation of sequence complexity measures to employment trajectories

    Daniel Ciganda

    2015-04-01

    Full Text Available Background: The emergence of new evidence suggesting a sign shift in the long-standing negativecorrelation between prosperity and fertility levels has sparked a renewed interest in understanding the relationship between economic conditions and fertility decisions. In thiscontext, the notion of uncertainty has gained relevance in analyses of low fertility. So far, most studies have approached this notion using snapshot indicators such as type of contract or employment situation. However, these types of measures seem to be fallingshort in capturing what is intrinsically a dynamic process. Objective: Our first objective is to analyze to what extent employment trajectories have become lessstable over time, and the second, to determine whether or not employment instability has an impact on the timing and quantum of fertility in France.Additionally, we present a new indicator of employment instability that takes into account both the frequency and duration of unemployment, with the objective of comparing its performance against other, more commonly used indicators of economic uncertainty. Methods: Our study combines exploratory (Sequence Analysis with confirmatory (Event History, Logistic Regression methods to understand the relationship between early life-course uncertainty and the timing and intensity of fertility. We use employment histories from the three available waves of the Etude des relations familiales et intergenerationnelles (ERFI, a panel survey carried out by INED and INSEE which constitutes the base of the Generations and Gender Survey (GGS in France. Results: Although France is characterized by strong family policies and high and stable fertility levels, we find that employment instability not only has a strong and persistent negative effect on the final number of children for both men and women, but also contributes to fertility postponement in the case of men.Regarding the timing of the transition to motherhood, we show how

  11. Assessing dry density and gravimetric water content of soils in geotechnics with complex conductivity measurements : preliminary investigations

    Kaouane, C.; Beck, Y.; Fauchard, C.; Chouteau, M.

    2012-12-01

    Quality controls of geotechnical works need gravimetric water content (w) and dry density (γd) measurements. Afterwards, results are compared to Proctor tests and referred to soil classification. Depending on the class of soils, different objectives must be achieved. Those measurements are usually carried out with neutron and gamma probes. Combined use of theses probes directly access (w, γd). Theses probes show great disadvantages as: nuclear hazard, heavy on-site, transporation and storage restrictions and low sampling volumes. Last decades showed a strong development of electrical and electromagnetic methods for mapping water content in soils. Still, their use in Geotechnics is limited due to interfacial effects neglected in common models but strong in compacted soils. We first showed that (w, γd) is equivalent to (φ, Sr) assuming density of particles γs=2.7 (g.cm-3). This assumption is true for common soils used in civil engineering. That first relationship allows us to work with meaningful parameters for geophysicists. Revil&Florsh recently adapted Vinegar&Waxman model for Spectal Induced Polarization (SIP) measurements at low frequencies (samples were compacted at Proctor energy. We assessed (w, γd) by weighting and drying samples. We obtained γd = 1.6-1.9 (g.cm-3) and w=7-14% which lead to φ=0.3-0.4 and Sr=0.3-0.8. Tap water (ρw= 30 Ω.m) was used for the experiment. We first evaluated the saturation factor n=1.35 by fitting a power law ρ/ρw =a*Sr^n+b. a=0.223 agreed with φ^(-n)=F, F being the formation factor. This leads to a mean tortuosity α=1.47. b=0.5 might be related to surface conductivity. An empirical Rhoades-Corwin model also fit great to data. Revil&Florsh model allows us to predict a phase peak in case of complex conductivity measurements. We predicted a frequency peak at 2.4 Hz. This peak is well located in the frequency range of SIP (from 1 mHz to ~10 Hz). At the frequency peak, this model allows the direct evaluation of saturation

  12. The importance of chemical buffering for pelagic and benthic colonization in acidic waters

    In poorly buffered areas acidification may occur for two reasons: through atmospheric deposition of acidifying substances and - in mining districts - through pyrite weathering. These different sources of acidity lead to distinct clearly geochemistry in lakes and rivers. In general, the geochemistry is the major determinant for the planktonic composition of the acidified water bodies, whereas the nutrient status mainly determines the level of biomass. A number of acidic mining lakes in Eastern Germany have to be neutralized to meet the water quality goals of the European Union Directives and to overcome the ecological degradation. This neutralization process is limnologically a short-term maturation of lakes, which permits biological succession to overcome two different geochemical buffer systems. First, the iron buffer system characterizes an initial state, when colonization starts: there is low organismic diversity and productivity, clear net heterotrophy in most cases. Organic carbon that serves as fuel for the food web derives mainly from allochthonous sources. In the second, less acidic state aluminum is the buffer. This state is found exceptionally among the hard water mining lakes, often as a result of deposition of acidifying substances onto soft water systems. Colonization in aluminum-buffered lakes is more complex and controlled by the sensitivity of the organisms towards both, protons and inorganic reactive aluminum species. In soft-water systems, calcium may act as antidote against acid and aluminum; however, this function is lost in hard water post mining lakes of similar proton concentrations. Nutrient limitations may occur, but these do not usually control qualitative and quantitative plankton composition. In these lakes, total pelagic biomass is controlled by the bioavailability of nutrients, particularly phosphorus

  13. The importance of chemical buffering for pelagic and benthic colonization in acidic waters

    Nixdorf, B., E-mail: b.nixdorf@t-online.de; Lessmann, D. [Brandenburg University of Technology at Cottbus, Chair of Water Conservation, Faculty of Environmental Sciences (Germany); Steinberg, C. E. W. [Leibniz-Institute of Freshwater Ecology and Inland Fisheries (Germany)

    2003-01-15

    In poorly buffered areas acidification may occur for two reasons: through atmospheric deposition of acidifying substances and - in mining districts - through pyrite weathering. These different sources of acidity lead to distinct clearly geochemistry in lakes and rivers. In general, the geochemistry is the major determinant for the planktonic composition of the acidified water bodies, whereas the nutrient status mainly determines the level of biomass. A number of acidic mining lakes in Eastern Germany have to be neutralized to meet the water quality goals of the European Union Directives and to overcome the ecological degradation. This neutralization process is limnologically a short-term maturation of lakes, which permits biological succession to overcome two different geochemical buffer systems. First, the iron buffer system characterizes an initial state, when colonization starts: there is low organismic diversity and productivity, clear net heterotrophy in most cases. Organic carbon that serves as fuel for the food web derives mainly from allochthonous sources. In the second, less acidic state aluminum is the buffer. This state is found exceptionally among the hard water mining lakes, often as a result of deposition of acidifying substances onto soft water systems. Colonization in aluminum-buffered lakes is more complex and controlled by the sensitivity of the organisms towards both, protons and inorganic reactive aluminum species. In soft-water systems, calcium may act as antidote against acid and aluminum; however, this function is lost in hard water post mining lakes of similar proton concentrations. Nutrient limitations may occur, but these do not usually control qualitative and quantitative plankton composition. In these lakes, total pelagic biomass is controlled by the bioavailability of nutrients, particularly phosphorus.

  14. Modeling and measuring the nocturnal drainage flow in a high-elevation, subalpine forest with complex terrain

    Yi, C.; Monson, Russell K.; Zhai, Z.; Anderson, D.E.; Lamb, B.; Allwine, G.; Turnipseed, A.A.; Burns, Sean P.

    2005-01-01

    The nocturnal drainage flow of air causes significant uncertainty in ecosystem CO2, H2O, and energy budgets determined with the eddy covariance measurement approach. In this study, we examined the magnitude, nature, and dynamics of the nocturnal drainage flow in a subalpine forest ecosystem with complex terrain. We used an experimental approach involving four towers, each with vertical profiling of wind speed to measure the magnitude of drainage flows and dynamics in their occurrence. We developed an analytical drainage flow model, constrained with measurements of canopy structure and SF6 diffusion, to help us interpret the tower profile results. Model predictions were in good agreement with observed profiles of wind speed, leaf area density, and wind drag coefficient. Using theory, we showed that this one-dimensional model is reduced to the widely used exponential wind profile model under conditions where vertical leaf area density and drag coefficient are uniformly distributed. We used the model for stability analysis, which predicted the presence of a very stable layer near the height of maximum leaf area density. This stable layer acts as a flow impediment, minimizing vertical dispersion between the subcanopy air space and the atmosphere above the canopy. The prediction is consistent with the results of SF6 diffusion observations that showed minimal vertical dispersion of nighttime, subcanopy drainage flows. The stable within-canopy air layer coincided with the height of maximum wake-to-shear production ratio. We concluded that nighttime drainage flows are restricted to a relatively shallow layer of air beneath the canopy, with little vertical mixing across a relatively long horizontal fetch. Insight into the horizontal and vertical structure of the drainage flow is crucial for understanding the magnitude and dynamics of the mean advective CO2 flux that becomes significant during stable nighttime conditions and are typically missed during measurement of the

  15. Condensed Phase Membrane Introduction Mass Spectrometry with Direct Electron Ionization: On-line Measurement of PAHs in Complex Aqueous Samples

    Termopoli, Veronica; Famiglini, Giorgio; Palma, Pierangela; Cappiello, Achille; Vandergrift, Gregory W.; Krogh, Erik T.; Gill, Chris G.

    2016-02-01

    Polycyclic aromatic hydrocarbons (PAHs) are USEPA regulated priority pollutants. Their low aqueous solubility requires very sensitive analytical methods for their detection, typically involving preconcentration steps. Presented is the first demonstrated `proof of concept' use of condensed phase membrane introduction mass spectrometry (CP-MIMS) coupled with direct liquid electron ionization (DEI) for the direct, on-line measurement of PAHs in aqueous samples. DEI is very well suited for the ionization of PAHs and other nonpolar compounds, and is not significantly influenced by the co-elution of matrix components. Linear calibration data for low ppb levels of aqueous naphthalene, anthracene, and pyrene is demonstrated, with measured detection limits of 4 ppb. Analytical response times (t10%-90% signal rise) ranged from 2.8 min for naphthalene to 4.7 min for pyrene. Both intra- and interday reproducibility has been assessed (waters, sea waters, and a hydrocarbon extraction production waste water sample. For these spiked, complex samples, direct PAH measurement by CP-MIMS-DEI yielded minimal signal suppression from sample matrix effects (81%-104%). We demonstrate the use of this analytical approach to directly monitor real-time changes in aqueous PAH concentrations with potential applications for continuous on-line monitoring strategies and binding/adsorption studies in heterogeneous samples.

  16. Increasing the sensitivity of NMR diffusion measurements by paramagnetic longitudinal relaxation enhancement, with application to ribosome–nascent chain complexes

    Chan, Sammy H. S.; Waudby, Christopher A.; Cassaignau, Anaïs M. E.; Cabrita, Lisa D.; Christodoulou, John, E-mail: j.christodoulou@ucl.ac.uk [University College London and Birkbeck College, Institute of Structural and Molecular Biology (United Kingdom)

    2015-10-15

    The translational diffusion of macromolecules can be examined non-invasively by stimulated echo (STE) NMR experiments to accurately determine their molecular sizes. These measurements can be important probes of intermolecular interactions and protein folding and unfolding, and are crucial in monitoring the integrity of large macromolecular assemblies such as ribosome–nascent chain complexes (RNCs). However, NMR studies of these complexes can be severely constrained by their slow tumbling, low solubility (with maximum concentrations of up to 10 μM), and short lifetimes resulting in weak signal, and therefore continuing improvements in experimental sensitivity are essential. Here we explore the use of the paramagnetic longitudinal relaxation enhancement (PLRE) agent NiDO2A on the sensitivity of {sup 15}N XSTE and SORDID heteronuclear STE experiments, which can be used to monitor the integrity of these unstable complexes. We exploit the dependence of the PLRE effect on the gyromagnetic ratio and electronic relaxation time to accelerate recovery of {sup 1}H magnetization without adversely affecting storage on N{sub z} during diffusion delays or introducing significant transverse relaxation line broadening. By applying the longitudinal relaxation-optimized SORDID pulse sequence together with NiDO2A to 70S Escherichia coli ribosomes and RNCs, NMR diffusion sensitivity enhancements of up to 4.5-fold relative to XSTE are achieved, alongside ∼1.9-fold improvements in two-dimensional NMR sensitivity, without compromising the sample integrity. We anticipate these results will significantly advance the use of NMR to probe dynamic regions of ribosomes and other large, unstable macromolecular assemblies.Graphical Abstract.

  17. Rotational study of the CH4–CO complex: Millimeter-wave measurements and ab initio calculations

    The rotational spectrum of the van der Waals complex CH4–CO has been measured with the intracavity OROTRON jet spectrometer in the frequency range of 110–145 GHz. Newly observed and assigned transitions belong to the K = 2–1 subband correlating with the rotationless jCH4 = 0 ground state and the K = 2–1 and K = 0–1 subbands correlating with the jCH4 = 2 excited state of free methane. The (approximate) quantum number K is the projection of the total angular momentum J on the intermolecular axis. The new data were analyzed together with the known millimeter-wave and microwave transitions in order to determine the molecular parameters of the CH4–CO complex. Accompanying ab initio calculations of the intermolecular potential energy surface (PES) of CH4–CO have been carried out at the explicitly correlated coupled cluster level of theory with single, double, and perturbative triple excitations [CCSD(T)-F12a] and an augmented correlation-consistent triple zeta (aVTZ) basis set. The global minimum of the five-dimensional PES corresponds to an approximately T-shaped structure with the CH4 face closest to the CO subunit and binding energy De = 177.82 cm−1. The bound rovibrational levels of the CH4–CO complex were calculated for total angular momentum J = 0–6 on this intermolecular potential surface and compared with the experimental results. The calculated dissociation energies D0 are 91.32, 94.46, and 104.21 cm−1 for A (jCH4 = 0), F (jCH4 = 1), and E (jCH4 = 2) nuclear spin modifications of CH4–CO, respectively

  18. Increasing the sensitivity of NMR diffusion measurements by paramagnetic longitudinal relaxation enhancement, with application to ribosome–nascent chain complexes

    The translational diffusion of macromolecules can be examined non-invasively by stimulated echo (STE) NMR experiments to accurately determine their molecular sizes. These measurements can be important probes of intermolecular interactions and protein folding and unfolding, and are crucial in monitoring the integrity of large macromolecular assemblies such as ribosome–nascent chain complexes (RNCs). However, NMR studies of these complexes can be severely constrained by their slow tumbling, low solubility (with maximum concentrations of up to 10 μM), and short lifetimes resulting in weak signal, and therefore continuing improvements in experimental sensitivity are essential. Here we explore the use of the paramagnetic longitudinal relaxation enhancement (PLRE) agent NiDO2A on the sensitivity of 15N XSTE and SORDID heteronuclear STE experiments, which can be used to monitor the integrity of these unstable complexes. We exploit the dependence of the PLRE effect on the gyromagnetic ratio and electronic relaxation time to accelerate recovery of 1H magnetization without adversely affecting storage on Nz during diffusion delays or introducing significant transverse relaxation line broadening. By applying the longitudinal relaxation-optimized SORDID pulse sequence together with NiDO2A to 70S Escherichia coli ribosomes and RNCs, NMR diffusion sensitivity enhancements of up to 4.5-fold relative to XSTE are achieved, alongside ∼1.9-fold improvements in two-dimensional NMR sensitivity, without compromising the sample integrity. We anticipate these results will significantly advance the use of NMR to probe dynamic regions of ribosomes and other large, unstable macromolecular assemblies.Graphical Abstract

  19. Analysis of spontaneous MEG activity in mild cognitive impairment and Alzheimer's disease using spectral entropies and statistical complexity measures

    Bruña, Ricardo; Poza, Jesús; Gómez, Carlos; García, María; Fernández, Alberto; Hornero, Roberto

    2012-06-01

    Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz-Mancini-Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p < 0.01). In addition, statistically significant differences between MCI subjects and controls were achieved by ED and LMC (p < 0.05). In order to assess the diagnostic ability of the parameters, a linear discriminant analysis with a leave-one-out cross-validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.

  20. Fluorescent mode XAFS measurements of structure changes of new complex porous compounds in the act of annealing in vacuum

    Dehydration process in recently synthesized porous inorganic salts formed by large cluster anions [Re6X8(CN)6]4- (X 6-point double bond; length half of m-dash S, Se) and transition metal cations and containing water molecules has been investigated. Desolvation process of the complex Co(DMF)6[Mo6Br8(NCS)6] containing cluster Mo anions and cations with dimethylformamide molecules has been studied. Co K, Mo K and Re L3 XAFS measurements of these new complicated compounds before and after annealing in vacuum at the temperature up to 250 deg. C were performed. Changes of electronic and spatial structure of these compounds under the heating process have been established and adequate structural models of the amorphous compounds obtained were suggested and discussed

  1. An Image Pattern Tracking Algorithm for Time-resolved Measurement of Mini- and Micro-scale Motion of Complex Object

    John M. Seiner

    2009-03-01

    Full Text Available An image pattern tracking algorithm is described in this paper for time-resolved measurements of mini- and micro-scale movements of complex objects. This algorithm works with a high-speed digital imaging system, which records thousands of successive image frames in a short time period. The image pattern of the observed object is tracked among successively recorded image frames with a correlation-based algorithm, so that the time histories of the position and displacement of the investigated object in the camera focus plane are determined with high accuracy. The speed, acceleration and harmonic content of the investigated motion are obtained by post processing the position and displacement time histories. The described image pattern tracking algorithm is tested with synthetic image patterns and verified with tests on live insects.

  2. Variability of precipitation in complex terrain and the investigation of representativeness of measurements for the Matre catchment area, Western Norway.

    Skjerdal, M.; Reuder, J.; Villanger, F.

    2009-04-01

    Orography is strongly affecting precipitation. Especially over complex terrain, the precipitation fields can show high spatial variability even over very small scales. Along the Western coast of Norway with its large precipitation amounts of up to above 3000 mm per year, an improved understanding of the spatial precipitation patterns is of large socio-economic impact, as it can improve both the prediction of floods and landslides and the water management for hydro power plants. The producers of hydroelectric power continuously want the water resources to be utilized in the best suited way. Control and supervision of the water resources are therefore of the utmost economic importance. To get an overview over the water resource situation, it is essential to know about the spatial and temporal distribution of precipitation. In cooperation with the Norwegian power company BKK, 20 HOBO rain gauges and two Aanderaa weather stations have been deployed between 22 and 898 meters above sea level in the catchment area for the Matre water system in Western Norway in the period May - October 2009. The main purpose of the project is to investigate the horizontal variability and the altitude dependence of precipitation in complex terrain under different synoptic conditions in this catchment area. Moreover, the representativeness of a few single point measurements on the total precipitation amount of the whole catchment area has been addressed. The total amount of precipitation recorded by the 20 rain gauges during the deployment period ranges between 535 mm and 1190 mm, which indicate the large variability within the catchment area. Analysis of the data with respect to wind direction shows that 75 % of the total precipitation amount during the measurement period arrives when the wind direction is S - SW. During a high precipitation event, which will be investigated in more detail, amounts of precipitation between 58 mm - 121 mm within a 24-hour period have been observed during a

  3. Measuring complexity with zippers

    Baronchelli, A; Loreto, V; Baronchelli, Andrea; Caglioti, Emanuele; Loreto, Vittorio

    2005-01-01

    Physics concepts have often been borrowed and independently developed by other fields of science. In this perspective a significant example is that of entropy in Information Theory. The aim of this paper is to provide a short and pedagogical introduction to the use of data compression techniques for the estimate of entropy and other relevant quantities in Information Theory and Algorithmic Information Theory. We consider in particular the LZ77 algorithm as case study and discuss how a zipper can be used for information extraction.

  4. Development of cortical shape in the human brain from 6 to 24months of age via a novel measure of shape complexity.

    Kim, Sun Hyung; Lyu, Ilwoo; Fonov, Vladimir S; Vachet, Clement; Hazlett, Heather C; Smith, Rachel G; Piven, Joseph; Dager, Stephen R; Mckinstry, Robert C; Pruett, John R; Evans, Alan C; Collins, D Louis; Botteron, Kelly N; Schultz, Robert T; Gerig, Guido; Styner, Martin A

    2016-07-15

    The quantification of local surface morphology in the human cortex is important for examining population differences as well as developmental changes in neurodegenerative or neurodevelopmental disorders. We propose a novel cortical shape measure, referred to as the 'shape complexity index' (SCI), that represents localized shape complexity as the difference between the observed distributions of local surface topology, as quantified by the shape index (SI) measure, to its best fitting simple topological model within a given neighborhood. We apply a relatively small, adaptive geodesic kernel to calculate the SCI. Due to the small size of the kernel, the proposed SCI measure captures fine differences of cortical shape. With this novel cortical feature, we aim to capture comparatively small local surface changes that capture a) the widening versus deepening of sulcal and gyral regions, as well as b) the emergence and development of secondary and tertiary sulci. Current cortical shape measures, such as the gyrification index (GI) or intrinsic curvature measures, investigate the cortical surface at a different scale and are less well suited to capture these particular cortical surface changes. In our experiments, the proposed SCI demonstrates higher complexity in the gyral/sulcal wall regions, lower complexity in wider gyral ridges and lowest complexity in wider sulcal fundus regions. In early postnatal brain development, our experiments show that SCI reveals a pattern of increased cortical shape complexity with age, as well as sexual dimorphisms in the insula, middle cingulate, parieto-occipital sulcal and Broca's regions. Overall, sex differences were greatest at 6months of age and were reduced at 24months, with the difference pattern switching from higher complexity in males at 6months to higher complexity in females at 24months. This is the first study of longitudinal, cortical complexity maturation and sex differences, in the early postnatal period from 6 to 24months

  5. Assessing dry density and gravimetric water content of soils in geotechnics with complex conductivity measurements : preliminary investigations

    Kaouane, C.; Beck, Y.; Fauchard, C.; Chouteau, M.

    2012-12-01

    /ρw =a*Sr^n+b. a=0.223 agreed with φ^(-n)=F, F being the formation factor. This leads to a mean tortuosity α=1.47. b=0.5 might be related to surface conductivity. An empirical Rhoades-Corwin model also fit great to data. Revil&Florsh model allows us to predict a phase peak in case of complex conductivity measurements. We predicted a frequency peak at 2.4 Hz. This peak is well located in the frequency range of SIP (from 1 mHz to ~10 Hz). At the frequency peak, this model allows the direct evaluation of saturation and porosity. Hence, complex conductivity measurements might be a fine alternative to nuclear probes. Still, driving in electrodes in compacted soils remains difficult. Ongoing studies are looking further to extend this model to higher frequency range (5-200 kHz) where capacitively coupled resistivity arrays might be used allowing continuous measurements.

  6. Measurements of the Intensity and Polarization of the Anomalous Microwave Emission in the Perseus molecular complex with QUIJOTE

    Génova-Santos, R; Rebolo, R; Peláez-Santos, A; López-Caraballo, C H; Harper, S; Watson, R A; Ashdown, M; Barreiro, R B; Casaponsa, B; Dickinson, C; Diego, J M; Fernández-Cobos, R; Grainge, K J B; Herranz, D; Hoyland, R; Lasenby, A; López-Caniego, M; Martínez-González, E; McCulloch, M; Melhuish, S; Piccirillo, L; Perrott, Y C; Poidevin, F; Razavi-Ghods, N; Scott, P F; Titterington, D; Tramonte, D; Vielva, P; Vignaga, R

    2015-01-01

    Anomalous microwave emission (AME) has been observed in numerous sky regions, in the frequency range ~10-60 GHz. One of the most scrutinized regions is G159.6-18.5, located within the Perseus molecular complex. In this paper we present further observations of this region (194 hours in total over ~250 deg^2), both in intensity and in polarization. They span four frequency channels between 10 and 20 GHz, and were gathered with QUIJOTE, a new CMB experiment with the goal of measuring the polarization of the CMB and Galactic foregrounds. When combined with other publicly-available intensity data, we achieve the most precise spectrum of the AME measured to date, with 13 independent data points being dominated by this emission. The four QUIJOTE data points provide the first independent confirmation of the downturn of the AME spectrum at low frequencies, initially unveiled by the COSMOSOMAS experiment in this region. We accomplish an accurate fit of these data using models based on electric dipole emission from spin...

  7. Two-layered disc quasi-optical dielectric resonators: electrodynamics and application perspectives for complex permittivity measurements of lossy liquids

    Barannik, A. A.; Cherpak, N. T.; Prokopenko, Yu V.; Filipov, Yu F.; Shaforost, E. N.; Shipilova, I. A.

    2007-07-01

    Electromagnetic properties of novel quasi-optical resonators are studied theoretically and experimentally. The resonators are a radially two-layered dielectric disc sandwiched between conducting endplates. The internal layer can be filled with air or lossy liquid. Whispering gallery modes are excited in such a resonator and the mode energy is concentrated near the inner side of the cylindrical surface of an external layer. The measurement data obtained in the Ka-band are compared with theoretical calculations of eigenfrequencies and quality factors of the Teflon resonator filled with water, ethyl alcohol, benzene and aqueous solutions of ethyl alcohol. A number of 'anomalous' properties of the resonator can be described using Maxwell equations. The experimental data on the complex permittivity of a binary mixture water-ethyl alcohol are compared with the values calculated in terms of Debye's function. An important feature of the proposed technique is that it holds promise for making first principle microwave measurements of the permittivity of lossy liquids.

  8. Building a measurement framework of burden of treatment in complex patients with chronic conditions: a qualitative study

    Eton DT

    2012-08-01

    Full Text Available David T Eton,1 Djenane Ramalho de Oliveira,2,3 Jason S Egginton,1 Jennifer L Ridgeway,1 Laura Odell,4 Carl R May,5 Victor M Montori1,61Division of Health Care Policy and Research, Department of Health Sciences Research, Mayo Clinic, Rochester, MN, USA; 2College of Pharmacy, Universidade Federal de Minas Gerais, Belo Horizonte, Brazil; 3Medication Therapy Management Program, Fairview Pharmacy Services LLC, Minneapolis, MN, USA; 4Pharmacy Services, Mayo Clinic, Rochester, MN, USA; 5Faculty of Health Sciences, University of Southampton, Southampton, UK; 6Knowledge and Evaluation Research Unit, Mayo Clinic, Rochester, MN, USABackground: Burden of treatment refers to the workload of health care as well as its impact on patient functioning and well-being. We set out to build a conceptual framework of issues descriptive of burden of treatment from the perspective of the complex patient, as a first step in the development of a new patient-reported measure.Methods: We conducted semistructured interviews with patients seeking medication therapy management services at a large, academic medical center. All patients had a complex regimen of self-care (including polypharmacy, and were coping with one or more chronic health conditions. We used framework analysis to identify and code themes and subthemes. A conceptual framework of burden of treatment was outlined from emergent themes and subthemes.Results: Thirty-two patients (20 female, 12 male, age 26–85 years were interviewed. Three broad themes of burden of treatment emerged including: the work patients must do to care for their health; problem-focused strategies and tools to facilitate the work of self-care; and factors that exacerbate the burden felt. The latter theme encompasses six subthemes including challenges with taking medication, emotional problems with others, role and activity limitations, financial challenges, confusion about medical information, and health care delivery obstacles

  9. Rotational study of the NH3–CO complex: Millimeter-wave measurements and ab initio calculations

    The rotational spectrum of the van der Waals complex NH3–CO has been measured with the intracavity OROTRON jet spectrometer in the frequency range of 112–139 GHz. Newly observed and assigned transitions belong to the K = 0–0, K = 1–1, K = 1–0, and K = 2–1 subbands correlating with the rotationless (jk)NH3 = 00 ground state of free ortho-NH3 and the K = 0–1 and K = 2–1 subbands correlating with the (jk)NH3 = 11 ground state of free para-NH3. The (approximate) quantum number K is the projection of the total angular momentum J on the intermolecular axis. Some of these transitions are continuations to higher J values of transition series observed previously [C. Xia et al., Mol. Phys. 99, 643 (2001)], the other transitions constitute newly detected subbands. The new data were analyzed together with the known millimeter-wave and microwave transitions in order to determine the molecular parameters of the ortho-NH3–CO and para-NH3–CO complexes. Accompanying ab initio calculations of the intermolecular potential energy surface (PES) of NH3–CO has been carried out at the explicitly correlated coupled cluster level of theory with single, double, and perturbative triple excitations and an augmented correlation-consistent triple zeta basis set. The global minimum of the five-dimensional PES corresponds to an approximately T-shaped structure with the N atom closest to the CO subunit and binding energy De = 359.21 cm−1. The bound rovibrational levels of the NH3–CO complex were calculated for total angular momentum J = 0–6 on this intermolecular potential surface and compared with the experimental results. The calculated dissociation energies D0 are 210.43 and 218.66 cm−1 for ortho-NH3–CO and para-NH3–CO, respectively

  10. Investigating Project Measurement Complexity from TO Perspectives%基于TO视角的项目复杂性测度研究

    何清华; 罗岚; 陆云波; 任俊山

    2013-01-01

    在分析传统项目复杂性影响要素的基础上,从客观性任务和主观性组织的角度探讨了项目复杂性微观影响因子的TO概念模型;并基于ProjectSim建立了以隐性工作量表示的项目复杂性测度方法;然后以世博AB片区项目构建模型,对TO测度方法的假设进行验证,证实基于隐性工作量的项目复杂性测度方法正确有效.本研究丰富和发展了复杂项目管理理论,对大型复杂项目管理具有重要的理论指导意义.%Projects have been growing in quantity, size, and complexity. Managing project complexity has become an important part of the project management. However, the traditional methods often measure project complexity from macro-perspectives, but largely ignore the potential influence of microcosmic factors on project complexity. Therefore, from the task and organizational (TO) perspective this paper explores the reasonable measurement model which can reflect the dynamic "emerging" effect of micro factors on project complexity. Based on the analysis of traditional factors affecting project complexity, the paper discusses microcosmic factors of project complexity from the perspectives of objectivity task and subjectivity organization, and establishes a method to meausre project complexity expressed by implicit workload based on the tool of ProjectSim. This tool effectively measures poject complexity from the perspective of implicit workload. Project complexity is equal to implicit workload/dominant workload. Implicit workload or ProjectSim (T, 0) is equal to reworking workload + coordinating workload + waiting workload. According to the synchronous relationship of the implicit workload and the project complexity, the paper combines the measurement method "TO" with the micro factors of project complexity based on the implicit workload. We also propose hypothesized relationships among task complexity, organization structure, organization members and project complexity

  11. Charge carrier effective mass and concentration derived from combination of Seebeck coefficient and 125Te NMR measurements in complex tellurides

    Levin, E. M.

    2016-06-01

    Thermoelectric materials utilize the Seebeck effect to convert heat to electrical energy. The Seebeck coefficient (thermopower), S , depends on the free (mobile) carrier concentration, n , and effective mass, m*, as S ˜m*/n2 /3 . The carrier concentration in tellurides can be derived from 125Te nuclear magnetic resonance (NMR) spin-lattice relaxation measurements. The NMR spin-lattice relaxation rate, 1 /T1 , depends on both n and m* as 1 /T1˜(m*)3/2n (within classical Maxwell-Boltzmann statistics) or as 1 /T1˜(m*)2n2 /3 (within quantum Fermi-Dirac statistics), which challenges the correct determination of the carrier concentration in some materials by NMR. Here it is shown that the combination of the Seebeck coefficient and 125Te NMR spin-lattice relaxation measurements in complex tellurides provides a unique opportunity to derive the carrier effective mass and then to calculate the carrier concentration. This approach was used to study A gxS bxG e50-2xT e50 , well-known GeTe-based high-efficiency tellurium-antimony-germanium-silver thermoelectric materials, where the replacement of Ge by [Ag+Sb] results in significant enhancement of the Seebeck coefficient. Values of both m* and n derived using this combination show that the enhancement of thermopower can be attributed primarily to an increase of the carrier effective mass and partially to a decrease of the carrier concentration when the [Ag+Sb] content increases.

  12. Measurement issues associated with quantitative molecular biology analysis of complex food matrices for the detection of food fraud.

    Burns, Malcolm; Wiseman, Gordon; Knight, Angus; Bramley, Peter; Foster, Lucy; Rollinson, Sophie; Damant, Andrew; Primrose, Sandy

    2016-01-01

    Following a report on a significant amount of horse DNA being detected in a beef burger product on sale to the public at a UK supermarket in early 2013, the Elliott report was published in 2014 and contained a list of recommendations for helping ensure food integrity. One of the recommendations included improving laboratory testing capacity and capability to ensure a harmonised approach for testing for food authenticity. Molecular biologists have developed exquisitely sensitive methods based on the polymerase chain reaction (PCR) or mass spectrometry for detecting the presence of particular nucleic acid or peptide/protein sequences. These methods have been shown to be specific and sensitive in terms of lower limits of applicability, but they are largely qualitative in nature. Historically, the conversion of these qualitative techniques into reliable quantitative methods has been beset with problems even when used on relatively simple sample matrices. When the methods are applied to complex sample matrices, as found in many foods, the problems are magnified resulting in a high measurement uncertainty associated with the result which may mean that the assay is not fit for purpose. However, recent advances in the technology and the understanding of molecular biology approaches have further given rise to the re-assessment of these methods for their quantitative potential. This review focuses on important issues for consideration when validating a molecular biology assay and the various factors that can impact on the measurement uncertainty of a result associated with molecular biology approaches used in detection of food fraud, with a particular focus on quantitative PCR-based and proteomics assays. PMID:26631264

  13. Embedded Measures of Performance Validity in the Rey Complex Figure Test in a Clinical Sample of Veterans.

    Sugarman, Michael A; Holcomb, Erin M; Axelrod, Bradley N; Meyers, John E; Liethen, Philip C

    2016-01-01

    The purpose of this study was to determine how well scores from the Rey Complex Figure Test (RCFT) could serve as embedded measures of performance validity in a large, heterogeneous clinical sample at an urban-based Veterans' Affairs hospital. Participants were divided into credible performance (n = 244) and noncredible performance (n = 87) groups based on common performance validity tests during their respective clinical evaluations. We evaluated how well preselected RCFT scores could discriminate between the 2 groups using cut scores from single indexes as well as multivariate logistic regression prediction models. Additionally, we evaluated how well memory error patterns (MEPs) could discriminate between the 2 groups. Optimal discrimination occurred when indexes from the Copy and Recognition trials were simultaneous predictors in logistic regression models, with 91% specificity and at least 53% sensitivity. Logistic regression yielded superior discrimination compared with individual indexes and compared with the use of MEPs. Specific scores on the RCFT, including the Copy and Recognition trials, can serve as adequate indexes of performance validity, when using both cut scores and logistic regression prediction models. We provide logistic regression equations that can be applied in similar clinical settings to assist in determining performance validity. PMID:26384155

  14. Retrieval of aerosol complex refractive index from a synergy between lidar, sun photometer and in situ measurements during LISAIR experiment

    Particulate pollutant exchanges between the streets and the Planetary Boundary Layer (PBL), and their daily evolution linked to human activity were studied in the framework of the Lidar pour la Surveillance de l'AIR (LISAIR) experiment. This program lasted from 10 to 30 May 2005. A synergetic approach combining dedicated active (lidar) and passive (sun photometer) remote sensors as well as ground based in situ instrumentation (nephelometer, aethalometer and particle sizers) was used to investigate urban aerosol optical properties within Paris. Aerosol complex refractive indices were assessed to be 1.56-0.034 i at 355 nm and 1.59-0.040 i at 532 nm, thus leading to single-scattering albedo values between 0.80 and 0.88. These retrievals are consistent with soot components in the aerosol arising from traffic exhausts indicating that these pollutants have a radiative impact on climate. We also discussed the influence of relative humidity on aerosol properties. A good agreement was found between vertical extinction profile derived from lidar backscattering signal and retrieved from the coupling between radio sounding and ground in situ measurements. (authors)

  15. Aerosol Disinfection Capacity of Slightly Acidic Hypochlorous Acid Water Towards Newcastle Disease Virus in the Air: An In Vivo Experiment.

    Hakim, Hakimullah; Thammakarn, Chanathip; Suguro, Atsushi; Ishida, Yuki; Nakajima, Katsuhiro; Kitazawa, Minori; Takehara, Kazuaki

    2015-12-01

    Existence of bioaerosol contaminants in farms and outbreaks of some infectious organisms with the ability of transmission by air increase the need for enhancement of biosecurity, especially for the application of aerosol disinfectants. Here we selected slightly acidic hypochlorous acid water (SAHW) as a candidate and evaluated its virucidal efficacy toward a virus in the air. Three-day-old conventional chicks were challenged with 25 doses of Newcastle disease live vaccine (B1 strain) by spray with nebulizer (particle size <3 μm in diameter), while at the same time reverse osmosis water as the control and SAHW containing 50 or 100 parts per million (ppm) free available chlorine in pH 6 were sprayed on the treated chicks with other nebulizers. Exposed chicks were kept in separated cages in an isolator and observed for clinical signs. Oropharyngeal swab samples were collected from 2 to 5 days postexposure from each chick, and then the samples were titrated with primary chicken kidney cells to detect the virus. Cytopathic effects were observed, and a hemagglutination test was performed to confirm the result at 5 days postinoculation. Clinical signs (sneezing) were recorded, and the virus was isolated from the control and 50 ppm treatment groups, while no clinical signs were observed in and no virus was isolated from the 100 ppm treatment group. The virulent Newcastle disease virus (NDV) strain Sato, too, was immediately inactivated by SAHW containing 50 ppm chlorine in the aqueous phase. These data suggest that SAHW containing 100 ppm chlorine can be used for aerosol disinfection of NDV in farms. PMID:26629621

  16. The improvement of the energy resolution in epi-thermal neutron region of Bonner sphere using boric acid water solution moderator

    Bonner sphere is useful to evaluate the neutron spectrum in detail. We are improving the energy resolution in epi-thermal neutron region of Bonner sphere, using boric acid water solution as a moderator. Its response function peak is narrower than that for polyethylene moderator and the improvement of the resolution is expected. The resolutions between polyethylene moderator and boric acid water solution moderator were compared by simulation calculation. Also the influence in the uncertainty of Bonner sphere configuration to spectrum estimation was simulated. - Highlights: • Boric acid solution is useful to improve the energy resolution of Bonner sphere. • Uncertainty of the device configuration is critical for neutron spectrometry. • It is important to reduce and evaluate the uncertainty

  17. Deciphering Jupiter's complex flow dynamics using the upcoming Juno gravity measurements and an adjoint based dynamical model

    Galanti, Eli; Kaspi, Yohai

    2015-11-01

    The nature of the large scale flow on Jupiter below the cloud level is still unknown. The observed surface wind might be confined to the upper layers, or be a manifestation of deep cylindrical flow. Moreover, it is possible that in the case where the observed wind is superficial, there exists deep flow that is completely separated from the surface. To date, all models linking the wind (via the induced density nomalies) to the gravity field to be measured by Juno, consider only wind flow related to the observed could level wind. Some assume full cylindrical flow while others allow for the wind to decay with depth.Here we explore the possibility of complex wind dynamics that include both the upper-layer wind, and a deep flow that is completely detached from the flow above it. The surface flow is based on the observed cloud level flow and is set to decay with depth. The deep flow is constructed synthetically to produce cylindrical structures with variable width and magnitude, thus allowing for a wide range of possible setups of the unknown deep flow. This flow is also set to decay when approaching the surface flow in coordination with the exponential decay rate. The combined 3D flow is then related to the density anomalies via a dynamical model, taking into account oblateness effects as well, and the resulting density field is then used to calculate the gravitational moments. An adjoint inverse model is constructed for the dynamical model, thus allowing backward integration of the dynamical model, from the expected observations of the gravity moments to the parameters controlling the setup of the deep and surface flows. We show that the model can be used for examination of various scenarios, including cases in which the deep flow is dominating over the surface wind. The novelty of our adjoint based inversion approach is in the ability to identify complex dynamics including deep cylindrical flows that have no manifestation in the observed cloud-level wind. Furthermore

  18. The development of a quantitative measure for the complexity of emergency tasks stipulated in emergency operating procedures of nuclear power plants

    Previous studies have continuously pointed out that human performance is a decisive factor affecting the safety of complicated process systems. Subsequently, as the result of extensive efforts, it has been revealed that the provision of procedures is one of the most effective countermeasures, especially if human operators have to carry out their tasks under a very stressful environment. That is, since good procedures are helpful to not only enhance the performance of human operators but also the reduction of the possibility of a human error through stipulating detailed tasks to be done by human operators. Ironically, it has been emphasized that the performance of human operators could be impaired due to complicated procedures, because procedures directly govern the physical as well as cognitive behavior of human operators by institutionalizing detailed actions. Therefore, it is a prerequisite to develop a systematic framework that properly evaluate the complexity of tasks described in procedures. For this reason, a measure called TACOM (Task Complexity) that can quantify the complexity of emergency tasks described in the emergency operating procedures (EOPs) of NPPs has been developed. In this report, a technical background as well as practical steps to quantify the complexity of tasks were presented with a series of studies that were conducted to ensure the validity of the TACOM measure. As a result of validation studies, since it is shown that the TACOM measure seem to properly quantify the complexity of emergency tasks, it is desirable that the TACOM measure plays an important role in improving the performance of human operators

  19. The development of a quantitative measure for the complexity of emergency tasks stipulated in emergency operating procedures of nuclear power plants

    Park, Jin Kyun; Jung, Won Dea

    2006-11-15

    Previous studies have continuously pointed out that human performance is a decisive factor affecting the safety of complicated process systems. Subsequently, as the result of extensive efforts, it has been revealed that the provision of procedures is one of the most effective countermeasures, especially if human operators have to carry out their tasks under a very stressful environment. That is, since good procedures are helpful to not only enhance the performance of human operators but also the reduction of the possibility of a human error through stipulating detailed tasks to be done by human operators. Ironically, it has been emphasized that the performance of human operators could be impaired due to complicated procedures, because procedures directly govern the physical as well as cognitive behavior of human operators by institutionalizing detailed actions. Therefore, it is a prerequisite to develop a systematic framework that properly evaluate the complexity of tasks described in procedures. For this reason, a measure called TACOM (Task Complexity) that can quantify the complexity of emergency tasks described in the emergency operating procedures (EOPs) of NPPs has been developed. In this report, a technical background as well as practical steps to quantify the complexity of tasks were presented with a series of studies that were conducted to ensure the validity of the TACOM measure. As a result of validation studies, since it is shown that the TACOM measure seem to properly quantify the complexity of emergency tasks, it is desirable that the TACOM measure plays an important role in improving the performance of human operators.

  20. Pseudo-stokes vector from complex signal representation of a speckle pattern and its applications to micro-displacement measurement

    Wang, W.; Ishijima, R.; Matsuda, A.;

    2010-01-01

    the intensity speckle pattern, which converts the original real-valued signal into a complex signal. In closest analogy to the polarisation of a vector wave, the Stokes-like vector constructed from the spatial derivative of the generated complex signal has been applied for correlation. Experimental...

  1. Water Accounting Plus (WA+ – a water accounting procedure for complex river basins based on satellite measurements

    P. Karimi

    2013-07-01

    Full Text Available Coping with water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links depletion to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper, we introduce Water Accounting Plus (WA+, which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use and landscape evapotranspiration on the water cycle is described explicitly by defining land use groups with common characteristics. WA+ presents four sheets including (i a resource base sheet, (ii an evapotranspiration sheet, (iii a productivity sheet, and (iv a withdrawal sheet. Every sheet encompasses a set of indicators that summarise the overall water resources situation. The impact of external (e.g., climate change and internal influences (e.g., infrastructure building can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used to acquire a vast amount of required data but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  2. Water Accounting Plus (WA+ – a water accounting procedure for complex river basins based on satellite measurements

    D. Molden

    2012-11-01

    Full Text Available Coping with the issue of water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links hydrological flows to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper we introduce Water Accounting Plus (WA+, which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use on the water cycle is described explicitly by defining land use groups with common characteristics. Analogous to financial accounting, WA+ presents four sheets including (i a resource base sheet, (ii a consumption sheet, (iii a productivity sheet, and (iv a withdrawal sheet. Every sheet encompasses a set of indicators that summarize the overall water resources situation. The impact of external (e.g. climate change and internal influences (e.g. infrastructure building can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used for 3 out of the 4 sheets, but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  3. Formation of κ-carrageenan-gelatin polyelectrolyte complexes studied by (1)H NMR, UV spectroscopy and kinematic viscosity measurements.

    Voron'ko, Nicolay G; Derkach, Svetlana R; Vovk, Mikhail A; Tolstoy, Peter M

    2016-10-20

    The intermolecular interactions between an anionic polysaccharide from the red algae κ-carrageenan and a gelatin polypeptide, forming stoichiometric polysaccharide-polypeptide (bio)polyelectrolyte complexes in the aqueous phase, were examined. The major method of investigation was high-resolution (1)H NMR spectroscopy. Additional data were obtained by UV absorption spectroscopy, light scattering dispersion and capillary viscometry. Experimental data were interpreted in terms of the changing roles of electrostatic interactions, hydrophobic interactions and hydrogen bonds when κ-carrageenan-gelatin complexes are formed. At high temperatures, when biopolymer macromolecules in solution are in the state of random coil, hydrophobic interactions make a major contribution to complex stabilization. At the temperature of gelatin's coil→helix conformational transition and at lower temperatures, electrostatic interactions and hydrogen bonds play a defining role in complex formation. A proposed model of the κ-carrageenan-gelatin complex is discussed. PMID:27474666

  4. The complexities of measuring access to parks and physical activity sites in New York City: a quantitative and qualitative approach

    Sohler Nancy L

    2009-06-01

    Full Text Available Abstract Background Proximity to parks and physical activity sites has been linked to an increase in active behaviors, and positive impacts on health outcomes such as lower rates of cardiovascular disease, diabetes, and obesity. Since populations with a low socio-economic status as well as racial and ethnic minorities tend to experience worse health outcomes in the USA, access to parks and physical activity sites may be an environmental justice issue. Geographic Information systems were used to conduct quantitative and qualitative analyses of park accessibility in New York City, which included kernel density estimation, ordinary least squares (global regression, geographically weighted (local regression, and longitudinal case studies, consisting of field work and archival research. Accessibility was measured by both density of park acreage and density of physical activity sites. Independent variables included percent non-Hispanic black, percent Hispanic, percent below poverty, percent of adults without high school diploma, percent with limited English-speaking ability, and population density. Results The ordinary least squares linear regression found weak relationships in both the park acreage density and the physical activity site density models (Ra2 = .11 and .23, respectively; AIC = 7162 and 3529, respectively. Geographically weighted regression, however, suggested spatial non-stationarity in both models, indicating disparities in accessibility that vary over space with respect to magnitude and directionality of the relationships (AIC = 2014 and -1241, respectively. The qualitative analysis supported the findings of the local regression, confirming that although there is a geographically inequitable distribution of park space and physical activity sites, it is not globally predicted by race, ethnicity, or socio-economic status. Conclusion The combination of quantitative and qualitative analyses demonstrated the complexity of the issues around

  5. Temperature effect on sorption of cations onto clay minerals: complexation modeling and experimental measurements up to 150 deg. C

    Tertre, E. [LMTG, UMR UPS-CNRS-IRD 5563, 14 av. E. Belin, 31400 Toulouse (France)]|[ANDRA, Parc de la Croix Blanche - 1/7 rue Jean Monnet, 92298 Chatenay-Malabry (France)]|[EDF R and D, 77818 Moret sur Loing (France); Berger, G.; Castet, S.; Loubet, M. [LMTG, UMR UPS-CNRS-IRD 5563, 14 av. E. Belin, 31400 Toulouse (France); Giffaut, E. [ANDRA, Parc de la Croix Blanche - 1/7 rue Jean Monnet, 92298 Chatenay-Malabry (France); Simoni, E. [Universite Paris XI, Institut de Physique Nucleaire, Groupe de Radiochimie, Bat. 100, 91406 Orsay (France); Catalette, H. [EDF R and D, 77818 Moret sur Loing (France)

    2005-07-01

    clay minerals is not temperature dependant whereas the surface charges increase weakly when temperature rises from 25 to 60 deg. C [2]. A surface complexation model (DLM) integrating the temperature parameter was performed to explain our sorption data. This model takes into account the site densities and their associated pK{sub a} obtained by our surface acid/base model [2]. [1] Experimental sorption of Ni{sup 2+}, Cs{sup +} and Ln{sup 3+} onto a montmorillonite up to 150 deg. C. E. Tertre, G. Berger, S. Castet, M. Loubet and E. Giffaut (submitted). [2] Acid/base surface chemistry of kaolinite and montmorillonite at 25 and 60 deg. C. Experimental measurements and modeling by CHESS{sup R}. E. Tertre, S. Castet, G. Berger, M. Loubet and E. Giffaut (in preparation). (authors)

  6. Development of X-ray Computed Tomography (CT) Imaging Method for the Measurement of Complex 3D Ice Shapes Project

    National Aeronautics and Space Administration — When ice accretes on a wing or other aerodynamic surface, it can produce extremely complex shapes. These are comprised of well-known shapes such as horns and...

  7. The importance and complexity of regret in the measurement of 'good' decisions: a systematic review and a content analysis of existing assessment instruments

    Joseph-Williams, N.; Edwards, A.; Elwyn, G.

    2011-01-01

    BACKGROUND OR CONTEXT: Regret is a common consequence of decisions, including those decisions related to individuals' health. Several assessment instruments have been developed that attempt to measure decision regret. However, recent research has highlighted the complexity of regret. Given its relev

  8. Complexity Plots

    Thiyagalingam, Jeyarajan

    2013-06-01

    In this paper, we present a novel visualization technique for assisting the observation and analysis of algorithmic complexity. In comparison with conventional line graphs, this new technique is not sensitive to the units of measurement, allowing multivariate data series of different physical qualities (e.g., time, space and energy) to be juxtaposed together conveniently and consistently. It supports multivariate visualization as well as uncertainty visualization. It enables users to focus on algorithm categorization by complexity classes, while reducing visual impact caused by constants and algorithmic components that are insignificant to complexity analysis. It provides an effective means for observing the algorithmic complexity of programs with a mixture of algorithms and black-box software through visualization. Through two case studies, we demonstrate the effectiveness of complexity plots in complexity analysis in research, education and application. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  9. Chemical and spectroscopic characterizations, ESI-QTOF mass spectrometric measurements and DFT studies of new complexes of palladium(II) with tryptamine and mefenamic acid

    Carvalho, Marcos A.; Arruda, Eduardo G. R.; Profirio, Daniel M.; Gomes, Alexandre F.; Gozzo, Fábio C.; Formiga, André L. B.; Corbi, Pedro P.

    2015-11-01

    New palladium(II) complexes with tryptamine (Pd-tra) and mefenamic acid (Pd-mef) were prepared and characterized by chemical and spectroscopic methods. Elemental, ESI-QTOF mass spectrometric and thermogravimetric analyses of the compounds confirm the composition [PdCl2(tra)2] for Pd-tra and [Pd(mef)2(bipy)] for Pd-mef. Infrared data indicate the coordination of tryptamine to Pd(II) by the nitrogen atom of the amino group, while for mefenamic acid coordination occurs by the oxygen atom of carboxylate group in a monodentate form. The 1H, 13C and {15N,1H} NMR spectroscopic data confirm the nitrogen coordination of the NH2 group of trypatmine to Pd(II) in the Pd-tra complex and also the oxygen coordination of the carboxylate group of mefenamic acid to Pd(II) in the Pd-mef complex. Density functional theory (DFT) studies were applied to determine the difference in energy between the geometric isomers (cis/trans) of Pd-tra and to optimize the structure of the Pd-mef complex. Raman spectroscopic measurements reinforce the nitrogen coordination of tryptamine to Pd(II) in the Pd-tra complex and confirms the presence of the cis-[PdCl2(tra)2] isomer in the solid state. The complexes are insoluble in water.

  10. The measurement and model construction of complex permittivity of corn leaves at the main frequency points of L/S/C/X-band

    The complex permittivity of target has a crucial influence on its microwave radiation characteristics. In the quantitative research of microwave remote sensing, the study of the dielectric properties of vegetation to establish the relationship between its specific physical parameters and complex permittivity is the basic work in this field. In this study, corn leaves samples of different types and heights were collected at the city of Zhangye which is the key study area of the Heihe watershed allied telemetry experimental research and also the largest breeding base of hybrid corn seeds in China. Then the vector network analyzer E8362B was used to measure the complex permittivity of these samples from 0.2 to 20 GHz by coaxial probe technique. Based on these measurements, an empirical model of corn leaves which describes the relationship between the gravimetric moisture and both the real part and imaginary part of complex permittivity at the main frequency points of L/S/C/X-band was established. Finally, the empirical model and the classical Debye-Cole model were compared and validated by the measured data collected from the Huailai county in Hebei province. The results show that the empirical model has higher accuracy and is more practical than the traditional Debye-Cole model

  11. Dynamics measured by neutron scattering correlates with the organization of bioenergetics complexes in natural membranes from hyperthermophile and mesophile bacteria.

    Peters, J; Giudici-Orticoni, M T; Zaccai, G; Guiral, M

    2013-07-01

    Various models on membrane structure and organization of proteins and complexes in natural membranes emerged during the last years. However, the lack of systematic dynamical studies to complement structural investigations hindered the establishment of a more complete picture of these systems. Elastic incoherent neutron scattering gives access to the dynamics on a molecular level and was applied to natural membranes extracted from the hyperthermophile Aquifex aeolicus and the mesophile Wolinella succinogenes bacteria. The results permitted to extract a hierarchy of dynamic flexibility and atomic resilience within the samples, which correlated with the organization of proteins in bioenergetics complexes and the functionality of the membranes. PMID:23880731

  12. Use of avidin-biotin-peroxidase complex for measurement of UV lesions in human DNA by microELISA

    The avidin/biotin system was introduced into the standard enzyme-linked immunosorbent assay (ELISA) to increase its sensitivity for detecting UV lesions in human DNA. Goat anti-rabbit IgG-peroxidase used in the standard ELISA as second antibody was replaced by biotinylated goat anti-rabbit IgG plus the avidin-biotin-peroxidase complex (ABC) reagent. Sensitivity of detection of plate-fixed UV-DNA-antibody complexes was increased about 8-fold and photolesions in human DNA samples irradiated with as low a dose as 1 J/m2 UVC or a suberythermal dose of UVB light could be detected. (Auth.)

  13. Resting and Task-Modulated High-Frequency Brain Rhythms Measured by Scalp Encephalography in Infants with Tuberous Sclerosis Complex

    Stamoulis, Catherine; Vogel-Farley, Vanessa; Degregorio, Geneva; Jeste, Shafali S.; Nelson, Charles A.

    2015-01-01

    The electrophysiological correlates of cognitive deficits in tuberous sclerosis complex (TSC) are not well understood, and modulations of neural dynamics by neuroanatomical abnormalities that characterize the disorder remain elusive. Neural oscillations (rhythms) are a fundamental aspect of brain function, and have dominant frequencies in a wide…

  14. Cutaneous noradrenaline measured by microdialysis in complex regional pain syndrome during whole-body cooling and heating

    Terkelsen, Astrid Juhl; Gierthmühlen, Janne; Petersen, Lars J.;

    2013-01-01

    Complex regional pain syndrome (CRPS) is characterised by autonomic, sensory, and motor disturbances. The underlying mechanisms of the autonomic changes in CPRS are unknown. However, it has been postulated that sympathetic inhibition in the acute phase with locally reduced levels of noradrenaline...

  15. Complex Problem Solving in Educational Contexts--Something beyond "g": Concept, Assessment, Measurement Invariance, and Construct Validity

    Greiff, Samuel; Wustenberg, Sascha; Molnar, Gyongyver; Fischer, Andreas; Funke, Joachim; Csapo, Beno

    2013-01-01

    Innovative assessments of cross-curricular competencies such as complex problem solving (CPS) have currently received considerable attention in large-scale educational studies. This study investigated the nature of CPS by applying a state-of-the-art approach to assess CPS in high school. We analyzed whether two processes derived from cognitive…

  16. A Measure of Systems Engineering Effectiveness in Government Acquisition of Complex Information Systems: A Bayesian Belief Network-Based Approach

    Doskey, Steven Craig

    2014-01-01

    This research presents an innovative means of gauging Systems Engineering effectiveness through a Systems Engineering Relative Effectiveness Index (SE REI) model. The SE REI model uses a Bayesian Belief Network to map causal relationships in government acquisitions of Complex Information Systems (CIS), enabling practitioners to identify and…

  17. Comparative study on CoFe2O4 ultrafineparticles in liquid and dry specimens of the acidic water-based ferrofluids by STM

    2001-01-01

    In the acidic water based ferro fluids, the ultra-fine particles appeared in a form of spheres with a diameter in a range of 2~6nm. The poly-groups of CoFe2O4 ultrafineparticles are divided into two species, i.e. the weak poly-group and strong poly-group, based on the resoluble degree of these ultrafineparticles that aggregate into the groups observed by STM. The ultra-fine particles formed at the liquid and dry specimens have different ratio of two species

  18. Investigation of the model of the vibration measuring channel of the complex monitoring system of steel tanks

    Бурау, Надежда Ивановна; Цыбульник, Сергей Алексеевич; Шевчук, Дмитрий Владимирович

    2015-01-01

    The presence of defects and damage incurred during the manufacture, installation and operation raises the problem of controlling the technical condition of critical structures of engineering and construction facilities on one of the first places in the diagnosis of objects. In the modern world practice, this problem is solved by using complex intelligent monitoring systems. Due to the wide range of opportunities, these tools for functional diagnostics are widely used in various industries.The...

  19. Formation of p-cresol:piperazine complex in solution monitored by spin-lattice relaxation times and pulsed field gradient NMR diffusion measurements

    de Carvalho, Erika Martins; Velloso, Marcia Helena Rodrigues; Tinoco, Luzineide Wanderley; Figueroa-Villar, José Daniel

    2003-10-01

    A study of the nature of the anthelmintic p-cresol:piperazine complex in chloroform solution has been conducted using different NMR techniques: self-diffusion coefficients using DOSY; NOE, NULL, and double-selective T1 measurements to determine inter-molecular distances; and selective and non-selective T1 measurements to determine correlation times. The experimental results in solution and CP-MAS were compared to literature X-ray diffraction data using molecular modeling. It was shown that the p-cresol:piperazine complex exists in solution in a very similar manner as it does in the solid state, with one p-cresol molecule hydrogen bonded through the hydroxyl hydrogen to each nitrogen atom of piperazine. The close correspondence between the X-ray diffraction data and the inter-proton distances obtained by NULL and double selective excitation techniques indicate that those methodologies can be used to determine inter-molecular distances in solution.

  20. On the adequacy of Adams-Bashforth sampled-data models for characterizing complex underwater-vehicle dynamics with noisy measurements

    Jordán, Mario A.; Bustamante, Jorge L.; Berger, Carlos E.

    2014-01-01

    In this paper the adequacy of high order interpolation-based approaches to describe highly perturbed complex dynamics in discrete time was analyzed. The analysis establishes features of the approaches related to modularity, consistency with the model order and the sampling times, and accuracy in disturbed contexts with noisy measurements. A detailed study of the sensitivity of local prediction errors under a high signal-to-noise ratio is carried out with analytical expressions in dependenc...

  1. Latent-class analysis of recurrence risks for complex phenotypes with selection and measurement error: a twin and family history study of autism.

    Pickles, A; Bolton, P.; Macdonald, H.; Bailey, A; Le Couteur, A; Sim, C H; Rutter, M

    1995-01-01

    The use of the family history method to examine the pattern of recurrence risks for complex disorders such as autism is not straightforward. Problems such as uncertain phenotypic definition, unreliable measurement with increased error rates for more distant relatives, and selection due to reduced fertility all complicate the estimation of risk ratios. Using data from a recent family history study of autism, and a similar study of twins, this paper shows how a latent-class approach can be used...

  2. Translabial ultrasound assessment of the anal sphincter complex: normal measurements of the internal and external anal sphincters at the proximal, mid-, and distal levels.

    Hall, Rebecca J; Rogers, Rebecca G; Saiz, Lori; Qualls, C

    2007-08-01

    The purpose of this study was to measure the internal and external anal sphincters using translabial ultrasound (TLU) at the proximal, mid, and distal levels of the anal sphincter complex. The human review committee approval was obtained and all women gave written informed consent. Sixty women presenting for gynecologic ultrasound for symptoms other than pelvic organ prolapse or urinary or anal incontinence underwent TLU. Thirty-six (60%) were asymptomatic and intact, 13 symptomatic and intact, and 11 disrupted. Anterior-posterior diameters of the internal anal sphincter at all levels and the external anal sphincter at the distal level were measured in four quadrants. Mean sphincter measurements are given for symptomatic and asymptomatic intact women and are comparable to previously reported endoanal MRI and ultrasound measurements. PMID:17221149

  3. A system for traceable measurement of the microwave complex permittivity of liquids at high pressures and temperatures

    A system has been developed for direct traceable dielectric measurements on liquids at high pressures and temperatures. The system consists of a coaxial reflectometric sensor terminated by a metallic cylindrical cell to contain the liquid. It has been designed for measurements on supercritical liquids, but as a first step measurements on dielectric reference liquids were performed. This paper reports on a full evaluation of the system up to 2.5 GHz using methanol, ethanol and n-propanol at pressures up to 9 MPa and temperatures up to 273 °C. A comprehensive approach to the evaluation of uncertainties using Monte Carlo modelling is used

  4. Kinetic measurements and quantum chemical calculations on low spin Ni(II)/(III) macrocyclic complexes in aqueous and sulphato medium

    Anuradha Sankaran; E J Padma Malar; Venkatapuram Ramanujam Vijayaraghavan

    2015-07-01

    Cu(II) ion catalyzed kinetics of oxidation of H2O2 by [NiIIIL2] (L2 = 1,8-bis(2-hydroxyethyl)-1,3,6,8,10,13-hexaazacyclotetradecane) was studied in aqueous acidic medium in the presence of sulphate ion. The rate of oxidation of H2O2 by [NiIIIL2] is faster than that by [NiIIIL1] (L1 = 1,4,8,11-tetraazacyclote-tradecane) in sulphate medium. DFT calculations at BP86/def2-TZVP level lead to different modes of bonding between [NiL]II/III and water ligands (L = L1 and L2). In aqueous medium, two water molecules interact with [NiL]II through weak hydrogen bonds with L and are tilted by ∼23° from the vertical axis forming the dihydrate [NiL]2+.2H2O. However, there is coordinate bond formation between [NiL1]III and two water molecules in aqueous medium and an aqua and a sulphato ligand in sulphate medium leading to the octahedral complexes [NiL1(H2O)2]3+ and [NiL1(SO4)(H2O)]+. In the analogous [NiL2]III, the water molecules are bound by hydrogen bonds resulting in [NiL2]3+.2H2O and [NiL2(SO4)]+.H2O. As the sulphato complex [NiL2(SO4)]+.H2O is less stable than [NiL1(SO4)(H2O)]+ in view of the weak H-bonding interactions in the former it can react faster. Thus the difference in the mode of bonding between Ni(III) and the water ligand can explain the rate of oxidation of H2O2 by [NiIIIL] complexes.

  5. Complexity measurement of a graphical programming language and comparison of a graphical and a textual design language

    Goff, Roger Allen

    1987-01-01

    For many years the software engineering community has been attacking the software reliability problem on two fronts. First via design methodologies, languages and tools as a precheck on quality and second by measuring the quality of produced software as a postcheck. This research attempts to unify the approach to creating reliable software by providing the ability to measure the quality of a design prior to its implementation. Also presented is a comparison of a graphical and a...

  6. The Relationship of 3D Translabial Ultrasound Anal Sphincter Complex Measurements to Postpartum Anal and Fecal Incontinence

    MERIWETHER, Kate V.; HALL, Rebecca J.; LEEMAN, Lawrence M.; MIGLIACCIO, Laura; QUALLS, Clifford; ROGERS, Rebecca G.

    2015-01-01

    Objective We aimed to determine whether ASC measurements on translabial ultrasound (TL-US) were related to anal incontinence (AI) or fecal incontinence (FI) symptoms six months postpartum. Methods A prospective cohort of primiparous women underwent TL-US six months after a vaginal birth (VB) or Cesarean delivery (CD). Muscle thickness was measured at 3, 6, 9, and 12 o’clock positions of the external sphincter (EAS), the same four quadrants of the internal sphincter (IAS) at proximal, mid, and distal levels, and at the bilateral pubovisceralis muscle (PVM). Measurements were correlated to AI and FI on the Wexner Fecal Incontinence Scale, with sub-analyses by mode of delivery. The odds ratio (OR) of symptoms was calculated for every one millimeter increase in muscle thickness (E1MIT). Results 423 women (299 VB, 124 CD) had TL-US six months postpartum. Decreased AI risk was associated with thicker measurements at the 6 o’clock (OR 0.74 E1MIT) and 9 o’clock proximal IAS (OR 0.71 E1MIT) in the entire cohort. For CD women, thicker measurements of the 9 o’clock proximal IAS were associated with decreased risk of AI (OR 0.56 E1MIT) and thicker distal 6 o’clock IAS measurements were related to a decreased risk of FI (OR 0.37 E1MIT). For VB women, no sphincter measurements were significantly related to symptoms, but thicker PVM measurements were associated with increased risk of AI (right side OR 1.32 E1MIT; left side OR 1.21 E1MIT). Conclusions ASC anatomy is associated with AI and FI in certain locations; these locations varybased on the patient’s mode of delivery. PMID:26085463

  7. Electronic speckle pattern interferometry technique for the measurement of complex mechanical structures for aero-spatial applications

    Restrepo, René; Uribe-Patarroyo, Néstor; Garranzo, Daniel; Pintado, José M.; Frovel, Malte; Belenguer, Tomás

    2010-09-01

    Using the electronic speckle pattern interferometry (ESPI) technique in the in-plane arrangement, the coefficient of thermal expansion (CTE) of a composite material that will be used in a passive focusing mechanism of an aerospace mission was measured. This measurement with ESPI was compared with another interferometric method (Differential Interferometer), whose principal characteristic is its high accuracy, but the measurement is only local. As a final step, the results have been used to provide feedback with the finite element analysis (FEA). Before the composite material measurements, a quality assessment of the technique was carried out measuring the CTE of Aluminum 6061-T6. Both techniques were compared with the datasheet delivered by the supplier. A review of the basic concepts was done, especially with regards to ESPI, and the considerations to predict the quality in the fringes formation were explained. Also, a review of the basic concepts for the mechanical calculation in composite materials was done. The CTE of the composite material found was 4.69X10-6 +/- 3X10-6K-1. The most important advantage between ESPI and differential interferometry is that ESPI provides more information due to its intrinsic extended area, surface deformation reconstruction, in comparison with the strictly local measurement of differential interferometry

  8. Complexity-Based Measures Inform Effects of Tai Chi Training on Standing Postural Control: Cross-Sectional and Randomized Trial Studies.

    Peter M Wayne

    Full Text Available Diminished control of standing balance, traditionally indicated by greater postural sway magnitude and speed, is associated with falls in older adults. Tai Chi (TC is a multisystem intervention that reduces fall risk, yet its impact on sway measures vary considerably. We hypothesized that TC improves the integrated function of multiple control systems influencing balance, quantifiable by the multi-scale "complexity" of postural sway fluctuations.To evaluate both traditional and complexity-based measures of sway to characterize the short- and potential long-term effects of TC training on postural control and the relationships between sway measures and physical function in healthy older adults.A cross-sectional comparison of standing postural sway in healthy TC-naïve and TC-expert (24.5±12 yrs experience adults. TC-naïve participants then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Postural sway was assessed before and after the training during standing on a force-plate with eyes-open (EO and eyes-closed (EC. Anterior-posterior (AP and medio-lateral (ML sway speed, magnitude, and complexity (quantified by multiscale entropy were calculated. Single-legged standing time and Timed-Up-and-Go tests characterized physical function.At baseline, compared to TC-naïve adults (n = 60, age 64.5±7.5 yrs, TC-experts (n = 27, age 62.8±7.5 yrs exhibited greater complexity of sway in the AP EC (P = 0.023, ML EO (P<0.001, and ML EC (P<0.001 conditions. Traditional measures of sway speed and magnitude were not significantly lower among TC-experts. Intention-to-treat analyses indicated no significant effects of short-term TC training; however, increases in AP EC and ML EC complexity amongst those randomized to TC were positively correlated with practice hours (P = 0.044, P = 0.018. Long- and short-term TC training were positively associated with physical function.Multiscale entropy offers a complementary

  9. Measurement of Labile Cu, Pb and Their Complexation Capa-city in Yueqing Bay in Zhejiang Province, China

    王正方; 吕海燕; 傅和芳

    2004-01-01

    The complexation capacity of Cu and Pb and their labile and organic contents were determined separately for surface seawater samples from Yueqing Bay. The samples were prepared using Nuclepore filtration method yielding <1.0μm, <0.4μm and <0.2μm particulate water samples. Our data indicated that the <0.2μm colloidal fraction is a major carrier for distribution of copper in seawater. Affinity of Cu to marine microparticles plays an important role in the process. Pb however, tends to be absorbed by >0.2μm particles. The complexation capacity of Pb with <0.2μm particulates was smaller than that with 0.2-1.0μm particulates, and averaged 11.5 and 23.0nmol/L respectively. The results suggested that colloidal particles were responsible for the distribution and concentration of Pb in seawater.

  10. Numerical calculations and RF characteristics measurement of complex-conjugate impedance antenna system for ICRF heating and current drive

    Characteristics of a complex-conjugate impedance antenna system for ion cyclotron resonance frequency (ICRF) heating and current drive (H and CD) are discussed in this paper. Large RF power is reflected in such transitions as that of ELMy H-L mode due to the large change in plasma resistance during the ICRF H and CD. In such a case the RF power injection to the plasma must be ceased in order to protect the tetrode vacuum tubes. The idea of a complex-conjugate impedance antenna system to mitigate the large reflected RF power, referred to as ELM tolerance, has recently been proposed. It has been proved that the reflected RF power fraction can be reduced even during a large change in plasma resistance. But it is not reduced to the level allowable for the tetrode tubes. Therefore, an improved system adding a single stub tuner is proposed. Using it, the reflected RF power fraction can be reduced to the allowable level. Numerical calculations and experiments are carried out for this system. It is found that the experimental data agree with numerical calculations.

  11. Constraining complex aquifer geometry with geophysics (2-D ERT and MRS measurements) for stochastic modelling of groundwater flow

    Chaudhuri, A.; Sekhar, M.; Descloitres, M.; Godderis, Y.; Ruiz, L.; Braun, J. J.

    2013-11-01

    Stochastic modelling is a useful way of simulating complex hard-rock aquifers as hydrological properties (permeability, porosity etc.) can be described using random variables with known statistics. However, very few studies have assessed the influence of topological uncertainty (i.e. the variability of thickness of conductive zones in the aquifer), probably because it is not easy to retrieve accurate statistics of the aquifer geometry, especially in hard rock context. In this paper, we assessed the potential of using geophysical surveys to describe the geometry of a hard rock-aquifer in a stochastic modelling framework. The study site was a small experimental watershed in South India, where the aquifer consisted of a clayey to loamy-sandy zone (regolith) underlain by a conductive fissured rock layer (protolith) and the unweathered gneiss (bedrock) at the bottom. The spatial variability of the thickness of the regolith and fissured layers was estimated by electrical resistivity tomography (ERT) profiles, which were performed along a few cross sections in the watershed. For stochastic analysis using Monte Carlo simulation, the generated random layer thickness was made conditional to the available data from the geophysics. In order to simulate steady state flow in the irregular domain with variable geometry, we used an isoparametric finite element method to discretize the flow equation over an unstructured grid with irregular hexahedral elements. The results indicated that the spatial variability of the layer thickness had a significant effect on reducing the simulated effective steady seepage flux and that using the conditional simulations reduced the uncertainty of the simulated seepage flux. As a conclusion, combining information on the aquifer geometry obtained from geophysical surveys with stochastic modelling is a promising methodology to improve the simulation of groundwater flow in complex hard-rock aquifers.

  12. Speed Isn't Everything: Complex Processing Speed Measures Mask Individual Differences and Developmental Changes in Executive Control

    Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko

    2013-01-01

    The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of "processing speed" may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive control and…

  13. Topography, complex refractive index, and conductivity of graphene layers measured by correlation of optical interference contrast, atomic force, and back scattered electron microscopy

    The optical phase shift by reflection on graphene is measured by interference contrast microscopy. The height profile across graphene layers on 300 nm thick SiO2 on silicon is derived from the phase profile. The complex refractive index and conductivity of graphene layers on silicon with 2 nm thin SiO2 are evaluated from a phase profile, while the height profile of the layers is measured by atomic force microscopy. It is observed that the conductivity measured on thin SiO2 is significantly greater than on thick SiO2. Back scattered electron contrast of graphene layers is correlated to the height of graphene layers

  14. Structure and equilibria of Ca 2+-complexes of glucose and sorbitol from multinuclear ( 1H, 13C and 43Ca) NMR measurements supplemented with molecular modelling calculations

    Pallagi, A.; Dudás, Cs.; Csendes, Z.; Forgó, P.; Pálinkó, I.; Sipos, P.

    2011-05-01

    Ca 2+-complexation of D-glucose and D-sorbitol have been investigated with the aid of multinuclear ( 1H, 13C and 43Ca) NMR spectroscopy and ab initio quantum chemical calculations. Formation constants of the forming 1:1 complexes have been estimated from one-dimensional 13C NMR spectra obtained at constant ionic strength (1 M NaCl). Binding sites were identified from 2D 1H- 43Ca NMR spectra. 2D NMR measurements and ab initio calculations indicated that Ca 2+ ions were bound in a tridentate manner via the glycosidic OH, the ethereal oxygen in the ring and the OH on the terminal carbon for the α- and β-anomers of glucose and for sorbitol simultaneous binding of four hydroxide moieties (C1, C2, C4 and C6) was suggested.

  15. Theory and calibration of non-nulling seven-hole cone probes for use in complex flow measurement

    Everett, K. N.; Durston, D. A.; Gerner, A. A.

    1982-01-01

    A seven-hole conical pressure probe capable of measuring flow conditions at angles up to 75 deg relative to its axis is described. The theoretical rationale of the seven-hole probe is developed and the calibration procedure outlined. Three-variable third order polynomials are used to represent local values of total pressure, static pressure, Mach number and relative flow angles. These flow conditions can be determined explicitly from measured probe pressures. Flow angles may be determined within 2.5 deg and Mach number within 0.05 with 95% certainty. The probe was calibrated in subsonic compressible and incompressible flows. Results of a calibration of four seven-hole probes are presented.

  16. Development of a microwave transmission setup for time-resolved measurements of the transient complex conductivity in bulk samples

    Schins, J. M.; Prins, P.; Grozema, F. C.; Abellón, R. D.; de Haas, M. P.; Siebbeles, L. D. A.

    2005-08-01

    We describe and characterize a microwave transmission setup for the measurement of radiation-induced transient conductivities in the frequency range between 26 and 38GHz (Q band). This technique combines the virtues of two already existing techniques. On one hand, the microwave transmission technique is well established for the determination of (quasi)static conductivities, but requires adaptations to be suitable to the determination of transient conductivities with 1ns temporal resolution. On the other hand, the transient conductivity technique is well established, too, but in its present form (using a reflection configuration) it suffers from a poor signal to noise ratio due to unwanted interferences. These interferences are due to the circulator, which diverts part of the incoming microwave flux directly to the detector. We characterized the transmission setup by measuring the real and imaginary components of the conductivity of pulse irradiated CO2 gas at different pressures, and compared these results to predictions of the Drude model. CO2 was chosen as a test sample because of its well characterized behavior when irradiated with MeV electron pulses, and the fact that a wide range of the ratios of imaginary to real components of the conductivity are obtainable by just controlling the pressure. For intrinsic bulk isolators (either powders or in solution) pulse-induced conductivity changes as small as 10-8S /m can be measured with nanosecond time resolution. The ratio of the imaginary to real part of the conductivity can be measured in the range from 0.084 to 28, which means that the dynamic range has been increased more than 100-fold with respect to the customary reflection setup.

  17. Measurement of environmental impacts of telework adoption amidst change in complex organizations. AT and T survey methodology and results

    Atkyns, Robert; Blazek, Michele; Roitz, Joseph [AT and T, 179 Bothin Road, 94930 Fairfax, CA (United States)

    2002-10-01

    Telecommuting practices and their environmental and organizational performance impacts have stimulated research across academic disciplines. Although telecommuting trends and impact projections are reported, few true longitudinal studies involving large organizations have been conducted. Published studies typically lack the research design elements to control a major confounding variable: rapid and widespread organizational change. Yet social science 'Best Practices' and market research industry quality control procedures exist that can help manage organizational change effects and other common sources of measurement error. In 1992, AT and T established a formal, corporate-wide telecommuting policy. A research and statistical modeling initiative was implemented to measure how flexible work arrangements reduce automotive emissions. Annual employee surveys were begun in 1994. As telecommuting benefits have been increasingly recognized within AT and T, the essential construct has been redefined as 'telework.' The survey's scope has expanded to address broader organization issues and provide guidance to multiple internal constituencies. This paper focuses upon the procedures used to reliably measure the adoption of telework practices and model their environmental impact, and contrasts those procedures with other, less reliable methodologies.

  18. Statistical and Spectral Analysis of Wind Characteristics Relevant to Wind Energy Assessment Using Tower Measurements in Complex Terrain

    Radian Belu

    2013-01-01

    Full Text Available The main objective of the study was to investigate spatial and temporal characteristics of the wind speed and direction in complex terrain that are relevant to wind energy assessment and development, as well as to wind energy system operation, management, and grid integration. Wind data from five tall meteorological towers located in Western Nevada, USA, operated from August 2003 to March 2008, used in the analysis. The multiannual average wind speeds did not show significant increased trend with increasing elevation, while the turbulence intensity slowly decreased with an increase were the average wind speed. The wind speed and direction were modeled using the Weibull and the von Mises distribution functions. The correlations show a strong coherence between the wind speed and direction with slowly decreasing amplitude of the multiday periodicity with increasing lag periods. The spectral analysis shows significant annual periodicity with similar characteristics at all locations. The relatively high correlations between the towers and small range of the computed turbulence intensity indicate that wind variability is dominated by the regional synoptic processes. Knowledge and information about daily, seasonal, and annual wind periodicities are very important for wind energy resource assessment, wind power plant operation, management, and grid integration.

  19. Determination of catecholamines based on the measurement of the metal nanoparticle-enhanced fluorescence of their terbium complexes

    We have developed a method for the determination of the three catecholamines (CAs) epinephrine (EP), norepinephrine (NE), and dopamine (DA) at sub-nanomolar levels. It is found that the luminescence of the complexes formed between the CAs and Tb 3+ ion is strongly enhanced in the presence of colloidal silver nanoparticles (Ag-NPs). The Ag-NPs cause a transfer of the resonance energy to the fluorophores through the interaction of the excited-state fluorophores and surface plasmon electrons in the Ag-NPs. Under the optimized condition, the luminescence intensity of the system is linearly related to the concentration of the CAs. Linearity is observed in the concentration ranges of 2. 5-110 nM for EP, 2. 8-240 nM for NE, and 2. 4-140 nM for DA, with limits of detection as low as 0. 25 nM, 0. 64 nM and 0. 42 nM, respectively. Relative standard deviations were determined at 10 nM concentrations (for n = 10) and gave values of 0. 98%, 1. 05% and 0. 96% for EP, NE and DA, respectively. Catecholamines were successfully determined in pharmaceutical preparations, and successful recovery experiments are demonstrated for urine and serum samples. (author)

  20. Utilization of Methyl Proton Resonances in Cross-Saturation Measurement for Determining the Interfaces of Large Protein-Protein Complexes

    Cross-saturation experiments allow the identification of the contact residues of large protein complexes (MW>50 K) more rigorously than conventional NMR approaches which involve chemical shift perturbations and hydrogen-deuterium exchange experiments [Takahashi et al. (2000) Nat. Struct. Biol., 7, 220-223]. In the amide proton-based cross-saturation experiment, the combined use of high deuteration levels for non-exchangeable protons of the ligand protein and a solvent with a low concentration of 1H2Ogreatly enhanced the selectivity of the intermolecular cross-saturation phenomenon. Unfortunately, experimental limitations caused losses in sensitivity. Furthermore, since main chain amide protons are not generally exposed to solvent, the efficiency of the saturation transfer directed to the main chain amide protons is not very high. Here we propose an alternative cross-saturation experiment which utilizes the methyl protons of the side chains of the ligand protein. Owing to the fast internal rotation along the methyl axis, we theoretically and experimentally demonstrated the enhanced efficiency of this approach. The methyl-utilizing cross-saturation experiment has clear advantages in sensitivity and saturation transfer efficiency over the amide proton-based approach

  1. Analytical algorithm for modeling polarized solar radiation transfer through the atmosphere for application in processing complex lidar and radiometer measurements

    Inversion algorithms and program packages recently created for processing data of the ground-based radiometer spectral measurements along with lidar multi-wavelength measurements are extremely multiparametric. Therefore, it is very important to develop an efficient program module for computations of functions modeling measurements by a sun-radiometer in the inversion procedure. In this paper, we present the analytical version of such efficient algorithm and analytical code on C++ designed for performance of algorithm testing. The code computes multiple scattering of the Sun light in the atmosphere. Data output are the radiance and linear polarization parameters angular patterns at a preselected altitude. The atmosphere model with mixed aerosol and molecular scattering is given approximately as the homogeneous atmosphere model. The algorithm testing has been carried out by comparison of computed data with accurate data obtained on the base of the discrete-ordinate code. Errors of estimates of downward radiance above the Earth surface turned out to be within 10%–15%.. The analytical solution construction concept has taken from the scalar task of solar radiation transfer in the atmosphere where an approximate analytical solution was developed. Taking into account the fact that aerosol phase functions are highly forward elongated, the multi-component method of solving vector transfer equations and small-angle approximation have been used. Generalization of the scalar approach to the polarization parameters is described. - Highlights: • We create an analytical algorithm and code to solve direct atmospheric task. • Data-out include a Stokes vector of scattered Sun light in a homogeneous atmosphere. • Solution for radiance involves several rather accurate approximations of scalar theory. • Errors of radiance estimates at the atmosphere bottom are within 10–15%

  2. Corrosion of (Th,U) O{sub 2} fuel at room temperature and near 100 degrees C in near neutral and acidic water (pH 3)

    Sunder, S.; Miller, N.H

    1998-10-01

    Dissolution and corrosion of (Th,U)O{sub 2} fuel was investigated at room temperature and near 100 degrees C in near neutral and acidic water (pH 3) to evaluate the suitability of irradiated UO{sub 2}-doped thoria as a waste form for direct geological disposal. X-ray photoelectron spectroscopy and X-ray diffraction were used to study oxidation of (Th,U)O{sub 2} fuel. The uranium in the surface of (Th,U)O{sub 2} fuel undergoes oxidation similar to that observed in UO{sub 2} fuel under similar conditions. Nevertheless, the dissolution rate of uranium from (Th,U)O{sub 2} fuel in aerated solutions is much lower than that from UO{sub 2} fuel under similar conditions. (author)

  3. The distribution of acid, water, methanol, ethanol and acetone between mixed aqueous-organic nitric acid solutions of trilaurylammoniumnitrate in cyclohexane

    The distribution of acid, water, methanol, ethanol and acetone between mixed aqueous-organic nitric acid solutions and solutions of trilaurylammoniumnitrate in cyclohexane has been investigated. The distribution of acid rises with increasing concentrations of nitric acid, methanol, ethanol and acetone in the mixed aqueous-organic phase. The effect of the organic additives in increasing the distribution of the acid is methanol< ethanol< acetone. The concentration of nitric acid in the organic phase can be calculated by a formula similar to that describing the extraction from pure aqueous solutions. The distribution curves of water, methanol and ethanol resemble each other, all of them showing a minimum, when the distribution ratio is plotted versus the nitric acid concentration in the mixed aqueous-organic phase. The acetone distribution decreases steadily with increasing nitric acid concentration. The shape of the curves is briefly discussed. (T.G.)

  4. Corrosion of (Th,U) O2 fuel at room temperature and near 100 degrees C in near neutral and acidic water (pH 3)

    Dissolution and corrosion of (Th,U)O2 fuel was investigated at room temperature and near 100 degrees C in near neutral and acidic water (pH 3) to evaluate the suitability of irradiated UO2-doped thoria as a waste form for direct geological disposal. X-ray photoelectron spectroscopy and X-ray diffraction were used to study oxidation of (Th,U)O2 fuel. The uranium in the surface of (Th,U)O2 fuel undergoes oxidation similar to that observed in UO2 fuel under similar conditions. Nevertheless, the dissolution rate of uranium from (Th,U)O2 fuel in aerated solutions is much lower than that from UO2 fuel under similar conditions. (author)

  5. Direct quantitative electrical measurement of many-body interactions in exciton complexes in InAs quantum dots.

    Labud, P A; Ludwig, A; Wieck, A D; Bester, G; Reuter, D

    2014-01-31

    We present capacitance-voltage spectra for the conduction band states of InAs quantum dots obtained under continuous illumination. The illumination leads to the appearance of additional charging peaks that we attribute to the charging of electrons into quantum dots containing a variable number of illumination-induced holes. By this we demonstrate an electrical measurement of excitonic states in quantum dots. Magnetocapacitance-voltage spectroscopy reveals that the electron always tunnels into the lowest electronic state. This allows us to directly extract, from the highly correlated many-body states, the correlation energy. The results are compared quantitatively to state of the art atomistic configuration interaction calculations, showing very good agreement for a lower level of excitations and also limitations of the approach for an increasing number of particles. Our experiments offer a rare benchmark to many-body theoretical calculations. PMID:24580478

  6. Experiment on measurement of energetic neutron and γ quanta fluxes at the Salyut-7 - Kosmos-1686 orbital complex

    A rather simple small-size device to measure the fluxes of energetic neutrons and gamma quanta was installed aboard the ''Kosmos-1686'' satellite launched on September 27, 1985, mated to the ''Salyut-7'' orbital station on October 2, 1985 and working within this orbital compelx through April, 1988. To register neutrons, their inelastic interactions with Cs and I nuclei followed by the yield of heavy charged particles in the scintillation counter were used. The device detector, CsI(Tl) crystal with 7.5 cm diameter and hight was surrounded by an active charged-particle shield of plastic scintillator 2cm thick, the system was controlled by the photomultiplier FEhU-110. The thickness of counter casing walls is 0.8 gxcm-2 Al. Event separation in CsI(Tl) and in the plastic scintillator was carried out according to the difference in luminescent times by a conventional electric circuit

  7. Analysis of Zr in UZr (6%) alloy has been done by measuring complex compounds of Zr-Arsenazo III with spectrophotometric method

    Sample solutions were prepared by dissolving UZr 6% ingot using 1 M HF and 1 M HNO3 solvent at 95°C. Complex formation of UZr was made by the reaction of UZr (6%) with arsenazo III (0.1%). Verification of parameters for the formation and measurement of Zr-arsenazo III complex was performed by using standard solution of SRM SPEK Zr at various concentrations, arsenazo III (0.1%) and HCl. The result showed that the optimum condition for the formation of Zr-arsenazo III complex was at a wavelength of 666.3 nm, 9 N HCl, which was stable at 30 minutes to 3.5 hours and at a concentration of arsenazo III (0.1%) of 80 ppm. The analysis was done in the range of 0.04 ppm to 0.5 ppm with an accuracy and precision of 1.846% and 0.868% respectively. The Separation process of Zr from UZr alloy was required before the analysis because the presence of uranium in the sample might affect the analysis result significantly. The separation of Zr from uranium matrix was done by extraction using TBP/kerosene (7:3) with contacting time of 10 minutes. The analysis of Zr in the aqueous phase shows that the content of Zr is 5.28% with 91,84% recovery and 0,08% method precision (RSD). (author)

  8. Measuring $\

    Mitchell, Jessica Sarah [Univ. of Cambridge (United Kingdom)

    2011-01-01

    The MINOS Experiment consists of two steel-scintillator calorimeters, sampling the long baseline NuMI muon neutrino beam. It was designed to make a precise measurement of the ‘atmospheric’ neutrino mixing parameters, Δm2 atm. and sin2 (2 atm.). The Near Detector measures the initial spectrum of the neutrino beam 1km from the production target, and the Far Detector, at a distance of 735 km, measures the impact of oscillations in the neutrino energy spectrum. Work performed to validate the quality of the data collected by the Near Detector is presented as part of this thesis. This thesis primarily details the results of a vμ disappearance analysis, and presents a new sophisticated fitting software framework, which employs a maximum likelihood method to extract the best fit oscillation parameters. The software is entirely decoupled from the extrapolation procedure between the detectors, and is capable of fitting multiple event samples (defined by the selections applied) in parallel, and any combination of energy dependent and independent sources of systematic error. Two techniques to improve the sensitivity of the oscillation measurement were also developed. The inclusion of information on the energy resolution of the neutrino events results in a significant improvement in the allowed region for the oscillation parameters. The degree to which sin2 (2θ )= 1.0 could be disfavoured with the exposure of the current dataset if the true mixing angle was non-maximal, was also investigated, with an improved neutrino energy reconstruction for very low energy events. The best fit oscillation parameters, obtained by the fitting software and incorporating resolution information were: | Δm2| = 2.32+0.12 -0.08×10-3 eV2 and sin2 (2θ ) > 0.90(90% C.L.). The analysis provides the current world best measurement of the atmospheric neutrino mass

  9. Measuring and predicting reservoir heterogeneity in complex deposystems. The fluvial-deltaic Big Injun Sandstone in West Virginia. Final report, September 20, 1991--October 31, 1993

    Hohn, M.E.; Patchen, D.G.; Heald, M.; Aminian, K.; Donaldson, A.; Shumaker, R.; Wilson, T.

    1994-05-01

    Non-uniform composition and permeability of a reservoir, commonly referred to as reservoir heterogeneity, is recognized as a major factor in the efficient recovery of oil during primary production and enhanced recovery operations. Heterogeneities are present at various scales and are caused by various factors, including folding and faulting, fractures, diagenesis and depositional environments. Thus, a reservoir consists of a complex flow system, or series of flow systems, dependent on lithology, sandstone genesis, and structural and thermal history. Ultimately, however, fundamental flow units are controlled by the distribution and type of depositional environments. Reservoir heterogeneity is difficult to measure and predict, especially in more complex reservoirs such as fluvial-deltaic sandstones. The Appalachian Oil and Natural Gas Research Consortium (AONGRC), a partnership of Appalachian basin state geological surveys in Kentucky, Ohio, Pennsylvania, and West Virginia, and West Virginia University, studied the Lower Mississippian Big Injun sandstone in West Virginia. The Big Injun research was multidisciplinary and designed to measure and map heterogeneity in existing fields and undrilled areas. The main goal was to develop an understanding of the reservoir sufficient to predict, in a given reservoir, optimum drilling locations versus high-risk locations for infill, outpost, or deeper-pool tests.

  10. Human brain mapping under increasing cognitive complexity using regional cerebral blood flow measurements and positron emission tomography.

    Law, Ian

    2007-11-01

    Measurement of the regional cerebral blood flow (rCBF) is an important parameter in the evaluation of cerebral function. With positron emission tomography (PET) rCBF has predominantly been quantified using the short-lived radiotracer oxygen-15 labelled water (H 2 15 O) and an adaptation of the Kety one-tissue compartment autoradiographic model. The values attained in putative grey matter, however, are systematically underestimated because of the limited scanner resolution. For this reason we applied a dynamic kinetic two-tissue compartment model including a fast and a slow flow component each with a perfusable tissue fraction. In the fast component rCBF was 2-2.5 times greater than grey matter values using traditional autoradiography in both human and monkey. Visual stimulation in human gave a corrected rCBF increase of approximately 40%. Visual stimulation was also used to indirectly validate carbon-10 labelled carbondioxide ( 10 CO 2 ), a new very short-lived rCBF PET tracer with a half-life of only 19.3 seconds. This allowed an increase in the number of independent PET scans per subject from 12-14 using H 2 15 O to 64 using 10 CO 2 . The experiment demonstrated a maximal activation response in the visual cortex at a 10-15 Hz stimulation frequency. The use of the rCBF PET mapping technique is illustrated by studies of the organization of language and the oculomotor system. With respect to the former, we found confirmation of neuropsychological evidence of the involvement of the left supramarginal/angular gyrus in reading in Japanese of a phonologically based script system, Kana, and of the left posterior inferior temporal gyrus in reading of a morphogram based script system, Kanji. Concerning the organization of the oculomotor system we found overlapping areas in fronto-parietal cortex involved in maintaining visual fixation, and performing visually guided and imagined eye movements. These data show that overt eye movements are not a prerequisite of the

  11. Design of measuring machine for complex geometry based on form-free measurement mode%基于免形状测量模式的复杂形状测量机设计

    石照耀; 张斌; 林家春

    2012-01-01

    To measure complex geometries without nominal mathematics models, a "form-free measurement mode" is introduced, and its basic requirements for measuring machine are analyzed. A fixed column structure coordinate measuring machine with high accuracy and efficiency was designed based on the novel mode. The machine is driven by linear motors, and high accuracy gratings are used as measurement devices. To decrease the influence of work-piece weight, a closed aerostatic bearing with vacuum preload and an H style two dimensional co-planar structure are designed. A pneumatic cylinder balanced axis Z assembly with brake function is designed, and a vibration isolation assembly is also designed. The measurement span of the machine is 300 mm × 300 mm × 300 mm and the measurement uncertainty is 1.8 μm. It can be applied to measure complex geometries without nominal mathematics models.%为了高精度高效率地测量数学模型未知的复杂几何形状,介绍了“免形状测量模式”,分析了该测量模式对仪器的基本要求,基于该模式设计了一台移动工作台式测量机.测量机以直线电机为驱动元件,以高精度长光栅为测量元件.为减小被测件重量对测量机的影响,设计了封闭式真空负压的空气静压气浮导轨和共平面的H形二维结构.设计了具有制动功能的Z轴和气浮隔振等关键部件.仪器量程为300 mm × 300 mm×300 mm,测量不确定度为1.8μm,能测量数学模型未知的复杂几何形状.

  12. Randomness, Information, and Complexity

    Grassberger, Peter

    2012-01-01

    We review possible measures of complexity which might in particular be applicable to situations where the complexity seems to arise spontaneously. We point out that not all of them correspond to the intuitive (or "naive") notion, and that one should not expect a unique observable of complexity. One of the main problems is to distinguish complex from disordered systems. This and the fact that complexity is closely related to information requires that we also give a review of information measures. We finally concentrate on quantities which measure in some way or other the difficulty of classifying and forecasting sequences of discrete symbols, and study them in simple examples.

  13. A comparison of the effects of 6 weeks of traditional resistance training, plyometric training, and complex training on measures of strength and anthropometrics.

    MacDonald, Christopher J; Lamont, Hugh S; Garner, John C

    2012-02-01

    Complex training (CT; alternating between heavy and lighter load resistance exercises with similar movement patterns within an exercise session) is a form of training that may potentially bring about a state of postactivation potentiation, resulting in increased dynamic power (Pmax) and rate of force development during the lighter load exercise. Such a method may be more effective than either modality, independently for developing strength. The purpose of this research was to compare the effects of resistance training (RT), plyometric training (PT), and CT on lower body strength and anthropometrics. Thirty recreationally trained college-aged men were trained using 1 of 3 methods: resistance, plyometric, or complex twice weekly for 6 weeks. The participants were tested pre, mid, and post to assess back squat strength, Romanian dead lift (RDL) strength, standing calf raise (SCR) strength, quadriceps girth, triceps surae girth, body mass, and body fat percentage. Diet was not controlled during this study. Statistical measures revealed a significant increase for squat strength (p = 0.000), RDL strength (p = 0.000), and SCR strength (p = 0.000) for all groups pre to post, with no differences between groups. There was also a main effect for time for girth measures of the quadriceps muscle group (p = 0.001), the triceps surae muscle group (p = 0.001), and body mass (p = 0.001; post hoc revealed no significant difference). There were main effects for time and group × time interactions for fat-free mass % (RT: p = 0.031; PT: p = 0.000). The results suggest that CT mirrors benefits seen with traditional RT or PT. Moreover, CT revealed no decrement in strength and anthropometric values and appears to be a viable training modality. PMID:22240547

  14. Direct sun and airborne MAX-DOAS measurements of the collision induced oxygen complex, O2O2 absorption with significant pressure and temperature differences

    E. Spinei

    2014-09-01

    Full Text Available The collision induced O2 complex, O2O2, is a very important trace gas in remote sensing measurements of aerosol and cloud properties. Some ground based MAX-DOAS measurements of O2O2 slant column density require correction factors of 0.75 ± 0.1 to reproduce radiative transfer modeling (RTM results for a near pure Rayleigh atmosphere. One of the potential causes of this discrepancy is believed to be uncertainty in laboratory measured O2O2 absorption cross section temperature and pressure dependence, due to difficulties in replicating atmospheric conditions in the laboratory environment. This paper presents direct-sun (DS and airborne multi-axis (AMAX DOAS measurements of O2O2 absorption optical depths under actual Earth atmospheric conditions in two wavelength regions (335–390 nm and 435–490 nm. DS irradiance measurements were made by the research grade MFDOAS instrument from 2007–2014 at seven sites with significant pressure (778–1013 hPa and O2O2 profile weighted temperature (247–275 K differences. Aircraft MAX-DOAS measurements were conducted by the University of Colorado AMAX-DOAS instrument on 29 January 2012 over the Southern Hemisphere subtropical Pacific Ocean. Scattered solar radiance spectra were collected at altitudes between 9 and 13.2 km, with O2O2 profile weighted temperatures of 231–244 K, and near pure Rayleigh scattering conditions. Due to the well defined DS air mass factors and extensively characterized atmospheric conditions during the AMAX-DOAS measurements, O2O2"pseudo" absorption cross sections, σ, are derived from the observed optical depths and estimated O2O2column densities. Vertical O2O2 columns are calculated from the atmospheric sounding temperature, pressure and specific humidity profiles. Based on the atmospheric DS observations, there is no pressure dependence of the O2O2 σ, within the measurement errors (3%. The two data sets are combined to derive peak σ temperature dependence of 360 and 477 nm

  15. Direct sun and airborne MAX-DOAS measurements of the collision induced oxygen complex, O2O2 absorption with significant pressure and temperature differences

    Spinei, E.; Cede, A.; Herman, J.; Mount, G. H.; Eloranta, E.; Morley, B.; Baidar, S.; Dix, B.; Ortega, I.; Koenig, T.; Volkamer, R.

    2014-09-01

    The collision induced O2 complex, O2O2, is a very important trace gas in remote sensing measurements of aerosol and cloud properties. Some ground based MAX-DOAS measurements of O2O2 slant column density require correction factors of 0.75 ± 0.1 to reproduce radiative transfer modeling (RTM) results for a near pure Rayleigh atmosphere. One of the potential causes of this discrepancy is believed to be uncertainty in laboratory measured O2O2 absorption cross section temperature and pressure dependence, due to difficulties in replicating atmospheric conditions in the laboratory environment. This paper presents direct-sun (DS) and airborne multi-axis (AMAX) DOAS measurements of O2O2 absorption optical depths under actual Earth atmospheric conditions in two wavelength regions (335-390 nm and 435-490 nm). DS irradiance measurements were made by the research grade MFDOAS instrument from 2007-2014 at seven sites with significant pressure (778-1013 hPa) and O2O2 profile weighted temperature (247-275 K) differences. Aircraft MAX-DOAS measurements were conducted by the University of Colorado AMAX-DOAS instrument on 29 January 2012 over the Southern Hemisphere subtropical Pacific Ocean. Scattered solar radiance spectra were collected at altitudes between 9 and 13.2 km, with O2O2 profile weighted temperatures of 231-244 K, and near pure Rayleigh scattering conditions. Due to the well defined DS air mass factors and extensively characterized atmospheric conditions during the AMAX-DOAS measurements, O2O2"pseudo" absorption cross sections, σ, are derived from the observed optical depths and estimated O2O2column densities. Vertical O2O2 columns are calculated from the atmospheric sounding temperature, pressure and specific humidity profiles. Based on the atmospheric DS observations, there is no pressure dependence of the O2O2 σ, within the measurement errors (3%). The two data sets are combined to derive peak σ temperature dependence of 360 and 477 nm absorption bands from 231

  16. A system for measuring complex dielectric properties of thin films at submillimeter wavelengths using an open hemispherical cavity and a vector network analyzer

    Rahman, Rezwanur; Taylor, P. C.; Scales, John A.

    2013-08-01

    Quasi-optical (QO) methods of dielectric spectroscopy are well established in the millimeter and submillimeter frequency bands. These methods exploit standing wave structure in the sample produced by a transmitted Gaussian beam to achieve accurate, low-noise measurement of the complex permittivity of the sample [e.g., J. A. Scales and M. Batzle, Appl. Phys. Lett. 88, 062906 (2006);, 10.1063/1.2172403 R. N. Clarke and C. B. Rosenberg, J. Phys. E 15, 9 (1982);, 10.1088/0022-3735/15/1/002 T. M. Hirovnen, P. Vainikainen, A. Lozowski, and A. V. Raisanen, IEEE Trans. Instrum. Meas. 45, 780 (1996)], 10.1109/19.516996. In effect the sample itself becomes a low-Q cavity. On the other hand, for optically thin samples (films of thickness much less than a wavelength) or extremely low loss samples (loss tangents below 10-5) the QO approach tends to break down due to loss of signal. In such a case it is useful to put the sample in a high-Q cavity and measure the perturbation of the cavity modes. Provided that the average mode frequency divided by the shift in mode frequency is less than the Q (quality factor) of the mode, then the perturbation should be resolvable. Cavity perturbation techniques are not new, but there are technological difficulties in working in the millimeter/submillimeter wave region. In this paper we will show applications of cavity perturbation to the dielectric characterization of semi-conductor thin films of the type used in the manufacture of photovoltaics in the 100 and 350 GHz range. We measured the complex optical constants of hot-wire chemical deposition grown 1-μm thick amorphous silicon (a-Si:H) film on borosilicate glass substrate. The real part of the refractive index and dielectric constant of the glass-substrate varies from frequency-independent to linearly frequency-dependent. We also see power-law behavior of the frequency-dependent optical conductivity from 316 GHz (9.48 cm-1) down to 104 GHz (3.12 cm-1).

  17. Environmental Assessment and Finding of No Significant Impact: Interim Measures for the Mixed Waste Management Facility Groundwater at the Burial Ground Complex at the Savannah River Site

    N/A

    1999-12-08

    The U. S. Department of Energy (DOE) prepared this environmental assessment (EA) to analyze the potential environmental impacts associated with the proposed interim measures for the Mixed Waste Management Facility (MW) groundwater at the Burial Ground Complex (BGC) at the Savannah River Site (SRS), located near Aiken, South Carolina. DOE proposes to install a small metal sheet pile dam to impound water around and over the BGC groundwater seepline. In addition, a drip irrigation system would be installed. Interim measures will also address the reduction of volatile organic compounds (VOCS) from ''hot-spot'' regions associated with the Southwest Plume Area (SWPA). This action is taken as an interim measure for the MWMF in cooperation with the South Carolina Department of Health and Environmental Control (SCDHEC) to reduce the amount of tritium seeping from the BGC southwest groundwater plume. The proposed action of this EA is being planned and would be implemented concurrent with a groundwater corrective action program under the Resource Conservation and Recovery Act (RCRA). On September 30, 1999, SCDHEC issued a modification to the SRS RCRA Part B permit that adds corrective action requirements for four plumes that are currently emanating from the BGC. One of those plumes is the southwest plume. The RCRA permit requires SRS to submit a corrective action plan (CAP) for the southwest plume by March 2000. The permit requires that the initial phase of the CAP prescribe a remedy that achieves a 70-percent reduction in the annual amount of tritium being released from the southwest plume area to Fourmile Branch, a nearby stream. Approval and actual implementation of the corrective measure in that CAP may take several years. As an interim measure, the actions described in this EA would manage the release of tritium from the southwest plume area until the final actions under the CAP can be implemented. This proposed action is expected to reduce the

  18. Modelling Complexity in Musical Rhythm

    Liou, Cheng-Yuan; Wu, Tai-Hei; Lee, Chia-Ying

    2007-01-01

    This paper constructs a tree structure for the music rhythm using the L-system. It models the structure as an automata and derives its complexity. It also solves the complexity for the L-system. This complexity can resolve the similarity between trees. This complexity serves as a measure of psychological complexity for rhythms. It resolves the music complexity of various compositions including the Mozart effect K488. Keyword: music perception, psychological complexity, rhythm, L-system, autom...

  19. Evolution of biological complexity

    Adami, Christoph; Ofria, Charles; Collier, Travis C.

    2000-01-01

    In order to make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent information-theoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a sequence stores about its environment. We investigate the evolution of genomic complexity in populations of digital organisms and monitor in detail the evolutionary transitions that increase complexit...

  20. Complex Beauty

    Franceschet, Massimo

    2014-01-01

    Complex systems and their underlying convoluted networks are ubiquitous, all we need is an eye for them. They pose problems of organized complexity which cannot be approached with a reductionist method. Complexity science and its emergent sister network science both come to grips with the inherent complexity of complex systems with an holistic strategy. The relevance of complexity, however, transcends the sciences. Complex systems and networks are the focal point of a philosophical, cultural ...

  1. Considering sampling strategy and cross-section complexity for estimating the uncertainty of discharge measurements using the velocity-area method

    Despax, Aurélien; Perret, Christian; Garçon, Rémy; Hauet, Alexandre; Belleville, Arnaud; Le Coz, Jérôme; Favre, Anne-Catherine

    2016-02-01

    uncertainty component for any routine gauging, the four most similar gaugings among the reference stream-gaugings dataset are selected using an analog approach, where analogy includes both riverbed shape and flow distribution complexity. This new method was applied to 3185 stream-gaugings with various flow conditions and compared with the other methods (ISO 748 , IVE, Q + with a simple automated parametrization). Results show that FLAURE is overall consistent with the Q + method but not with ISO 748 and IVE methods, which produce clearly overestimated uncertainties for discharge measurements with less than 15 verticals. The FLAURE approach therefore appears to be a consistent method. An advantage is the explicit link made between the estimation of cross-sectional interpolation errors and the study of high-resolution reference gaugings.

  2. Matrix effects during magnetic sector-field inductively coupled plasma mass spectrometry Uranium isotope ratio measurements in complex environmental/biological samples

    Sample matrix effects on mass discrimination during inductively coupled plasma mass spectrometry (ICP-MS) isotope ratio measurements are rarely reported. However, they can lead to errors larger than the uncertainty claimed on the ratio results when not properly taken into account or corrected for. For instance, up to 1% matrix specific effects were experienced during an isotope dilution mass spectrometry campaign we carried out for the certification of the Cd amount content in some food digest samples (7% acidity and salt content around 450μg g-1). Specific nuclear safeguards programs were designed for the monitoring of declared and non-declared nuclear activities and important efforts are currently deployed to better understand the consequences on human health of the dispersion of depleted uranium in the environment. The interest in developing and/or improving measurement capabilities for uranium isotope ratios and uranium content in environmental and biological samples has therefore considerably increased in the last decade. However, procedure validation is rarely addressed with these developments even though, for instance, non-disputable uncertainty statements are absolutely crucial to underpin correctly the important decisions of political, economical, military or medical nature that can arise from these results. This is why we produced simulated urine samples (complex matrix made of organic and inorganic components) with certified n(234U)/n(238U), n(235U)/n(238U) and n(236U)/n(238U) ratios. These, which will eventually be commercially available for validation purposes, will first be used as test materials for an international interlaboratory comparison organised by IRMM and this exercise, named NUSIMEP-4 and open for participation to anyone]. This presentation will introduce magnetic sector-field inductively coupled plasma mass spectrometry (ICP-MS) uranium isotope ratio measurements on real human urine samples and in the NUSIMEP-4 test materials. These were

  3. Direct sun and airborne MAX-DOAS measurements of the collision induced oxygen complex, O2O2 absorption with significant pressure and temperature differences

    Spinei, E.; A. Cede; J. Herman; G. H. Mount; E. Eloranta; Morley, B.; S. Baidar; Dix, B.; Ortega, I.; Koenig, T.; R. Volkamer

    2014-01-01

    The collision induced O2 complex, O2O2, is a very important trace gas in remote sensing measurements of aerosol and cloud properties. Some ground based MAX-DOAS measurements of O2O2 slant column density require correction factors of 0.75 ± 0.1 to reproduce radiative transfer modeling (RTM) results for a near pure Rayleigh atmosphere. One of the potential causes of this discrepancy is believed to be uncertainty in laboratory measured ...

  4. Complex Role of Secondary Electron Emissions in Dust Grain Charging in Space Environments: Measurements on Apollo 11 and 17 Dust Grains

    Abbas, M. M.; Tankosic, D.; Spann, J. F.; LeClair, A. C.

    2010-01-01

    Dust grains in various astrophysical environments are generally charged electrostatically by photoelectric emissions with radiation from nearby sources, or by electron/ion collisions by sticking or secondary electron emissions. Knowledge of the dust grain charges and equilibrium potentials is important for understanding of a variety of physical and dynamical processes in the interstellar medium (ISM), and heliospheric, interplanetary, planetary, and lunar environments. The high vacuum environment on the lunar surface leads to some unusual physical and dynamical phenomena involving dust grains with high adhesive characteristics, and levitation and transportation over long distances. It has been well recognized that the charging properties of individual micron/submicron size dust grains are expected to be substantially different from the corresponding values for bulk materials and theoretical models. In this paper we present experimental results on charging of individual dust grains selected from Apollo 11 and Apollo 17 dust samples by exposing them to mono-energetic electron beams in the 10- 400 eV energy range. The charging rates of positively and negatively charged particles of approximately 0.2 to 13 microns diameters are discussed in terms of the secondary electron emission (SEE) process, which is found to be a complex charging process at electron energies as low as 10-25 eV, with strong particle size dependence. The measurements indicate substantial differences between dust charging properties of individual small size dust grains and of bulk materials.

  5. QUIJOTE Scientific Results. II. Polarisation Measurements of the Microwave Emission in the Galactic molecular complexes W43 and W47 and supernova remnant W44

    Génova-Santos, R; Peláez-Santos, A; Poidevin, F; Rebolo, R; Vignaga, R; Artal, E; Harper, S; Hoyland, R; Lasenby, A; Martínez-González, E; Piccirillo, L; Tramonte, D; Watson, R A

    2016-01-01

    We present Q-U-I JOint TEnerife (QUIJOTE) intensity and polarisation maps at 10-20 GHz covering a region along the Galactic plane 24complexes W43 (22-sigma) and W47 (8-sigma). We also detect at high significance (6-sigma) AME associated with W44, the first clear detection of this emission towards a SNR. The new QUIJOTE polarisation data, in combination with WMAP, are essential to: i) Determine the spectral index of the synchrotron emission in W44, beta_sync=-0.62+/-0.03 in good agreement with the value inferred from the intensity spectrum once a free-free component is included in the fit. ii) Trace the change in the polarisation angle associated with Faraday rotation in the direction of W44 with rotation measure -404+/-49 rad/m2. And iii)...

  6. Complex chemistry

    Kim, Bong Gon; Kim, Jae Sang; Kim, Jin Eun; Lee, Boo Yeon

    2006-06-15

    This book introduces complex chemistry with ten chapters, which include development of complex chemistry on history coordination theory and Warner's coordination theory and new development of complex chemistry, nomenclature on complex with conception and define, chemical formula on coordination compound, symbol of stereochemistry, stereo structure and isomerism, electron structure and bond theory on complex, structure of complex like NMR and XAFS, balance and reaction on solution, an organo-metallic chemistry, biology inorganic chemistry, material chemistry of complex, design of complex and calculation chemistry.

  7. Complex chemistry

    This book introduces complex chemistry with ten chapters, which include development of complex chemistry on history coordination theory and Warner's coordination theory and new development of complex chemistry, nomenclature on complex with conception and define, chemical formula on coordination compound, symbol of stereochemistry, stereo structure and isomerism, electron structure and bond theory on complex, structure of complex like NMR and XAFS, balance and reaction on solution, an organo-metallic chemistry, biology inorganic chemistry, material chemistry of complex, design of complex and calculation chemistry.

  8. A New Complete Class Complexity Metric

    Singh, Vinay; Bhattacherjee, Vandana

    2014-01-01

    Software complexity metrics is essential for minimizing the cost of software maintenance. Package level and System level complexity cannot be measured without class level complexity. This research addresses the class complexity metrics. This paper studies the existing class complexity metrics and proposes a new class complexity metric CCC (Complete class complexity metric). The CCC metric is then analytically evaluated by Weyuker's property.

  9. Thin film Z-scan measurements of the nonlinear response of novel conjugated silicon-ethynylene polymers and metal-containing complexes incorporated into polymeric matrices

    Douglas, William E.; Klapshina, Larisa G.; Rubinov, Anatoly N.; Domrachev, George A.; Bushuk, Boris A.; Antipov, Oleg L.; Semenov, Vladimir V.; Kuzhelev, Alexander S.; Bushuk, Sergey B.; Kalvinkovskaya, Julia A.

    2000-11-01

    The third-order optical nonlinearities of new conjugated poly[(arylene)(ethynylene)silylene]s, and a variety of chromium, neodymium or cobalt complexes incorporated into polymeric matrices as thin sol-gel or polyacrylonitrile films have been determined by using a single beam Z-scan technique. The samples were pumped by a single ultrashort pulse of a mode-locked Nd-phosphate glass laser (wavelength 1054 nm) with a 5ps pulse duration (full width at half- maximum), the repetition rate of the Gaussian beam being low (0.3Hz) ro avoid thermal effects. The spot radius of the focused pulse was ca. 60micrometers , its beam waist being in the sample (intensity up to 4x1013 Wm-2). Calibration was done with chloroform and benzene, the value of N2 for the latter (2x10-12esu) being similar to that previously reported. A small-aperture Z-scan (S=0.03) was used to measure the magnitude and the sign of the nonlinear refractive index, n2. Very high nonlinear refractive indices were found for a film containing (a) a poly[(arylene)(ethynylene)silylene]s with pentacoordinated silicon (c 5 gl-1) in a sol-gel matrix (N2 = 6 x 10-13 cm2W-1), (b) a film containing a poly[(arylene)(ethynylene)silylene] with tetracoordinated silicon (c 0.5 gl-1) and a very small proportion of fullerene-C70 incorporated into an NH2-containing sol-gel matrix (n2 = 5x10-13 cm2W-1, and (c) a thin polyacrylonitrile film of polycyanoethylate bis-arenechromium(I) hydroxide (n2 = -5 x 10-12 cm(superscript 2W-1.

  10. "Product Complexity and Economic Development"

    Abdon, Arnelyn; Bacate, Marife; Felipe, Jesus; Kumar, Utsav

    2010-01-01

    We rank 5,107 products and 124 countries according to the Hidalgo and Hausmann (2009) measures of complexity. We find that: (1) the most complex products are in machinery, chemicals, and metals, while the least complex products are raw materials and commodities, wood, textiles, and agricultural products; (2) the most complex economies in the world are Japan, Germany, and Sweden, and the least complex, Cambodia, Papua New Guinea, and Nigeria; (3) the major exporters of the more complex product...

  11. Bucolic Complexes

    Brešar, Bostjan; Chepoi, Victor; Gologranc, Tanja; Osajda, Damian

    2012-01-01

    In this article, we introduce and investigate bucolic complexes, a common generalization of systolic complexes and of CAT(0) cubical complexes. This class of complexes is closed under Cartesian products and amalgamations over some convex subcomplexes. We study various approaches to bucolic complexes: from graph-theoretic and topological viewpoints, as well as from the point of view of geometric group theory. Bucolic complexes can be defined as locally-finite simply connected prism complexes satisfying some local combinatorial conditions. We show that bucolic complexes are contractible, and satisfy some nonpositive-curvature-like properties. In particular, we prove a version of the Cartan-Hadamard theorem, the fixed point theorem for finite group actions, and establish some results on groups acting geometrically on such complexes. We also characterize the 1-skeletons (which we call bucolic graphs) and the 2-skeletons of bucolic complexes. In particular, we prove that bucolic graphs are precisely retracts of Ca...

  12. Thermally driven circulation in a region of complex topography: comparison of wind-profiling radar measurements and MM5 numerical predictions

    Bianco, L.; Tomassetti, B.; Coppola, E.; A. Fracassi; Verdecchia, M.; Visconti, G.

    2006-01-01

    The diurnal variation of regional wind patterns in the complex terrain of Central Italy was investigated for summer fair-weather conditions and winter time periods using a radar wind profiler. The profiler is located on a site where interaction between the complex topography and land-surface produces a variety of thermally and dynamically driven wind systems. The observational data set, collected for a period of one year, was used first to describe the diurnal evolution of thermal driven wind...

  13. Multiscale Cross-Approximate Entropy Analysis as a Measurement of Complexity between ECG R-R Interval and PPG Pulse Amplitude Series among the Normal and Diabetic Subjects

    Hsien-Tsai Wu; Chih-Yuan Lee; Cyuan-Cin Liu; An-Bang Liu

    2013-01-01

    Physiological signals often show complex fluctuation (CF) under the dual influence of temporal and spatial scales, and CF can be used to assess the health of physiologic systems in the human body. This study applied multiscale cross-approximate entropy (MC-ApEn) to quantify the complex fluctuation between R-R intervals series and photoplethysmography amplitude series. All subjects were then divided into the following two groups: healthy upper middle-aged subjects (Group 1, age range: 41–80 ye...

  14. How evolution guides complexity

    LARRY S. YAEGER

    2009-01-01

    Long-standing debates about the role of natural selection in the growth of biological complexity over geological time scales are difficult to resolve from the paleobiological record. Using an evolutionary model—a computational ecosystem subjected to natural selection—we investigate evolutionary trends in an information-theoretic measure of the complexity of the neural dynamics of artificial agents inhabiting the model. Our results suggest that evolution always guides complexity change, just n...

  15. Visual Complexity: A Review

    Donderi, Don C.

    2006-01-01

    The idea of visual complexity, the history of its measurement, and its implications for behavior are reviewed, starting with structuralism and Gestalt psychology at the beginning of the 20th century and ending with visual complexity theory, perceptual learning theory, and neural circuit theory at the beginning of the 21st. Evidence is drawn from…

  16. Synthesis and measurements of the optical bandgap of single crystalline complex metal oxide BaCuV2O7 nanowires by UV–VIS absorption

    Highlights: • Synthesis of single crystalline complex metal oxides BaCuV2O7 nanowires. • Surfactant free, economically favorable chemical solution deposition method. • Complex metal oxides nanowires with controlled stoichiometry. • Simply controlling the temperature and thickness of the coated film, we can easily obtain high quality BaCuV2O7 nanowires. - Abstract: The synthesis of single crystalline complex metal oxides BaCuV2O7 nanowires were attained by using surfactant free, economically favorable chemical solution deposition method. A thin layer of BaCuV2O7 nanocrystals is formed by the decomposition of complex metal oxide solution at 150 °C to provide nucleation sites for the growth of nanowires. The synthesized nanowires were typically 1–5 μm long with diameter from 50 to 150 nm. We showed that by simply controlling the temperature and thickness of the coated film, we can easily obtain high quality BaCuV2O7 nanowires. The UV–VIS absorption spectra show indirect bandgap of 2.65 ± 0.05 eV of nanowires. The temperature-dependent resistances of BaCuV2O7 nanowires agree with the exponential correlation, supporting that the conducting carriers are the quasi-free electrons. We believe that our methodology will provides a simple and convenient route for the synthesis of variety of complex metal oxides nanowires with controlled stoichiometry

  17. Statistical complexity and disequilibrium

    We study the concept of disequilibrium as an essential ingredient of a family of statistical complexity measures. We find that Wootters' objections to the use of Euclidean distances for probability spaces become quite relevant to this endeavor. Replacing the Euclidean distance by the Wootters' one noticeably improves the behavior of the associated statistical complexity measure, as evidenced by its application to the dynamics of the logistic map

  18. Tropical complexes

    Cartwright, Dustin

    2013-01-01

    We introduce tropical complexes, which are Delta-complexes together with additional numerical data. On a tropical complex, we define divisors and linear equivalence between divisors, analogous to the notions for algebraic varieties, and generalizing previous work for graphs. We prove a comparison theorem showing that divisor-curve intersection numbers agree under certain conditions.

  19. Medical Complex

    Kumaraswamy, Mohan

    2002-01-01

    One element of the CIVCAL project Web-based resources containing images, tables, texts and associated data on the construction of the Medical Complex. This project covers the construction of a new Hong Kong University Medical Complex on Sassoon Road, Pokfulam. The complex will comprise two buildings, one will house laboratories and a car park, while the other will contain lecture halls

  20. Cadmium(2) complexes of cytosine

    Complexes of cadmium(2) with cytosine obtained from aqueous or physiological solutions at room temperature are reported. The complexes were characterized by spectroscopic, conductometric, 1H-NMR, and 13C-NMR measurements and also by thermogravimetry. (Authors)

  1. The Acute Effect of Upper-Body Complex Training on Power Output of Martial Art Athletes as Measured by the Bench Press Throw Exercise

    Liossis, Loudovikos Dimitrios; Forsyth, Jacky; Liossis, Ceorge; Tsolakis, Charilaos

    2013-01-01

    The purpose of this study was to examine the acute effect of upper body complex training on power output, as well as to determine the requisite preload intensity and intra-complex recovery interval needed to induce power output increases. Nine amateur-level combat/martial art athletes completed four distinct experimental protocols, which consisted of 5 bench press repetitions at either: 65% of one-repetition maximum (1RM) with a 4 min rest interval; 65% of 1RM with an 8 min rest; 85% of 1RM w...

  2. Communication complexity and information complexity

    Pankratov, Denis

    Information complexity enables the use of information-theoretic tools in communication complexity theory. Prior to the results presented in this thesis, information complexity was mainly used for proving lower bounds and direct-sum theorems in the setting of communication complexity. We present three results that demonstrate new connections between information complexity and communication complexity. In the first contribution we thoroughly study the information complexity of the smallest nontrivial two-party function: the AND function. While computing the communication complexity of AND is trivial, computing its exact information complexity presents a major technical challenge. In overcoming this challenge, we reveal that information complexity gives rise to rich geometrical structures. Our analysis of information complexity relies on new analytic techniques and new characterizations of communication protocols. We also uncover a connection of information complexity to the theory of elliptic partial differential equations. Once we compute the exact information complexity of AND, we can compute exact communication complexity of several related functions on n-bit inputs with some additional technical work. Previous combinatorial and algebraic techniques could only prove bounds of the form theta( n). Interestingly, this level of precision is typical in the area of information theory, so our result demonstrates that this meta-property of precise bounds carries over to information complexity and in certain cases even to communication complexity. Our result does not only strengthen the lower bound on communication complexity of disjointness by making it more exact, but it also shows that information complexity provides the exact upper bound on communication complexity. In fact, this result is more general and applies to a whole class of communication problems. In the second contribution, we use self-reduction methods to prove strong lower bounds on the information

  3. Direct amination of benzene to aniline with several typical vanadium complexes

    Yu Fen Lv; Liang Fang Zhu; Qiu Yuan Liu; Bin Guo; Xiao Ke Hu; Chang Wei Hu

    2009-01-01

    The liquid-phase direct catalytic amination of benzene to aniline was performed in acetic acid water solvent using a series of vanadium(Ⅲ,Ⅳ,Ⅴ)complexes with N,O-or O,O-ligands as catalysts and hydroxylamine hydrochloride as the aminating agent.The vanadium complexes exhibited much higher selectivity towards the production of aniline than NaVO3 or VOSO4.Under the optimized conditions,an aniline yield of 42.5% and a TON of 48 with a high selectivity of above 99.9% was obtained using 0.2 mmol of[VO(OAc)2]as the catalyst.

  4. Development of Oceanic Core Complexes on the Mid-Atlantic Ridge at 13-14N: Deep-Towed Geophysical Measurements and Detailed Seafloor Sampling

    Searle, R.; MacLeod, C.; Murton, B.; Mallows, C.; Casey, J.; Achenbach, K.; Unsworth, S.; Harris, M.

    2007-12-01

    The first scientific cruise of research vessel James Cook in March-April 2007 targeted the Mid-Atlantic Ridge at 13-14°N, to investigate details of lithospheric generation and development in a low-magmatic setting. Overall objectives were to 1) investigate the 3D pattern of mantle upwelling and melt focusing; 2) study how plate accretion and separation mechanisms differ between magma-rich and magma-poor areas; and 3) test mechanisms of detachment faulting and extensional strain localisation in the lower crust and upper mantle. Smith et al. (Nature 2006) had shown this to be an area of widespread detachment faulting and formation of oceanic core complexes (OCC), and published bathymetry showed an extensive area of blocky rather than lineated topography, which elsewhere has correlated with areas of low effusive magmatism. We conducted a TOBI deep-towed geophysical survey over a 70 km length of ridge extending to magnetic chron C2n (1.9 Ma) on each flank. This included sidescan sonar and high resolution bathymetry and magnetic measurements on 13 E-W tracks spaced 3 - 6 km apart. The area includes 1 active, 1 dying, and 1 defunct OCC and borders well-lineated, apparently magmatically robust seafloor to the north. The geophysical survey was complimented by recovery of 7 oriented and 18 unoriented core and 29 dredge samples, including some from a probable OCC south of the TOBI survey. Deep-towed sidescan, bathymetry and video show the OCCs typically comprise a steeply outward tilted volcanic ridge marking the breakaway (as suggested by Smith et al., 2006); a high, rugged central massif that is complexly deformed as a result of uplift and bending, and may be separated from the breakaway ridge by what we interpret as a late outward dipping normal fault; and a smooth, corrugated surface that generally dips c. 20° towards the ridge axis at the termination but gradually rotates to horizontal or gently outward dipping near its junction with the central massif. Older OCCs

  5. Spectroscopic and physical measurements on charge-transfer complexes: Interactions between norfloxacin and ciprofloxacin drugs with picric acid and 3,5-dinitrobenzoic acid acceptors

    Refat, Moamen S.; Elfalaky, A.; Elesh, Eman

    2011-03-01

    Charge-transfer complexes formed between norfloxacin (nor) or ciprofloxacin (cip) drugs as donors with picric acid (PA) and/or 3,5-dinitrobenzoic acid (DNB) as π-acceptors have been studied spectrophotometrically in methanol solvent at room temperature. The results indicated the formation of CT-complexes with molar ratio1:1 between donor and acceptor at maximum CT-bands. In the terms of formation constant ( KCT), molar extinction coefficient ( ɛCT), standard free energy (Δ Go), oscillator strength ( f), transition dipole moment (μ), resonance energy ( RN) and ionization potential ( ID) were estimated. IR, H NMR, UV-Vis techniques, elemental analyses (CHN) and TG-DTG investigations were used to characterize the structural of charge-transfer complexes. It indicates that the CT interaction was associated with a proton migration from each acceptor to nor or cip donors which followed by appearing intermolecular hydrogen bond. In addition, X-ray investigation was carried out to scrutinize the crystal structure of the resulted CT-complexes.

  6. Measurements of Enthalpy Change of Reaction of Formation, Molar Heat Capacity and Constant-Volume Combustion Energy of Solid Complex Yb(Et2dtc)3(phen)

    Song Weiming; Hu Qilin; Chang Xuan; Chen Sanping; Xie Gang; Gao Shengli

    2006-01-01

    A ternary solid complex Yb(Et2dtc)3(phen) was obtained from the reaction of hydrous ytterbium chloride with sodium diethyldithiocarbamate (NaEt2dtc), and 1, 10-phenanthroline (o-phen·H2O) in absolute ethanol.The bonding characteristics of the complex were characterized by IR.The result shows Yb3+ bands with two sulfur atoms in the Na(Et2dtc)3 and two nitrogen atoms in the o-phen.The enthalpy change of liquid-phase reaction of formation of the complex ΔrHθm (l), was determined as being (-24.838±0.114) kJ·mol-1 at 298.15 K, by an RD-496 Ⅲ type heat conduction microcalormeter.The enthalpy change of the solid-phase reaction of formation of the complex ΔrHθm (s), was calculated as being (108.015±0.479) kJ·mol-1 on the basis of an appropriate thermochemistry cycle.The thermodynamics of liquid-phase reaction of formation of the complex was investigated by changing the temperature during the liquid-phase reaction.Fundamental parameters, the activation enthalpy, ΔHθ≠, the activation entropy, ΔSθ≠, the activation free energy, ΔGθ≠, the apparent reaction rate constant k, the apparent activation energy E, the pre-exponential constant A, and the reaction order n, were obtained by a combination of the reaction thermodynamic and kinetic equations with the data from the thermokinetic experiments.At the same time, the molar heat capacity of the complex cm, p, was determined to be (86.34±1.74) J·mol-1·K-1 by the same microcalormeter.The constant-volume combustion energy of the complex, ΔcU, was determined to be (-17954.08±8.11) kJ·mol-1 by an RBC-Ⅱ type rotating-bomb calorimeter at 298.15 K.Its standard enthalpy of combustion, ΔcHθm, and standard enthalpy of formation, ΔfHθm, were calculated to be (-17973.29±8.11) kJ·mol-1 and (-770.36±9.02) kJ·mol-1, respectively.

  7. Software Complexity Methodologies & Software Security

    Masoud Rafighi; Nasser Modiri

    2011-01-01

    It is broadly clear that complexity is one of the software natural features. Software natural complexity and software requirement functionality are two inseparable part and they have special range. measurement complexity have explained with using the MacCabe and Halsted models and with an example discuss about software complexity in this paper Flow metric information Henry and Kafura, complexity metric system Agresti-card-glass, design metric in item’s level have compared and peruse then cate...

  8. Comparison of net CO2 fluxes measured with open- and closed-path infrared gas analyzers in an urban complex environment

    Järvi, L.; Mammarella, I.; Eugster, W.;

    2009-01-01

    between the two fluxes was good (R2 = 0.93) at the urban site, but during the measurement period the open-path net surface exchange (NSE) was 17% smaller than the closed-path NSE, indicating apparent additional uptake of CO2 by open-path measurements. At both sites, sensor heating corrections evidently......Simultaneous eddy covariance (EC) measurements of CO2 fluxes made with open-path and closed-path analyzers were done in urban area of Helsinki, Finland, in July 2007–June 2008. Our purpose was to study the differences between the two analyzers, the necessary correction procedures...... and their suitability to accurately measure CO2 exchange in such non-ideal landscape. In addition, this study examined the effect of open-path sensor heating on measured fluxes in urban terrain, and these results were compared with similar measurements made above a temperate beech forest in Denmark. The correlation...

  9. Interdisciplinary Symposium on Complex Systems

    Rössler, Otto; Zelinka, Ivan

    2015-01-01

    The book you hold in your hands is the outcome of the “2014 Interdisciplinary Symposium on Complex Systems” held in the historical city of Florence. The book consists of 37 chapters from 4 areas of Physical Modeling of Complex Systems, Evolutionary Computations, Complex Biological Systems and Complex Networks. All 4 parts contain contributions that give interesting point of view on complexity in different areas in science and technology. The book starts with a comprehensive overview and classification of complexity problems entitled Physics in the world of ideas: Complexity as Energy”  , followed by chapters about complexity measures and physical principles, its observation, modeling and its applications, to solving various problems including real-life applications. Further chapters contain recent research about evolution, randomness and complexity, as well as complexity in biological systems and complex networks. All selected papers represent innovative ideas, philosophical overviews and state-of-the-...

  10. What is a complex graph?

    Kim, Jongkwang; Wilhelm, Thomas

    2008-04-01

    Many papers published in recent years show that real-world graphs G(n,m) ( n nodes, m edges) are more or less “complex” in the sense that different topological features deviate from random graphs. Here we narrow the definition of graph complexity and argue that a complex graph contains many different subgraphs. We present different measures that quantify this complexity, for instance C1e, the relative number of non-isomorphic one-edge-deleted subgraphs (i.e. DECK size). However, because these different subgraph measures are computationally demanding, we also study simpler complexity measures focussing on slightly different aspects of graph complexity. We consider heuristically defined “product measures”, the products of two quantities which are zero in the extreme cases of a path and clique, and “entropy measures” quantifying the diversity of different topological features. The previously defined network/graph complexity measures Medium Articulation and Offdiagonal complexity ( OdC) belong to these two classes. We study OdC measures in some detail and compare it with our new measures. For all measures, the most complex graph G has a medium number of edges, between the edge numbers of the minimum and the maximum connected graph n-1graph complexity measures are characterized with the help of different example graphs. For all measures the corresponding time complexity is given. Finally, we discuss the complexity of 33 real-world graphs of different biological, social and economic systems with the six computationally most simple measures (including OdC). The complexities of the real graphs are compared with average complexities of two different random graph versions: complete random graphs (just fixed n,m) and rewired graphs with fixed node degrees.

  11. Engaging complexity

    Gys M. Loubser

    2014-01-01

    Full Text Available In this article, I discuss studies in complexity and its epistemological implications for systematic and practical theology. I argue that engagement with complexity does not necessarily assurea non-reductionist approach. However, if complexity is engaged transversally, it becomes possible to transcend reductionist approaches. Moreover, systematic and practical the ologians can draw on complexity in developing new ways of understanding and, therefore, new ways of describing the focus, epistemic scope and heuristic structures of systematic and practical theology. Firstly, Edgar Morin draws a distinction between restricted and general complexity based on the epistemology drawn upon in studies in complexity. Moving away from foundationalist approaches to epistemology, Morin argues for a paradigm of systems. Secondly,I discuss Kees van Kooten Niekerk�s distinction between epistemology, methodology andontology in studies in complexity and offer an example of a theological argument that drawson complexity. Thirdly, I argue for the importance of transversality in engaging complexity by drawing on the work of Wentzel van Huyssteen and Paul Cilliers. In conclusion, I argue that theologians have to be conscious of the epistemic foundations of each study in complexity, and these studies illuminate the heart of Reformed theology.Intradisciplinary and/or interdisciplinary implications: Therefore, this article has both intradisciplinary and interdisciplinary implications. When theologians engage studies incomplexity, the epistemological roots of these studies need to be considered seeing thatresearchers in complexity draw on different epistemologies. Drawing on transversality wouldenhance such considerations. Furthermore, Edgar Morin�s and Paul Cilliers� approach tocomplexity will inform practical and theoretical considerations in church polity and unity.

  12. Ground-based direct-sun DOAS and airborne MAX-DOAS measurements of the collision-induced oxygen complex, O2O2, absorption with significant pressure and temperature differences

    Spinei, E.; A. Cede; J. Herman; G. H. Mount; E. Eloranta; Morley, B.; S. Baidar; Dix, B.; Ortega, I.; Koenig, T.; R. Volkamer

    2015-01-01

    The collision-induced O2 complex, O2O2, is a very important trace gas for understanding remote sensing measurements of aerosols, cloud properties and atmospheric trace gases. Many ground-based multi-axis differential optical absorption spectroscopy (MAX-DOAS) measurements of the O2O2 optical depth require correction factors of 0.75 ± 0.1 to reproduce radiative transfer modeling (RTM) results for a nearly pure Rayleigh atmosphere. One of the potential causes of this discrepa...

  13. Carney Complex

    ... of Carney complex are Cushing’s syndrome and multiple thyroid nodules (tumors). Cushing’s syndrome features a combination of weight gain, ... with Carney complex include adrenocortical carcinoma , pituitary gland tumors , thyroid , colorectal , liver and pancreatic cancers . Ovarian cancer in ...

  14. Simplifying complexity

    Leemput, van de I.A.

    2016-01-01

    In this thesis I use mathematical models to explore the properties of complex systems ranging from microbial nitrogen pathways and coral reefs to the human state of mind. All are examples of complex systems, defined as systems composed of a number of interconnected parts, where the systemic behavior

  15. Hamiltonian complexity

    In recent years we have seen the birth of a new field known as Hamiltonian complexity lying at the crossroads between computer science and theoretical physics. Hamiltonian complexity is directly concerned with the question: how hard is it to simulate a physical system? Here I review the foundational results, guiding problems, and future directions of this emergent field.

  16. Syntactic Complexity as an Aspect of Text Complexity

    Frantz, Roger S.; Starr, Laura E.; Bailey, Alison L.

    2015-01-01

    Students' ability to read complex texts is emphasized in the Common Core State Standards (CCSS) for English Language Arts and Literacy. The standards propose a three-part model for measuring text complexity. Although the model presents a robust means for determining text complexity based on a variety of features inherent to a text as well as…

  17. Managing Complexity

    Maylath, Bruce; Vandepitte, Sonia; Minacori, Patricia;

    2013-01-01

    and into French. The complexity of the undertaking proved to be a central element in the students' learning, as the collaboration closely resembles the complexity of international documentation workplaces of language service providers. © Association of Teachers of Technical Writing.......This article discusses the largest and most complex international learning-by-doing project to date- a project involving translation from Danish and Dutch into English and editing into American English alongside a project involving writing, usability testing, and translation from English into Dutch...

  18. Complex variables

    Fisher, Stephen D

    1999-01-01

    The most important topics in the theory and application of complex variables receive a thorough, coherent treatment in this introductory text. Intended for undergraduates or graduate students in science, mathematics, and engineering, this volume features hundreds of solved examples, exercises, and applications designed to foster a complete understanding of complex variables as well as an appreciation of their mathematical beauty and elegance. Prerequisites are minimal; a three-semester course in calculus will suffice to prepare students for discussions of these topics: the complex plane, basic

  19. Algorithmic Problem Complexity

    Burgin, Mark

    2008-01-01

    People solve different problems and know that some of them are simple, some are complex and some insoluble. The main goal of this work is to develop a mathematical theory of algorithmic complexity for problems. This theory is aimed at determination of computer abilities in solving different problems and estimation of resources that computers need to do this. Here we build the part of this theory related to static measures of algorithms. At first, we consider problems for finite words and stud...

  20. In the search for the low-complexity sequences in prokaryotic and eukaryotic genomes: how to derive a coherent picture from global and local entropy measures

    Acquisti, Claudia; Allegrini, Paolo E-mail: allegrip@ilc.cnr.it; Bogani, Patrizia; Buiatti, Marcello; Catanese, Elena; Fronzoni, Leone; Grigolini, Paolo; Mersi, Giuseppe; Palatella, Luigi

    2004-04-01

    We investigate on a possible way to connect the presence of low-complexity sequences (LCS) in DNA genomes and the non-stationary properties of base correlations. Under the hypothesis that these variations signal a change in the DNA function, we use a new technique, called non-stationarity entropic index (NSEI) method, and we prove that this technique is an efficient way to detect functional changes with respect to a random baseline. The remarkable aspect is that NSEI does not imply any training data or fitting parameter, the only arbitrarity being the choice of a marker in the sequence. We make this choice on the basis of biological information about LCS distributions in genomes. We show that there exists a correlation between changing the amount in LCS and the ratio of long- to short-range correlation.

  1. Complex Covariance

    Frieder Kleefeld

    2013-01-01

    Full Text Available According to some generalized correspondence principle the classical limit of a non-Hermitian quantum theory describing quantum degrees of freedom is expected to be the well known classical mechanics of classical degrees of freedom in the complex phase space, i.e., some phase space spanned by complex-valued space and momentum coordinates. As special relativity was developed by Einstein merely for real-valued space-time and four-momentum, we will try to understand how special relativity and covariance can be extended to complex-valued space-time and four-momentum. Our considerations will lead us not only to some unconventional derivation of Lorentz transformations for complex-valued velocities, but also to the non-Hermitian Klein-Gordon and Dirac equations, which are to lay the foundations of a non-Hermitian quantum theory.

  2. Thermally driven circulation in a region of complex topography: comparison of wind-profiling radar measurements and MM5 numerical predictions

    L. Bianco

    2006-07-01

    Full Text Available The diurnal variation of regional wind patterns in the complex terrain of Central Italy was investigated for summer fair-weather conditions and winter time periods using a radar wind profiler. The profiler is located on a site where interaction between the complex topography and land-surface produces a variety of thermally and dynamically driven wind systems. The observational data set, collected for a period of one year, was used first to describe the diurnal evolution of thermal driven winds, second to validate the Mesoscale Model 5 (MM5 that is a three-dimensional numerical model. This type of analysis was focused on the near-surface wind observation, since thermally driven winds occur in the lower atmosphere. According to the valley wind theory expectations, the site – located on the left sidewall of the valley (looking up valley – experiences a clockwise turning with time. Same characteristics in the behavior were established in both the experimental and numerical results.

    Because the thermally driven flows can have some depth and may be influenced mainly by model errors, as a third step the analysis focuses on a subset of cases to explore four different MM5 Planetary Boundary Layer (PBL parameterizations. The reason is to test how the results are sensitive to the selected PBL parameterization, and to identify the better parameterization if it is possible. For this purpose we analysed the MM5 output for the whole PBL levels. The chosen PBL parameterizations are: 1 Gayno-Seaman; 2 Medium-Range Forecast; 3 Mellor-Yamada scheme as used in the ETA model; and 4 Blackadar.

  3. Simplifying complexity

    Leemput, van de, J.C.H.

    2016-01-01

    In this thesis I use mathematical models to explore the properties of complex systems ranging from microbial nitrogen pathways and coral reefs to the human state of mind. All are examples of complex systems, defined as systems composed of a number of interconnected parts, where the systemic behavior leads to the emergence of properties that would not be expected from behavior or properties of the individual parts of the system. Although the full behavior of the systems I address will probably...

  4. Electronic structures of TiO2-TCNE, -TCNQ, and -2,6-TCNAQ surface complexes studied by ionization potential measurements and DFT calculations: Mechanism of the shift of interfacial charge-transfer bands

    Fujisawa, Jun-ichi; Hanaya, Minoru

    2016-06-01

    Interfacial charge-transfer (ICT) transitions between inorganic semiconductors and π-conjugated molecules allow direct charge separation without loss of energy. This feature is potentially useful for efficient photovoltaic conversions. Charge-transferred complexes of TiO2 nanoparticles with 7,7,8,8-tetracyanoquinodimethane (TCNQ) and its analogues (TCNX) show strong ICT absorption in the visible region. The ICT band was reported to be significantly red-shifted with extension of the π-conjugated system of TCNX. In order to clarify the mechanism of the red-shift, in this work, we systematically study electronic structures of the TiO2-TCNX surface complexes (TCNX; TCNE, TCNQ, 2,6-TCNAQ) by ionization potential measurements and density functional theory (DFT) calculations.

  5. Complexity regularized hydrological model selection

    Pande, S.; Arkesteijn, L.; Bastidas, L.A.

    2014-01-01

    This paper uses a recently proposed measure of hydrological model complexity in a model selection exercise. It demonstrates that a robust hydrological model is selected by penalizing model complexity while maximizing a model performance measure. This especially holds when limited data is available.

  6. I. Fundamental Practicum: Temperature Measurements of Falling Droplets, July, 1989. II. Industrial Practicum: Interaction and Effect of Adsorbed Organics on Reference Clays and Reservoir Rock, April, 1988. III. Apprenticeship Practicum: Studies of Group XIII Metal Inclusion Complexes, March, 1987

    Wells, Mark Richard

    The temperature of 225 μm decane droplets falling through a hot, quiescent, oxygen -free environment were measured using laser-induced exciplex fluorescence thermometry. The temperature of the droplets was found to increase approximately 0.42^ circC/^circC increase in the environment temperature as the environment temperature was increased to 250^circ C. Less than 10% evaporation of the droplets was observed at the highest environment temperatures. This represents one of the first successful applications of a remote-sensing technique for the temperature determination of droplets in a dynamic system. Industrial practicum. The industrial practicum report, entitled "Interaction and Effect of Adsorbed Organics on Reference Clays and Reservoir Rock," is a discussion of the measurement of the effect adsorbed organic material, especially from crude petroleum, has on the surface area, cation exchange capacity, and zeta potential of reference clay material and reservoir rock. In addition, the energetics of adsorption of a petroleum extract onto several reference clays and reservoir rock were measured using both flow and batch microcalorimetry. These results are very important in evaluating and understanding the wettability of reservoir rock and its impact on the recovery of crude oil from a petroleum reservoir. Apprenticeship practicum. "Studies of Group XIII Metal Inclusion Complexes" investigates the structure and dynamics of liquid inclusion complexes having the general formula (R_4N) (Al_2 Me_6I) cdot (C_6H_6) _{rm x}. ^1H and ^{13}C spin-lattice relaxation times, nuclear Overhauser enhancements, and molecular correlation times were measured as well as diffusion coefficients of the various species in solution. The dynamics of transfer between "guest" and free solvent molecules were measured using a variety of techniques. The inherent structure of liquid inclusion complexes as an ordered medium for homogeneous catalysis was studied using hydrogenation catalyzed by

  7. Complex networks: Patterns of complexity

    Pastor-Satorras, Romualdo; Vespignani, Alessandro

    2010-07-01

    The Turing mechanism provides a paradigm for the spontaneous generation of patterns in reaction-diffusion systems. A framework that describes Turing-pattern formation in the context of complex networks should provide a new basis for studying the phenomenon.

  8. Clinical efficacy of acid water on distal lower extremity osteomyelitis%应用酸性水治疗下肢远端骨髓炎的临床疗效

    刘智深; 牛纪娥; 牛志勇; 杜张荣

    2013-01-01

    Objective To evaluate the clinical efficacy of acid water on distal lower extremity chronic osteomyelitis.Methods Between July 2011 and November 2012,11 patients with lots of exudates of chronic osteomyelitis were treated in our hospital.After the focus of infection was debrided completely,they were treated with acid water.As soon as the wound surface was clear and the surface was covered with granulating tissue,the wounds were treaded with secondary directly sutured or closed by skin transplantation.Results The exudates and the area of the wound were both decreased after rinsed and soaked in acid water for 10-25 days,with an average of 19 days.The wound was covered by second suture or flap transplantation.After of followed-up of 6-18 months,all 11 cases of chronic osteomyelitis were healed with no recurrence.Conclusions Application of acid water in treatment of osteomyelitis is effective and feasible,it has low medical cost,with good social and economic benefits,and provides a new option for the treatment of chronic osteomyelitis of the distal limbs.%目的 探讨酸性水在下肢远端骨髓炎的治疗作用.方法 晋城煤业集团总医院骨一科2011年7月至2012年10月对11例胫腓骨远端骨折内固定术后及跟骨骨折内固定术后感染,内固定材料或骨外露,有较多脓性分泌物的病例于病灶清除后先采用酸性水冲洗浸泡治疗,待创面清洁、肉芽组织形成后,二期直接缝合切口或应用皮瓣转移修复创面.结果 本组应用酸性水冲洗浸泡切口,同时全身应用敏感抗生素,冲洗浸泡时间为10~25 d,平均19 d,创面经上述方法治疗后分泌物逐渐减少,创面逐渐形成肉芽组织.二期手术修复创面,随访6~18个月,患者骨折断端骨性愈合,骨髓炎均无复发.结论 应用酸化水治疗骨髓炎,疗效确切可行,医疗费用低,具有良好的社会效益和经济效益,为治疗四肢远端的慢性骨髓炎提供了一种新的选择.

  9. Complex analysis

    Freitag, Eberhard

    2005-01-01

    The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...

  10. Ganglion cell complex and retinal nerve fiber layer measured by fourier-domain optical coherence tomography for early detection of structural damage in patients with preperimetric glaucoma

    Rolle T

    2011-07-01

    Full Text Available Teresa Rolle, Cristina Briamonte, Daniela Curto, Federico Maria GrignoloEye Clinic, Section of Ophthalmology, Department of Clinical Physiopathology, University of Torino, Torino, ItalyAims: To evaluate the capability of Fourier-domain optical coherence tomography (FD-OCT to detect structural damage in patients with preperimetric glaucoma.Methods: A total of 178 Caucasian subjects were enrolled in this cohort study: 116 preperimetric glaucoma patients and 52 healthy subjects. Using three-dimensional FD-OCT, the participants underwent imaging of the ganglion cell complex (GCC and the optic nerve head. Sensitivity, specificity, likelihood ratios, and predictive values were calculated for all parameters at the first and fifth percentiles. Areas under the curves (AUCs were generated for all parameters and were compared (Delong test. For both the GCC and the optic nerve head protocols, the OR logical disjunction (Boolean logic operator was calculated.Results: The AUCs didn’t significantly differ. Macular global loss volume had the largest AUC (0.81. Specificities were high at both the fifth and first percentiles (up to 97%, but sensitivities were low, especially at the first percentile (55%–27%.Conclusion: Macular and papillary diagnostic accuracies did not differ significantly based on the 95% confidence interval. The computation of the Boolean OR operator has been found to boost diagnostic accuracy. Using the software-provided classification, sensitivity and diagnostic accuracy were low for both the retinal nerve fiber layer and the GCC scans. FD-OCT does not seem to be decisive for early detection of structural damage in patients with no functional impairment. This suggests that there is a need for analysis software to be further refined to enhance glaucoma diagnostic capability.Keywords: OCT, RNFL, GCC, diagnostic accuracy 

  11. A simple measure with complex determinants: investigation of the correlates of self-rated health in older men and women from three continents

    French Davina J

    2012-08-01

    consider earlier life experiences of cohorts as well as national and individual factors in later life. Further research is required to understand the complex societal influences on perceptions of health.

  12. Near infrared-red models for the remote estimation of chlorophyll- a concentration in optically complex turbid productive waters: From in situ measurements to aerial imagery

    Gurlin, Daniela

    Today the water quality of many inland and coastal waters is compromised by cultural eutrophication in consequence of increased human agricultural and industrial activities and remote sensing is widely applied to monitor the trophic state of these waters. This study explores near infrared-red models for the remote estimation of chlorophyll-a concentration in turbid productive waters and compares several near infrared-red models developed within the last 35 years. Three of these near infrared-red models were calibrated for a dataset with chlorophyll-a concentrations from 2.3 to 81.2 mg m -3 and validated for independent and statistically significantly different datasets with chlorophyll-a concentrations from 4.0 to 95.5 mg m-3 and 4.0 to 24.2 mg m-3 for the spectral bands of the MEdium Resolution Imaging Spectrometer (MERIS) and Moderate-resolution Imaging Spectroradiometer (MODIS). The developed MERIS two-band algorithm estimated chlorophyll-a concentrations from 4.0 to 24.2 mg m-3, which are typical for many inland and coastal waters, very accurately with a mean absolute error 1.2 mg m-3. These results indicate a high potential of the simple MERIS two-band algorithm for the reliable estimation of chlorophyll-a concentration without any reduction in accuracy compared to more complex algorithms, even though more research seems required to analyze the sensitivity of this algorithm to differences in the chlorophyll-a specific absorption coefficient of phytoplankton. Three near infrared-red models were calibrated and validated for a smaller dataset of atmospherically corrected multi-temporal aerial imagery collected by the hyperspectral airborne imaging spectrometer for applications (AisaEAGLE). The developed algorithms successfully captured the spatial and temporal variability of the chlorophyll-a concentrations and estimated chlorophyll- a concentrations from 2.3 to 81.2 mg m-3 with mean absolute errors from 4.4 mg m-3 for the AISA two band algorithm to 5.2 mg m-3

  13. Complex Networks

    Evsukoff, Alexandre; González, Marta

    2013-01-01

    In the last decade we have seen the emergence of a new inter-disciplinary field focusing on the understanding of networks which are dynamic, large, open, and have a structure sometimes called random-biased. The field of Complex Networks is helping us better understand many complex phenomena such as the spread of  deseases, protein interactions, social relationships, to name but a few. Studies in Complex Networks are gaining attention due to some major scientific breakthroughs proposed by network scientists helping us understand and model interactions contained in large datasets. In fact, if we could point to one event leading to the widespread use of complex network analysis is the availability of online databases. Theories of Random Graphs from Erdös and Rényi from the late 1950s led us to believe that most networks had random characteristics. The work on large online datasets told us otherwise. Starting with the work of Barabási and Albert as well as Watts and Strogatz in the late 1990s, we now know th...

  14. Management of complex fisheries

    Frost, Hans Staby; Andersen, Peder; Hoff, Ayoe

    2013-01-01

    The purpose of this paper is to demonstrate how fisheries economics management issues or problems can be analyzed by using a complex model based on conventional bioeconomic theory. Complex simulation models contain a number of details that make them suitable for practical management advice......, including taking into account the response of the fishermen to implemented management measures. To demonstrate the use of complex management models this paper assesses a number of second best management schemes against a first rank optimum (FRO), an ideal individual transferable quotas (ITQ) system....... This is defined as the management scheme which produces the highest net present value over a 25 year period. The assessed management schemes (scenarios) are composed by several measures as used in the Common Fisheries Policy of the European Union for the cod fishery in the Baltic Sea. The scenarios are total...

  15. On the Use of Molecular Weight Cutoff Cassettes to Measure Dynamic Relaxivity of Novel Gadolinium Contrast Agents: Example Using Hyaluronic Acid Polymer Complexes in Phosphate-Buffered Saline

    Nima Kasraie

    2011-01-01

    Full Text Available The aims of this study were to determine whether standard extracellular contrast agents of Gd(III ions in combination with a polymeric entity susceptible to hydrolytic degradation over a finite period of time, such as Hyaluronic Acid (HA, have sufficient vascular residence time to obtain comparable vascular imaging to current conventional compounds and to obtain sufficient data to show proof of concept that HA with Gd-DTPA ligands could be useful as vascular imaging agents. We assessed the dynamic relaxivity of the HA bound DTPA compounds using a custom-made phantom, as well as relaxation rates at 10.72 MHz with concentrations ranging between 0.09 and 7.96 mM in phosphate-buffered saline. Linear dependences of static longitudinal relaxation rate (R1 on concentration were found for most measured samples, and the HA samples continued to produce high signal strength after 24 hours after injection into a dialysis cassette at 3T, showing superior dynamic relaxivity values compared to conventional contrast media such as Gd-DTPA-BMA.

  16. 探讨复杂环境下城市隧道施工控制措施%Discussion on City Tunnel Construction Control Measures under Complex Environment

    钱雪锋; 王永朕

    2013-01-01

    在现代城市道路建设中,城市穿山丘隧道往往会碰到对隧道施工而言地质条件差、地形非常不利及外界影响施工环境也非常恶劣等客观因素。本文以某城市隧道工程为例,浅谈不良地质、地形、环境条件下城市隧道施工控制的一些措施和方法。%In the modern city road construction, city tunnel cross massifs often encounter on the objective factors for tun-nel construction that geological conditions, topography is very negative, and external influences, construction environment is very bad. This paper takes a city tunnel project as an example, discusses city tunnel construction control measures and metho-ds under the adverse geological, topography, environment co-nditions.

  17. Measurement of high energy neutrons (E > 50 MeV) at electron accelerators of INDUS accelerator complex using bismuth fission detectors

    This paper reports the measurement of high energy neutron component (E > 50 MeV) carried out at INDUS-I (450 MeV) and INDUS-II (2.5 GeV) electron accelerators (RRCAT, Indore, India). The study is based on the registration of neutron induced fission fragments from bismuth films in solid polymeric track detectors. These BFD stacks were exposed at the injection septums of booster synchrotron, Indus-1 and Indus-2 storage rings, where the possibility of dose due to beam loss is expected to be maximum. The detection efficiency of the bismuth fission detector (BFD) could be enhanced by enlarging the detector surface area and accordingly a large area spark counter was fabricated for automatic and rapid counting of the track densities. The dose equivalent rates were found to be 11.0 ± 0.7 mrem/h (73 h total exposure time), 11.0 ± 2.6 mrem/h (35 h total exposure time) and 65.0 mrem/h (5 h total exposure time) for the injection septums of booster synchrotron, Indus-1 and Indus-2 respectively. However, the values reported here were not corrected for the contribution from photo fissions, if any. (author)

  18. On the Use of Molecular Weight Cutoff Cassettes to Measure Dynamic Relaxivity of Novel Gadolinium Contrast Agents: Example Using Hyaluronic Acid Polymer Complexes in Phosphate-Buffered Saline

    The aims of this study were to determine whether standard extracellular contrast agents of Gd(III) ions in combination with a polymeric entity susceptible to hydrolytic degradation over a finite period of time, such as Hyaluronic Acid (HA), have sufficient vascular residence time to obtain comparable vascular imaging to current conventional compounds and to obtain sufficient data to show proof of concept that HA with Gd-DTPA ligands could be useful as vascular imaging agents. We assessed the dynamic relaxivity of the HA bound DTPA compounds using a custom-made phantom, as well as relaxation rates at 10.72 MHz with concentrations ranging between 0.09 and 7.96 mM in phosphate-buffered saline. Linear dependences of static longitudinal relaxation rate (R1) on concentration were found for most measured samples, and the HA samples continued to produce high signal strength after 24 hours after injection into a dialysis cassette at 3T, showing superior dynamic relaxivity values compared to conventional contrast media such as Gd-DTPA-BMA

  19. A competitive indirect enzyme-linked immunoassay for lead ion measurement using mAbs against the lead-DTPA complex

    Xiang Junjian; Zhai Yifan [Molecular Immunology and Antibody Engineering Center, Jinan University, Guangzhou 510632 (China); Tang Yong, E-mail: ty7926@qq.co [Molecular Immunology and Antibody Engineering Center, Jinan University, Guangzhou 510632 (China); Wang Hong; Liu Bin; Guo Changwei [Molecular Immunology and Antibody Engineering Center, Jinan University, Guangzhou 510632 (China)

    2010-05-15

    Immunoassays for quantitative measurement of environmental heavy metals offer several advantages over other traditional methods. To develop an immunoassay for lead, Balb/c mice were immunized with a lead-chelate-protein conjugate to allow maximum exposure of the metal to the immune system. Three stable hybridoma cell lines were obtained through spleen cells fusion with Sp2/0 cells. One cell line, 2A11D11, produced mAbs with preferential selectivity and sensitivity for Pb-DTPA than DTPA, exhibiting an affinity constant of 3.34 +- 0.24 x 10{sup 9} M{sup -1}. Cross reactivity (CR) with other metals were below 1%, except for Fe(III) with a CR less than 5%. This quantitative indirect ELISA for the lead ion was used to detect environmental lead content in local water sources; importantly, the results from the immunoassay were in excellent agreement with those from ICP-MS. Development of immunoassays for metal ions may thus facilitate the detection and regulation of environmental pollution. - Development of immunoassays for metal ions may thus facilitate the detection and regulation of environmental pollution.

  20. A competitive indirect enzyme-linked immunoassay for lead ion measurement using mAbs against the lead-DTPA complex

    Immunoassays for quantitative measurement of environmental heavy metals offer several advantages over other traditional methods. To develop an immunoassay for lead, Balb/c mice were immunized with a lead-chelate-protein conjugate to allow maximum exposure of the metal to the immune system. Three stable hybridoma cell lines were obtained through spleen cells fusion with Sp2/0 cells. One cell line, 2A11D11, produced mAbs with preferential selectivity and sensitivity for Pb-DTPA than DTPA, exhibiting an affinity constant of 3.34 ± 0.24 x 109 M-1. Cross reactivity (CR) with other metals were below 1%, except for Fe(III) with a CR less than 5%. This quantitative indirect ELISA for the lead ion was used to detect environmental lead content in local water sources; importantly, the results from the immunoassay were in excellent agreement with those from ICP-MS. Development of immunoassays for metal ions may thus facilitate the detection and regulation of environmental pollution. - Development of immunoassays for metal ions may thus facilitate the detection and regulation of environmental pollution.