WorldWideScience

Sample records for acid-water complexes measured

  1. The benzoic acid-water complex: a potential atmospheric nucleation precursor studied using microwave spectroscopy and ab initio calculations.

    Science.gov (United States)

    Schnitzler, Elijah G; Jäger, Wolfgang

    2014-02-14

    The pure rotational, high-resolution spectrum of the benzoic acid-water complex was measured in the range of 4-14 GHz, using a cavity-based molecular beam Fourier-transform microwave spectrometer. In all, 40 a-type transitions and 2 b-type transitions were measured for benzoic acid-water, and 12 a-type transitions were measured for benzoic acid-D2O. The equilibrium geometry of benzoic acid-water was determined with ab initio calculations, at the B3LYP, M06-2X, and MP2 levels of theory, with the 6-311++G(2df,2pd) basis set. The experimental rotational spectrum is most consistent with the B3LYP-predicted geometry. Narrow splittings were observed in the b-type transitions, and possible tunnelling motions were investigated using the B3LYP/6-311++G(d,p) level of theory. Rotation of the water moiety about the lone electron pair hydrogen-bonded to benzoic acid, across a barrier of 7.0 kJ mol(-1), is the most likely cause for the splitting. Wagging of the unbound hydrogen atom of water is barrier-less, and this large amplitude motion results in the absence of c-type transitions. The interaction and spectroscopic dissociation energies calculated using B3LYP and MP2 are in good agreement, but those calculated using M06-2X indicate excess stabilization, possibly due to dispersive interactions being over-estimated. The equilibrium constant of hydration was calculated by statistical thermodynamics, using ab initio results and the experimental rotational constants. This allowed us to estimate the changes in percentage of hydrated benzoic acid with variations in the altitude, region, and season. Using monitoring data from Calgary, Alberta, and the MP2-predicted dissociation energy, a yearly average of 1% of benzoic acid is expected to be present in the form of benzoic acid-water. However, this percentage depends sensitively on the dissociation energy. For example, when using the M06-2X-predicted dissociation energy, we find it increases to 18%.

  2. Lewis acid-water/alcohol complexes as hydrogen atom donors in radical reactions.

    Science.gov (United States)

    Povie, Guillaume; Renaud, Philippe

    2013-01-01

    Water or low molecular weight alcohols are, due to their availability, low price and low toxicity ideal reagents for organic synthesis. Recently, it was reported that, despite the very strong BDE of the O-H bond, they can be used as hydrogen atom donors in place of expensive and/or toxic group 14 metal hydrides when boron and titanium(III) Lewis acids are present. This finding represents a considerable innovation and uncovers a new perspective on the paradigm of hydrogen atom transfers to radicals. We discuss here the influence of complex formation and other association processes on the efficacy of the hydrogen transfer step. A delicate balance between activation by complex formation and deactivation by further hydrogen bonding is operative.

  3. Measuring Tax Complexity

    OpenAIRE

    David Ulph

    2014-01-01

    This paper critically examines a number of issues relating to the measurement of tax complexity. It starts with an analysis of the concept of tax complexity, distinguishing tax design complexity and operational complexity. It considers the consequences/costs of complexity, and then examines the rationale for measuring complexity. Finally it applies the analysis to an examination of an index of complexity developed by the UK Office of Tax Simplification (OTS). Postprint

  4. Measuring static complexity

    Directory of Open Access Journals (Sweden)

    Ben Goertzel

    1992-01-01

    Full Text Available The concept of “pattern” is introduced, formally defined, and used to analyze various measures of the complexity of finite binary sequences and other objects. The standard Kolmogoroff-Chaitin-Solomonoff complexity measure is considered, along with Bennett's ‘logical depth’, Koppel's ‘sophistication'’, and Chaitin's analysis of the complexity of geometric objects. The pattern-theoretic point of view illuminates the shortcomings of these measures and leads to specific improvements, it gives rise to two novel mathematical concepts--“orders” of complexity and “levels” of pattern, and it yields a new measure of complexity, the “structural complexity”, which measures the total amount of structure an entity possesses.

  5. Quantum State Complexity Measure

    CERN Document Server

    Campbell, Yuri

    2011-01-01

    The complexity measures role has become much clearer in recent years as they help to better understand complex systems dynamical behavior. Even though the large number of measures proposed to tackle this issue for classical systems, for quantum systems only Kolmogorov's algorithm complexity extensions have been proposed. Hence, the present approach makes use of a new and mathematically well-established complexity measure for classical systems and extends it to assess quantum states complexity as well. Then the proposed extension is applied to a mixed state constructed with a W-state together with controlled white noise, showing a convex behavior of quantum state complexity. Thus, this reinforces the differences from previous known quantum complexities.

  6. Electric dipole moments of nitric acid-water complexes measured by cluster beam deflection

    CERN Document Server

    Moro, Ramiro; Kresin, Vitaly V

    2009-01-01

    Water clusters embedding a nitric acid molecule HNO3(H2O)_{n=1-10} are investigated via electrostatic deflection of a molecular beam. We observe large paraelectric susceptibilities that greatly exceed the electronic polarizability, revealing the contribution of permanent dipole moments. The moments derived from the data are also significantly higher than those of pure water clusters. An enhancement in the susceptibility for n=5,6 and a rise in cluster abundances setting in at n=6 suggest that dissociation of the solvated acid molecule into ions takes place in this size range.

  7. Measuring importance in complex networks

    Science.gov (United States)

    Morrison, Greg; Dudte, Levi; Mahadevan, L.

    2013-03-01

    A variety of centrality measures can be defined on a network to determine the global `importance' of a node i. However, the inhomogeneity of complex networks implies that not all nodes j will consider i equally important. In this talk, we use a linearized form of the Generalized Erdos numbers [Morrison and Mahadevan EPL 93 40002 (2011)] to define a pairwise measure of the importance of a node i from the perspective of node j which incorporates the global network topology. This localized importance can be used to define a global measure of centrality that is consistent with other well-known centrality measures. We illustrate the use of the localized importance in both artificial and real-world networks with a complex global topology.

  8. Hierarchy measure for complex networks.

    Directory of Open Access Journals (Sweden)

    Enys Mones

    Full Text Available Nature, technology and society are full of complexity arising from the intricate web of the interactions among the units of the related systems (e.g., proteins, computers, people. Consequently, one of the most successful recent approaches to capturing the fundamental features of the structure and dynamics of complex systems has been the investigation of the networks associated with the above units (nodes together with their relations (edges. Most complex systems have an inherently hierarchical organization and, correspondingly, the networks behind them also exhibit hierarchical features. Indeed, several papers have been devoted to describing this essential aspect of networks, however, without resulting in a widely accepted, converging concept concerning the quantitative characterization of the level of their hierarchy. Here we develop an approach and propose a quantity (measure which is simple enough to be widely applicable, reveals a number of universal features of the organization of real-world networks and, as we demonstrate, is capable of capturing the essential features of the structure and the degree of hierarchy in a complex network. The measure we introduce is based on a generalization of the m-reach centrality, which we first extend to directed/partially directed graphs. Then, we define the global reaching centrality (GRC, which is the difference between the maximum and the average value of the generalized reach centralities over the network. We investigate the behavior of the GRC considering both a synthetic model with an adjustable level of hierarchy and real networks. Results for real networks show that our hierarchy measure is related to the controllability of the given system. We also propose a visualization procedure for large complex networks that can be used to obtain an overall qualitative picture about the nature of their hierarchical structure.

  9. Turbulence measurements over complex terrain

    Science.gov (United States)

    Skupniewicz, Charles E.; Kamada, Ray F.; Schacher, Gordon E.

    1989-07-01

    Horizontal turbulence measurements obtained from 22 wind sensors located on 9 towers in a mountainous coastal area are described and categorized by stability and terrain. Vector wind time series are high-pass filtered, and lateral and longitudinal wind speed variance is calculated for averaging times ranging from 15 s to 2 h. Parameterizations of the functional dependence of variance on averaging time are discussed, and a modification of Panofsky's (1988) uniform terrain technique applicable to complex terrain is presented. The parameterization is applied to the data and shown to be more realistic than a less complicated power law technique. The parameter values are shown to be different than the flat terrain cases of Kaimal et al. (1972), and are primarily a function of sensor location within the complex terrain. The parameters are also examined in terms of their dependence upon season, stability, marine boundary-layer height, and measurement height.

  10. Acceptable Complexity Measures of Theorems

    OpenAIRE

    Grenet, Bruno

    2009-01-01

    In 1931, G\\"odel presented in K\\"onigsberg his famous Incompleteness Theorem, stating that some true mathematical statements are unprovable. Yet, this result gives us no idea about those independent (that is, true and unprovable) statements, about their frequency, the reason they are unprovable, and so on. Calude and J\\"urgensen proved in 2005 Chaitin's "heuristic principle" for an appropriate measure: the theorems of a finitely-specified theory cannot be significantly more complex than the t...

  11. 柠檬酸钠与聚羧酸减水剂复配性能研究%Study on properties of complexes of polycarboxylic acid water reducing agent and sodium citrate

    Institute of Scientific and Technical Information of China (English)

    李萍; 蔡其全; 陈军超; 李薇; 宋明健; 张建兵; 唐小刚

    2013-01-01

    This paper used the sodium citrate and polycarboxylate acid for compound research,investigated the influence of complex products of sodium citrate in different contents on fluidity of cement paste,setting time and compressive strength of mortar and so on,and put forward the optimal compound recipe of polycarboxylic acid water reducing agent and sodium citrate.The experiment proved that when the mixing amount of polycarboxylic acid water reducing agent was 0.13%,considering the comprehensive function of sodium citrate of mortar retarding effect,auxiliary plasticizing effect and contribution on cement mortar strength,the appropriate quantity of sodium citrate was 0.02%~0.03%.%采用柠檬酸钠与聚羧酸减水剂进行复配,考察柠檬酸钠在不同掺量条件下的复配产品对水泥净浆流动度、凝结时间和砂浆抗压强度等的影响,提出了聚羧酸减水剂与柠檬酸钠复配的优化配方.试验结果表明,在聚羧酸减水剂掺量0.13%条件下,从综合发挥柠檬酸钠对砂浆的缓凝作用、辅助塑化效果以及对砂浆强度贡献角度考虑,柠檬酸钠的适宜掺量为0.02%~0.03%.

  12. Complexity measurement based on information theory and kolmogorov complexity.

    Science.gov (United States)

    Lui, Leong Ting; Terrazas, Germán; Zenil, Hector; Alexander, Cameron; Krasnogor, Natalio

    2015-01-01

    In the past decades many definitions of complexity have been proposed. Most of these definitions are based either on Shannon's information theory or on Kolmogorov complexity; these two are often compared, but very few studies integrate the two ideas. In this article we introduce a new measure of complexity that builds on both of these theories. As a demonstration of the concept, the technique is applied to elementary cellular automata and simulations of the self-organization of porphyrin molecules.

  13. Complexity measures, emergence, and multiparticle correlations

    CERN Document Server

    Galla, Tobias

    2011-01-01

    We study correlation measures for complex systems. First, we investigate some recently proposed measures based on information geometry. We show that these measures can increase under local transformations as well as under discarding particles, thereby questioning their interpretation as a quantifier for complexity or correlations. We then propose a refined definition of these measures, investigate its properties and discuss its numerical evaluation. As an example, we study coupled logistic maps and study the behavior of the different measures for that case. Finally, we investigate other local effects during the coarse graining of the complex system.

  14. Complexity measures, emergence, and multiparticle correlations

    Science.gov (United States)

    Galla, Tobias; Gühne, Otfried

    2012-04-01

    We study correlation measures for complex systems. First, we investigate some recently proposed measures based on information geometry. We show that these measures can increase under local transformations as well as under discarding particles, thereby questioning their interpretation as a quantifier for complexity or correlations. We then propose a refined definition of these measures, investigate its properties, and discuss its numerical evaluation. As an example, we study coupled logistic maps and study the behavior of the different measures for that case. Finally, we investigate other local effects during the coarse graining of the complex system.

  15. Metric for Early Measurement of Software Complexity

    Directory of Open Access Journals (Sweden)

    Ghazal Keshavarz,

    2011-06-01

    Full Text Available Software quality depends on several factors such as on time delivery; within budget and fulfilling user's needs. Complexity is one of the most important factors that may affect the quality. Therefore, measuring and controlling the complexity result in improving the quality. So far, most of the researches have tried to identify and measure the complexity in design and code phase. However, whenwe have the code or design for software, it is too late to control complexity. In this article, with emphasis on Requirement Engineering process, we analyze the causes of software complexity, particularly in the first phase of software development, and propose a requirement based metric. This metric enables a software engineer to measure the complexity before actual design and implementation and choosestrategies that are appropriate to the software complexity degree, thus saving on cost and human resource wastage and, more importantly, leading to lower maintenance costs.

  16. Measurement methods on the complexity of network

    Institute of Scientific and Technical Information of China (English)

    LIN Lin; DING Gang; CHEN Guo-song

    2010-01-01

    Based on the size of network and the number of paths in the network,we proposed a model of topology complexity of a network to measure the topology complexity of the network.Based on the analyses of the effects of the number of the equipment,the types of equipment and the processing time of the node on the complexity of the network with the equipment-constrained,a complexity model of equipment-constrained network was constructed to measure the integrated complexity of the equipment-constrained network.The algorithms for the two models were also developed.An automatic generator of the random single label network was developed to test the models.The results show that the models can correctly evaluate the topology complexity and the integrated complexity of the networks.

  17. Cardiac Aging Detection Using Complexity Measures

    CERN Document Server

    Balasubramanian, Karthi

    2016-01-01

    As we age, our hearts undergo changes which result in reduction in complexity of physiological interactions between different control mechanisms. This results in a potential risk of cardiovascular diseases which are the number one cause of death globally. Since cardiac signals are nonstationary and nonlinear in nature, complexity measures are better suited to handle such data. In this study, non-invasive methods for detection of cardiac aging using complexity measures are explored. Lempel-Ziv (LZ) complexity, Approximate Entropy (ApEn) and Effort-to-Compress (ETC) measures are used to differentiate between healthy young and old subjects using heartbeat interval data. We show that both LZ and ETC complexity measures are able to differentiate between young and old subjects with only 10 data samples while ApEn requires at least 15 data samples.

  18. Artificial sequences and complexity measures

    Science.gov (United States)

    Baronchelli, Andrea; Caglioti, Emanuele; Loreto, Vittorio

    2005-04-01

    In this paper we exploit concepts of information theory to address the fundamental problem of identifying and defining the most suitable tools for extracting, in a automatic and agnostic way, information from a generic string of characters. We introduce in particular a class of methods which use in a crucial way data compression techniques in order to define a measure of remoteness and distance between pairs of sequences of characters (e.g. texts) based on their relative information content. We also discuss in detail how specific features of data compression techniques could be used to introduce the notion of dictionary of a given sequence and of artificial text and we show how these new tools can be used for information extraction purposes. We point out the versatility and generality of our method that applies to any kind of corpora of character strings independently of the type of coding behind them. We consider as a case study linguistic motivated problems and we present results for automatic language recognition, authorship attribution and self-consistent classification.

  19. Measuring Complexity in an Aquatic Ecosystem

    OpenAIRE

    Fernandez, Nelson; Gershenson, Carlos

    2013-01-01

    We apply formal measures of emergence, self-organization, homeostasis, autopoiesis and complexity to an aquatic ecosystem; in particular to the physiochemical component of an Arctic lake. These measures are based on information theory. Variables with an homogeneous distribution have higher values of emergence, while variables with a more heterogeneous distribution have a higher self-organization. Variables with a high complexity reflect a balance between change (emergence) and regularity/orde...

  20. A Simple Measure of Economic Complexity

    OpenAIRE

    Inoua, Sabiou

    2016-01-01

    We show from a simple model that a country's technological development can be measured by the logarithm of the number of products it makes. We show that much of the income gaps among countries are due to differences in technology, as measured by this simple metric. Finally, we show that the so-called Economic Complexity Index (ECI), a recently proposed measure of collective knowhow, is in fact an estimate of this simple metric (with correlation above 0.9).

  1. Bernoulli measure of complex admissible kneading sequences

    CERN Document Server

    Bruin, Henk

    2012-01-01

    Iterated quadratic polynomials give rise to a rich collection of different dynamical systems that are parametrized by a simple complex parameter $c$. The different dynamical features are encoded by the \\emph{kneading sequence} which is an infinite sequence over $\\{0,\\1\\}$. Not every such sequence actually occurs in complex dynamics. The set of admissible kneading sequences was described by Milnor and Thurston for real quadratic polynomials, and by the authors in the complex case. We prove that the set of admissible kneading sequences has positive Bernoulli measure within the set of sequences over $\\{0,\\1\\}$.

  2. Measurement of Diffusion in Flowing Complex Fluids

    OpenAIRE

    Leonard, Edward F.; Aucoin, Christian P.; Nanne, Edgar E.

    2006-01-01

    A microfluidic device for the measurement of solute diffusion as well as particle diffusion and migration in flowing complex fluids is described. The device is particularly suited to obtaining diffusivities in such fluids, which require a desired flow state to be maintained during measurement. A method based on the Loschmidt diffusion theory and short times of exposure is presented to allow calculation of diffusivities from concentration differences in the flow streams leaving the cell.

  3. Complexity measurement of natural and artificial languages

    CERN Document Server

    Febres, Gerardo; Gershenson, Carlos

    2013-01-01

    We compared entropy for texts written in natural languages (English, Spanish) and artificial languages (computer software) based on a simple expression for the entropy as a function of message length and specific word diversity. Code text written in artificial languages showed higher entropy than text of similar length expressed in natural languages. Spanish texts exhibit more symbolic diversity than English ones. Results showed that algorithms based on complexity measures differentiate artificial from natural languages, and that text analysis based on complexity measures allows the unveiling of important aspects of their nature. We propose specific expressions to examine entropy related aspects of tests and estimate the values of entropy, emergence, self-organization and complexity based on specific diversity and message length.

  4. Balancing model complexity and measurements in hydrology

    Science.gov (United States)

    Van De Giesen, N.; Schoups, G.; Weijs, S. V.

    2012-12-01

    The Data Processing Inequality implies that hydrological modeling can only reduce, and never increase, the amount of information available in the original data used to formulate and calibrate hydrological models: I(X;Z(Y)) ≤ I(X;Y). Still, hydrologists around the world seem quite content building models for "their" watersheds to move our discipline forward. Hydrological models tend to have a hybrid character with respect to underlying physics. Most models make use of some well established physical principles, such as mass and energy balances. One could argue that such principles are based on many observations, and therefore add data. These physical principles, however, are applied to hydrological models that often contain concepts that have no direct counterpart in the observable physical universe, such as "buckets" or "reservoirs" that fill up and empty out over time. These not-so-physical concepts are more like the Artificial Neural Networks and Support Vector Machines of the Artificial Intelligence (AI) community. Within AI, one quickly came to the realization that by increasing model complexity, one could basically fit any dataset but that complexity should be controlled in order to be able to predict unseen events. The more data are available to train or calibrate the model, the more complex it can be. Many complexity control approaches exist in AI, with Solomonoff inductive inference being one of the first formal approaches, the Akaike Information Criterion the most popular, and Statistical Learning Theory arguably being the most comprehensive practical approach. In hydrology, complexity control has hardly been used so far. There are a number of reasons for that lack of interest, the more valid ones of which will be presented during the presentation. For starters, there are no readily available complexity measures for our models. Second, some unrealistic simplifications of the underlying complex physics tend to have a smoothing effect on possible model

  5. Study on fluorescence spectra of molecular association of acetic acid-water

    Institute of Scientific and Technical Information of China (English)

    Caiqin Han; Ying Liu; Yang Yang; Xiaowu Ni; Jian Lu; Xiaosen Luo

    2009-01-01

    Fluorescence spectra of acetic acid-water solution excited by ultraviolet (UV) light are studied, and the relationship between fluorescence spectra and molecular association of acetic acid is discussed. The results indicate that when the exciting light wavelength is longer than 246 nm, there are two fluorescence peaks located at 305 and 334 nm, respectively. By measuring the excitation spectra, the optimal wavelengths of the two fluorescence peaks are obtained, which are 258 and 284 nm, respectively. Fluorescence spectra of acetic acid-water solution change with concentrations, which is primarily attributed to changes of molecular association of acetic acid in aqueous solution. Through theoretical analysis, three variations of molecular association have been obtained in acetic acid-water solution, which are the hydrated monomers, the linear dimers, and the water separated dimers. This research can provide references to studies of molecular association of acetic acid-water, especially studies of hydrogen bonds.

  6. Residual radioactivity measurements at Indus accelerator complex

    International Nuclear Information System (INIS)

    Indus-1 and Indus-2 are two Synchrotron Radiation Sources (SRS) operational at RRCAT, Indore. Indus-1 and Indus-2 are designed for maximum electron beam energy of 450 MeV and 2.5 GeV respectively. During shut down of these accelerators for maintenance purpose, residual radioactivity measurements were carried out. The residual radioactivity formation in various parts of the high energy electron accelerators is due to the beam loss taking place at these locations. The present paper describes the recent residual radioactivity measurements carried out at the electron accelerators of Indus Accelerator Complex and the radio-isotopes identified. The maximum dose rate due to induced activity obtained is 30 μSv/h, near dipole-5 of booster synchrotron after 12 h of cooling time. In case of Indus-1 and Indus-2 SRS the dose rate due to induced radioactivity is found to be of the order of 2 - 3 μSv/h. The radio isotopes identified at these beam loss locations are beta emitters that do not pose serious external hazard to the working personnel. However, precautions are to be observed while doing maintenance on activated components. The paper describes the measurements in detail with the results. (author)

  7. Consistently weighted measures for complex network topologies

    CERN Document Server

    Heitzig, Jobst; Zou, Yong; Marwan, Norbert; Kurths, Jürgen

    2011-01-01

    When network and graph theory are used in the study of complex systems, a typically finite set of nodes of the network under consideration is frequently either explicitly or implicitly considered representative of a much larger finite or infinite set of objects of interest. The selection procedure, e.g., formation of a subset or some kind of discretization or aggregation, typically results in individual nodes of the studied network representing quite differently sized parts of the domain of interest. This heterogeneity may induce substantial bias and artifacts in derived network statistics. To avoid this bias, we propose an axiomatic scheme based on the idea of {\\em node splitting invariance} to derive consistently weighted variants of various commonly used statistical network measures. The practical relevance and applicability of our approach is demonstrated for a number of example networks from different fields of research, and is shown to be of fundamental importance in particular in the study of climate n...

  8. Measuring multiple evolution mechanisms of complex networks.

    Science.gov (United States)

    Zhang, Qian-Ming; Xu, Xiao-Ke; Zhu, Yu-Xiao; Zhou, Tao

    2015-01-01

    Numerous concise models such as preferential attachment have been put forward to reveal the evolution mechanisms of real-world networks, which show that real-world networks are usually jointly driven by a hybrid mechanism of multiplex features instead of a single pure mechanism. To get an accurate simulation for real networks, some researchers proposed a few hybrid models by mixing multiple evolution mechanisms. Nevertheless, how a hybrid mechanism of multiplex features jointly influence the network evolution is not very clear. In this study, we introduce two methods (link prediction and likelihood analysis) to measure multiple evolution mechanisms of complex networks. Through tremendous experiments on artificial networks, which can be controlled to follow multiple mechanisms with different weights, we find the method based on likelihood analysis performs much better and gives very accurate estimations. At last, we apply this method to some real-world networks which are from different domains (including technology networks and social networks) and different countries (e.g., USA and China), to see how popularity and clustering co-evolve. We find most of them are affected by both popularity and clustering, but with quite different weights.

  9. A New Method for Measurement and Reduction of Software Complexity

    Institute of Scientific and Technical Information of China (English)

    SHI Yindun; XU Shiyi

    2007-01-01

    This paper develops an improved structural software complexity metrics named information flow complexity which is closely related to the reliability of software. Together with the three software complexity metrics, the total software complexity is measured and some rules to reduce the complexity are presented in the paper. To illustrate and explain the process of measurement and reduction of software complexity, several examples and experiments are given. It is proposed that software complexity metrics can be measured earlier in software development and can provide substantial information of software systems whose reliability can be modeled and used in the determination of initial parameter estimation.

  10. Thermodynamic properties of citric acid and the system citric acid-water

    NARCIS (Netherlands)

    Kruif, C.G. de; Miltenburg, J.C. van; Sprenkels, A.J.J.; Stevens, G.; Graaf, W. de; Wit, H.G.M. de

    1982-01-01

    The binary system citric acid-water has been investigated with static vapour pressure measurements, adiabatic calorimetry, solution calorimetry, solubility measurements and powder X-ray measurements. The data are correlated by thermodynamics and a large part of the phase diagram is given. Molar heat

  11. Complexity measures in magnetoencephalography: measuring "disorder" in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Matthew J Brookes

    Full Text Available This paper details a methodology which, when applied to magnetoencephalography (MEG data, is capable of measuring the spatio-temporal dynamics of 'disorder' in the human brain. Our method, which is based upon signal entropy, shows that spatially separate brain regions (or networks generate temporally independent entropy time-courses. These time-courses are modulated by cognitive tasks, with an increase in local neural processing characterised by localised and transient increases in entropy in the neural signal. We explore the relationship between entropy and the more established time-frequency decomposition methods, which elucidate the temporal evolution of neural oscillations. We observe a direct but complex relationship between entropy and oscillatory amplitude, which suggests that these metrics are complementary. Finally, we provide a demonstration of the clinical utility of our method, using it to shed light on aberrant neurophysiological processing in schizophrenia. We demonstrate significantly increased task induced entropy change in patients (compared to controls in multiple brain regions, including a cingulo-insula network, bilateral insula cortices and a right fronto-parietal network. These findings demonstrate potential clinical utility for our method and support a recent hypothesis that schizophrenia can be characterised by abnormalities in the salience network (a well characterised distributed network comprising bilateral insula and cingulate cortices.

  12. Complexity measures in magnetoencephalography: measuring "disorder" in schizophrenia.

    Science.gov (United States)

    Brookes, Matthew J; Hall, Emma L; Robson, Siân E; Price, Darren; Palaniyappan, Lena; Liddle, Elizabeth B; Liddle, Peter F; Robinson, Stephen E; Morris, Peter G

    2015-01-01

    This paper details a methodology which, when applied to magnetoencephalography (MEG) data, is capable of measuring the spatio-temporal dynamics of 'disorder' in the human brain. Our method, which is based upon signal entropy, shows that spatially separate brain regions (or networks) generate temporally independent entropy time-courses. These time-courses are modulated by cognitive tasks, with an increase in local neural processing characterised by localised and transient increases in entropy in the neural signal. We explore the relationship between entropy and the more established time-frequency decomposition methods, which elucidate the temporal evolution of neural oscillations. We observe a direct but complex relationship between entropy and oscillatory amplitude, which suggests that these metrics are complementary. Finally, we provide a demonstration of the clinical utility of our method, using it to shed light on aberrant neurophysiological processing in schizophrenia. We demonstrate significantly increased task induced entropy change in patients (compared to controls) in multiple brain regions, including a cingulo-insula network, bilateral insula cortices and a right fronto-parietal network. These findings demonstrate potential clinical utility for our method and support a recent hypothesis that schizophrenia can be characterised by abnormalities in the salience network (a well characterised distributed network comprising bilateral insula and cingulate cortices).

  13. Evaluating quantitative measures of grammatical complexity in spontaneous speech samples.

    Science.gov (United States)

    Blake, J; Quartaro, G; Onorati, S

    1993-02-01

    The validity of MLU and a measure of syntactic complexity were tested against LARSP on spontaneous speech samples from 87 children, ranging in age from 1;6 to 4;9. Change in some LARSP clausal measures was found across MLU stages up to MLU 4.5. For the measure of syntactic complexity, no such ceiling was found for the clausal connectivity score in LARSP or for average clausal complexity in LARSP. Neither MLU nor the measure of syntactic complexity indexed LARSP phrasal complexity. It is concluded that MLU is a valid measure of clausal complexity up to 4.5 and that our measure of syntactic complexity is more valid at more advanced stages.

  14. Measuring Customer Profitability in Complex Environments

    DEFF Research Database (Denmark)

    Holm, Morten; Kumar, V.; Rohde, Carsten

    2012-01-01

    Customer profitability measurement is an important element in customer relationship management and a lever for enhanced marketing accountability. Two distinct measurement approaches have emerged in the marketing literature: Customer Lifetime Value (CLV) and Customer Profitability Analysis (CPA...

  15. Laser beam complex amplitude measurement by phase diversity

    OpenAIRE

    Védrenne, Nicolas; Mugnier, Laurent M.; Michau, Vincent; Velluet, Marie-Thérèse; Bierent, Rudolph

    2014-01-01

    The control of the optical quality of a laser beam requires a complex amplitude measurement able to deal with strong modulus variations and potentially highly perturbed wavefronts. The method proposed here consists in an extension of phase diversity to complex amplitude measurements that is effective for highly perturbed beams. Named CAMELOT for Complex Amplitude MEasurement by a Likelihood Optimization Tool, it relies on the acquisition and processing of few images of the beam section taken ...

  16. Complexity analysis in particulate matter measurements

    Directory of Open Access Journals (Sweden)

    Luciano Telesca

    2011-09-01

    Full Text Available We investigated the complex temporal fluctuations of particulate matter data recorded in London area by using the Fisher-Shannon (FS information plane. In the FS plane the PM10 and PM2.5 data are aggregated in two different clusters, characterized by different degrees of order and organization. This results could be related to different sources of the particulate matter.

  17. Reclamation of acid waters using sewage sludge.

    Science.gov (United States)

    Davison, W; Reynolds, C S; Tipping, E; Needham, R F

    1989-01-01

    An exhausted sand quarry which had filled with acid water (pH 3) from the oxidation of pyrite was treated with calcium hydroxide to neutralize the water (pH 8), and sewage sludge to prevent further ingress of acid. The water remained neutral for 2 years, an appreciable quantity of base being generated by the reduction of sulphate to sulphide in the anoxic sediment formed by the sewage sludge. After this time the water reverted to acid conditions, chiefly because the lake was too shallow to retain the sewage sludge over a sufficiently large area of its bed. Incubation experiments showed that the sewage sludge had a large capacity for sulphate reduction, which was equally efficient in acid or neutral waters and that the areal rate of consumption was sufficiently fast to neutralize all incoming acid, if at least 50% of the lake bed was covered with sludge. Throughout the course of the field investigations there was no foul smell and the lake was quickly colonized by phytoplankton, macrophytes and insects. Although nutrients associated with the sewage sludge stimulated photosynthesis and so caused the generation of additional organic matter, they were exhausted within two years. To ensure permanent reclamation, phosphate fertilizer could be added once the initial supply has been consumed. Neutralization removed trace metals from the system, presumably due to formation of insoluble oxyhydroxide and carbonates. The solubility of aluminium was apparently controlled by a basic aluminium sulphate (jurbanite).

  18. Comparative Analysis of EEG Signals Based on Complexity Measure

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The aim of this study is to identify the functions and states of the brains according to the values of the complexity measure of the EEG signals. The EEG signals of 30 normal samples and 30 patient samples are collected. Based on the preprocessing for the raw data, a computational program for complexity measure is compiled and the complexity measures of all samples are calculated. The mean value and standard error of complexity measure of control group is as 0.33 and 0.10, and the normal group is as 0.53 an...

  19. An entropy based measure for comparing distributions of complexity

    Science.gov (United States)

    Rajaram, R.; Castellani, B.

    2016-07-01

    This paper is part of a series addressing the empirical/statistical distribution of the diversity of complexity within and amongst complex systems. Here, we consider the problem of measuring the diversity of complexity in a system, given its ordered range of complexity types i and their probability of occurrence pi, with the understanding that larger values of i mean a higher degree of complexity. To address this problem, we introduce a new complexity measure called case-based entropyCc - a modification of the Shannon-Wiener entropy measure H. The utility of this measure is that, unlike current complexity measures-which focus on the macroscopic complexity of a single system-Cc can be used to empirically identify and measure the distribution of the diversity of complexity within and across multiple natural and human-made systems, as well as the diversity contribution of complexity of any part of a system, relative to the total range of ordered complexity types.

  20. Unraveling chaotic attractors by complex networks and measurements of stock market complexity.

    Science.gov (United States)

    Cao, Hongduo; Li, Ying

    2014-03-01

    We present a novel method for measuring the complexity of a time series by unraveling a chaotic attractor modeled on complex networks. The complexity index R, which can potentially be exploited for prediction, has a similar meaning to the Kolmogorov complexity (calculated from the Lempel-Ziv complexity), and is an appropriate measure of a series' complexity. The proposed method is used to research the complexity of the world's major capital markets. None of these markets are completely random, and they have different degrees of complexity, both over the entire length of their time series and at a level of detail. However, developing markets differ significantly from mature markets. Specifically, the complexity of mature stock markets is stronger and more stable over time, whereas developing markets exhibit relatively low and unstable complexity over certain time periods, implying a stronger long-term price memory process.

  1. Unraveling chaotic attractors by complex networks and measurements of stock market complexity

    International Nuclear Information System (INIS)

    We present a novel method for measuring the complexity of a time series by unraveling a chaotic attractor modeled on complex networks. The complexity index R, which can potentially be exploited for prediction, has a similar meaning to the Kolmogorov complexity (calculated from the Lempel–Ziv complexity), and is an appropriate measure of a series' complexity. The proposed method is used to research the complexity of the world's major capital markets. None of these markets are completely random, and they have different degrees of complexity, both over the entire length of their time series and at a level of detail. However, developing markets differ significantly from mature markets. Specifically, the complexity of mature stock markets is stronger and more stable over time, whereas developing markets exhibit relatively low and unstable complexity over certain time periods, implying a stronger long-term price memory process

  2. Measuring control structure complexity through execution sequence grammars

    OpenAIRE

    MacLennan, Bruce J.

    1981-01-01

    A method for measuring the complexity of control structures is presented. It is based on the size of a grammar describing the possible execution sequences of the control structure. This method is applied to a number of control structures, including Pascal's control structures, Dijkstra's operators, and a structure recently proposed by Parnas. The verification of complexity measures is briefly discussed. (Author)

  3. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    Science.gov (United States)

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  4. SAT is a problem with exponential complexity measured by negentropy

    OpenAIRE

    Pan, Feng(Department of Physics, Liaoning Normal University, Dalian 116029, China)

    2014-01-01

    In this paper the reason why entropy reduction (negentropy) can be used to measure the complexity of any computation was first elaborated both in the aspect of mathematics and informational physics. In the same time the equivalence of computation and information was clearly stated. Then the complexities of three specific problems: logical compare, sorting and SAT, were analyzed and measured. The result showed SAT was a problem with exponential complexity which naturally leads to the conclusio...

  5. Clinical complexity in medicine: A measurement model of task and patient complexity

    Science.gov (United States)

    Islam, R.; Weir, C.; Fiol, G. Del

    2016-01-01

    Summary Background Complexity in medicine needs to be reduced to simple components in a way that is comprehensible to researchers and clinicians. Few studies in the current literature propose a measurement model that addresses both task and patient complexity in medicine. Objective The objective of this paper is to develop an integrated approach to understand and measure clinical complexity by incorporating both task and patient complexity components focusing on infectious disease domain. The measurement model was adapted and modified to healthcare domain. Methods Three clinical Infectious Disease teams were observed, audio-recorded and transcribed. Each team included an Infectious Diseases expert, one Infectious Diseases fellow, one physician assistant and one pharmacy resident fellow. The transcripts were parsed and the authors independently coded complexity attributes. This baseline measurement model of clinical complexity was modified in an initial set of coding process and further validated in a consensus-based iterative process that included several meetings and email discussions by three clinical experts from diverse backgrounds from the Department of Biomedical Informatics at the University of Utah. Inter-rater reliability was calculated using Cohen’s kappa. Results The proposed clinical complexity model consists of two separate components. The first is a clinical task complexity model with 13 clinical complexity-contributing factors and 7 dimensions. The second is the patient complexity model with 11 complexity-contributing factors and 5 dimensions. Conclusion The measurement model for complexity encompassing both task and patient complexity will be a valuable resource for future researchers and industry to measure and understand complexity in healthcare. PMID:26404626

  6. On the complexity of computing two nonlinearity measures

    DEFF Research Database (Denmark)

    Find, Magnus Gausdal

    2014-01-01

    We study the computational complexity of two Boolean nonlinearity measures: the nonlinearity and the multiplicative complexity. We show that if one-way functions exist, no algorithm can compute the multiplicative complexity in time 2O(n) given the truth table of length 2n, in fact under the same...... assumption it is impossible to approximate the multiplicative complexity within a factor of (2−ϵ)n/2. When given a circuit, the problem of determining the multiplicative complexity is in the second level of the polynomial hierarchy. For nonlinearity, we show that it is #P hard to compute given a function...

  7. Complexity Measurement of Large-Scale Software System Based on Complex Network

    Directory of Open Access Journals (Sweden)

    Dali Li

    2014-05-01

    Full Text Available With the increase of software system complexity, the traditional measurements can not meet the requirements, for the reason that the developers need control the software quality effectively and guarantee the normal operation of software system. Hence how to measure the complexity of large-scale software system has been a challenge problem. In order to solve this problem, the developers have to obtain a good method to measure the complexity of software system first. Only through this work, the software quality and the software structure could be controlled and optimized. Note that the complex network theory has offered a new theoretical understanding and a new perspective to solve this kind of complexity problem, this work discusses the complexity phenomenon in large-scale software system. Based on this, some complexity measurements of large-scale software system are put forward from static structure and dynamic structure perspectives. Furthermore, we find some potential complexity characteristics in large-scale software networks through the numerical simulations. The proposed measurement methods have a guiding significance on the development for today's large-scale software system. In addition, this paper presents a new technique for the structural complexity measurements of large-scale software system

  8. Laser beam complex amplitude measurement by phase diversity.

    Science.gov (United States)

    Védrenne, Nicolas; Mugnier, Laurent M; Michau, Vincent; Velluet, Marie-Thérèse; Bierent, Rudolph

    2014-02-24

    The control of the optical quality of a laser beam requires a complex amplitude measurement able to deal with strong modulus variations and potentially highly perturbed wavefronts. The method proposed here consists in an extension of phase diversity to complex amplitude measurements that is effective for highly perturbed beams. Named camelot for Complex Amplitude MEasurement by a Likelihood Optimization Tool, it relies on the acquisition and processing of few images of the beam section taken along the optical path. The complex amplitude of the beam is retrieved from the images by the minimization of a Maximum a Posteriori error metric between the images and a model of the beam propagation. The analytical formalism of the method and its experimental validation are presented. The modulus of the beam is compared to a measurement of the beam profile, the phase of the beam is compared to a conventional phase diversity estimate. The precision of the experimental measurements is investigated by numerical simulations.

  9. A Complexity measure based on Requirement Engineering Document

    CERN Document Server

    Sharma, Ashish

    2010-01-01

    Research shows, that the major issue in development of quality software is precise estimation. Further this estimation depends upon the degree of intricacy inherent in the software i.e. complexity. This paper attempts to empirically demonstrate the proposed complexity which is based on IEEE Requirement Engineering document. It is said that a high quality SRS is pre requisite for high quality software. Requirement Engineering document (SRS) is a specification for a particular software product, program or set of program that performs some certain functions for a specific environment. The various complexity measure given so far are based on Code and Cognitive metrics value of software, which are code based. So these metrics provide no leverage to the developer of the code. Considering the shortcoming of code based approaches, the proposed approach identifies complexity of software immediately after freezing the requirement in SDLC process. The proposed complexity measure compares well with established complexity...

  10. A Simple Complexity Measurement for Software Verification and Software Testing

    OpenAIRE

    Cheng, Zheng; Monahan, Rosemary; Power, James F.

    2012-01-01

    In this paper, we used a simple metric (i.e. Lines of Code) to measure the complexity involved in software verification and software testing. The goal is then, to argue for software verification over software testing, and motivate a discussion of how to reduce the complexity involved in software verification. We propose to reduce this complexity by translating the software to a simple intermediate representation which can be verified using an efficient verifier, such as Boog...

  11. SYNAPTONEMAL COMPLEX DAMAGE AS A MEASURE OF GENOTOXICITY AT MEIOSIS

    Science.gov (United States)

    Synaptonemal complex aberrations can provide a sensitive measure of chemical-specific alterations to meiotic chromosomes. Mitomycin C, cyclophosphamide, amsacrine, ellipticine, colchicine, vinblastine sulfate, and cis-platin exposures in mice have been shown to cause various patt...

  12. Fisher Information and Complexity Measure of Generalized Morse Potential Model

    Science.gov (United States)

    Onate, C. A.; Idiodi, J. O. A.

    2016-09-01

    The spreading of the quantum-mechanical probability distribution density of the three-dimensional system is quantitatively determined by means of the local information-theoretic quantity of the Shannon information and information energy in both position and momentum spaces. The complexity measure which is equivalent to Cramer–Rao uncertainty product is determined. We have obtained the information content stored, the concentration of quantum system and complexity measure numerically for n = 0, 1, 2 and 3 respectively.

  13. A computer program for geochemical analysis of acid-rain and other low-ionic-strength, acidic waters

    Science.gov (United States)

    Johnsson, P.A.; Lord, D.G.

    1987-01-01

    ARCHEM, a computer program written in FORTRAN 77, is designed primarily for use in the routine geochemical interpretation of low-ionic-strength, acidic waters. On the basis of chemical analyses of the water, and either laboratory or field determinations of pH, temperature, and dissolved oxygen, the program calculates the equilibrium distribution of major inorganic aqueous species and of inorganic aluminum complexes. The concentration of the organic anion is estimated from the dissolved organic concentration. Ionic ferrous iron is calculated from the dissolved oxygen concentration. Ionic balances and comparisons of computed with measured specific conductances are performed as checks on the analytical accuracy of chemical analyses. ARCHEM may be tailored easily to fit different sampling protocols, and may be run on multiple sample analyses. (Author 's abstract)

  14. High Dynamic Range Complex Impedance Measurement System for Petrophysical Usage

    Science.gov (United States)

    Chen, R.; He, X.; Yao, H.; Tan, S.; Shi, H.; Shen, R.; Yan, C.; Zeng, P.; He, L.; Qiao, N.; Xi, F.; Zhang, H.; Xie, J.

    2015-12-01

    Spectral induced polarization method (SIP) or complex resistivity method is increasing its application in metalliferous ore exploration, hydrocarbon exploration, underground water exploration, monitoring of environment pollution, and the evaluation of environment remediation. And the measurement of complex resistivity or complex impedance of rock/ore sample and polluted water plays a fundamental role in improving the application effect of SIP and the application scope of SIP. However, current instruments can't guaranty the accuracy of measurement when the resistance of sample is less than 10Ω or great than 100kΩ. A lot of samples, such as liquid, polluted sea water, igneous rock, limestone, and sandstone, can't be measured with reliable complex resistivity result. Therefore, this problem projects a shadow in the basic research and application research of SIP. We design a high precision measurement system from the study of measurement principle, sample holder, and measurement instrument. We design input buffers in a single board. We adopt operation amplifier AD549 in this system because of its ultra-high input impedance and ultra-low current noise. This buffer is good in acquiring potential signal across high impedance sample. By analyzing the sources of measurement error and errors generated by the measurement system, we propose a correction method to remove the error in order to achieve high quality complex impedance measurement for rock and ore samples. This measurement system can improve the measurement range of the complex impedance to 0.1 Ω ~ 10 GΩ with amplitude error less than 0.1% and phase error less than 0.1mrad when frequency ranges as 0.01 Hz ~ 1 kHz. We tested our system on resistors with resistance as 0.1Ω ~ 10 GΩ in frequency range as 1 Hz ~ 1000 Hz, and the measurement error is less than 0.1 mrad. We also compared the result with LCR bridge and SCIP, we can find that the bridge's measuring range only reaches 100 MΩ, SCIP's measuring range

  15. One Single Static Measurement Predicts Wave Localization in Complex Structures

    Science.gov (United States)

    Lefebvre, Gautier; Gondel, Alexane; Dubois, Marc; Atlan, Michael; Feppon, Florian; Labbé, Aimé; Gillot, Camille; Garelli, Alix; Ernoult, Maxence; Mayboroda, Svitlana; Filoche, Marcel; Sebbah, Patrick

    2016-08-01

    A recent theoretical breakthrough has brought a new tool, called the localization landscape, for predicting the localization regions of vibration modes in complex or disordered systems. Here, we report on the first experiment which measures the localization landscape and demonstrates its predictive power. Holographic measurement of the static deformation under uniform load of a thin plate with complex geometry provides direct access to the landscape function. When put in vibration, this system shows modes precisely confined within the subregions delineated by the landscape function. Also the maxima of this function match the measured eigenfrequencies, while the minima of the valley network gives the frequencies at which modes become extended. This approach fully characterizes the low frequency spectrum of a complex structure from a single static measurement. It paves the way for controlling and engineering eigenmodes in any vibratory system, especially where a structural or microscopic description is not accessible.

  16. Network Decomposition and Complexity Measures: An Information Geometrical Approach

    Directory of Open Access Journals (Sweden)

    Masatoshi Funabashi

    2014-07-01

    Full Text Available We consider the graph representation of the stochastic model with n binary variables, and develop an information theoretical framework to measure the degree of statistical association existing between subsystems as well as the ones represented by each edge of the graph representation. Besides, we consider the novel measures of complexity with respect to the system decompositionability, by introducing the geometric product of Kullback–Leibler (KL- divergence. The novel complexity measures satisfy the boundary condition of vanishing at the limit of completely random and ordered state, and also with the existence of independent subsystem of any size. Such complexity measures based on the geometric means are relevant to the heterogeneity of dependencies between subsystems, and the amount of information propagation shared entirely in the system.

  17. Measuring logic complexity can guide pattern discovery in empirical systems

    CERN Document Server

    Gherardi, Marco

    2016-01-01

    We explore a definition of complexity based on logic functions, which are widely used as compact descriptions of rules in diverse fields of contemporary science. Detailed numerical analysis shows that (i) logic complexity is effective in discriminating between classes of functions commonly employed in modelling contexts; (ii) it extends the notion of canalisation, used in the study of genetic regulation, to a more general and detailed measure; (iii) it is tightly linked to the resilience of a function's output to noise affecting its inputs. We demonstrate its utility by measuring it in empirical data on gene regulation, digital circuitry, and propositional calculus. Logic complexity is exceptionally low in these systems. The asymmetry between "on" and "off" states in the data correlates with the complexity in a non-null way; a model of random Boolean networks clarifies this trend and indicates a common hierarchical architecture in the three systems.

  18. Riemannian-geometric entropy for measuring network complexity

    Science.gov (United States)

    Franzosi, Roberto; Felice, Domenico; Mancini, Stefano; Pettini, Marco

    2016-06-01

    A central issue in the science of complex systems is the quantitative characterization of complexity. In the present work we address this issue by resorting to information geometry. Actually we propose a constructive way to associate with a—in principle, any—network a differentiable object (a Riemannian manifold) whose volume is used to define the entropy. The effectiveness of the latter in measuring network complexity is successfully proved through its capability of detecting a classical phase transition occurring in both random graphs and scale-free networks, as well as of characterizing small exponential random graphs, configuration models, and real networks.

  19. A Measure of Learning Model Complexity by VC Dimension

    Institute of Scientific and Technical Information of China (English)

    WANG Wen-jian; ZHANG Li-xia; XU Zong-ben

    2002-01-01

    When developing models there is always a trade-off between model complexity and model fit. In this paper, a measure of learning model complexity based on VC dimension is presented, and some relevant mathematical theory surrounding the derivation and use of this metric is summarized. The measure allows modelers to control the amount of error that is returned from a modeling system and to state upper bounds on the amount of error that the modeling system will return on all future, as yet unseen and uncollected data sets. It is possible for modelers to use the VC theory to determine which type of model more accurately represents a system.

  20. A new complexity measure for time series analysis and classification

    Science.gov (United States)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  1. On bias of kinetic temperature measurements in complex plasmas

    DEFF Research Database (Denmark)

    Kantor, M.; Moseev, D.; Salewski, Mirko

    2014-01-01

    The kinetic temperature in complex plasmas is often measured using particle tracking velocimetry. Here, we introduce a criterion which minimizes the probability of faulty tracking of particles with normally distributed random displacements in consecutive frames. Faulty particle tracking results in...... a measurement bias of the deduced velocity distribution function and hence the deduced kinetic temperature. For particles with a normal velocity distribution function, mistracking biases the obtained velocity distribution function towards small velocities at the expense of large velocities, i...

  2. The Generalization Complexity Measure for Continuous Input Data

    Directory of Open Access Journals (Sweden)

    Iván Gómez

    2014-01-01

    defined in Boolean space, quantifies the complexity of data in relationship to the prediction accuracy that can be expected when using a supervised classifier like a neural network, SVM, and so forth. We first extend the original measure for its use with continuous functions to later on, using an approach based on the use of the set of Walsh functions, consider the case of having a finite number of data points (inputs/outputs pairs, that is, usually the practical case. Using a set of trigonometric functions a model that gives a relationship between the size of the hidden layer of a neural network and the complexity is constructed. Finally, we demonstrate the application of the introduced complexity measure, by using the generated model, to the problem of estimating an adequate neural network architecture for real-world data sets.

  3. A SHARC based ROB Complex : design and measurement results

    CERN Document Server

    Boterenbrood, H; Kieft, G; Scholte, R; Slopsema, R; Vermeulen, J C

    2000-01-01

    ROB hardware, based on and exploiting the properties of the SHARC DSP and of FPGAs, and the associated software are described. Results from performance measurements and an analysis of the results for a single ROBIn as well as for a ROB Complex with up to 4 ROBIns are presented.

  4. Assessment of Complex Performances: Limitations of Key Measurement Assumptions.

    Science.gov (United States)

    Delandshere, Ginette; Petrosky, Anthony R.

    1998-01-01

    Examines measurement concepts and assumptions traditionally used in educational assessment, using the Early Adolescence/English Language Arts assessment developed for the National Board for Professional Teaching Standards as a context. The use of numerical ratings in complex performance assessment is questioned. (SLD)

  5. Measures for track complexity and robustness of operation at stations

    DEFF Research Database (Denmark)

    Landex, Alex; Jensen, Lars Wittrup

    2013-01-01

    Stations are often limiting the capacity of a railway network. However, most capacity analysis methods focus on open line capacity. This paper presents methods to analyse and describe stations by the use of complexity and robustness measures at stations.Five methods to analyse infrastructure...

  6. Measuring the Complexity of Self-organizing Traffic Lights

    CERN Document Server

    Zubillaga, Dario; Aguilar, Luis Daniel; Zapotecatl, Jorge; Fernandez, Nelson; Aguilar, Jose; Rosenblueth, David A; Gershenson, Carlos

    2014-01-01

    We apply measures of complexity, emergence and self-organization to an abstract city traffic model for comparing a traditional traffic coordination method with a self-organizing method in two scenarios: cyclic boundaries and non-orientable boundaries. We show that the measures are useful to identify and characterize different dynamical phases. It becomes clear that different operation regimes are required for different traffic demands. Thus, not only traffic is a non-stationary problem, which requires controllers to adapt constantly. Controllers must also change drastically the complexity of their behavior depending on the demand. Based on our measures, we can say that the self-organizing method achieves an adaptability level comparable to a living system.

  7. Self-dissimilarity as a High Dimensional Complexity Measure

    Science.gov (United States)

    Wolpert, David H.; Macready, William

    2005-01-01

    For many systems characterized as "complex" the patterns exhibited on different scales differ markedly from one another. For example the biomass distribution in a human body "looks very different" depending on the scale at which one examines it. Conversely, the patterns at different scales in "simple" systems (e.g., gases, mountains, crystals) vary little from one scale to another. Accordingly, the degrees of self-dissimilarity between the patterns of a system at various scales constitute a complexity "signature" of that system. Here we present a novel quantification of self-dissimilarity. This signature can, if desired, incorporate a novel information-theoretic measure of the distance between probability distributions that we derive here. Whatever distance measure is chosen, our quantification of self-dissimilarity can be measured for many kinds of real-world data. This allows comparisons of the complexity signatures of wholly different kinds of systems (e.g., systems involving information density in a digital computer vs. species densities in a rain-forest vs. capital density in an economy, etc.). Moreover, in contrast to many other suggested complexity measures, evaluating the self-dissimilarity of a system does not require one to already have a model of the system. These facts may allow self-dissimilarity signatures to be used a s the underlying observational variables of an eventual overarching theory relating all complex systems. To illustrate self-dissimilarity we present several numerical experiments. In particular, we show that underlying structure of the logistic map is picked out by the self-dissimilarity signature of time series produced by that map

  8. Complexity-Entropy Causality Plane as a Complexity Measure for Two-dimensional Patterns

    CERN Document Server

    Ribeiro, H V; Lenzi, E K; Santoro, P A; Mendes, R S; 10.1371/journal.pone.0040689

    2012-01-01

    Complexity measures are essential to understand complex systems and there are numerous definitions to analyze one-dimensional data. However, extensions of these approaches to two or higher-dimensional data, such as images, are much less common. Here, we reduce this gap by applying the ideas of the permutation entropy combined with a relative entropic index. We build up a numerical procedure that can be easily implemented to evaluate the complexity of two or higher-dimensional patterns. We work out this method in different scenarios where numerical experiments and empirical data were taken into account. Specifically, we have applied the method to i) fractal landscapes generated numerically where we compare our measures with the Hurst exponent; ii) liquid crystal textures where nematic-isotropic-nematic phase transitions were properly identified; iii) 12 characteristic textures of liquid crystals where the different values show that the method can distinguish different phases; iv) and Ising surfaces where our m...

  9. A Method for Measuring the Structure Complexity of Web Application

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The precise and effective measure results of Web applications not only facilitate good comprehension of them, but also benefit to the macro-management of software activities, such as testing, reverse engineering, reuse, etc. The paper exploits some researches on measuring the structure complexity of Web application. Through a deep analysis of the configuration and objects' interactions of Web system, two conclusions have been drawn:①A generic Web application consists of static web page, dynamic page, component and database object;②The main interactions have only three styles, that is static link, dynamic link and call/return relation. Based on analysis and modeling of the content of a Web page (static or dynamic), complexity measure methods of both control logic of script and nesting of HTML code are further discussed. In addition, two methods for measuring the complexity of inter-page navigation are also addressed by modeling the inter-page navigation behaviors of Web application via WNG graph.

  10. Applications of fidelity measures to complex quantum systems.

    Science.gov (United States)

    Wimberger, Sandro

    2016-06-13

    We revisit fidelity as a measure for the stability and the complexity of the quantum motion of single-and many-body systems. Within the context of cold atoms, we present an overview of applications of two fidelities, which we call static and dynamical fidelity, respectively. The static fidelity applies to quantum problems which can be diagonalized since it is defined via the eigenfunctions. In particular, we show that the static fidelity is a highly effective practical detector of avoided crossings characterizing the complexity of the systems and their evolutions. The dynamical fidelity is defined via the time-dependent wave functions. Focusing on the quantum kicked rotor system, we highlight a few practical applications of fidelity measurements in order to better understand the large variety of dynamical regimes of this paradigm of a low-dimensional system with mixed regular-chaotic phase space. PMID:27140967

  11. Statistical analysis of complex systems with nonclassical invariant measures

    KAUST Repository

    Fratalocchi, Andrea

    2011-02-28

    I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a general formalism based on the Ablowitz-Kaup-Newell-Segur scheme, I demonstrate how to build an invariant measure and, within a one-dimensional phase space, how to develop a suitable thermodynamics. A detailed example is provided with a universal model of wave propagation, with reference to a transparent potential sustaining gray solitons. The system shows a rich thermodynamic scenario, with a free-energy landscape supporting phase transitions and controllable emergent properties. I finally discuss the origin of such behavior, trying to identify common denominators in the area of complex dynamics.

  12. Measuring system complexity to support development cost estimates

    Science.gov (United States)

    Malone, P.; Wolfarth, L.

    Systems and System-of-Systems (SoS) are being used more frequently either as a design element of stand alone systems or architectural frameworks. Consequently, a programmatic need has arisen to understand and measure systems complexity in order to estimate more accurately development plans and life-cycle costs. In a prior paper, we introduced the System Readiness Level (SRL) concept as a composite function of both Technology Readiness Levels (TRLs) and Integration Readiness Levels (IRLs) and touched on system complexity. While the SRL approach provides a repeatable, process-driven method to assess the maturity of a system or SoS, it does not capture all aspects of system complexity. In this paper we assess the concept of cyclomatic complexity as a system complexity metric and consider its utility as an approach for estimating the life-cycle costs and cost growth of complex systems. We hypothesize that the greater the number of technologies and integration tasks, the more complex the system and the higher its cost to develop and maintain. We base our analysis on historical data from DoD programs that have experienced significant cost growth, including some that have been cancelled due to unsustainable cost (and schedule) growth. We begin by describing the original implementation of the cyclomatic method, which was developed to estimate the effort to maintain system software. We then describe how the method can be generalized and applied to systems. Next, we show how to estimate the cyclomatic number (CN) and show the statistical significance between a system's CN metric and its cost. We illustrate the method with an example. Last, we discuss opportunities for future research.

  13. Measuring MAP kinase activity in immune complex assays.

    Science.gov (United States)

    Cherkasova, Vera A

    2006-11-01

    I present an overview of published methods for measuring mitogen activated protein (MAP) kinase activity on endogenous associated substrates, exogenously added substrates as well as determination of activation loop phosphorylation as a read-out of kinase activity in vivo. Detailed procedures for these assays are given for two MAP kinases (MAPKs) Fus3 and Kss1 and compared with other published protocols, including the protocols for Hog1 and Mpk1 MAPKs. Measuring kinase activity in immune complex assays can serve as an approach for identification of potential substrates of protein kinases as well as for detecting other kinase-associated proteins. PMID:16890454

  14. Measurements of complex refractive indices of photoactive yellow protein

    CERN Document Server

    Lee, KyeoReh; Jung, JaeHwang; Ihee, Hyotcherl; Park, YongKeun

    2015-01-01

    A novel optical technique for measuring the complex refractive index (CRI) of photoactive proteins over the wide range of visible wavelengths is presented. Employing quantitative phase microscopy equipped with a wavelength swept source, optical fields transmitted from a solution of photoactive proteins were precisely measured, from which the CRIs of the photoactive proteins were retrieved with the Fourier light scattering technique. Using the present method, both the real and imaginary RIs of a photoactive yellow protein (PYP) solution were precisely measured over a broad wavelength range (461 - 582 nm). The internal population of the ground and excited states were switched by blue light excitation (445 nm center wavelength), and the broadband refractive index increments of each state were measured. The significant CRI deviation between in the presence and absence of the blue excitation was quantified and explained based on the Kramers-Kronig relations.

  15. Novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis

    CERN Document Server

    Mihailovic, Dragutin T; Nikolic-Djoric, Emilija; Arsenic, Ilija

    2013-01-01

    We have proposed novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis. We have considered background of the Kolmogorov complexity and also we have discussed meaning of the physical as well as other complexities. To get better insights into the complexity of complex systems and time series analysis we have introduced the three novel measures based on the Kolmogorov complexity: (i) the Kolmogorov complexity spectrum, (ii) the Kolmogorov complexity spectrum highest value and (iii) the overall Kolmogorov complexity. The characteristics of these measures have been tested using a generalized logistic equation. Finally, the proposed measures have been applied on different time series originating from: the model output (the biochemical substance exchange in a multi-cell system), four different geophysical phenomena (dynamics of: river flow, long term precipitation, indoor 222Rn concentration and UV radiation dose) and economy (stock prices dynamics). Re...

  16. OPEN PUBLIC SPACE ATTRIBUTES AND CATEGORIES – COMPLEXITY AND MEASURABILITY

    Directory of Open Access Journals (Sweden)

    Ljiljana Čavić

    2014-12-01

    Full Text Available Within the field of architectural and urban research, this work addresses the complexity of contemporary public space, both in a conceptual and concrete sense. It aims at systematizing spatial attributes and their categories and discussing spatial complexity and measurability, all this in order to reach a more comprehensive understanding, description and analysis of public space. Our aim is to improve everyday usage of open public space and we acknowledged users as its crucial factor. There are numerous investigations on the complex urban and architectural reality of public space that recognise importance of users. However, we did not find any that would holistically account for what users find essential in public space. Based on the incompleteness of existing approaches on open public space and the importance of users for their success, this paper proposes a user-orientated approach. Through an initial survey directed to users, we collected the most important aspects of public spaces in the way that contemporary humans see them. The gathered data is analysed and coded into spatial attributes from which their role in the complexity of open public space and measurability are discussed. The work results in an inventory of attributes that users find salient in public spaces. It does not discuss their qualitative values or contribution in generating spatial realities. It aims to define them clearly so that any further logical argumentation on open space concerning users may be solidly constructed. Finally, through categorisation of attributes it proposes the disciplinary levels necessary for the analysis of complex urban-architectural reality

  17. Compositional segmentation and complexity measurement in stock indices

    Science.gov (United States)

    Wang, Haifeng; Shang, Pengjian; Xia, Jianan

    2016-01-01

    In this paper, we introduce a complexity measure based on the entropic segmentation called sequence compositional complexity (SCC) into the analysis of financial time series. SCC was first used to deal directly with the complex heterogeneity in nonstationary DNA sequences. We already know that SCC was found to be higher in sequences with long-range correlation than those with low long-range correlation, especially in the DNA sequences. Now, we introduce this method into financial index data, subsequently, we find that the values of SCC of some mature stock indices, such as S & P 500 (simplified with S & P in the following) and HSI, are likely to be lower than the SCC value of Chinese index data (such as SSE). What is more, we find that, if we classify the indices with the method of SCC, the financial market of Hong Kong has more similarities with mature foreign markets than Chinese ones. So we believe that a good correspondence is found between the SCC of the index sequence and the complexity of the market involved.

  18. Aluminium speciation in streams and lakes of the UK Acid Waters Monitoring Network, modelled with WHAM.

    Science.gov (United States)

    Tipping, E; Carter, H T

    2011-03-15

    The Windermere Humic Aqueous Model (WHAM) incorporating Humic Ion-Binding Model VI was applied to analytical data from the United Kingdom Acid Waters Monitoring Network, collected for 22 streams and lakes over the period 1988-2007, to calculate the chemical speciation of monomeric aluminium (Al(mon)) in 3087 water samples. Model outputs were compared with analytical measurements of labile and non-labile Al(mon) concentrations, the former being equated with inorganic forms of Al(mon) and the latter with organically-complexed metal. Raw analytical data were used, and also data produced by applying a correction for the possible dissociation of organically-complexed Al(mon), and therefore its underestimation, during passage through the analytical cation-exchange column. Model calibration was performed by finding the conversion factor, F(FADOC), between the concentration of isolated fulvic acid, with default ion-binding properties, required by the model, and the measured concentration of dissolved organic carbon, [DOC]. For both uncorrected and corrected data, the value of F(FADOC) for streams was greater than for lakes, indicating greater binding activity towards aluminium. Model fits were better using uncorrected analytical data, but the values of F(FADOC) obtained from corrected data agreed more closely with previous estimates. The model provided reasonably good explanations of differences in aluminium speciation between sampling sites, and of temporal variations at individual sites. With total monomeric concentration as input, WHAM calculations might substitute for analytical speciation measurements, or aid analytical quality control. Calculated Al(3+) activities, a(Al3+), showed a pH-dependence similar to that previously found for other surface waters, and the modelling exercise identified differences between waters of up to two orders of magnitude in the value of a(Al3+) at a given pH. The model gives the net charge of dissolved organic matter, which is calculated

  19. Atmospheric stability and complex terrain: comparing measurements and CFD

    DEFF Research Database (Denmark)

    Koblitz, Tilman; Bechmann, Andreas; Berg, Jacob;

    2014-01-01

    For wind resource assessment, the wind industry is increasingly relying on Computational Fluid Dynamics models that focus on modeling the airflow in a neutrally stratified surface layer. So far, physical processes that are specific to the atmospheric boundary layer, for example the Coriolis force......-neutral atmospheric flow over complex terrain including physical processes like stability and Coriolis force. We examine the influence of these effects on the whole atmospheric boundary layer using the DTU Wind Energy flow solver EllipSys3D. To validate the flow solver, measurements from Benakanahalli hill, a field...... experiment that took place in India in early 2010, are used. The experiment was specifically designed to address the combined effects of stability and Coriolis force over complex terrain, and provides a dataset to validate flow solvers. Including those effects into EllipSys3D significantly improves...

  20. Increment entropy as a measure of complexity for time series

    CERN Document Server

    Liu, Xiaofeng; Xu, Ning; Xue, Jianru

    2015-01-01

    Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce increment entropy to measure the complexity of time series in which each increment is mapped into a word of two letters, one letter corresponding to direction and the other corresponding to magnitude. The Shannon entropy of the words is termed as increment entropy (IncrEn). Simulations on synthetic data and tests on epileptic EEG signals have demonstrated its ability of detecting the abrupt change, regardless of energetic (e.g. spikes or bursts) or structural changes. The computation of IncrEn does not make any assumption on time series and it can be applicable to arbitrary real-world data.

  1. Overcoming Problems in the Measurement of Biological Complexity

    CERN Document Server

    Cebrian, Manuel; Ortega, Alfonso

    2010-01-01

    In a genetic algorithm, fluctuations of the entropy of a genome over time are interpreted as fluctuations of the information that the genome's organism is storing about its environment, being this reflected in more complex organisms. The computation of this entropy presents technical problems due to the small population sizes used in practice. In this work we propose and test an alternative way of measuring the entropy variation in a population by means of algorithmic information theory, where the entropy variation between two generational steps is the Kolmogorov complexity of the first step conditioned to the second one. As an example application of this technique, we report experimental differences in entropy evolution between systems in which sexual reproduction is present or absent.

  2. Increment Entropy as a Measure of Complexity for Time Series

    Directory of Open Access Journals (Sweden)

    Xiaofeng Liu

    2016-01-01

    Full Text Available Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce an increment entropy to measure the complexity of time series in which each increment is mapped onto a word of two letters, one corresponding to the sign and the other corresponding to the magnitude. Increment entropy (IncrEn is defined as the Shannon entropy of the words. Simulations on synthetic data and tests on epileptic electroencephalogram (EEG signals demonstrate its ability of detecting abrupt changes, regardless of the energetic (e.g., spikes or bursts or structural changes. The computation of IncrEn does not make any assumption on time series, and it can be applicable to arbitrary real-world data.

  3. A new measure of heterogeneity for complex networks

    CERN Document Server

    Jacob, Rinku; Misra, R; Ambika, G

    2016-01-01

    We propose a novel measure of heterogeneity for unweighted and undirected complex networks that can be derived from the degree distribution of the network instead of the degree sequences, as is done at present. We show that the proposed measure can be applied to all types of topology with ease and shows direct correlation with the diversity of node degrees in the network. The measure is mathematically well behaved and is normalised in the interval [0, 1]. The measure is applied to compute the heterogeneity of synthetic (both random and scale free) and real world networks. We specifically show that the heterogeneity of an evolving scale free network decreases as a power law with the size of the network N, implying a scale free character for the proposed measure. Finally, as a specific application, we show that the proposed measure can be used to compare the heterogeneity of recurrence networks constructed from the time series of several low dimensional chaotic attractors, thereby providing a single index to co...

  4. Fractal and complexity measures of heart rate variability.

    Science.gov (United States)

    Perkiömäki, Juha S; Mäkikallio, Timo H; Huikuri, Heikki V

    2005-01-01

    Heart rate variability has been analyzed conventionally with time and frequency domain methods, which measure the overall magnitude of RR interval fluctuations around its mean value or the magnitude of fluctuations in some predetermined frequencies. Analysis of heart rate dynamics by methods based on chaos theory and nonlinear system theory has gained recent interest. This interest is based on observations suggesting that the mechanisms involved in cardiovascular regulation likely interact with each other in a nonlinear way. Furthermore, recent observational studies suggest that some indexes describing nonlinear heart rate dynamics, such as fractal scaling exponents, may provide more powerful prognostic information than the traditional heart rate variability indexes. In particular, the short-term fractal scaling exponent measured by the detrended fluctuation analysis method has predicted fatal cardiovascular events in various populations. Approximate entropy, a nonlinear index of heart rate dynamics, that describes the complexity of RR interval behavior, has provided information on the vulnerability to atrial fibrillation. Many other nonlinear indexes, e.g., Lyapunov exponent and correlation dimensions, also give information on the characteristics of heart rate dynamics, but their clinical utility is not well established. Although concepts of chaos theory, fractal mathematics, and complexity measures of heart rate behavior in relation to cardiovascular physiology or various cardiovascular events are still far away from clinical medicine, they are a fruitful area for future research to expand our knowledge concerning the behavior of cardiovascular oscillations in normal healthy conditions as well as in disease states.

  5. Millimeter wave complex dielectric permittivity and complex magnetic permeability measurements of absorbing materials

    Science.gov (United States)

    Tkachov, Igor Ivanovich

    2000-09-01

    This dissertation presents new methods for characterization of materials in the millimeter wave range. Historically, this has been the most difficult part of the electromagnetic spectrum for accurate measurements of material properties. New instrumentation has now been developed for operation in this frequency band. The new techniques developed in the course of this work allowed precise measurement of dielectric properties as well as the separation of magnetic properties from dielectric in the millimeter wave range. A new quasi-optical spectrometer with a waveguide reference channel has been designed and built for the precision measurement of the real part of dielectric permittivity of medium and highly absorbing materials over an extended W-band frequency range (70-118 GHz). A new method of phase measurement with this unique unbalanced quasi-optical waveguide bridge spectrometer has been developed. The phase of the electromagnetic wave transmitted through the specimen can be measured accurately, leading to the determination of the real part of the complex dielectric permittivity of moderate and highly absorbing dielectric materials with high precision. A simple quasi-optical transmission configuration of the spectrometer, a single free space channel provides the transmittance data with a high resolution from which the spectra of the imaginary part of dielectric permittivity of materials are evaluated accurately. A backward wave oscillator (BWO) is used as the source of tunable coherent radiation for the spectrometer. The high output power of the BWO and the high sensitivity of the receiver system, which employs a specially constructed liquid helium cooled InSb detector, enable adequate sensitivity in transmission for highly absorbing materials. Systematic study of dielectric and magnetic properties of various materials has been performed with the quasi-optical free space method in the millimeter wave range from 34GHz to 117GHz for the first time. Specific results

  6. Automated imitating-measuring complex for designing and measuring characteristics of phased antenna arrays

    OpenAIRE

    Usin, V.; Markov, V.; Pomazanov, S.; Usina, A.; Filonenko, A.

    2011-01-01

    This article considers design principles, structure and technical characteristics of automated imitating-measuring complex, contains variants of its hardware and software implementation for selecting APD, tolerance justification, estimation of manufacturing errors’ influence, discrete nature of control and mutual influence of radiating elements on PAA parameters.

  7. Testing robustness of relative complexity measure method constructing robust phylogenetic trees for Galanthus L. Using the relative complexity measure

    Directory of Open Access Journals (Sweden)

    Bakış Yasin

    2013-01-01

    Full Text Available Abstract Background Most phylogeny analysis methods based on molecular sequences use multiple alignment where the quality of the alignment, which is dependent on the alignment parameters, determines the accuracy of the resulting trees. Different parameter combinations chosen for the multiple alignment may result in different phylogenies. A new non-alignment based approach, Relative Complexity Measure (RCM, has been introduced to tackle this problem and proven to work in fungi and mitochondrial DNA. Result In this work, we present an application of the RCM method to reconstruct robust phylogenetic trees using sequence data for genus Galanthus obtained from different regions in Turkey. Phylogenies have been analyzed using nuclear and chloroplast DNA sequences. Results showed that, the tree obtained from nuclear ribosomal RNA gene sequences was more robust, while the tree obtained from the chloroplast DNA showed a higher degree of variation. Conclusions Phylogenies generated by Relative Complexity Measure were found to be robust and results of RCM were more reliable than the compared techniques. Particularly, to overcome MSA-based problems, RCM seems to be a reasonable way and a good alternative to MSA-based phylogenetic analysis. We believe our method will become a mainstream phylogeny construction method especially for the highly variable sequence families where the accuracy of the MSA heavily depends on the alignment parameters.

  8. Digraph Complexity Measures and Applications in Formal Language Theory

    CERN Document Server

    Gruber, Hermann

    2011-01-01

    We investigate structural complexity measures on digraphs, in particular the cycle rank. This concept is intimately related to a classical topic in formal language theory, namely the star height of regular languages. We explore this connection, and obtain several new algorithmic insights regarding both cycle rank and star height. Among other results, we show that computing the cycle rank is NP-complete, even for sparse digraphs of maximum outdegree 2. Notwithstanding, we provide both a polynomial-time approximation algorithm and an exponential-time exact algorithm for this problem. The former algorithm yields an O((log n)^(3/2))- approximation in polynomial time, whereas the latter yields the optimum solution, and runs in time and space O*(1.9129^n) on digraphs of maximum outdegree at most two. Regarding the star height problem, we identify a subclass of the regular languages for which we can precisely determine the computational complexity of the star height problem. Namely, the star height problem for bidet...

  9. Measuring the complex behavior of the SO2 oxidation reaction

    Directory of Open Access Journals (Sweden)

    Muhammad Shahzad

    2015-09-01

    Full Text Available The two step reversible chemical reaction involving five chemical species is investigated. The quasi equilibrium manifold (QEM and spectral quasi equilibrium manifold (SQEM are used for initial approximation to simplify the mechanisms, which we want to utilize in order to investigate the behavior of the desired species. They show a meaningful picture, but for maximum clarity, the investigation method of invariant grid (MIG is employed. These methods simplify the complex chemical kinetics and deduce low dimensional manifold (LDM from the high dimensional mechanism. The coverage of the species near equilibrium point is investigated and then we shall discuss moving along the equilibrium of ODEs. The steady state behavior is observed and the Lyapunov function is utilized to study the stability of ODEs. Graphical results are used to describe the physical aspects of measurements.

  10. Measurement of complex supercontinuum light pulses using time domain ptychography

    CERN Document Server

    Heidt, Alexander M; Brügmann, Michael; Rohwer, Erich G; Feurer, Thomas

    2016-01-01

    We demonstrate that time-domain ptychography, a recently introduced ultrafast pulse reconstruction modality, has properties ideally suited for the temporal characterization of complex light pulses with large time-bandwidth products as it achieves temporal resolution on the scale of a single optical cycle using long probe pulses, low sampling rates, and an extremely fast and robust algorithm. In comparison to existing techniques, ptychography minimizes the data to be recorded and processed, and drastically reduces the computational time of the reconstruction. Experimentally we measure the temporal waveform of an octave-spanning, 3.5~ps long supercontinuum pulse generated in photonic crystal fiber, resolving features as short as 5.7~fs with sub-fs resolution and 30~dB dynamic range using 100~fs probe pulses and similarly large delay steps.

  11. Measuring robustness of community structure in complex networks

    CERN Document Server

    Li, Hui-Jia; Chen, Luonan

    2015-01-01

    The theory of community structure is a powerful tool for real networks, which can simplify their topological and functional analysis considerably. However, since community detection methods have random factors and real social networks obtained from complex systems always contain error edges, evaluating the robustness of community structure is an urgent and important task. In this letter, we employ the critical threshold of resolution parameter in Hamiltonian function, $\\gamma_C$, to measure the robustness of a network. According to spectral theory, a rigorous proof shows that the index we proposed is inversely proportional to robustness of community structure. Furthermore, by utilizing the co-evolution model, we provides a new efficient method for computing the value of $\\gamma_C$. The research can be applied to broad clustering problems in network analysis and data mining due to its solid mathematical basis and experimental effects.

  12. Analyzing complex networks through correlations in centrality measurements

    Science.gov (United States)

    Furlan Ronqui, José Ricardo; Travieso, Gonzalo

    2015-05-01

    Many real world systems can be expressed as complex networks of interconnected nodes. It is frequently important to be able to quantify the relative importance of the various nodes in the network, a task accomplished by defining some centrality measures, with different centrality definitions stressing different aspects of the network. It is interesting to know to what extent these different centrality definitions are related for different networks. In this work, we study the correlation between pairs of a set of centrality measures for different real world networks and two network models. We show that the centralities are in general correlated, but with stronger correlations for network models than for real networks. We also show that the strength of the correlation of each pair of centralities varies from network to network. Taking this fact into account, we propose the use of a centrality correlation profile, consisting of the values of the correlation coefficients between all pairs of centralities of interest, as a way to characterize networks. Using the yeast protein interaction network as an example we show also that the centrality correlation profile can be used to assess the adequacy of a network model as a representation of a given real network.

  13. Permutation Complexity and Coupling Measures in Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Taichi Haruna

    2013-09-01

    Full Text Available Recently, the duality between values (words and orderings (permutations has been proposed by the authors as a basis to discuss the relationship between information theoretic measures for finite-alphabet stationary stochastic processes and their permutatio nanalogues. It has been used to give a simple proof of the equality between the entropy rate and the permutation entropy rate for any finite-alphabet stationary stochastic process and to show some results on the excess entropy and the transfer entropy for finite-alphabet stationary ergodic Markov processes. In this paper, we extend our previous results to hidden Markov models and show the equalities between various information theoretic complexity and coupling measures and their permutation analogues. In particular, we show the following two results within the realm of hidden Markov models with ergodic internal processes: the two permutation analogues of the transfer entropy, the symbolic transfer entropy and the transfer entropy on rank vectors, are both equivalent to the transfer entropy if they are considered as the rates, and the directed information theory can be captured by the permutation entropy approach.

  14. A high accuracy broadband measurement system for time resolved complex bioimpedance measurements

    International Nuclear Information System (INIS)

    Bioimpedance measurements are useful tools in biomedical engineering and life science. Bioimpedance is the electrical impedance of living tissue and can be used in the analysis of various physiological parameters. Bioimpedance is commonly measured by injecting a small well known alternating current via surface electrodes into an object under test and measuring the resultant surface voltages. It is non-invasive, painless and has no known hazards. This work presents a field programmable gate array based high accuracy broadband bioimpedance measurement system for time resolved bioimpedance measurements. The system is able to measure magnitude and phase of complex impedances under test in a frequency range of about 10–500 kHz with excitation currents from 10 µA to 5 mA. The overall measurement uncertainties stay below 1% for the impedance magnitude and below 0.5° for the phase in most measurement ranges. Furthermore, the described system has a sample rate of up to 3840 impedance spectra per second. The performance of the bioimpedance measurement system is demonstrated with a resistor based system calibration and with measurements on biological samples. (paper)

  15. Disassembling "evapotranspiration" in-situ with a complex measurement tool

    Science.gov (United States)

    Chormanski, Jaroslaw; Kleniewska, Malgorzata; Berezowski, Tomasz; Sporak-Wasilewska, Sylwia; Okruszko, Tomasz; Szatylowicz, Jan; Batelaan, Okke

    2014-05-01

    In this work we present a complex tool for measuring water fluxes in wetland ecosystems. The tool was designed to quantify processes related to interception storage on plants leafs. The measurements are conducted by combining readings from various instruments, including: eddy covariance tower (EC), field spectrometer, SapFlow system, rain gauges above and under canopy, soil moisture probes and other. The idea of this set-up is to provide continuous measurement of overall water flux from the ecosystem (EC tower), intercepted water volume and timing (field spectrometers), through-fall (rain gauges above and under canopy), transpiration (SapFlow), evaporation and soil moisture (soil moisture probes). Disassembling the water flux to the above components allows giving more insight to the interception related processes and differentiates them from the total evapotranspiration. The measurements are conducted in the Upper Biebrza Basin (NE Poland). The study area is part of the valley and is covered by peat soils (mainly peat moss with the exception of areas near the river) and receives no inundations waters of the Biebrza. The plant community of Agrostietum-Carici caninae has a dominant share here creating an up to 0.6 km wide belt along the river. The area is covered also by Caricion lasiocarpae as well as meadows and pastures Molinio-Arrhenatheretea, Phragmitetum communis. Sedges form a hummock pattern characteristic for the sedge communities in natural river valleys with wetland vegetation. The main result of the measurement set-up will be the analyzed characteristics and dynamics of interception storage for sedge ecosystems and a developed methodology for interception monitoring by use spectral reflectance technique. This will give a new insight to processes of evapotranspiration in wetlands and its components transpiration, evaporation from interception and evaporation from soil. Moreover, other important results of this project will be the estimation of energy and

  16. Range-limited centrality measures in complex networks

    Science.gov (United States)

    Ercsey-Ravasz, Mária; Lichtenwalter, Ryan N.; Chawla, Nitesh V.; Toroczkai, Zoltán

    2012-06-01

    Here we present a range-limited approach to centrality measures in both nonweighted and weighted directed complex networks. We introduce an efficient method that generates for every node and every edge its betweenness centrality based on shortest paths of lengths not longer than ℓ=1,...,L in the case of nonweighted networks, and for weighted networks the corresponding quantities based on minimum weight paths with path weights not larger than wℓ=ℓΔ, ℓ=1,2...,L=R/Δ. These measures provide a systematic description on the positioning importance of a node (edge) with respect to its network neighborhoods one step out, two steps out, etc., up to and including the whole network. They are more informative than traditional centrality measures, as network transport typically happens on all length scales, from transport to nearest neighbors to the farthest reaches of the network. We show that range-limited centralities obey universal scaling laws for large nonweighted networks. As the computation of traditional centrality measures is costly, this scaling behavior can be exploited to efficiently estimate centralities of nodes and edges for all ranges, including the traditional ones. The scaling behavior can also be exploited to show that the ranking top list of nodes (edges) based on their range-limited centralities quickly freezes as a function of the range, and hence the diameter-range top list can be efficiently predicted. We also show how to estimate the typical largest node-to-node distance for a network of N nodes, exploiting the afore-mentioned scaling behavior. These observations were made on model networks and on a large social network inferred from cell-phone trace logs (˜5.5×106 nodes and ˜2.7×107 edges). Finally, we apply these concepts to efficiently detect the vulnerability backbone of a network (defined as the smallest percolating cluster of the highest betweenness nodes and edges) and illustrate the importance of weight-based centrality measures in

  17. Introducing a Space Complexity Measure for P Systems

    OpenAIRE

    Porreca, Antonio E.; Leporati, Alberto; Mauri, Giancarlo; Zandron, Claudio; Research Group on Natural Computing (Universidad de Sevilla) (Coordinador)

    2009-01-01

    We define space complexity classes in the framework of membrane computing, giving some initial results about their mutual relations and their connection with time complexity classes, and identifying some potentially interesting problems which require further research.

  18. Methodology for Measuring the Complexity of Enterprise Information Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  19. Complex Squeezing and Force Measurement Beyond the Standard Quantum Limit

    CERN Document Server

    Buchmann, L F; Kohler, J; Spethmann, N; Stamper-Kurn, D M

    2016-01-01

    A continuous quantum field, such as a propagating beam of light, may be characterized by a squeezing spectrum that is inhomogeneous in frequency. We point out that homodyne detectors, which are commonly employed to detect quantum squeezing, are blind to squeezing spectra in which the correlation between amplitude and phase fluctuations is complex. We find theoretically that such complex squeezing is a component of ponderomotive squeezing of light through cavity optomechanics. We propose a detection scheme, called synodyne detection, which reveals complex squeezing and allows its use to improve force detection beyond the standard quantum limit.

  20. Measuring the complexity of viewers' television news interpretation: Differentiation

    OpenAIRE

    Schaap, G.J.; Konig, R.P.; Renckstorf, K.; Wester, F.P.J.

    2005-01-01

    If television news viewers are conceived as active audience members, their interpretations should be a crucial factor in the study of the ‘effects’ of television news. Here, viewers’ interpretations are understood as subjective (re)constructions of a news item. In a previous contribution, we argued that interpretations can vary both within and between viewers in regard to the level of complexity. Complexity is the degree to which interpretations are a) differentiated, and b) integrated. In th...

  1. Power Quality Measurement in a Modern Hotel Complex

    OpenAIRE

    Velimir Strugar; Vladimir Katić

    2010-01-01

    The paper presents the analysis of power quality characteristics at the 10 kV grids supplying a modern hotel complex in Montenegrin Adriatic coast. The consumer is characterized with different type of loads, of which some are with highly nonlinear characteristic. For example, smart rooms, lift drives, modern equipment for hotel kitchen, public electric lighting, audio, video and TV devices, etc. Such loads in the hotel complex may be source of negative effects regarding power quality at MV pu...

  2. Measuring the Level of Complexity of Scientific Inquiries: The LCSI Index

    Science.gov (United States)

    Eilam, Efrat

    2015-01-01

    The study developed and applied an index for measuring the level of complexity of full authentic scientific inquiry. Complexity is a fundamental attribute of real life scientific research. The level of complexity is an overall reflection of complex cognitive and metacognitive processes which are required for navigating the authentic inquiry…

  3. Combining complexity measures of EEG data: multiplying measures reveal previously hidden information [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Thomas Burns

    2015-06-01

    Full Text Available Many studies have noted significant differences among human electroencephalograph (EEG results when participants or patients are exposed to different stimuli, undertaking different tasks, or being affected by conditions such as epilepsy or Alzheimer's disease. Such studies often use only one or two measures of complexity and do not regularly justify their choice of measure beyond the fact that it has been used in previous studies. If more measures were added to such studies, however, more complete information might be found about these reported differences. Such information might be useful in confirming the existence or extent of such differences, or in understanding their physiological bases. In this study we analysed publically-available EEG data using a range of complexity measures to determine how well the measures correlated with one another. The complexity measures did not all significantly correlate, suggesting that different measures were measuring unique features of the EEG signals and thus revealing information which other measures were unable to detect. Therefore, the results from this analysis suggests that combinations of complexity measures reveal unique information which is in addition to the information captured by other measures of complexity in EEG data. For this reason, researchers using individual complexity measures for EEG data should consider using combinations of measures to more completely account for any differences they observe and to ensure the robustness of any relationships identified.

  4. Dynamic portfolio managment based on complex quantile risk measures

    Directory of Open Access Journals (Sweden)

    Ekaterina V. Tulupova

    2011-05-01

    Full Text Available The article focuses on effectiveness evaluation combined measures of financial risks, which are convex combinations of measures VaR, CVaR and their analogues for the right distribution tail functions of a portfolio returns.

  5. Information and complexity measures for hydrologic model evaluation

    Science.gov (United States)

    Hydrological models are commonly evaluated through the residual-based performance measures such as the root-mean square error or efficiency criteria. Such measures, however, do not evaluate the degree of similarity of patterns in simulated and measured time series. The objective of this study was to...

  6. Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR and Information Entropy

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2015-04-01

    Full Text Available Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL, Shili (SL and Renli (RL, so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.

  7. Quantum mechanics with chaos correspondence principle, measurement and complexity

    CERN Document Server

    Kirilyuk, A P

    1995-01-01

    The true dynamical randomness is obtained as a natural fundamental property of deterministic quantum systems. It provides quantum chaos passing to the classical dynamical chaos under the ordinary semiclassical transition, which extends the correspondence principle to chaotic systems. In return one should accept the modified form of quantum formalism (exemplified by the Schrodinger equation) which, however, does not contradict the ordinary form, and the main postulates, of quantum mechanics. It introduces the principle of the fundamental dynamic multivaluedness extending the quantum paradigm to complex dynamical behaviour. Moreover, a causal solution to the well-known problems of the foundations of quantum mechanics, those of quantum indeterminacy and wave reduction, is also found using the same method. The concept of the fundamental dynamic uncertainty thus established is universal in character and provides a unified scheme of the complete description of arbitrary complex system of any origin. This scheme inc...

  8. Measurement and documentation of complex PTSD in treatment seeking traumatized refugees

    DEFF Research Database (Denmark)

    Palic, Sabina

    The aim of the thesis is to study complex traumatization and its measurement in treatment seeking traumatized refugees. Historically there have been repeated attempts to create a diagnosis for complex posttraumatic stress disorder (complex PTSD) to capture the more diverse, trauma related symptoms...... and personality dysfunction following extreme traumatization. Importantly, patterns of severe traumatic exposure in refugees may represent a group vulnerable to complex PTSD. However, there are currently only a few validated psychiatric measures for the assessment of traumatized refugees, which are limited...... to measuring symptoms of PTSD, anxiety, and depression. This renders documentation, measurement, and treatment of possible complex traumatic adaptations in traumatized refugees very difficult. The thesis comprises two studies using different measures and different samples. The first study investigated complex...

  9. 3-D profile measurement for complex micro-structures

    Institute of Scientific and Technical Information of China (English)

    HU Chun-guang; HU Xiao-dong; XU Lin-yan; GUO Tong; HU Xiao-tang

    2005-01-01

    Micro-structures 3-D profile measurement is an important measurement content for research on micro-machining and characterization of micro-dimension. In this paper,a new method involved 2-D structure template, which guides phase unwrapping,is proposed based on phase-shifting microscopic interferometry.It is fit not only for static measurement, but also for dynamic measurement,especially for motion of MEMS devices.3-D profile of active comb of micro-resonator is obtained by using the method.The theoretic precision in out-of-plane direction is better than 0.5 nm.The in-plane theoretic precision in micro-structures is better than 0.5 μm.But at the edge of micro-structures,it is on the level of micrometer mainly caused by imprecise edge analysis.Finally,its disadvantages and the following development are discussed.

  10. Titan's Complex Neutral Composition as Measured by Cassini INMS

    Science.gov (United States)

    Waite, J. H.; Magee, B. A.; Gell, D. A.; Kasprzak, W. T.; Cravens, T.; Vuitton, V. S.; Yelle, R. V.

    2006-12-01

    The composition of Titan's complex neutral atmosphere above 1000 km as observed by the Cassini Ion Neutral Mass Spectrometer on recent flybys of Titan are presented. A rich mixture of hydrocarbons and nitriles are found with mixing ratios that vary from 10-4 to 10-7: acetylene, ethylene, ethane, benzene, toluene, cyanogen, propyne, propene, propane, and various nitriles. The calibration and mass deconvolution processes are presented in order to establish clear boundaries on the systematic errors that can occur in the mass deconvolution process. The role of ion neutral chemistry in forming these compounds will also be discussed.

  11. Block-based test data adequacy measurement criteria and test complexity metrics

    Institute of Scientific and Technical Information of China (English)

    陈卫东; 杨建军; 叶澄清; 潘云鹤

    2002-01-01

    On the basis of software testing tools we developed for progrmnming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program-based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J-complexity 0; J-complexity 1 ; J-complexity 1 + ; J-complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.

  12. Block-based test data adequacy measurement criteria and test complexity metrics

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    On the basis of software testing tools we developed for programming languages, we firstly present a new control flowgraph model based on block. In view of the notion of block, we extend the traditional program-based software test data adequacy measurement criteria, and empirically analyze the subsume relation between these measurement criteria. Then, we define four test complexity metrics based on block. They are J-complexity 0; J-complexity 1; J-complexity 1 +; J-complexity 2. Finally, we show the Kiviat diagram that makes software quality visible.

  13. Measuring Viscosity with a Levitating Magnet: Application to Complex Fluids

    Science.gov (United States)

    Even, C.; Bouquet, F.; Remond, J.; Deloche, B.

    2009-01-01

    As an experimental project proposed to students in fourth year of university, a viscometer was developed, consisting of a small magnet levitating in a viscous fluid. The viscous force acting on the magnet is directly measured: viscosities in the range 10-10[superscript 6] mPa s are obtained. This experiment is used as an introduction to complex…

  14. Resolving and measuring diffusion in complex interfaces: Exploring new capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Alam, Todd M. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This exploratory LDRD targeted the use of a new high resolution spectroscopic diffusion capabilities developed at Sandia to resolve transport processes at interfaces in heterogeneous polymer materials. In particular, the combination of high resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) spectroscopy with pulsed field gradient (PFG) diffusion experiments were used to directly explore interface diffusion within heterogeneous polymer composites, including measuring diffusion for individual chemical species in multi-component mixtures. Several different types of heterogeneous polymer systems were studied using these HRMAS NMR diffusion capabilities to probe the resolution limitations, determine the spatial length scales involved, and explore the general applicability to specific heterogeneous systems. The investigations pursued included a) the direct measurement of the diffusion for poly(dimethyl siloxane) polymer (PDMS) on nano-porous materials, b) measurement of penetrant diffusion in additive manufactures (3D printed) processed PDMS composites, and c) the measurement of diffusion in swollen polymers/penetrant mixtures within nano-confined aluminum oxide membranes. The NMR diffusion results obtained were encouraging and allowed for an improved understanding of diffusion and transport processes at the molecular level, while at the same time demonstrating that the spatial heterogeneity that can be resolved using HRMAS NMR PFG diffusion experiment must be larger than ~μm length scales, expect for polymer transport within nanoporous carbons where additional chemical resolution improves the resolvable heterogeneous length scale to hundreds of nm.

  15. Simulation and Efficient Measurements of Intensities for Complex Imaging Sequences

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Rasmussen, Morten Fischer; Stuart, Matthias Bo;

    2014-01-01

    on the sequence to simulate both intensity and mechanical index (MI) according to FDA rules. A 3 MHz BK Medical 8820e convex array transducer is used with the SARUS scanner. An Onda HFL-0400 hydrophone and the Onda AIMS III system measures the pressure field for three imaging schemes: a fixed focus, single...

  16. Measuring complexity, nonextensivity and chaos in the DNA sequence of the Major Histocompatibility Complex

    Science.gov (United States)

    Pavlos, G. P.; Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Xenakis, M. N.; Clark, Peter; Duke, Jamie; Monos, D. S.

    2015-11-01

    We analyze 4 Mb sequences of the Major Histocompatibility Complex (MHC), which is a DNA segment on chromosome 6 with high gene density, controlling many immunological functions and associated with many diseases. The analysis is based on modern theoretical and mathematical tools of complexity theory, such as nonlinear time series analysis and Tsallis non-extensive statistics. The results revealed that the DNA complexity and self-organization can be related to fractional dynamical nonlinear processes with low-dimensional deterministic chaotic and non-extensive statistical character, which generate the DNA sequences under the extremization of Tsallis q-entropy principle. While it still remains an open question as to whether the DNA walk is a fractional Brownian motion (FBM), a static anomalous diffusion process or a non-Gaussian dynamical fractional anomalous diffusion process, the results of this study testify for the latter, providing also a possible explanation for the previously observed long-range power law correlations of nucleotides, as well as the long-range correlation properties of coding and non-coding sequences present in DNA sequences.

  17. Comparison of task complexity measures for emergency operating procedures: Convergent validity and predictive validity

    International Nuclear Information System (INIS)

    Human performance while executing operating procedures is critically important for the safety of complex industrial systems. To predict and model human performance, several complexity measures have been developed. This study aims to compare the convergent validity and predictive validity of three existing complexity measures, step complexity (SC), task size, and task complexity (TC), using operator performance data collected from an emergency operating procedure (EOP) experiment. This comparative study shows that these measures have a high convergent validity with each other, most likely because all of them involve the size dimension of complexity. These measures and their sub-measures also have a high predictive validity for operation time and a moderate-to-high predictive validity for error rate, except the step logic complexity (SLC) measure, a component of the SC measure. SLC appears not to contribute to the predictive validity in the experimental EOPs. The use of visual, auditory, cognitive, and psychomotor (VACP) rating scales in the TC measure seems to be significantly beneficial for explaining the human error rate; however, these rating scales appear not to adequately reflect the complexity differences among the meta-operations in EOPs

  18. Urban sustainability : complex interactions and the measurement of risk

    Directory of Open Access Journals (Sweden)

    Lidia Diappi

    1999-05-01

    Full Text Available This paper focuses on the concept of asustainable city and its theoretical implications for the urban system. Urban sustainability is based on positive interactions among three different urban sub-systems : social, economic and physical, where social well-being coexists with economic development and environmental quality. This utopian scenario doesn’t appear. Affluent economy is often associated with poverty and criminality, labour variety and urban efficiency coexist with pollution and congestion. The research subject is the analysis of local risk and opportunity conditions, based on the application of a special definition of risk elaborated and made operative with the production of a set of maps representing the multidimensional facets of spatial organisation in urban sustainability. The interactions among the economic/social and environmental systems are complex and unpredictable and present the opportunity for a new methodology of scientific investigation : the connectionistic approach, processed by Self-Reflexive Neural Networks (SRNN. These Networks are a useful instrument of investigation and analogic questioning of the Data Base. Once the SRNN has learned the structure of the weights from the DB, by querying the network with the maximization or minimization of specific groups of attributes, it is possible to read the related properties and to rank the areas. The survey scale assumed by the research is purposefully aimed at the micro-scale and concerns the Municipality of Milan which is spatially divided into 144 zones.

  19. Measuring the complex field scattered by single submicron particles

    Energy Technology Data Exchange (ETDEWEB)

    Potenza, Marco A. C., E-mail: marco.potenza@unimi.it; Sanvito, Tiziano [Department of Physics, University of Milan, via Celoria, 16 – I-20133 Milan (Italy); CIMAINA, University of Milan, via Celoria, 16 – I-20133 Milan (Italy); EOS s.r.l., viale Ortles 22/4, I-20139 Milan (Italy); Pullia, Alberto [Department of Physics, University of Milan, via Celoria, 16 – I-20133 Milan (Italy)

    2015-11-15

    We describe a method for simultaneous measurements of the real and imaginary parts of the field scattered by single nanoparticles illuminated by a laser beam, exploiting a self-reference interferometric scheme relying on the fundamentals of the Optical Theorem. Results obtained with calibrated spheres of different materials are compared to the expected values obtained through a simplified analytical model without any free parameters, and the method is applied to a highly polydisperse water suspension of Poly(D,L-lactide-co-glycolide) nanoparticles. Advantages with respect to existing methods and possible applications are discussed.

  20. Reconstruction of Complex Materials by Integral Geometric Measures

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The goal of much research in computational materials science is to quantify necessary morphological information and then to develop stochastic models which both accurately reflect the material morphology and allow one to estimate macroscopic physical properties. A novel method of characterizing the morphology of disordered systems is presented based on the evolution of a family of integral geometric measures during erosion and dilation operations.The method is used to determine the accuracy of model reconstructions of random systems. It is shown that the use of erosion/dilation operations on the original image leads to a more accurate discrimination of morphology than previous methods.

  1. Complexity and Information: Measuring Emergence, Self-organization, and Homeostasis at Multiple Scales

    CERN Document Server

    Gershenson, Carlos

    2012-01-01

    Concepts used in the scientific study of complex systems have become so widespread that their use and abuse has led to ambiguity and confusion in their meaning. In this paper we use information theory to provide abstract and concise measures of complexity, emergence, self-organization, and homeostasis. The purpose is to clarify the meaning of these concepts with the aid of the proposed formal measures. In a simplified version of the measures (focussing on the information produced by a system), emergence becomes the opposite of self-organization, while complexity represents their balance. We use computational experiments on random Boolean networks and elementary cellular automata to illustrate our measures at multiple scales.

  2. Design of New Complex Detector Used for Gross Beta Measuring

    International Nuclear Information System (INIS)

    The level of gross β for radioactive aerosol in the containment of nuclear plants can indicate how serious the radioactive pollution is in the shell, and it can provide evidence which shows whether there is the phenomenon of leak in the boundaries of confined aquifer of the primary coolant circuit equipment.In the process of measuring, the counting of gross β is influenced by γ. In order to avoid the influence of γ, a new method was introduced and a new detector was designed using plastic scintillator as the major detecting component and BGO as the sub-component. Based on distinctive difference of light attenuation time, signal induced in them can be discriminated. γ background in plastic scintillator was subtracted according to the counting of γ in BGO. The functions of absolute detection efficiency were obtained. The simulation for Monte-Carlo method shows that the influence of γ background is decreased about one order of magnitude. (authors)

  3. Prediction of Software Requirements Stability Based on Complexity Point Measurement Using Multi-Criteria Fuzzy Approach

    Directory of Open Access Journals (Sweden)

    D. Francis Xavier Christopher

    2012-12-01

    Full Text Available Many software projects fail due to instable requirements and lack of managing the requirements changesefficiently. Software Requirements Stability Index Metric (RSI helps to evaluate the overall stability ofrequirements and also keep track of the project status. Higher the stability, less changes tends topropagate. The existing system use Function Point modeling for measuring the Requirements Stability.However, the main drawback of the existing modeling is that the complexity of non-functional requirementshas not been measured for Requirements Stability. The Non-Functional Factors plays a vital role inassessing the Requirements Stability. Numerous Measurement methods have been proposed for measuringthe software complexity. This paper proposes Multi-criteria Fuzzy Based approach for finding out thecomplexity weight based on Requirement Complexity Attributes such as Functional RequirementComplexity, Non-Functional Requirement Complexity, Input Output Complexity, Interface and FileComplexity. Based on the complexity weight, this paper computes the software complexity point. And thenpredict the Software Requirements Stability based on Software Complexity Point changes. The advantageof this model is that it is able to estimate the software complexity early which in turn predicts the SoftwareRequirement Stability during the software development life cycle.

  4. What the complex joint probabilities observed in weak measurements can tell us about quantum physics

    Energy Technology Data Exchange (ETDEWEB)

    Hofmann, Holger F. [Graduate School of Advanced Sciences of Matter, Hiroshima University, Kagamiyama 1-3-1, Higashi Hiroshima 739-8530, Japan and JST, CREST, Sanbancho 5, Chiyoda-ku, Tokyo 102-0075 (Japan)

    2014-12-04

    Quantummechanics does not permit joint measurements of non-commuting observables. However, it is possible to measure the weak value of a projection operator, followed by the precise measurement of a different property. The results can be interpreted as complex joint probabilities of the two non-commuting measurement outcomes. Significantly, it is possible to predict the outcome of completely different measurements by combining the joint probabilities of the initial state with complex conditional probabilities relating the new measurement to the possible combinations of measurement outcomes used in the characterization of the quantum state. We can therefore conclude that the complex conditional probabilities observed in weak measurements describe fundamental state-independent relations between non-commuting properties that represent the most fundamental form of universal laws in quantum physics.

  5. A comparison of LMC and SDL complexity measures on binomial distributions

    Science.gov (United States)

    Piqueira, José Roberto C.

    2016-02-01

    The concept of complexity has been widely discussed in the last forty years, with a lot of thinking contributions coming from all areas of the human knowledge, including Philosophy, Linguistics, History, Biology, Physics, Chemistry and many others, with mathematicians trying to give a rigorous view of it. In this sense, thermodynamics meets information theory and, by using the entropy definition, López-Ruiz, Mancini and Calbet proposed a definition for complexity that is referred as LMC measure. Shiner, Davison and Landsberg, by slightly changing the LMC definition, proposed the SDL measure and the both, LMC and SDL, are satisfactory to measure complexity for a lot of problems. Here, SDL and LMC measures are applied to the case of a binomial probability distribution, trying to clarify how the length of the data set implies complexity and how the success probability of the repeated trials determines how complex the whole set is.

  6. Matrix Energy as a Measure of Topological Complexity of a Graph

    CERN Document Server

    Sinha, Kaushik

    2016-01-01

    The complexity of highly interconnected systems is rooted in the interwoven architecture defined by its connectivity structure. In this paper, we develop matrix energy of the underlying connectivity structure as a measure of topological complexity and highlight interpretations about certain global features of underlying system connectivity patterns. The proposed complexity metric is shown to satisfy the Weyuker criteria as a measure of its validity as a formal complexity metric. We also introduce the notion of P point in the graph density space. The P point acts as a boundary between multiple connectivity regimes for finite-size graphs.

  7. Measurements of student understanding on complex scientific reasoning problems

    Science.gov (United States)

    Izumi, Alisa Sau-Lin

    While there has been much discussion of cognitive processes underlying effective scientific teaching, less is known about the response nature of assessments targeting processes of scientific reasoning specific to biology content. This study used multiple-choice (m-c) and short-answer essay student responses to evaluate progress in high-order reasoning skills. In a pilot investigation of student responses on a non-content-based test of scientific thinking, it was found that some students showed a pre-post gain on the m-c test version while showing no gain on a short-answer essay version of the same questions. This result led to a subsequent research project focused on differences between alternate versions of tests of scientific reasoning. Using m-c and written responses from biology tests targeted toward the skills of (1) reasoning with a model and (2) designing controlled experiments, test score frequencies, factor analysis, and regression models were analyzed to explore test format differences. Understanding the format differences in tests is important for the development of practical ways to identify student gains in scientific reasoning. The overall results suggested test format differences. Factor analysis revealed three interpretable factors---m-c format, genetics content, and model-based reasoning. Frequency distributions on the m-c and open explanation portions of the hybrid items revealed that many students answered the m-c portion of an item correctly but gave inadequate explanations. In other instances students answered the m-c portion incorrectly yet demonstrated sufficient explanation or answered the m-c correctly and also provided poor explanations. When trying to fit test score predictors for non-associated student measures---VSAT, MSAT, high school grade point average, or final course grade---the test scores accounted for close to zero percent of the variance. Overall, these results point to the importance of using multiple methods of testing and of

  8. Measurements of complex impedance in microwave high power systems with a new bluetooth integrated circuit.

    Science.gov (United States)

    Roussy, Georges; Dichtel, Bernard; Chaabane, Haykel

    2003-01-01

    By using a new integrated circuit, which is marketed for bluetooth applications, it is possible to simplify the method of measuring the complex impedance, complex reflection coefficient and complex transmission coefficient in an industrial microwave setup. The Analog Devices circuit AD 8302, which measures gain and phase up to 2.7 GHz, operates with variable level input signals and is less sensitive to both amplitude and frequency fluctuations of the industrial magnetrons than are mixers and AM crystal detectors. Therefore, accurate gain and phase measurements can be performed with low stability generators. A mechanical setup with an AD 8302 is described; the calibration procedure and its performance are presented. PMID:15078067

  9. Measurement of Characteristic Self-Similarity and Self-Diversity for Complex Mechanical Systems

    Institute of Scientific and Technical Information of China (English)

    ZHOU Meili; LAI Jiangfeng

    2006-01-01

    Based on similarity science and complex system theory, a new concept of characteristic self-diversity and corresponding relations between self-similarity and self-diversity for complex mechanical systems are presented in this paper. Methods of system self-similarity and self-diversity measure between main system and sub-system are studied. Numerical calculations show that the characteristic self-similarity and self-diversity measure method is validity. A new theory and method of self-similarity and self-diversity measure for complexity mechanical system is presented.

  10. The Complex Trauma Questionnaire (ComplexTQ): development and preliminary psychometric properties of an instrument for measuring early relational trauma.

    Science.gov (United States)

    Maggiora Vergano, Carola; Lauriola, Marco; Speranza, Anna M

    2015-01-01

    Research on the etiology of adult psychopathology and its relationship with childhood trauma has focused primarily on specific forms of maltreatment. This study developed an instrument for the assessment of childhood and adolescence trauma that would aid in identifying the role of co-occurring childhood stressors and chronic adverse conditions. The Complex Trauma Questionnaire (ComplexTQ), in both clinician and self-report versions, is a measure for the assessment of multi-type maltreatment: physical, psychological, and sexual abuse; physical and emotional neglect as well as other traumatic experiences, such rejection, role reversal, witnessing domestic violence, separations, and losses. The four-point Likert scale allows to specifically indicate with which caregiver the traumatic experience has occurred. A total of 229 participants, a sample of 79 nonclinical and that of 150 high-risk and clinical participants, were assessed with the ComplexTQ clinician version applied to Adult Attachment Interview (AAI) transcripts. Initial analyses indicate acceptable inter-rater reliability. A good fit to a 6-factor model regarding the experience with the mother and to a 5-factor model with the experience with the father was obtained; the internal consistency of factors derived was good. Convergent validity was provided with the AAI scales. ComplexTQ factors discriminated normative from high-risk and clinical samples. The findings suggest a promising, reliable, and valid measurement of early relational trauma that is reported; furthermore, it is easy to complete and is useful for both research and clinical practice. PMID:26388820

  11. The Complex Trauma Questionnaire (ComplexTQ:Development and preliminary psychometric properties of an instrument for measuring early relational trauma

    Directory of Open Access Journals (Sweden)

    Carola eMaggiora Vergano

    2015-09-01

    Full Text Available Research on the etiology of adult psychopathology and its relationship with childhood trauma has focused primarily on specific forms of maltreatment. This study developed an instrument for the assessment of childhood and adolescence trauma that would aid in identifying the role of co-occurring childhood stressors and chronic adverse conditions. The Complex Trauma Questionnaire (ComplexTQ, in both clinician and self-report versions, is a measure for the assessment of multi-type maltreatment: physical, psychological, and sexual abuse; physical and emotional neglect as well as other traumatic experiences, such rejection, role reversal, witnessing domestic violence, separations, and losses. The four-point Likert scale allows to specifically indicate with which caregiver the traumatic experience has occurred. A total of 229 participants, a sample of 79 nonclinical and that of 150 high-risk and clinical participants, were assessed with the ComplexTQ clinician version applied to Adult Attachment Interview (AAI transcripts. Initial analyses indicate acceptable inter-rater reliability. A good fit to a 6-factor model regarding the experience with the mother and to a 5-factor model with the experience with the father was obtained; the internal consistency of factors derived was good. Convergent validity was provided with the AAI scales. ComplexTQ factors discriminated normative from high-risk and clinical samples. The findings suggest a promising, reliable, and valid measurement of early relational trauma that is reported; furthermore, it is easy to complete and is useful for both research and clinical practice.

  12. Liquid structure of acetic acid-water and trifluoroacetic acid-water mixtures studied by large-angle X-ray scattering and NMR.

    Science.gov (United States)

    Takamuku, Toshiyuki; Kyoshoin, Yasuhiro; Noguchi, Hiroshi; Kusano, Shoji; Yamaguchi, Toshio

    2007-08-01

    The structures of acetic acid (AA), trifluoroacetic acid (TFA), and their aqueous mixtures over the entire range of acid mole fraction xA have been investigated by using large-angle X-ray scattering (LAXS) and NMR techniques. The results from the LAXS experiments have shown that acetic acid molecules mainly form a chain structure via hydrogen bonding in the pure liquid. In acetic acid-water mixtures hydrogen bonds of acetic acid-water and water-water gradually increase with decreasing xA, while the chain structure of acetic acid molecules is moderately ruptured. Hydrogen bonds among water molecules are remarkably formed in acetic acid-water mixtures at xATFA molecules form not a chain structure but cyclic dimers through hydrogen bonding in the pure liquid. In TFA-water mixtures O...O hydrogen bonds among water molecules gradually increase when xA decreases, and hydrogen bonds among water molecules are significantly formed in the mixtures at xATFA molecules are considerably dissociated to hydrogen ions and trifluoroacetate in the mixtures. 1H, 13C, and 19F NMR chemical shifts of acetic acid and TFA molecules for acetic acid-water and TFA-water mixtures have indicated strong relationships between a structural change of the mixtures and the acid mole fraction. On the basis of both LAXS and NMR results, the structural changes of acetic acid-water and TFA-water mixtures with decreasing acid mole fraction and the effects of fluorination of the methyl group on the structure are discussed at the molecular level. PMID:17628099

  13. MEASURING OF COMPLEX STRUCTURE TRANSFER FUNCTION AND CALCULATING OF INNER SOUND FIELD

    Institute of Scientific and Technical Information of China (English)

    Chen Yuan; Huang Qibai; Shi Hanmin

    2005-01-01

    In order to measure complex structure transfer function and calculate inner sound field, transfer function of integration is mentioned. By establishing virtual system, transfer function of integration can be measured and the inner sound field can also be calculated. In the experiment, automobile body transfer function of integration is measured and experimental method of establishing virtual system is very valid.

  14. Measurement of the total solar energy transmittance (g-value) for complex glazings

    DEFF Research Database (Denmark)

    Duer, Karsten

    1999-01-01

    Four different complex glazings have been investigated in the Danish experimental setup METSET.The purpose of the measurements is to increase the confidence in the calorimetric measurements and to perform measurements and corrections according to a method developed in the ALTSET project...

  15. Measurement of solubilities for rhodium complexes and phosphine ligands in supercritical carbon dioxide

    OpenAIRE

    Shimoyama, Yusuke; Sonoda, Masanori; Miyazaki, Kaoru; Higashi, Hidenori; Iwai, Yoshio; ARAI, Yasuhiko

    2008-01-01

    The solubilities of phosphine ligands and rhodium (Rh) complexes in supercritical carbon dioxide were measured with Fourier transform infrared (FT-IR) spectroscopy at 320 and 333 K and several pressures. Triphenylphosphine (TPP) and tris(p-trifluoromethylphenyl)-phosphine (TTFMPP) were selected as ligands for the Rh complex. The solubilities of the fluorinated ligands and complexes were compared with those of the non-fluorinated compounds. The solubilities of ligand increased up to 10 times b...

  16. Counterions release from electrostatic complexes of polyelectrolytes and proteins of opposite charge : a direct measurement

    OpenAIRE

    Gummel, Jérémie; Cousin, Fabrice; Boué, François

    2009-01-01

    Though often considered as one of the main driving process of the complexation of species of opposite charges, the release of counterions has never been experimentally directly measured on polyelectrolyte/proteins complexes. We present here the first structural determination of such a release by Small Angle Neutron Scattering in complexes made of lysozyme, a positively charged protein and of PSS, a negatively charged polyelectrolyte. Both components have the same neutron density length, so th...

  17. Normalized entropy of rank distribution: a novel measure of heterogeneity of complex networks

    Institute of Scientific and Technical Information of China (English)

    Wu Jun; Tan Yue-Jin; Deng Hong-Zhong; Zhu Da-Zhi

    2007-01-01

    Many unique properties of complex networks result from heterogeneity. The measure and analysis of heterogeneity are important and desirable to the research of the properties and functions of complex networks. In this paper, the rank distribution is proposed as a new statistic feature of complex networks. Based on the rank distribution, a novel measure of the heterogeneity called a normalized entropy of rank distribution (NERD) is proposed. The NERD accords with the normal meaning of heterogeneity within the context of complex networks compared with conventional measures. The heterogeneity of scale-free networks is studied using the NERD. It is shown that scale-free networks become more heterogeneous as the scaling exponent decreases and the NERD of scale-free networks is independent of the number of vertices, which indicates that the NERD is a suitable and effective measure of heterogeneity for networks with different sizes.

  18. Information Measures of Complexity, Emergence, Self-organization, Homeostasis, and Autopoiesis

    OpenAIRE

    Fernandez, Nelson; Maldonado, Carlos; Gershenson, Carlos

    2013-01-01

    This chapter reviews measures of emergence, self-organization, complexity, homeostasis, and autopoiesis based on information theory. These measures are derived from proposed axioms and tested in two case studies: random Boolean networks and an Arctic lake ecosystem. Emergence is defined as the information a system or process produces. Self-organization is defined as the opposite of emergence, while complexity is defined as the balance between emergence and self-organization. Homeostasis refle...

  19. In Situ Fluorescence Microscopic Measurements of Complexation Reactions at Liquid/Liquid Interface

    OpenAIRE

    TSUKAHARA, Satoshi

    2005-01-01

    In situ microscopic measurement is a novel approach to clarify the intrinsic mechanism of complexation reactions occurring at liquid/liquid interfaces. The present review was mainly focused on recent three topics of methodology of in situ fluorescence microscopic observation and measurement of interfacial complexation reactions: (1) two kinds of self-assemblies of Pd2+ and 5,10,15,20-tetra(4-pyridyl)-21H, 23H-porphine complexes formed at the toluene/water interface, (2) microextraction of Eu3...

  20. Variances as order parameter and complexity measure for random Boolean networks

    Energy Technology Data Exchange (ETDEWEB)

    Luque, Bartolo [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain); Ballesteros, Fernando J [Observatori Astronomic, Universitat de Valencia, Ed. Instituts d' Investigacio, Pol. La Coma s/n, E-46980 Paterna, Valencia (Spain); Fernandez, Manuel [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain)

    2005-02-04

    Several order parameters have been considered to predict and characterize the transition between ordered and disordered phases in random Boolean networks, such as the Hamming distance between replicas or the stable core, which have been successfully used. In this work, we propose a natural and clear new order parameter: the temporal variance. We compute its value analytically and compare it with the results of numerical experiments. Finally, we propose a complexity measure based on the compromise between temporal and spatial variances. This new order parameter and its related complexity measure can be easily applied to other complex systems.

  1. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. K.; Jung, W. D.; Kim, J. W.; Ha, J. J

    2001-04-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed.

  2. Update of a footprint-based approach for the characterisation of complex measurement sites

    DEFF Research Database (Denmark)

    Goeckede, M.; Markkanen, T.; Hasager, C.B.;

    2006-01-01

    Horizontal heterogeneity can significantly affect the flux data quality at monitoring sites in complex terrain. In heterogeneous conditions, the adoption of the eddy-covariance technique is contraindicated by the lack of horizontal homogeneity and presence of advective conditions. In addition...... of an existing footprint-based quality evaluation concept for flux measurement sites in complex terrain. The most significant modifications in the present version are the use of a forward Lagrangian stochastic trajectory model for the determination of the spatial context of the measurements...... the performance of a coordinate rotation procedure, and to check to what extent the measured fluxes are representative for a target land-use type....

  3. The Microcantilever: A Versatile Tool for Measuring the Rheological Properties of Complex Fluids

    Directory of Open Access Journals (Sweden)

    I. Dufour

    2012-01-01

    Full Text Available Silicon microcantilevers can be used to measure the rheological properties of complex fluids. In this paper, two different methods will be presented. In the first method, the microcantilever is used to measure the hydrodynamic force exerted by a confined fluid on a sphere that is attached to the microcantilever. In the second method, the measurement of the microcantilever's dynamic spectrum is used to extract the hydrodynamic force exerted by the surrounding fluid on the microcantilever. The originality of the proposed methods lies in the fact that not only may the viscosity of the fluid be measured, but also the fluid's viscoelasticity, that is, both viscous and elastic properties, which are key parameters in the case of complex fluids. In both methods, the use of analytical equations permits the fluid's complex shear modulus to be extracted and expressed as a function of shear stress and/or frequency.

  4. Spatial separation of individual substances in effloresced crystals of ternary ammonium sulphate/dicarboxylic acid/water aerosols.

    Science.gov (United States)

    Treuel, Lennart; Sandmann, Alice; Zellner, Reinhard

    2011-04-18

    This work examines the crystals resulting from the efflorescence of internally mixed aqueous aerosols comprising ammonium sulphate and different dicarboxylic acids. Most studies on the deliquescence of aerosols use previously effloresced aerosols in their experiments. However, during efflorescence a highly supersaturated solution crystallises in a kinetically controlled way unlike the case of thermodynamically controlled crystallisation. Herein the distribution of individual substances within the effloresced crystals is investigated using Raman scanning experiments. The data presented show an intriguingly complex behaviour of these ternary and quarternary aerosols. A spatial separation of substances in the crystals resulting from the efflorescence of previously internally mixed ternary salt/dicarboxylic acid/water aerosol droplets is demonstrated and mechanistic aspects are discussed. PMID:21472958

  5. The Measurement of Financial System Complexity%金融系统复杂性度量

    Institute of Scientific and Technical Information of China (English)

    邱奕奎

    2014-01-01

    The measurement of financial system complexity is the intrinsic request of formalizing financial system complexity. It is also one of the basic problems of in theoretical research on financial system complexity. According to the parameters of system, the financial system complexity can be divided into four aspects: the structural complexity, the environment complexity, the function complexity and the dynamic complexity. The structural complexity is related to the relationship of components. The environment complexity is related to the changing of natural environment , economic environment and rule environment. The function complexity is related to the uncertainty of system function and the complexity of system function realization. The dynamic complexity pays attention to the complexity of financial activities and the uncertainty of financial system evolution. The relationship of system components can be divided into dependence and decomposition. By the concept of information entropy , the structure complexity can be measured. Through the classification and valuation of the metrics , the measurement index of the next three aspects' complexity can be got. By dividing the complexity of financial system into four levels respectively and studying the complexity measurement of every level, the difficult of measuring the overall complexity metric system can be solved. So the model of financial system overall complexity can be constructed.%金融系统复杂性的度量是金融系统复杂性形式化的内在要求,也是金融系统复杂性理论研究的基本问题之一。根据系统基本参量的构成,可以将金融系统的复杂性划分为:结构复杂性、环境复杂性、功能复杂性和动态复杂性四个层面。金融系统结构复杂性与系统组元之间的关系相关,环境复杂性与自然环境的突变、经济环境的波动、制度环境的变化相关,功能复杂性涉及到系统功能发挥的不确定性与系统功

  6. The effect of electrode contact resistance and capacitive coupling on Complex Resistivity measurements

    DEFF Research Database (Denmark)

    Ingeman-Nielsen, Thomas

    2006-01-01

    The effect of electrode contact resistance and capacitive coupling on complex resistivity (CR) measurements is studied in this paper. An equivalent circuit model for the receiver is developed to describe the effects. The model shows that CR measurements are severely affected even at relatively lo...... the contact resistance artificially increased by resistors. The results emphasize the importance of keeping contact resistance low in CR measurements....

  7. Fast laser systems for measuring the geometry of complex-shaped objects

    Science.gov (United States)

    Galiulin, Ravil M.; Galiulin, Rishat M.; Bakirov, J. M.; Vorontsov, A. V.; Ponomarenko, I. V.

    1999-01-01

    The technical characteristics, advantages and applications of an automated optoelectronic measuring system designed by 'Optel' company, State Aviation University of Ufa, are presented in this paper. The measuring apparatus can be applied for industrial development and research, for example, in rapid prototyping, and for obtaining geometrical parameters in medicine and criminalistics. It essentially is a non-contact and rapid scanning system, allowing measurements of complex shaped objects like metal and plastic workpieces or parts of human body.

  8. Design and Functional Validation of a Complex Impedance Measurement Device for Characterization of Ultrasonic Transducers

    International Nuclear Information System (INIS)

    This paper presents the design and practical implementation of a complex impedance measurement device capable of characterization of ultrasonic transducers. The device works in the frequency range used by industrial ultrasonic transducers which is below the measurement range of modern high end network analyzers. The device uses the Goertzel algorithm instead of the more common FFT algorithm to calculate the magnitude and phase component of the impedance under test. A theoretical overview is given followed by a practical approach and measurement results. (authors)

  9. Measuring streetscape complexity based on the statistics of local contrast and spatial frequency.

    Directory of Open Access Journals (Sweden)

    André Cavalcante

    Full Text Available Streetscapes are basic urban elements which play a major role in the livability of a city. The visual complexity of streetscapes is known to influence how people behave in such built spaces. However, how and which characteristics of a visual scene influence our perception of complexity have yet to be fully understood. This study proposes a method to evaluate the complexity perceived in streetscapes based on the statistics of local contrast and spatial frequency. Here, 74 streetscape images from four cities, including daytime and nighttime scenes, were ranked for complexity by 40 participants. Image processing was then used to locally segment contrast and spatial frequency in the streetscapes. The statistics of these characteristics were extracted and later combined to form a single objective measure. The direct use of statistics revealed structural or morphological patterns in streetscapes related to the perception of complexity. Furthermore, in comparison to conventional measures of visual complexity, the proposed objective measure exhibits a higher correlation with the opinion of the participants. Also, the performance of this method is more robust regarding different time scenarios.

  10. Measuring streetscape complexity based on the statistics of local contrast and spatial frequency.

    Science.gov (United States)

    Cavalcante, André; Mansouri, Ahmed; Kacha, Lemya; Barros, Allan Kardec; Takeuchi, Yoshinori; Matsumoto, Naoji; Ohnishi, Noboru

    2014-01-01

    Streetscapes are basic urban elements which play a major role in the livability of a city. The visual complexity of streetscapes is known to influence how people behave in such built spaces. However, how and which characteristics of a visual scene influence our perception of complexity have yet to be fully understood. This study proposes a method to evaluate the complexity perceived in streetscapes based on the statistics of local contrast and spatial frequency. Here, 74 streetscape images from four cities, including daytime and nighttime scenes, were ranked for complexity by 40 participants. Image processing was then used to locally segment contrast and spatial frequency in the streetscapes. The statistics of these characteristics were extracted and later combined to form a single objective measure. The direct use of statistics revealed structural or morphological patterns in streetscapes related to the perception of complexity. Furthermore, in comparison to conventional measures of visual complexity, the proposed objective measure exhibits a higher correlation with the opinion of the participants. Also, the performance of this method is more robust regarding different time scenarios. PMID:24498292

  11. Complexity

    CERN Document Server

    Gershenson, Carlos

    2011-01-01

    The term complexity derives etymologically from the Latin plexus, which means interwoven. Intuitively, this implies that something complex is composed by elements that are difficult to separate. This difficulty arises from the relevant interactions that take place between components. This lack of separability is at odds with the classical scientific method - which has been used since the times of Galileo, Newton, Descartes, and Laplace - and has also influenced philosophy and engineering. In recent decades, the scientific study of complexity and complex systems has proposed a paradigm shift in science and philosophy, proposing novel methods that take into account relevant interactions.

  12. Complex-optical-field lidar system for range and vector velocity measurement.

    Science.gov (United States)

    Gao, Shuang; O'Sullivan, Maurice; Hui, Rongqing

    2012-11-01

    A coherent lidar system based on the measurement of complex optical field is demonstrated for the first time. An electro-optic in-phase/quadrature (I/Q) modulator is used in the lidar transmitter to realize carrier-suppressed complex optical field modulation in which the positive and the negative optical sidebands can carry independent modulation waveforms. A fiber-optic 90° hybrid is used in the lidar receiver for coherent heterodyne detection and to recover the complex optical field. By loading a constant modulation frequency on the lower optical sideband and a wideband linear frequency chirp on the upper sideband, vector velocity and target distance can be measured independently. The wide modulation bandwidth of this lidar system also enabled unprecedented range resolution and the capability of measuring high velocity unambiguously. PMID:23187404

  13. Classification of periodic, chaotic and random sequences using approximate entropy and Lempel–Ziv complexity measures

    Indian Academy of Sciences (India)

    Karthi Balasubramanian; Silpa S Nair; Nithin Nagaraj

    2015-03-01

    ‘Complexity’ has several definitions in diverse fields. These measures are indicators of some aspects of the nature of the signal. Such measures are used to analyse and classify signals and as a signal diagnostics tool to distinguish between periodic, quasiperiodic, chaotic and random signals. Lempel–Ziv (LZ) complexity and approximate entropy (ApEn) are such popular complexity measures that are widely used for characterizing biological signals also. In this paper, we compare the utility of ApEn, LZ complexities and Shannon’s entropy in characterizing data from a nonlinear chaotic map (logistic map). In this work, we show that LZ and ApEn complexity measures can characterize the data complexities correctly for data sequences as short as 20 in length while Shannon’s entropy fails for length less than 50. In the case of noisy sequences with 10% uniform noise, Shannon’s entropy works only for lengths greater than 200 while LZ and ApEn are successful with sequences of lengths greater than 30 and 20, respectively.

  14. An attractor-based complexity measurement for Boolean recurrent neural networks.

    Directory of Open Access Journals (Sweden)

    Jérémie Cabessa

    Full Text Available We provide a novel refined attractor-based complexity measurement for Boolean recurrent neural networks that represents an assessment of their computational power in terms of the significance of their attractor dynamics. This complexity measurement is achieved by first proving a computational equivalence between Boolean recurrent neural networks and some specific class of ω-automata, and then translating the most refined classification of ω-automata to the Boolean neural network context. As a result, a hierarchical classification of Boolean neural networks based on their attractive dynamics is obtained, thus providing a novel refined attractor-based complexity measurement for Boolean recurrent neural networks. These results provide new theoretical insights to the computational and dynamical capabilities of neural networks according to their attractive potentialities. An application of our findings is illustrated by the analysis of the dynamics of a simplified model of the basal ganglia-thalamocortical network simulated by a Boolean recurrent neural network. This example shows the significance of measuring network complexity, and how our results bear new founding elements for the understanding of the complexity of real brain circuits.

  15. An attractor-based complexity measurement for Boolean recurrent neural networks.

    Science.gov (United States)

    Cabessa, Jérémie; Villa, Alessandro E P

    2014-01-01

    We provide a novel refined attractor-based complexity measurement for Boolean recurrent neural networks that represents an assessment of their computational power in terms of the significance of their attractor dynamics. This complexity measurement is achieved by first proving a computational equivalence between Boolean recurrent neural networks and some specific class of ω-automata, and then translating the most refined classification of ω-automata to the Boolean neural network context. As a result, a hierarchical classification of Boolean neural networks based on their attractive dynamics is obtained, thus providing a novel refined attractor-based complexity measurement for Boolean recurrent neural networks. These results provide new theoretical insights to the computational and dynamical capabilities of neural networks according to their attractive potentialities. An application of our findings is illustrated by the analysis of the dynamics of a simplified model of the basal ganglia-thalamocortical network simulated by a Boolean recurrent neural network. This example shows the significance of measuring network complexity, and how our results bear new founding elements for the understanding of the complexity of real brain circuits.

  16. An Activation Force-based Affinity Measure for Analyzing Complex Networks

    OpenAIRE

    Jun Guo; Hanliang Guo; Zhanyi Wang

    2011-01-01

    Affinity measure is a key factor that determines the quality of the analysis of a complex network. Here, we introduce a type of statistics, activation forces, to weight the links of a complex network and thereby develop a desired affinity measure. We show that the approach is superior in facilitating the analysis through experiments on a large-scale word network and a protein-protein interaction (PPI) network consisting of ∼5,000 human proteins. The experiment on the word network verifies tha...

  17. Silicon Isotope Fractionation During Acid Water-Igneous Rock Interaction

    Science.gov (United States)

    van den Boorn, S. H.; van Bergen, M. J.; Vroon, P. Z.

    2007-12-01

    Silica enrichment by metasomatic/hydrothermal alteration is a widespread phenomenon in crustal environments where acid fluids interact with silicate rocks. High-sulfidation epithermal ore deposits and acid-leached residues at hot-spring settings are among the best known examples. Acid alteration acting on basalts has also been invoked to explain the relatively high silica contents of the surface of Mars. We have analyzed basaltic-andesitic lavas from the Kawah Ijen volcanic complex (East Java, Indonesia) that were altered by interaction with highly acid (pH~1) sulfate-chloride water of its crater lake and seepage stream. Quantitative removal of major elements during this interaction has led to relative increase in SiO2 contents. Our silicon isotope data, obtained by HR-MC-ICPMS and reported relative to the NIST RM8546 (=NBS28) standard, show a systematic increase in &δ&&30Si from -0.2‰ (±0.3, 2sd) for unaltered andesites and basalts to +1.5‰ (±0.3, 2sd) for the most altered/silicified rocks. These results demonstrate that silicification induced by pervasive acid alteration is accompanied by significant Si isotope fractionation, so that alterered products become isotopically heavier than the precursor rocks. Despite the observed enrichment in SiO2, the rocks have experienced an overall net loss of silicon upon alteration, if Nb is considered as perfectly immobile. The observed &δ&&30Si values of the alteration products appeared to correlate well with the inferred amounts of silicon loss. These findings would suggest that &28Si is preferentially leached during water-rock interaction, implying that dissolved silica in the ambient lake and stream water is isotopically light. However, layered opaline lake sediments, that are believed to represent precipitates from the silica-saturated water show a conspicuous &30Si-enrichment (+1.2 ± 0.2‰). Because anorganic precipitation is known to discriminate against the heavy isotope (e.g. Basile- Doelsch et al., 2006

  18. Solubilities of Isophthalic Acid in Acetic Acid + Water Solvent Mixtures

    Institute of Scientific and Technical Information of China (English)

    CHENG Youwei; HUO Lei; LI Xi

    2013-01-01

    The solubilities of isophthalic acid (1) in binary acetic acid (2) + water (3) solvent mixtures were determined in a pressurized vessel.The temperature range was from 373.2 to 473.2K and the range of the mole fraction of acetic acid in the solvent mixtures was from x2 =0 to 1.A new method to measure the solubility was developed,which solved the problem of sampling at high temperature.The experimental results indicated that within the temperature range studied,the solubilities of isophthalic acid in all mixtures showed an increasing trend with increasing temperature.The experimental solubilities were correlated by the Buchowski equation,and the calculate results showed good agreement with the experimental solubilities.Furthermore,the mixed solvent systems were found to exhibit a maximum solubility effect on the solubility,which may be attributed to the intermolecular association between the solute and the solvent mixture.The maximum solubility effect was well modeled by the modified Wilson equation.

  19. [Investigation of the efficacy of electrolyzed acid water on the standard strains of some pathogenic microorganisms].

    Science.gov (United States)

    Ileri, Ciğdem; Sezen, Yavuz; Dimoglo, Anatoli

    2006-10-01

    Many of the studies have indicated that electrolyzed acid water (EAW) has a strong microbicidal activity. In this study, EAW was obtained by the exposure of NaCl (10 g/L) and tap water mixture to direct electric current (2 ampere) during 15 minutes, in an instrument designed by the study group. EAW was tested for its inactivation efficacy on the standard strains of Staphylococcus aureus, Candida albicans and Pseudomonas aeruginosa in different concentrations and for different periods (0, 10, 30 and 60 seconds). The EAW dilutions were prepared by using sterile deionized water in the rates of 100% (undiluted), 20%, 10%, 5%, 2% and 1%, while deionized water alone was used as control. The oxidation-reduction potency, pH, and free cloride amounts were separately measured in different concentrations of EAW. UNE-EN 1276 standard was used to investigate the inhibitory efficacy of EAW on S. aureus ATCC 29213, C. albicans ATCC 10231 and P. aeruginosa ATCC 9027 through the use of membrane filtration method. As a result, all of the microorganisms have been completely inactivated at the end of 10th second, in all of the EAW concentrations, except 1% dilution. However, after the treatment with 1% EAW during 60 seconds, it was determined that an average population of 4.09 log cfu/ml, 4.56 log cfu/ml, and 3.62 log cfu/ml survived, respectively for S. aureus, C. albicans and P. aeruginosa. Our data showed that 2% concentration of EAW had a bactericidal effect and may be used for the surface disinfection in practice.

  20. Size Distribution Studies on Sulfuric Acid-Water Particles in a Photolytic Reactor

    Science.gov (United States)

    Abdullahi, H. U.; Kunz, J. C.; Hanson, D. R.; Thao, S.; Vences, J.

    2015-12-01

    The size distribution of particles composed of sulfuric acid and water were measured in a Photolytic cylindrical Flow Reactor (PhoFR, inner diameter 5 cm, length ~ 100 cm). In the reactor, nitrous acid, water and sulfur dioxide gases along with ultraviolet light produced sulfuric acid. The particles formed from these vapors were detected with a scanning mobility particle spectrometer equipped with a diethylene glycol condensation particle counter (Jiang et al. 2011). For a set of standard conditions, particles attained a log-normal distribution with a peak diameter of 6 nm, and a total number of about 3x105 cm-3. The distributions show that ~70 % of the particles are between 4 and 8 nm diameter (lnσ ~ 0.37). These standard conditions are: 296 K, 25% relative humidity, total flow = 3 sLpm, ~10 ppbv HONO, SO2 in excess. With variations of relative humidity, the total particle number varied strongly, with a power relationship of ~3.5, and the size distributions showed a slight increase in peak diameter with relative humidity, increasing about 1 nm from 8 to 33 % relative humidity. Variations of HONO at a constant light intensity (wavelength of ~ 360 nm) were performed and particle size and total number changed dramatically. Size distributions also changed drastically with variations of light intensity, accomplished by turning on/off some of the black light flourescent bulbs that illuminated the flow reactor. Comparisons of these size distributions to recently published nucleation experiments (e.g. Zollner et al., Glasoe et al.) as well as to simulations of PhoFR reveal important details about the levels of sulfuric acid present in PhoFR as well as possible base contaminants.

  1. Comparing entropy with tests for randomness as a measure of complexity in time series

    CERN Document Server

    Gan, Chee Chun

    2015-01-01

    Entropy measures have become increasingly popular as an evaluation metric for complexity in the analysis of time series data, especially in physiology and medicine. Entropy measures the rate of information gain, or degree of regularity in a time series e.g. heartbeat. Ideally, entropy should be able to quantify the complexity of any underlying structure in the series, as well as determine if the variation arises from a random process. Unfortunately current entropy measures mostly are unable to perform the latter differentiation. Thus, a high entropy score indicates a random or chaotic series, whereas a low score indicates a high degree of regularity. This leads to the observation that current entropy measures are equivalent to evaluating how random a series is, or conversely the degree of regularity in a time series. This raises the possibility that existing tests for randomness, such as the runs test or permutation test, may have similar utility in diagnosing certain conditions. This paper compares various t...

  2. Effects of lability of metal complex on free ion measurement using DMT.

    Science.gov (United States)

    Weng, Liping; Van Riemsdijk, Willem H; Temminghoff, Erwin J M

    2010-04-01

    Very low concentrations of free metal ion in natural samples can be measured using the Donnan membrane technique (DMT) based on ion transport kinetics. In this paper, the possible effects of slow dissociation of metal complexes on the interpretation of kinetic DMT are investigated both theoretically and experimentally. The expressions of the lability parameter, Lgrangian , were derived for DMT. Analysis of new experimental studies using synthetic solution containing NTA as the ligand and Cu(2+) ions shows that when the ionic strength is low (DMT measurement. In natural waters, dissolved organic matter (DOM) is the most important source of ligands that complex metals. By comparing the fraction of labile species measured using other dynamic sensors (DGT, GIME) in several freshwaters, it is concluded that in most waters ion transport in DMT is controlled by diffusion in the membrane. Only in very soft waters (DMT. In this case, neglecting this effect may lead to an underestimation of the free metal ion concentration measured.

  3. Nano Ferrites Microwave Complex Permeability and Permittivity Measurements by T/R Technique in Waveguide

    OpenAIRE

    Obol, Mahmut; Al-Moayed, Nawaf; Khan, Usman A.; Afsar, Mohammed N.

    2007-01-01

    There is a huge demand to accurately determine the magneto-electrical properties of particles in the nano sized regime due to the modern IC technology revolution and biomedical application science. In this paper, we present a microwave waveguide measurement technique for complex permeability and permittivity of expensive nano sized magnetic powder materials. In the measurement process, Agilents 8510C vector network analyzer was used to have a standard TRL calibration for free space inside the...

  4. An approach to measuring adolescents' perception of complexity for pictures of fruit and vegetable mixes

    DEFF Research Database (Denmark)

    Mielby, Line Holler; Bennedbæk-Jensen, Sidsel; Edelenbos, Merete;

    2013-01-01

    adolescents' perception of complexity of pictures of fruit and vegetable mixes. A sensory panel evaluated 10 descriptive attributes, including simplicity and complexity, for 24 pictures of fruit and vegetable mixes. The descriptive analysis found strong inverse correlation between complexity and simplicity....... An adolescent consumer group (n = 242) and an adult consumer group (n = 86) subsequently rated the pictures on simplicity and attractiveness. Pearson's correlation coefficients revealed strong correlations between the sensory panel and both consumer groups' usage of simplicity. This suggests that simplicity can...... be used to measure perceived complexity. In relation to attractiveness, different optimal levels of simplicity of pictures of fruit mixes were found for segments of the adolescent consumer group....

  5. The precision of visual memory for a complex contour shape measured by a freehand drawing task.

    Science.gov (United States)

    Osugi, Takayuki; Takeda, Yuji

    2013-03-01

    Contour information is an important source for object perception and memory. Three experiments examined the precision of visual short-term memory for complex contour shapes. All used a new procedure that assessed recall memory for holistic information in complex contour shapes: Participants studied, then reproduced (without cues), a contoured shape by freehand drawing. In Experiment 1 memory precision was measured by comparing Fourier descriptors for studied and reproduced contours. Results indicated survival of lower (holistic) frequency information (i.e., ⩽5cycles/perimeter) and loss of higher (detail) frequency information. Secondary tasks placed demands on either verbal memory (Experiment 2) or visual spatial memory (Experiment 3). Neither secondary task interfered with recall of complex contour shapes, suggesting that the memory system maintaining holistic shape information was independent of both the verbal memory system and the visual spatial memory subsystem of visual short-term memory. The nature of memory for complex contour shape is discussed. PMID:23296198

  6. Counterions release from electrostatic complexes of polyelectrolytes and proteins of opposite charge : a direct measurement

    CERN Document Server

    Gummel, Jérémie; Boué, François

    2009-01-01

    Though often considered as one of the main driving process of the complexation of species of opposite charges, the release of counterions has never been experimentally directly measured on polyelectrolyte/proteins complexes. We present here the first structural determination of such a release by Small Angle Neutron Scattering in complexes made of lysozyme, a positively charged protein and of PSS, a negatively charged polyelectrolyte. Both components have the same neutron density length, so their scattering can be switched off simultaneously in an appropriate "matching" solvent; this enables determination of the spatial distribution of the single counterions within the complexes. The counterions (including the one subjected to Manning condensation) are expelled from the cores where the species are at electrostatic stoichiometry.

  7. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  8. Using measures of information content and complexity of time series as hydrologic metrics

    Science.gov (United States)

    The information theory has been previously used to develop metrics that allowed to characterize temporal patterns in soil moisture dynamics, and to evaluate and to compare performance of soil water flow models. The objective of this study was to apply information and complexity measures to characte...

  9. Effects of Lability of Metal Complex on Free Ion Measurement Using DMT

    NARCIS (Netherlands)

    Weng, L.P.; Riemsdijk, van W.H.; Temminghoff, E.J.M.

    2010-01-01

    Very low concentrations of free metal ion in natural samples can be measured using the Donnan membrane technique (DMT) based on ion transport kinetics. In this paper, the possible effects of slow dissociation of metal complexes on the interpretation of kinetic DMT are investigated both theoretically

  10. The Word Complexity Measure: Description and Application to Developmental Phonology and Disorders

    Science.gov (United States)

    Stoel-Gammon, Carol

    2010-01-01

    Miccio's work included a number of articles on the assessment of phonology in children with phonological disorders, typically using measures of correct articulation, using the PCC, or analyses of errors, using the framework of phonological processes. This paper introduces an approach to assessing phonology by examining the phonetic complexity of…

  11. Complex decay patterns in atomic core photoionization disentangled by ion-recoil measurements

    Energy Technology Data Exchange (ETDEWEB)

    Guillemin, Renaud; Bomme, Cedric; Marin, Thierry; Journel, Loic; Marchenko, Tatiana; Kushawaha, Rajesh K.; Piancastelli, Maria Novella; Simon, Marc [Universite Pierre et Marie Curie, Universite Paris 06, Laboratoire de Chimie Physique Matiere et Rayonement, 11 rue Pierre et Marie Curie, FR-75231 Paris Cedex 05 (France); Centre National de la Recherche Scientifique, Laboratoire de Chimie Physique Matiere et Rayonement (UMR7614), 11 rue Pierre et Marie Curie, FR-75231 Paris Cedex 05 (France); Trcera, Nicolas [Synchrotron SOLEIL, l' Orme des Merisiers, Saint-Aubin, BP 48, FR-91192 Gif-sur-Yvette Cedex (France)

    2011-12-15

    Following core 1s ionization and resonant excitation of argon atoms, we measure the recoil energy of the ions due to momentum conservation during the emission of Auger electrons. We show that such ion momentum spectroscopy can be used to disentangle to some degree complex decay patterns, involving both radiative and nonradiative decays.

  12. 2D and 3D endoanal and translabial ultrasound measurement variation in normal postpartum measurements of the anal sphincter complex

    Science.gov (United States)

    MERIWETHER, Kate V.; HALL, Rebecca J.; LEEMAN, Lawrence M.; MIGLIACCIO, Laura; QUALLS, Clifford; ROGERS, Rebecca G.

    2015-01-01

    Introduction Women may experience anal sphincter anatomy changes after vaginal or Cesarean delivery. Therefore, accurate and acceptable imaging options to evaluate the anal sphincter complex (ASC) are needed. ASC measurements may differ between translabial (TL-US) and endoanal ultrasound (EA-US) imaging and between 2D and 3D ultrasound. The objective of this analysis was to describe measurement variation between these modalities. Methods Primiparous women underwent 2D and 3D TL-US imaging of the ASC six months after a vaginal birth (VB) or Cesarean delivery (CD). A subset of women also underwent EA-US measurements. Measurements included the internal anal sphincter (IAS) thickness at proximal, mid, and distal levels and the external anal sphincter (EAS) at 3, 6, 9, and 12 o’clock positions as well as bilateral thickness of the pubovisceralis muscle (PVM). Results 433 women presented for US: 423 had TL-US and 64 had both TL-US and EA-US of the ASC. All IAS measurements were significantly thicker on TL-US than EA-US (all p0.20). On both TL-US and EA-US, there were multiple sites where significant asymmetry existed in left versus right measurements. Conclusion The ultrasound modality used to image the ASC introduces small but significant changes in measurements, and the direction of the bias depends on the muscle and location being imaged. PMID:25344221

  13. Measuring economic complexity of countries and products: which metric to use?

    Science.gov (United States)

    Mariani, Manuel Sebastian; Vidmer, Alexandre; Medo, Matsúš; Zhang, Yi-Cheng

    2015-11-01

    Evaluating the economies of countries and their relations with products in the global market is a central problem in economics, with far-reaching implications to our theoretical understanding of the international trade as well as to practical applications, such as policy making and financial investment planning. The recent Economic Complexity approach aims to quantify the competitiveness of countries and the quality of the exported products based on the empirical observation that the most competitive countries have diversified exports, whereas developing countries only export few low quality products - typically those exported by many other countries. Two different metrics, Fitness-Complexity and the Method of Reflections, have been proposed to measure country and product score in the Economic Complexity framework. We use international trade data and a recent ranking evaluation measure to quantitatively compare the ability of the two metrics to rank countries and products according to their importance in the network. The results show that the Fitness-Complexity metric outperforms the Method of Reflections in both the ranking of products and the ranking of countries. We also investigate a generalization of the Fitness-Complexity metric and show that it can produce improved rankings provided that the input data are reliable.

  14. Microbial growth and biofilm formation in geologic media is detected with complex conductivity measurements

    Science.gov (United States)

    Davis, Caroline A.; Atekwana, Estella; Atekwana, Eliot; Slater, Lee D.; Rossbach, Silvia; Mormile, Melanie R.

    2006-09-01

    Complex conductivity measurements (0.1-1000 Hz) were obtained from biostimulated sand-packed columns to investigate the effect of microbial growth and biofilm formation on the electrical properties of porous media. Microbial growth was verified by direct microbial counts, pH measurements, and environmental scanning electron microscope imaging. Peaks in imaginary (interfacial) conductivity in the biostimulated columns were coincident with peaks in the microbial cell concentrations extracted from sands. However, the real conductivity component showed no discernible relationship to microbial cell concentration. We suggest that the observed dynamic changes in the imaginary conductivity (σ″) arise from the growth and attachment of microbial cells and biofilms to sand surfaces. We conclude that complex conductivity techniques, specifically imaginary conductivity measurements are a proxy indicator for microbial growth and biofilm formation in porous media. Our results have implications for microbial enhanced oil recovery, CO2 sequestration, bioremediation, and astrobiology studies.

  15. Recurrence Plot Based Measures of Complexity and its Application to Heart Rate Variability Data

    CERN Document Server

    Marwan, N; Meyerfeldt, U; Schirdewan, A; Kurths, J

    2002-01-01

    In complex systems the knowledge of transitions between regular, laminar or chaotic behavior is essential to understand the processes going on there. Linear approaches are often not sufficient to describe these processes and several nonlinear methods require rather long time observations. To overcome these difficulties, we propose measures of complexity based on vertical structures in recurrence plots and apply them to the logistic map as well as to heart rate variability data. For the logistic map these measures enable us to detect transitions between chaotic and periodic states, as well as to identify additional laminar states, i.e. chaos-chaos transitions. Traditional recurrence quantification analysis fails to detect these latter transitions. Applying our new measures to the heart rate variability data, we are able to detect and quantify laminar phases before a life-threatening cardiac arrhythmia and, thus, to enable a prediction of such an event. Our findings could be of importance for the therapy of mal...

  16. Complex hand dexterity: a review of biomechanical methods for measuring musical performance.

    Science.gov (United States)

    Metcalf, Cheryl D; Irvine, Thomas A; Sims, Jennifer L; Wang, Yu L; Su, Alvin W Y; Norris, David O

    2014-01-01

    Complex hand dexterity is fundamental to our interactions with the physical, social, and cultural environment. Dexterity can be an expression of creativity and precision in a range of activities, including musical performance. Little is understood about complex hand dexterity or how virtuoso expertise is acquired, due to the versatility of movement combinations available to complete any given task. This has historically limited progress of the field because of difficulties in measuring movements of the hand. Recent developments in methods of motion capture and analysis mean it is now possible to explore the intricate movements of the hand and fingers. These methods allow us insights into the neurophysiological mechanisms underpinning complex hand dexterity and motor learning. They also allow investigation into the key factors that contribute to injury, recovery and functional compensation. The application of such analytical techniques within musical performance provides a multidisciplinary framework for purposeful investigation into the process of learning and skill acquisition in instrumental performance. These highly skilled manual and cognitive tasks present the ultimate achievement in complex hand dexterity. This paper will review methods of assessing instrumental performance in music, focusing specifically on biomechanical measurement and the associated technical challenges faced when measuring highly dexterous activities. PMID:24860531

  17. Complex Hand Dexterity: A Review of Biomechanical Methods for Measuring Musical Performance

    Directory of Open Access Journals (Sweden)

    Cheryl Diane Metcalf

    2014-05-01

    Full Text Available Complex hand dexterity is fundamental to our interactions with the physical, social and cultural environment. Dexterity can be an expression of creativity and precision in a range of activities, including musical performance. Little is understood about complex hand dexterity or how virtuoso expertise is acquired, due to the versatility of movement combinations available to complete any given task. This has historically limited progress of the field because of difficulties in measuring movements of the hand. Recent developments in methods of motion capture and analysis mean it is now possible to explore the intricate movements of the hand and fingers. These methods allow us insights into the neurophysiological mechanisms underpinning complex hand dexterity and motor learning. They also allow investigation into the key factors that contribute to injury, recovery and functional compensation.The application of such analytical techniques within musical performance provides a multidisciplinary framework for purposeful investigation into the process of learning and skill acquisition in instrumental performance. These highly skilled manual and cognitive tasks present the ultimate achievement in complex hand dexterity. This paper will review methods of assessing instrumental performance in music, focusing specifically on biomechanical measurement and the associated technical challenges faced when measuring highly dexterous activities.

  18. Spin state behavior of iron(II)/dipyrazolylpyridine complexes. New insights from crystallographic and solution measurements

    OpenAIRE

    Kershaw Cook, L; R. Mohammed; Sherborne, G; Roberts, TD; Alvarez, S.; Halcrow, MA

    2015-01-01

    The isomeric complexes [Fe(1-bpp)2]2+ and [Fe(3-bpp)2]2+ (1-bpp=2,6-di[pyrazol-1-yl]pyridine; 3-bpp=2,6-di[1H-pyrazol-3-yl]pyridine) and their derivatives are some of the most widely investigated complexes in spin-crossover research. This article addresses two unique aspects of their spin-state chemistry. First, is an unusual structural distortion in the high-spin form that can inhibit spin-crossover in the solid state. A new analysis of these structures using continuous shape measures has ex...

  19. A quantitative measure, mechanism and attractor for self-organization in networked complex systems

    CERN Document Server

    Georgiev, Georgi Yordanov

    2012-01-01

    Quantity of organization in complex networks here is measured as the inverse of the average sum of physical actions of all elements per unit motion multiplied by the Planck's constant. The meaning of quantity of organization is the inverse of the number of quanta of action per one unit motion of an element. This definition can be applied to the organization of any complex system. Systems self-organize to decrease the average action per element per unit motion. This lowest action state is the attractor for the continuous self-organization and evolution of a dynamical complex system. Constraints increase this average action and constraint minimization by the elements is a basic mechanism for action minimization. Increase of quantity of elements in a network, leads to faster constraint minimization through grouping, decrease of average action per element and motion and therefore accelerated rate of self-organization. Progressive development, as self-organization, is a process of minimization of action.

  20. Multi-attribute integrated measurement of node importance in complex networks

    Science.gov (United States)

    Wang, Shibo; Zhao, Jinlou

    2015-11-01

    The measure of node importance in complex networks is very important to the research of networks stability and robustness; it also can ensure the security of the whole network. Most researchers have used a single indicator to measure the networks node importance, so that the obtained measurement results only reflect certain aspects of the networks with a loss of information. Meanwhile, because of the difference of networks topology, the nodes' importance should be described by combining the character of the networks topology. Most of the existing evaluation algorithms cannot completely reflect the circumstances of complex networks, so this paper takes into account the degree of centrality, the relative closeness centrality, clustering coefficient, and topology potential and raises an integrated measuring method to measure the nodes' importance. This method can reflect nodes' internal and outside attributes and eliminate the influence of network structure on the node importance. The experiments of karate network and dolphin network show that networks topology structure integrated measure has smaller range of metrical result than a single indicator and more universal. Experiments show that attacking the North American power grid and the Internet network with the method has a faster convergence speed than other methods.

  1. Measurement and Hemodialysis Effect of Complex Relative Permittivity for Blood of Kidney Patients Using Open-Ended Coaxial Measurement Probe

    Science.gov (United States)

    Takeda, Akira; Takata, Kazuyuki; Nagao, Hirotomo; Wang, Jianqing; Fujiwara, Osamu

    Before evaluating the quality of hemodialysis from the limited volume of human blood using a commercially available open-ended coaxial probe, we previously measured the complex relative permittivity of pure water from 200 MHz to 6 GHz with respect to its measured liquid volume, and revealed that 1.9 ml water in a beaker with a diameter of 24 mm and a depth of 2 mm gives a variation within ±0.5 % for the real part and ±7 % for the imaginary part. Based on the above finding, we measured the dielectric properties of 2.5 ml whole blood at 25°C for 10 normal healthy subjects and 9 hemodialysis patients. The measured results on healthy subjects show good agreement with the data reported by Gabriel for human blood at 37°C, while they provide different dispersion characteristics of straight lines for their Cole-Cole plots. The measured results on the patients give further different dispersion characteristics in comparison with the healthy subjects. In order to investigate the above differences statistically, the student t-test was conducted to reveal that permittivity at infinite frequency for the Cole-Cole plots is significantly different with a level of 1 % among its averaged values for normal healthy subjects and patients before dialysis.

  2. Long-lifetime Ru(II) complexes for the measurement of high molecular weight protein hydrodynamics.

    Science.gov (United States)

    Szmacinski, H; Castellano, F N; Terpetschnig, E; Dattelbaum, J D; Lakowicz, J R; Meyer, G J

    1998-03-01

    We describe the synthesis and characterization of two asymmetrical ruthenium(II) complexes, [Ru(dpp)2(dcbpy)]2+ and [Ru(dpp)2(mcbpy)]2+, as well as the water soluble sulfonated derivatives [Ru(dpp(SO3Na)2)2(dcbpy)]2+ and [Ru(dpp(SO3Na)2)2(mcbpy)]2+ (dpp is 4,7-diphenyl-1,10-phenanthroline, dcbpy is 4,4'-dicarboxylic acid-2,2'-bipyridine, mcbpy is 4-methyl,4'-carboxylic acid-2,2'-bipyridine, and dpp(SO3Na)2 is the disulfonated derivative of dpp) as probes for the measurement of the rotational motions of proteins. The spectral (absorption, emission, and anisotropy) and photophysical (time-resolved intensity and anisotropy decays) properties of these metal-ligand complexes were determined in solution, in both the presence and absence of human serum albumin (HSA). These complexes display lifetimes ranging from 345 ns to 3.8 microseconds in deoxygenated aqueous solutions under a variety of conditions. The carboxylic acid groups on these complexes were activated to form N-hydroxysuccinimide (NHS) esters which were used to covalently lable HSA, and were characterized spectroscopically in the same manner as above. Time-resolved anisotropy measurements were performed to demonstrate the utility of these complexes in measuring long rotational correlation times of bioconjugates between HSA and antibody to HSA. The potential usefulness of these probes in fluorescence polarization immunoassays was demonstrated by an association assay of the Ru(II)-labeled HSA with polyclonal antibody. PMID:9546056

  3. Crater size-frequency distribution measurements and age of the Compton-Belkovich Volcanic Complex

    Science.gov (United States)

    Shirley, K. A.; Zanetti, M.; Jolliff, B.; van der Bogert, C. H.; Hiesinger, H.

    2016-07-01

    The Compton-Belkovich Volcanic Complex (CBVC) is a 25 × 35 km feature on the lunar farside marked by elevated topography, high albedo, high thorium concentration, and high silica content. Morphologies indicate that the complex is volcanic in origin and compositions indicate that it represents rare silicic volcanism on the Moon. Constraining the timing of silicic volcanism at the complex is necessary to better understand the development of evolved magmas and when they were active on the lunar surface. We employ image analysis and crater size-frequency distribution (CSFD) measurements on several locations within the complex and at surrounding impact craters, Hayn (87 km diameter), and Compton (160 km diameter), to determine relative and absolute model ages of regional events. Using CSFD measurements, we establish a chronology dating regional resurfacing events and the earliest possible onset of CBVC volcanism at ∼3.8 Ga, the formation of Compton Crater at 3.6 Ga, likely resurfacing by volcanism at the CBVC at ∼3.5 Ga, and the formation of Hayn Crater at ∼1 Ga. For the CBVC, we find the most consistent results are obtained using craters larger than 300 m in diameter; the small crater population is affected by their approach to an equilibrium condition and by the physical properties of regolith at the CBVC.

  4. Theoretical Study on Measure of Hydrogen Bonding Strength: R-C≡N…pyrrole Complexes

    Institute of Scientific and Technical Information of China (English)

    史福强; 安静仪; 俞稼镛

    2005-01-01

    The R-C≡N…pyrrole (R=H, CH3, CH2F, CHF2, CF3, NH2, BH2, OH, F, CH2Cl, CHCl2, CCl3, Li, Na) complexes were considered as the simple sample for measure of hydrogen bonding strength. Density functional theory B3LYP/6-311 + + G** level was applied to the optimization of geometries of complexes and monomers. Measure of hydrogen bonding strength based on geometrical and topological parameters, which were derived from the AIM theory, was analyzed. Additionally, natural bond orbital (NBO) analysis and frequency calculations were performed.From the computation results it was found that the electronic density at N-H bond critical points was also strictly correlated with the hydrogen bonding strength.

  5. Progress on Simultaneous PLIF/PIV Measurements for a Turbulent Complex Fluid Interface

    Science.gov (United States)

    Reilly, David; Mohaghar, Mohammad; Carter, John; McFarland, Jacob; Ranjan, Devesh

    2015-11-01

    Experiments were performed at the inclined shock tube facility at Georgia Institute of Technology to study a Richtmyer-Meshkov unstable complex interface. The complex density stratification was achieved by counter flowing N2 over CO2 in order to create shear and buoyancy effects. The resulting Atwood number is 0.23 with an incident shock strength of Mach 1.55 and an angle of inclination of 80°. High-resolution, full-field simultaneous Planar Laser-Induced Fluorescence (PLIF) and Particle Image Velocimetry (PIV) was employed to measure density and velocity statistics, respectively. For the first time with the inclined interface, mixing parameters from the BHR (Besnard-Harlow-Rauenzahn) model, including the density self-correlation and turbulent mass flux, are determined from experiments. Secondary modes added to the interface result in markedly greater mixing compared to the simple inclined interface as measured by mixedness and mixed mass.

  6. Quantification of spatial structure of human proximal tibial bone biopsies using 3D measures of complexity

    DEFF Research Database (Denmark)

    Saparin, Peter I.; Thomsen, Jesper Skovhus; Prohaska, Steffen;

    2005-01-01

    Changes in trabecular bone composition during development of osteoporosis are used as a model for bone loss in microgravity conditions during a space flight. Symbolic dynamics and measures of complexity are proposed and applied to assess quantitatively the structural composition of bone tissue from...... 3D data sets of human tibia bone biopsies acquired by a micro-CT scanner. In order to justify the newly proposed approach, the measures of complexity of the bone architecture were compared with the results of traditional 2D bone histomorphometry. The proposed technique is able to quantify...... the structural loss of the bone tissue and may help to diagnose and to monitor changes in bone structure of patients on Earth as well as of the space-flying personnel....

  7. Information entropy to measure the spatial and temporal complexity of solute transport in heterogeneous porous media

    Science.gov (United States)

    Li, Weiyao; Huang, Guanhua; Xiong, Yunwu

    2016-04-01

    The complexity of the spatial structure of porous media, randomness of groundwater recharge and discharge (rainfall, runoff, etc.) has led to groundwater movement complexity, physical and chemical interaction between groundwater and porous media cause solute transport in the medium more complicated. An appropriate method to describe the complexity of features is essential when study on solute transport and conversion in porous media. Information entropy could measure uncertainty and disorder, therefore we attempted to investigate complexity, explore the contact between the information entropy and complexity of solute transport in heterogeneous porous media using information entropy theory. Based on Markov theory, two-dimensional stochastic field of hydraulic conductivity (K) was generated by transition probability. Flow and solute transport model were established under four conditions (instantaneous point source, continuous point source, instantaneous line source and continuous line source). The spatial and temporal complexity of solute transport process was characterized and evaluated using spatial moment and information entropy. Results indicated that the entropy increased as the increase of complexity of solute transport process. For the point source, the one-dimensional entropy of solute concentration increased at first and then decreased along X and Y directions. As time increased, entropy peak value basically unchanged, peak position migrated along the flow direction (X direction) and approximately coincided with the centroid position. With the increase of time, spatial variability and complexity of solute concentration increase, which result in the increases of the second-order spatial moment and the two-dimensional entropy. Information entropy of line source was higher than point source. Solute entropy obtained from continuous input was higher than instantaneous input. Due to the increase of average length of lithoface, media continuity increased, flow and

  8. Wettability of reservoir rock and fluid systems from complex resistivity measurements

    Energy Technology Data Exchange (ETDEWEB)

    Moss, A.K.; Jing, X.D.; Archer, J.S. [Department of Earth Science and Engineering, Imperial College of Science, Technology and Medicine, London (United Kingdom)

    2002-04-01

    Electrical resistivity measurements at a single low AC frequency have long been recognized as providing an indication of the wettability of reservoir rock and fluid systems. However, the resistivity response over a range of frequencies for samples of varying wettability is not so well characterized. Data is presented from reservoir core plugs of differing lithologies, permeabilities, and wettabilities. The complex resistivity response at differing saturations and wettability was measured. This research group has been investigating relationships between complex resistivity, permeability, and clay content, described in previous research papers. This study extends this work to include wettability. Electrical resistivity measurements in the low-frequency range (10 Hz-10 kHz) include an electrode polarization effect. At frequencies between 10 and 200 kHz, the electrode polarization effect is reduced and the bulk sample response measured. An Argand diagram analysis is employed to find the critical frequency (f{sub c}) separating the electrode polarization from the bulk sample response. Samples are tested in a multi-sample rig at hydrostatic reservoir overburden stresses. The test equipment allows the measurement of resistivity in the two or four electrode configurations over a frequency range from 10 Hz to 1 MHz during drainage and imbibition cycles. Multi-electrodes down the sample length allow saturation monitoring and thus the detection of any saturation inhomogeneity throughout the samples. Sample wettability is evaluated using the Amott-Harvey wettability index (AHWI) on adjacent samples and change in Archie Saturation exponent before and after aging in crude oil. The effect of frequency dispersion was analysed in relation to pore-scale fluid distribution and, hence, wettability. The results suggest complex resistivity measurement have the potential as a non-invasive technique to evaluate reservoir wettability.

  9. Quantifying the improvement of surrogate indices of hepatic insulin resistance using complex measurement techniques.

    Directory of Open Access Journals (Sweden)

    John G Hattersley

    Full Text Available We evaluated the ability of simple and complex surrogate-indices to identify individuals from an overweight/obese cohort with hepatic insulin-resistance (HEP-IR. Five indices, one previously defined and four newly generated through step-wise linear regression, were created against a single-cohort sample of 77 extensively characterised participants with the metabolic syndrome (age 55.6 ± 1.0 years, BMI 31.5 ± 0.4 kg/m(2; 30 males. HEP-IR was defined by measuring endogenous-glucose-production (EGP with [6-6(2H(2] glucose during fasting and euglycemic-hyperinsulinemic clamps and expressed as EGP*fasting plasma insulin. Complex measures were incorporated into the model, including various non-standard biomarkers and the measurement of body-fat distribution and liver-fat, to further improve the predictive capability of the index. Validation was performed against a data set of the same subjects after an isoenergetic dietary intervention (4 arms, diets varying in protein and fiber content versus control. All five indices produced comparable prediction of HEP-IR, explaining 39-56% of the variance, depending on regression variable combination. The validation of the regression equations showed little variation between the different proposed indices (r(2 = 27-32% on a matched dataset. New complex indices encompassing advanced measurement techniques offered an improved correlation (r = 0.75, P<0.001. However, when validated against the alternative dataset all indices performed comparably with the standard homeostasis model assessment for insulin resistance (HOMA-IR (r = 0.54, P<0.001. Thus, simple estimates of HEP-IR performed comparable to more complex indices and could be an efficient and cost effective approach in large epidemiological investigations.

  10. An Assessment of Wind Plant Complex Flows Using Advanced Doppler Radar Measurements

    Science.gov (United States)

    Gunter, W. S.; Schroeder, J.; Hirth, B.; Duncan, J.; Guynes, J.

    2015-12-01

    As installed wind energy capacity continues to steadily increase, the need for comprehensive measurements of wind plant complex flows to further reduce the cost of wind energy has been well advertised by the industry as a whole. Such measurements serve diverse perspectives including resource assessment, turbine inflow and power curve validation, wake and wind plant layout model verification, operations and maintenance, and the development of future advanced wind plant control schemes. While various measurement devices have been matured for wind energy applications (e.g. meteorological towers, LIDAR, SODAR), this presentation will focus on the use of advanced Doppler radar systems to observe the complex wind flows within and surrounding wind plants. Advanced Doppler radars can provide the combined advantage of a large analysis footprint (tens of square kilometers) with rapid data analysis updates (a few seconds to one minute) using both single- and dual-Doppler data collection methods. This presentation demonstrates the utility of measurements collected by the Texas Tech University Ka-band (TTUKa) radars to identify complex wind flows occurring within and nearby operational wind plants, and provide reliable forecasts of wind speeds and directions at given locations (i.e. turbine or instrumented tower sites) 45+ seconds in advance. Radar-derived wind maps reveal commonly observed features such as turbine wakes and turbine-to-turbine interaction, high momentum wind speed channels between turbine wakes, turbine array edge effects, transient boundary layer flow structures (such as wind streaks, frontal boundaries, etc.), and the impact of local terrain. Operational turbine or instrumented tower data are merged with the radar analysis to link the observed complex flow features to turbine and wind plant performance.

  11. An entropy-based measure of hydrologic complexity and its applications

    OpenAIRE

    Castillo, Aldrich; Castelli, Fabio; Entekhabi, Dara

    2015-01-01

    Abstract Basin response and hydrologic fluxes are functions of hydrologic states, most notably of soil moisture. However, characterization of hillslope‐scale soil moisture is challenging since it is both spatially heterogeneous and dynamic. This paper introduces an entropy‐based and discretization‐invariant dimensionless index of hydrologic complexity H that measures the distance of a given distribution of soil moisture from a Dirac delta (most organization) and a uniform distribution (widest...

  12. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    OpenAIRE

    Szi-Wen Chen

    2007-01-01

    A novel approach that employs a complexity-based sequential hypothesis testing (SHT) technique for real-time detection of ventricular fibrillation (VF) and ventricular tachycardia (VT) is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG) recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For ...

  13. Instrumentation measurement and testing complex for detection and identification of radioactive materials using the emitted radiation

    International Nuclear Information System (INIS)

    Simultaneous measurement of neutron and gamma radiation is a very usefull method for effective nuclear materials identification and control. The gamma-ray-neutron complex described in the paper is based on two multi-layer 3He neutrons detectors and two High Pressure Xenon gamma-ray spectrometers assembled in one unit. All these detectors were callibrated on neutron and gamma-ray sources. The main characteristics of the instrumentation , its testing results and gamma-ray and neutron radiation parameters, which have been measured are represented in the paper. The gamma-neutron sources and fissile materials reliable detection and identification capability was demonstrated

  14. The Born rule from a consistency requirement on hidden measurements in complex Hilbert space

    CERN Document Server

    Aerts, S

    2002-01-01

    We formalize the hidden measurement approach within the very general notion of an interactive probability model. We narrow down the model by assuming the state space of a physical entity is a complex Hilbert space and introduce the principle of consistent interaction which effectively partitions the space of apparatus states. The normalized measure of the set of apparatus states that interact with a pure state giving rise to a fixed outcome is shown to be in accordance with the probability obtained using the Born rule.

  15. An improvement on measure methods of the complexity theory and its applications

    Institute of Scientific and Technical Information of China (English)

    Wang Fu-Lai; Yang Hui-Huang

    2009-01-01

    A new method is proposed to transform the time series gained from a dynamic system to a symbolic series which extracts both overall and local information of the time series. Based on the transformation,two measures are defined to characterize the complexity of the symbolic series. The measures reflect the sensitive dependence of chaotic systems on initial conditions and the randomness of a time series,and thus can distinguish periodic or completely random series from chaotic time series even though the lengths of the time series are not long. Finally,the logistic map and the two-parameter Hen6n map are studied and the results are satisfactory.

  16. RNACompress: Grammar-based compression and informational complexity measurement of RNA secondary structure

    Directory of Open Access Journals (Sweden)

    Chen Chun

    2008-03-01

    Full Text Available Abstract Background With the rapid emergence of RNA databases and newly identified non-coding RNAs, an efficient compression algorithm for RNA sequence and structural information is needed for the storage and analysis of such data. Although several algorithms for compressing DNA sequences have been proposed, none of them are suitable for the compression of RNA sequences with their secondary structures simultaneously. This kind of compression not only facilitates the maintenance of RNA data, but also supplies a novel way to measure the informational complexity of RNA structural data, raising the possibility of studying the relationship between the functional activities of RNA structures and their complexities, as well as various structural properties of RNA based on compression. Results RNACompress employs an efficient grammar-based model to compress RNA sequences and their secondary structures. The main goals of this algorithm are two fold: (1 present a robust and effective way for RNA structural data compression; (2 design a suitable model to represent RNA secondary structure as well as derive the informational complexity of the structural data based on compression. Our extensive tests have shown that RNACompress achieves a universally better compression ratio compared with other sequence-specific or common text-specific compression algorithms, such as Gencompress, winrar and gzip. Moreover, a test of the activities of distinct GTP-binding RNAs (aptamers compared with their structural complexity shows that our defined informational complexity can be used to describe how complexity varies with activity. These results lead to an objective means of comparing the functional properties of heteropolymers from the information perspective. Conclusion A universal algorithm for the compression of RNA secondary structure as well as the evaluation of its informational complexity is discussed in this paper. We have developed RNACompress, as a useful tool

  17. MEASURING OBJECT-ORIENTED SYSTEMS BASED ON THE EXPERIMENTAL ANALYSIS OF THE COMPLEXITY METRICS

    Directory of Open Access Journals (Sweden)

    J.S.V.R.S.SASTRY,

    2011-05-01

    Full Text Available Metrics are used to help a software engineer in quantitative analysis to assess the quality of the design before a system is built. The focus of Object-Oriented metrics is on the class which is the fundamental building block of the Object-Oriented architecture. These metrics are focused on internal object structure and external object structure. Internal object structure reflects the complexity of each individual entity such as methods and classes. External complexity measures the interaction among entities such as Coupling and Inheritance. This paper mainly focuses on a set of object oriented metrics that can be used to measure the quality of an object oriented design. Two types of complexity metrics in Object-Oriented paradigm namely Mood metrics and Lorenz & Kidd metrics. Mood metrics consist of Method inheritance factor(MIF, Coupling factor(CF, Attribute inheritance factor(AIF, Method hiding factor(MHF, Attribute hiding factor(AHF, and polymorphism factor(PF. Lorenz & Kidd metrics consist of Number of operations overridden (NOO, Number operations added (NOA, Specialization index(SI. Mood metrics and Lorenz & Kidd metrics measurements are used mainly by designers and testers. Designers uses these metrics to access the software early in process,making changes that will reduce complexity and improve the continuing capability of the design. Testers use to test the software for finding the complexity, performance of the system, quality of the software. This paper reviews Mood metrics and Lorenz & Kidd metrics are validates theoretically and empirically methods. In thispaper, work has been done to explore the quality of design of software components using object oriented paradigm. A number of object oriented metrics have been proposed in the literature for measuring the design attributes such as inheritance, coupling, polymorphism etc. This paper, metrics have been used to analyzevarious features of software component. Complexity of methods

  18. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces.

    Science.gov (United States)

    Nakano, Yuichiro; Akamatsu, Norihiko; Mori, Tsuyoshi; Sano, Kazunori; Satoh, Katsuya; Nagayasu, Takeshi; Miyoshi, Yoshiaki; Sugio, Tomomi; Sakai, Hideyuki; Sakae, Eiji; Ichimiya, Kazuko; Hamada, Masahisa; Nakayama, Takehisa; Fujita, Yuhzo; Yanagihara, Katsunori; Nishida, Noriyuki

    2016-01-01

    Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9) with sonication, and then with acidic water (pH 2.7) without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa) and a fungus (Candida albicans) were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound. PMID:27223116

  19. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces.

    Science.gov (United States)

    Nakano, Yuichiro; Akamatsu, Norihiko; Mori, Tsuyoshi; Sano, Kazunori; Satoh, Katsuya; Nagayasu, Takeshi; Miyoshi, Yoshiaki; Sugio, Tomomi; Sakai, Hideyuki; Sakae, Eiji; Ichimiya, Kazuko; Hamada, Masahisa; Nakayama, Takehisa; Fujita, Yuhzo; Yanagihara, Katsunori; Nishida, Noriyuki

    2016-01-01

    Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9) with sonication, and then with acidic water (pH 2.7) without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa) and a fungus (Candida albicans) were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound.

  20. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces

    Science.gov (United States)

    Nakano, Yuichiro; Akamatsu, Norihiko; Mori, Tsuyoshi; Sano, Kazunori; Satoh, Katsuya; Nagayasu, Takeshi; Miyoshi, Yoshiaki; Sugio, Tomomi; Sakai, Hideyuki; Sakae, Eiji; Ichimiya, Kazuko; Hamada, Masahisa; Nakayama, Takehisa; Fujita, Yuhzo; Yanagihara, Katsunori; Nishida, Noriyuki

    2016-01-01

    Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9) with sonication, and then with acidic water (pH 2.7) without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa) and a fungus (Candida albicans) were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound. PMID:27223116

  1. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces.

    Directory of Open Access Journals (Sweden)

    Yuichiro Nakano

    Full Text Available Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9 with sonication, and then with acidic water (pH 2.7 without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa and a fungus (Candida albicans were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound.

  2. In vivo and in vitro measurements of complex-type chromosomal exchanges induced by heavy ions.

    Science.gov (United States)

    George, K; Durante, M; Wu, H; Willingham, V; Cucinotta, F A

    2003-01-01

    Heavy ions are more efficient in producing complex-type chromosome exchanges than sparsely ionizing radiation, and this can potentially be used as a biomarker of radiation quality. We measured the induction of complex-type chromosomal aberrations in human peripheral blood lymphocytes exposed in vitro to accelerated H-, He-, C-, Ar-, Fe- and Au-ions in the LET range of approximately 0.4-1400 keV/micrometers. Chromosomes were analyzed either at the first post-irradiation mitosis, or in interphase, following premature condensation by phosphatase inhibitors. Selected chromosomes were then visualized after FISH-painting. The dose-response curve for the induction of complex-type exchanges by heavy ions was linear in the dose-range 0.2-1.5 Gy, while gamma-rays did not produce a significant increase in the yield of complex rearrangements in this dose range. The yield of complex aberrations after 1 Gy of heavy ions increased up to an LET around 100 keV/micrometers, and then declined at higher LET values. When mitotic cells were analyzed, the frequency of complex rearrangements after 1 Gy was about 10 times higher for Ar- or Fe- ions (the most effective ions, with LET around 100 keV/micrometers) than for 250 MeV protons, and values were about 35 times higher in prematurely condensed chromosomes. These results suggest that complex rearrangements may be detected in astronauts' blood lymphocytes after long-term space flight, because crews are exposed to HZE particles from galactic cosmic radiation. However, in a cytogenetic study of ten astronauts after long-term missions on the Mir or International Space Station, we found a very low frequency of complex rearrangements, and a significant post-flight increase was detected in only one out of the ten crewmembers. It appears that the use of complex-type exchanges as biomarker of radiation quality in vivo after low-dose chronic exposure in mixed radiation fields is hampered by statistical uncertainties. PMID:12971407

  3. Study of proton-transfer processes by the NMR method applied to various nuclei. VIII. The trifluoroacetic acid-water system

    International Nuclear Information System (INIS)

    It was shown earlier that the determination of the composition and type of the complexes is possible by the use of the NMR method applied to various nuclei. This method is based on the simultaneous solution of equations describing the concentration dependence of the NMR chemical shifts for the various nuclei in the system and material-balance equations. It has been applied to the investigation of complex-formation and proton-transfer processes in the nitric acid-water system. In the present work the authors studied aqueous solutions of an acid that is weaker than nitric acid, namely trifluoroacetic acid, both of the usual isotopic composition, and also a sample deuterated to the extent of 97.65%, in the concentration range of 0-100 mole %. The considerable changes in the chemical shifts of the 1H, 13C, and 19F nuclei, depending on the concentration, indicate the formation of complexes of various types and compositions

  4. An information complexity index for probability measures on ℝ with all moments

    Science.gov (United States)

    Accardi, Luigi; Barhoumi, Abdessatar; Rhaima, Mohamed

    2016-08-01

    We prove that, each probability meassure on ℝ, with all moments, is canonically associated with (i) a ∗-Lie algebra; (ii) a complexity index labeled by pairs of natural integers. The measures with complexity index (0,K) consist of two disjoint classes: that of all measures with finite support and the semi-circle-arcsine class (the discussion in Sec. 4.1 motivates this name). The class C(μ) = (0, 0) coincides with the δ-measures in the finite support case and includes the semi-circle laws in the infinite support case. In the infinite support case, the class C(μ) = (0, 1) includes the arcsine laws, and the class C(μ) = (0, 2) appeared in central limit theorems of quantum random walks in the sense of Konno. The classes C(μ) = (0,K), with K ≥ 3, do not seem to be present in the literature. The class (1, 0) includes the Gaussian and Poisson measures and the associated ∗-Lie algebra is the Heisenberg algebra. The class (2, 0) includes the non-standard (i.e. neither Gaussian nor Poisson) Meixner distributions and the associated ∗-Lie algebra is a central extension of sl(2, ℝ). Starting from n = 3, the ∗-Lie algebra associated to the class (n, 0) is infinite dimensional and the corresponding classes include the higher powers of the standard Gaussian.

  5. A Measure for Brain Complexity: Relating Functional Segregation and Integration in the Nervous System

    Science.gov (United States)

    Tononi, Giulio; Sporns, Olaf; Edelman, Gerald M.

    1994-05-01

    In brains of higher vertebrates, the functional segregation of local areas that differ in their anatomy and physiology contrasts sharply with their global integration during perception and behavior. In this paper, we introduce a measure, called neural complexity (C_N), that captures the interplay between these two fundamental aspects of brain organization. We express functional segregation within a neural system in terms of the relative statistical independence of small subsets of the system and functional integration in terms of significant deviations from independence of large subsets. C_N is then obtained from estimates of the average deviation from statistical independence for subsets of increasing size. C_N is shown to be high when functional segregation coexists with integration and to be low when the components of a system are either completely independent (segregated) or completely dependent (integrated). We apply this complexity measure in computer simulations of cortical areas to examine how some basic principles of neuroanatomical organization constrain brain dynamics. We show that the connectivity patterns of the cerebral cortex, such as a high density of connections, strong local connectivity organizing cells into neuronal groups, patchiness in the connectivity among neuronal groups, and prevalent reciprocal connections, are associated with high values of C_N. The approach outlined here may prove useful in analyzing complexity in other biological domains such as gene regulation and embryogenesis.

  6. Application of a Dual-Arm Robot in Complex Sample Preparation and Measurement Processes.

    Science.gov (United States)

    Fleischer, Heidi; Drews, Robert Ralf; Janson, Jessica; Chinna Patlolla, Bharath Reddy; Chu, Xianghua; Klos, Michael; Thurow, Kerstin

    2016-10-01

    Automation systems with applied robotics have already been established in industrial applications for many years. In the field of life sciences, a comparable high level of automation can be found in the areas of bioscreening and high-throughput screening. Strong deficits still exist in the development of flexible and universal fully automated systems in the field of analytical measurement. Reasons are the heterogeneous processes with complex structures, which include sample preparation and transport, analytical measurements using complex sensor systems, and suitable data analysis and evaluation. Furthermore, the use of nonstandard sample vessels with various shapes and volumes results in an increased complexity. The direct use of existing automation solutions from bioscreening applications is not possible. A flexible automation system for sample preparation, analysis, and data evaluation is presented in this article. It is applied for the determination of cholesterol in biliary endoprosthesis using gas chromatography-mass spectrometry (GC-MS). A dual-arm robot performs both transport and active manipulation tasks to ensure human-like operation. This general robotic concept also enables the use of manual laboratory devices and equipment and is thus suitable in areas with a high standardization grade.

  7. Growing complex network of citations of scientific papers -- measurements and modeling

    CERN Document Server

    Golosovsky, M

    2016-01-01

    To quantify the mechanism of a complex network growth we focus on the network of citations of scientific papers and use a combination of the theoretical and experimental tools to uncover microscopic details of this network growth. Namely, we develop a stochastic model of citation dynamics based on copying/redirection/triadic closure mechanism. In a complementary and coherent way, the model accounts both for statistics of references of scientific papers and for their citation dynamics. Originating in empirical measurements, the model is cast in such a way that it can be verified quantitatively in every aspect. Such verification is performed by measuring citation dynamics of Physics papers. The measurements revealed nonlinear citation dynamics, the nonlinearity being intricately related to network topology. The nonlinearity has far-reaching consequences including non-stationary citation distributions, diverging citation trajectory of similar papers, runaways or "immortal papers" with infinite citation lifetime ...

  8. Application of FEM to estimate complex permittivity of dielectric material at microwave frequency using waveguide measurements

    Science.gov (United States)

    Deshpande, M. D.; Reddy, C. J.

    1995-01-01

    A simple waveguide measurement technique is presented to determine the complex dielectric constant of a dielectric material. The dielectric sample is loaded in a shorted x-band rectangular waveguide. Using a network analyzer; the reflection coefficient of the shorted waveguide (loaded with sample) is measured. Using the Finite Element Method (FEM), the exact reflection coefficient of the shorted waveguide (loaded with sample) is determined as a function of the dielectric constant. Matching the measured value of the reflection coefficient with the reflection coefficient calculated using the FEM utilizing the Newton-Raphson Method, an estimate of the dielectric constant of a dielectric material is obtained. A comparison of estimated values of dielectric constant obtained from simple waveguide modal theory and the present approach is presented.

  9. Comparison of Different Measurement Techniques and a CFD Simulation in Complex Terrain

    Science.gov (United States)

    Schulz, Christoph; Hofsäß, Martin; Anger, Jan; Rautenberg, Alexander; Lutz, Thorsten; Cheng, Po Wen; Bange, Jens

    2016-09-01

    This paper deals with a comparison of data collected by measurements and a simulation for a complex terrain test site in southern Germany. Lidar, met mast, unmanned aerial vehicle (UAV) measurements of wind speed and direction and Computational Fluid Dynamics (CFD) data are compared to each other. The site is characterised regarding its flow features and the suitability for a wind turbine test field. A Delayed-Detached-Eddy- Simulation (DES) was employed using measurement data to generate generic turbulent inflow. A good agreement of the wind profiles between the different approaches was reached. The terrain slope leads to a speed-up, a change of turbulence intensity as well as to flow angle variations.

  10. Strategies to develop and evaluate soil conservation measures for complex mountainous farmland in South Korea

    Science.gov (United States)

    Arnhold, S.; Huwe, B.

    2012-04-01

    Soil erosion by water can generate serious damages in mountainous ecosystems by the irreversible loss of soil productivity and the degradation of surface water quality. A substantial impact on the quantity of erosion and the amount of transported soil has the local land management. The application of best management practices in regions affected by high soil erosion is the major goal of conservation planning. Management practices include tillage operations and crop cultivation on farmland, but also landscape structuring by field margins, forest patches and riparian areas. Developing proper management strategies for a certain area require careful planning, because they are often associated with high costs and use restrictions for the local people. Different potential control measures are not only strongly variable in their effectiveness, but in certain cases they can even produce higher erosion rates. Therefore effective conservation planning requires individual treatments depending on the local conditions, and it should consider all important factors controlling the impact of each management measure. Objective of this work is to derive possible management measures for mountainous farmland areas in the watershed of the Soyang Lake in South Korea, which are characterized by intense agriculture and heavy monsoonal rain events during the summer months. The complex topography and heterogeneous soil and land use conditions of those areas play a primary role in soil erosion processes and require special consideration for developing conservation measures. The complexity of factors governing erosion processes and the difficulties of evaluating erosion control measures are described on the basis of recent studies focusing on local farmland management and its effect on erosion in this region. We present the types of data bases, which are needed to develop erosion control measures and show different methods, which can be applied to obtain those information. Possible strategies

  11. Measuring working memory in aphasia: Comparing performance on complex span and N-back tasks

    Directory of Open Access Journals (Sweden)

    Maria Ivanova

    2014-04-01

    No significant correlations were observed between performance on complex span task and N-back tasks.Furthermore, performance on the modified listening span was related to performance on the comprehension subtest of the QASA, while no relationship was found for 2-back and 0-back tasks.Our results mirror studies in healthy controls that demonstrated no relationship between performance on the two tasks(Jaeggi et al., 2010; Kane et al., 2007. Thus although N-back tasks seem similar to traditional complex span measures and may also index abilities related to cognitive processing, the evidence to date does not warrant their direct association with the construct of WM. Implications for future investigation of cognitive deficits in aphasia will be discussed.

  12. Simulation of complex glazing products; from optical data measurements to model based predictive controls

    Energy Technology Data Exchange (ETDEWEB)

    Kohler, Christian

    2012-08-01

    Complex glazing systems such as venetian blinds, fritted glass and woven shades require more detailed optical and thermal input data for their components than specular non light-redirecting glazing systems. Various methods for measuring these data sets are described in this paper. These data sets are used in multiple simulation tools to model the thermal and optical properties of complex glazing systems. The output from these tools can be used to generate simplified rating values or as an input to other simulation tools such as whole building annual energy programs, or lighting analysis tools. I also describe some of the challenges of creating a rating system for these products and which factors affect this rating. A potential future direction of simulation and building operations is model based predictive controls, where detailed computer models are run in real-time, receiving data for an actual building and providing control input to building elements such as shades.

  13. Positron life time and annihilation Doppler broadening measurements on transition metal complexes

    Energy Technology Data Exchange (ETDEWEB)

    Levay, B. (Eoetvoes Lorand Tudomanyegyetem, Budapest (Hungary). Fizikai Kemiai es Radiologiai Tanszek); Varhelyi, Cs. (Babes-Bolyai Univ., Cluj (Romania)); Burger, K. (Eoetvoes Lorand Tudomanyegyetem, Budapest (Hungary). Szervetlen es Analitikai Kemiai Intezet)

    1982-01-01

    Positron life time and annihilation Doppler broadening measurements have been carried out on 44 solid coordination compounds. Several correlations have been found between the annihilation life time (tau/sub 1/) and line shape parameters (L) and the chemical structure of the compounds. Halide ligands were the most active towards positrons. This fact supports the assumption on the possible formation of (e/sup +/X/sup -/) positron-halide bound state. The life time was decreasing and the annihilation energy spectra were broadening with the increasing negative character of the halides. The aromatic base ligands affected the positron-halide interaction according to their basicity and space requirement and thus they indirectly affected the annihilation parameters, too. In the planar and tetrahedral complexes the electron density on the central met--al ion affected directly the annihilation parameters, while in the octahedral mixed complexes it had only an ind--irect effect through the polarization of the halide ligands.

  14. Combining measurements to estimate properties and characterization extent of complex biochemical mixtures; applications to Heparan Sulfate

    Science.gov (United States)

    Pradines, Joël R.; Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Farutin, Victor; Huang, Yongqing; Gunay, Nur Sibel; Capila, Ishan

    2016-04-01

    Complex mixtures of molecular species, such as glycoproteins and glycosaminoglycans, have important biological and therapeutic functions. Characterization of these mixtures with analytical chemistry measurements is an important step when developing generic drugs such as biosimilars. Recent developments have focused on analytical methods and statistical approaches to test similarity between mixtures. The question of how much uncertainty on mixture composition is reduced by combining several measurements still remains mostly unexplored. Mathematical frameworks to combine measurements, estimate mixture properties, and quantify remaining uncertainty, i.e. a characterization extent, are introduced here. Constrained optimization and mathematical modeling are applied to a set of twenty-three experimental measurements on heparan sulfate, a mixture of linear chains of disaccharides having different levels of sulfation. While this mixture has potentially over two million molecular species, mathematical modeling and the small set of measurements establish the existence of nonhomogeneity of sulfate level along chains and the presence of abundant sulfate repeats. Constrained optimization yields not only estimations of sulfate repeats and sulfate level at each position in the chains but also bounds on these levels, thereby estimating the extent of characterization of the sulfation pattern which is achieved by the set of measurements.

  15. Detecting Microbial Growth and Metabolism in Geologic Media with Complex Conductivity Measurements

    Science.gov (United States)

    Davis, C. A.; Atekwana, E. A.; Slater, L. D.; Bottrell, P. M.; Chasten, L. E.; Heidenreich, J. D.

    2006-05-01

    Complex conductivity measurements between 0.1-1000 Hz were obtained from biostimulated sand-packed (coarse and mixed fine and medium grain) columns to investigate microbial growth, biofilm formation, and microbial metabolism on the electrical properties of porous media. Microbial growth and metabolism was verified by direct microbial counts, pH changes, and environmental scanning electron microscope imaging. Peaks in imaginary (interfacial) conductivity in the coarse grain columns occurred concurrently with peaks in the microbial cell concentrations. The magnitude of the imaginary conductivity response in the mixed fine and medium grain columns, however, was low compared to the coarse grain sand columns, consistent with lower microbial cell concentrations. It is possible that the pore size in the mixed fine and medium grain sand restricted bacteria cell division, inhibiting microbial growth, and thus the smaller magnitude imaginary conductivity response. The biostimulated columns for both grain sizes displayed similar trends and showed an increase in the real (electrolytic) conductivity and decrease in pH over time. Dynamic changes in the imaginary conductivity arises from the growth and attachment of microbial cells and biofilms to surfaces, whereas, changes in the real conductivity arises from the release of byproducts (ionic species) of microbial metabolism. We conclude that complex conductivity techniques are feasible sensors for detecting microbial growth (imaginary conductivity measurements) and metabolism (real conductivity measurements) with implications for bioremediation and astrobiology studies.

  16. Recurrence-plot-based measures of complexity and their application to heart-rate-variability data

    Science.gov (United States)

    Marwan, Norbert; Wessel, Niels; Meyerfeldt, Udo; Schirdewan, Alexander; Kurths, Jürgen

    2002-08-01

    The knowledge of transitions between regular, laminar or chaotic behaviors is essential to understand the underlying mechanisms behind complex systems. While several linear approaches are often insufficient to describe such processes, there are several nonlinear methods that, however, require rather long time observations. To overcome these difficulties, we propose measures of complexity based on vertical structures in recurrence plots and apply them to the logistic map as well as to heart-rate-variability data. For the logistic map these measures enable us not only to detect transitions between chaotic and periodic states, but also to identify laminar states, i.e., chaos-chaos transitions. The traditional recurrence quantification analysis fails to detect the latter transitions. Applying our measures to the heart-rate-variability data, we are able to detect and quantify the laminar phases before a life-threatening cardiac arrhythmia occurs thereby facilitating a prediction of such an event. Our findings could be of importance for the therapy of malignant cardiac arrhythmias.

  17. Operational Complexity of Supplier-Customer Systems Measured by Entropy—Case Studies

    Directory of Open Access Journals (Sweden)

    Ladislav Lukáš

    2016-04-01

    Full Text Available This paper discusses a unified entropy-based approach for the quantitative measurement of operational complexity of company supplier-customer relations. Classical Shannon entropy is utilized. Beside this quantification tool, we also explore the relations between Shannon entropy and (c,d-entropy in more details. An analytic description of so called iso-quant curves is given, too. We present five case studies, albeit in an anonymous setting, describing various details of general procedures for measuring the operational complexity of supplier-customer systems. In general, we assume a problem-oriented database exists, which contains detailed records of all product forecasts, orders and deliveries both in quantity and time, scheduled and realized, too. Data processing detects important flow variations both in volumes and times, e.g., order—forecast, delivery—order, and actual production—scheduled one. The unifying quantity used for entropy computation is the time gap between actual delivery time and order issue time, which is nothing else but a lead time in inventory control models. After data consistency checks, histograms and empirical distribution functions are constructed. Finally, the entropy, information-theoretic measure of supplier-customer operational complexity, is calculated. Basic steps of the algorithm are mentioned briefly, too. Results of supplier-customer system analysis from selected Czech small and medium-sized enterprises (SMEs are presented in various computational and managerial decision making details. An enterprise is ranked as SME one, if it has at most 250 employees and its turnover does not exceed 50 million USD per year, or its balance sheet total does not exceed 43 million USD per year, alternatively.

  18. A Thorax Simulator for Complex Dynamic Bioimpedance Measurements With Textile Electrodes.

    Science.gov (United States)

    Ulbrich, Mark; Muhlsteff, Jens; Teichmann, Daniel; Leonhardt, Steffen; Walter, Marian

    2015-06-01

    Bioimpedance measurements on the human thorax are suitable for assessment of body composition or hemodynamic parameters, such as stroke volume; they are non-invasive, easy in application and inexpensive. When targeting personal healthcare scenarios, the technology can be integrated into textiles to increase ease, comfort and coverage of measurements. Bioimpedance is generally measured using two electrodes injecting low alternating currents (0.5-10 mA) and two additional electrodes to measure the corresponding voltage drop. The impedance is measured either spectroscopically (bioimpedance spectroscopy, BIS) between 5 kHz and 1 MHz or continuously at a fixed frequency around 100 kHz (impedance cardiography, ICG). A thorax simulator is being developed for testing and calibration of bioimpedance devices and other new developments. For the first time, it is possible to mimic the complete time-variant properties of the thorax during an impedance measurement. This includes the dynamic real part and dynamic imaginary part of the impedance with a peak-to-peak value of 0.2 Ω and an adjustable base impedance (24.6 Ω ≥ Z0 ≥ 51.6 Ω). Another novelty is adjustable complex electrode-skin contact impedances for up to 8 electrodes to evaluate bioimpedance devices in combination with textile electrodes. In addition, an electrocardiographic signal is provided for cardiographic measurements which is used in ICG devices. This provides the possibility to generate physiologic impedance changes, and in combination with an ECG, all parameters of interest such as stroke volume (SV), pre-ejection period (PEP) or extracellular resistance (Re) can be simulated. The speed of all dynamic signals can be altered. The simulator was successfully tested with commercially available BIS and ICG devices and the preset signals are measured with high correlation (r = 0.996). PMID:25148671

  19. Raman spectroscopy of the system iron(III)-sulfuric acid-water: an approach to Tinto River's (Spain) hydrogeochemistry.

    Science.gov (United States)

    Sobron, P; Rull, F; Sobron, F; Sanz, A; Medina, J; Nielsen, C J

    2007-12-15

    Acid mine drainage is formed when pyrite (FeS(2)) is exposed and reacts with air and water to form sulfuric acid and dissolved iron. Tinto River (Huelva, Spain) is an example of this phenomenon. In this study, Raman spectroscopy has been used to investigate the speciation of the system iron(III)-sulfuric acid-water as an approach to Tinto River's aqueous solutions. The molalities of sulfuric acid (0.09 mol/kg) and iron(III) (0.01-1.5 mol/kg) were chosen to mimic the concentration of the species in Tinto River waters. Raman spectra of the solutions reveal a strong iron(III)-sulfate inner-sphere interaction through the nu(1) sulfate band at 981 cm(-1) and its shoulder at 1005 cm(-1). Iron(III)-sulfate interaction may also be facilitated by hydrogen bonds and monitored in the Raman spectra through the symmetric stretching band of bisulfate at 1052 cm(-1) and a shoulder at 1040 cm(-1). Other bands in the low-frequency region of the Raman spectra are attributed to the hydrogen-bonded complexes formation as well. PMID:17869164

  20. Fine-grained permutation entropy as a measure of natural complexity for time series

    Institute of Scientific and Technical Information of China (English)

    Liu Xiao-Feng; Wang Yue

    2009-01-01

    In a recent paper [2002 Phys. Rev. Lett. 88 174102], Bandt and Pompe propose permutation entropy (PE)as a natural complexity measure for arbitrary time series which may be stationary or nonstationary, deterministic or stochastic. Their method is based on a comparison of neighbouring values. This paper further develops PE, and proposes the concept of fine-grained PE (FGPE) defined by the order pattern and magnitude of the difference between neighbouring values. This measure excludes the case where vectors with a distinct appearance are mistakenly mapped onto the same permutation type, and consequently FGPE becomes more sensitive to the dynamical change of time series than does PE, according to our simulation and experimental results.

  1. Electrical Conductivity of Synthetic Quartz Crystals at High Temperature and Pressure from Complex Impedance Measurements

    Institute of Scientific and Technical Information of China (English)

    王多君; 李和平; 刘丛强; 易丽; 丁东业; 苏根利; 张卫刚

    2002-01-01

    An electrical conductivity measurement system under high-pressure conditions with a multi-anvil high-pressure apparatus by an ac complex impedance method was set up. With this system, we have successfully measured the electrical conductivity of synthetic quartz under pressure up to approximately 1.0 GPa in the temperature range 661-987K. The values of electrical conductivity decrease with the increasing pressure and increase with the increasing temperature. The activation enthalpies for the α-quartz crystals are 1.10-1.28eV. The electrical conductivity of α-quartz is ionic, with Na ions moving in channels parallel to the c-axis being the predominant current carrier.

  2. A new closeness centrality measure via effective distance in complex networks

    Science.gov (United States)

    Du, Yuxian; Gao, Cai; Chen, Xin; Hu, Yong; Sadiq, Rehan; Deng, Yong

    2015-03-01

    Closeness centrality (CC) measure, as a well-known global measure, is widely applied in many complex networks. However, the classical CC presents many problems for flow networks since these networks are directed and weighted. To address these issues, we propose an effective distance based closeness centrality (EDCC), which uses effective distance to replace conventional geographic distance and binary distance obtained by Dijkstra's shortest path algorithm. The proposed EDCC considers not only the global structure of the network but also the local information of nodes. And it can be well applied in directed or undirected, weighted or unweighted networks. Susceptible-Infected model is utilized to evaluate the performance by using the spreading rate and the number of infected nodes. Numerical examples simulated on four real networks are given to show the effectiveness of the proposed EDCC.

  3. Single-step stereolithography of complex anatomical models for optical flow measurements.

    Science.gov (United States)

    de Zélicourt, Diane; Pekkan, Kerem; Kitajima, Hiroumi; Frakes, David; Yoganathan, Ajit P

    2005-02-01

    Transparent stereolithographic rapid prototyping (RP) technology has already demonstrated in literature to be a practical model construction tool for optical flow measurements such as digital particle image velocimetry (DPIV), laser doppler velocimetry (LDV), and flow visualization. Here, we employ recently available transparent RP resins and eliminate time-consuming casting and chemical curing steps from the traditional approach. This note details our methodology with relevant material properties and highlights its advantages. Stereolithographic model printing with our procedure is now a direct single-step process, enabling faster geometric replication of complex computational fluid dynamics (CFD) models for exact experimental validation studies. This methodology is specifically applied to the in vitro flow modeling of patient-specific total cavopulmonary connection (TCPC) morphologies. The effect of RP machining grooves, surface quality, and hydrodynamic performance measurements as compared with the smooth glass models are also quantified.

  4. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Directory of Open Access Journals (Sweden)

    Chen Szi-Wen

    2007-01-01

    Full Text Available A novel approach that employs a complexity-based sequential hypothesis testing (SHT technique for real-time detection of ventricular fibrillation (VF and ventricular tachycardia (VT is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of . The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  5. New method for improving angle measurement precision of laser collimation system under complex background

    Science.gov (United States)

    Zhao, Xiaofeng; Chen, He; Tan, Lilong; Zhang, Zhili; Cai, Wei

    2014-09-01

    We have proposed a new method for improving angle measurement precision based on the principle of CCD laser collimation in this paper. First, through the control of the laser's state, on or off, by the Digital Signal Processor (DSP), the collimation light and the background light can be sampled, individually. Second, with the comparison between the sampled value of the background light intensity and the threshold value which has been set in the DSP previously, the DSP can automatically control Complex Programmable Logic Device (CPLD) to adjust the light integral time of CCD to adapt to different environment background and the changeable scanning driver of CCD is realized. Last, by the digital wave filtering the impact of the background light on the collimation light can be removed. With the comprehensive application of the controlling technology of automatically changeable scanning driving, collimation light on or off, A/D conversion and adaptive filtering, the integration time of the collimation system can automatically adjust to the proper value according to the change of the environment and the impact of the background light on the collimation system can be well removed. The simulation results show that the new method can achieve the self-adaptable control with the change of the environment and can improve the measurement precision of the laser collimation system under the complex environment.

  6. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Science.gov (United States)

    Chen, Szi-Wen

    2006-12-01

    A novel approach that employs a complexity-based sequential hypothesis testing (SHT) technique for real-time detection of ventricular fibrillation (VF) and ventricular tachycardia (VT) is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG) recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM) value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of[InlineEquation not available: see fulltext.]. The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  7. Complexity-Measure-Based Sequential Hypothesis Testing for Real-Time Detection of Lethal Cardiac Arrhythmias

    Directory of Open Access Journals (Sweden)

    Szi-Wen Chen

    2007-01-01

    Full Text Available A novel approach that employs a complexity-based sequential hypothesis testing (SHT technique for real-time detection of ventricular fibrillation (VF and ventricular tachycardia (VT is presented. A dataset consisting of a number of VF and VT electrocardiogram (ECG recordings drawn from the MIT-BIH database was adopted for such an analysis. It was split into two smaller datasets for algorithm training and testing, respectively. Each ECG recording was measured in a 10-second interval. For each recording, a number of overlapping windowed ECG data segments were obtained by shifting a 5-second window by a step of 1 second. During the windowing process, the complexity measure (CM value was calculated for each windowed segment and the task of pattern recognition was then sequentially performed by the SHT procedure. A preliminary test conducted using the database produced optimal overall predictive accuracy of 96.67%. The algorithm was also implemented on a commercial embedded DSP controller, permitting a hardware realization of real-time ventricular arrhythmia detection.

  8. The complex ion structure of warm dense carbon measured by spectrally resolved x-ray scattering

    Energy Technology Data Exchange (ETDEWEB)

    Kraus, D.; Barbrel, B.; Falcone, R. W. [Department of Physics, University of California, Berkeley, California 94720 (United States); Vorberger, J. [Max-Planck-Institut für Physik komplexer Systeme, Nöthnitzer Straße 38, 01187 Dresden (Germany); Helfrich, J.; Frydrych, S.; Ortner, A.; Otten, A.; Roth, F.; Schaumann, G.; Schumacher, D.; Siegenthaler, K.; Wagner, F.; Roth, M. [Institut für Kernphysik, Technische Universität Darmstadt, Schlossgartenstraße 9, 64289 Darmstadt (Germany); Gericke, D. O.; Wünsch, K. [Centre for Fusion, Space and Astrophysics, Department of Physics, University of Warwick, Coventry CV4 7AL (United Kingdom); Bachmann, B.; Döppner, T. [Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Bagnoud, V.; Blažević, A. [GSI Helmholtzzentrum für Schwerionenforschung GmbH, Planckstraße 1, 64291 Darmstadt (Germany); and others

    2015-05-15

    We present measurements of the complex ion structure of warm dense carbon close to the melting line at pressures around 100 GPa. High-pressure samples were created by laser-driven shock compression of graphite and probed by intense laser-generated x-ray sources with photon energies of 4.75 keV and 4.95 keV. High-efficiency crystal spectrometers allow for spectrally resolving the scattered radiation. Comparing the ratio of elastically and inelastically scattered radiation, we find evidence for a complex bonded liquid that is predicted by ab-initio quantum simulations showing the influence of chemical bonds under these conditions. Using graphite samples of different initial densities we demonstrate the capability of spectrally resolved x-ray scattering to monitor the carbon solid-liquid transition at relatively constant pressure of 150 GPa. Showing first single-pulse scattering spectra from cold graphite of unprecedented quality recorded at the Linac Coherent Light Source, we demonstrate the outstanding possibilities for future high-precision measurements at 4th Generation Light Sources.

  9. [Sample pretreatment for the measurement of phthalate esters in complex matrices].

    Science.gov (United States)

    Liang, Jing; Zhuang, Wan'e; Lin, Fang; Yao, Wensong

    2014-11-01

    Sample pretreatment methods for the measurement of phthalate esters (PAEs) by gas chromatography-mass spectrometry (GC-MS) in various complex matrices, including sediment, soil, suspended particle matter, urban surface dust, Sinonovacula Constricta, cosmet- ic, leather, plastic and coastal/estuarine seawater, were proposed. The pretreatment which was appropriate for GC-MS detection was focused on the investigation and optimization of oper- ating parameters for the extraction and purification, such as the extraction solvent, the eluant and the adsorbent of solid phase extraction. The results of the study of pretreatment for various complex matrices showed that methylene chloride was the best solvent for the ultrasonic extraction when solid-liquid extraction was used; silica gel was the economical and practical adsorbent for solid-phase extraction for purification; C18 was the most commonly adsorbent for preconcentration of PAE in coastal/estuarine seawater sample; the mixed solution of n-hexane and ethyl acetate with a certain proportion was the suitable SPE eluent. Under the optimized conditions, the spiked recoveries were above 58% and the relative standard deviations (RSDs) were less than 10.5% (n = 6). The detection limits (DL, 3σ) were in the range of 0.3 μg/kg (dibutyl phthalate)--5.2 μg/kg ( diisononyl phthalate) for sediment, and 6 ng/L (dipropyl phthalate)--67 ng/L (diisodecyl phthalate) for costal/estuarine seawater. The pretreatment meth- od for various complex matrices is prominent for the measurement of the 16 PAEs with GC-MS. PMID:25764660

  10. Measurement of unsteady convection in a complex fenestration using laser interferometry

    Energy Technology Data Exchange (ETDEWEB)

    Poulad, M.E.; Naylor, D. [Ryerson Univ., Toronto, ON (Canada). Dept. of Mechanical and Industrial Engineering; Oosthuizen, P.H. [Queen' s Univ., Kingston, ON (Canada). Dept. of Mechanical and Materials Engineering

    2009-06-15

    Complex fenestration involving windows with between-panes louvered blinds is gaining interest as a means to control solar gains in buildings. However, the heat transfer performance of this type of shading system is not well understood, especially at high Rayleigh numbers. A Mach-Zehnder interferometer was used in this study to measure the unsteady convective heat transfer in a tall enclosure with between-panes blind that was heated to simulate absorbed solar radiation. Digital cinematography was combined with laser interferometry to make time-averaged measurements of unsteady and turbulent free convective heat transfer. This paper described the procedures used to measure the time-average local heat flux. Under strongly turbulent conditions, the average Nusselt number for the enclosure was found to compare well with empirical correlations. A total sampling time of about ten seconds was needed in this experiment to obtain a stationary time-average heat flux. The time-average heat flux was found to be relatively insensitive to the camera frame rate. The local heat flux was found to be unsteady and periodic. Heating of the blind made the flow more unstable, producing a higher amplitude heat flux variation than for the unheated blind condition. This paper reported on only a small set of preliminary measurements. This study is being extended to other blind angles and glazing spacings. The next phase will focus on flow visualization studies to characterize the nature of the flow. 8 refs., 2 tabs., 7 figs.

  11. Vertical profiles of urban aerosol complex refractive index in the frame of ESQUIF airborne measurements

    Directory of Open Access Journals (Sweden)

    J.-C. Raut

    2008-02-01

    Full Text Available A synergy between lidar, sunphotometer and in situ measurements has been applied to airborne observations performed during the Etude et Simulation de la QUalité de l'air en Ile-de-France (ESQUIF, enabling the retrieval of vertical profiles for the aerosol complex refractive index (ACRI and single-scattering albedo with a vertical resolution of 200 m over Paris area. The averaged value over the entire planetary boundary layer (PBL for the ACRI is close to 1.51(±0.02–i0.017(±0.003 at 532 nm. The single-scattering albedo of the corresponding aerosols is found to be ~0.9 at the same wavelength. A good agreement is found with previous studies for urban aerosols. A comparison of vertical profiles of ACRI with simulations combining in situ measurements and relative humidity (RH profiles has highlighted a modification in aerosol optical properties linked to their history and the origin of the air mass. The determination of ACRI in the atmospheric column enabled to retrieve vertical profiles of extinction coefficient in accordance with lidar profiles measurements.

  12. Vertical profiles of urban aerosol complex refractive index in the frame of ESQUIF airborne measurements

    International Nuclear Information System (INIS)

    A synergy between lidar, sun photometer and in situ measurements has been applied to airborne observations performed during the Etude et Simulation de la QUalite de l'air en Ile-de-France (ESQUIF), enabling the retrieval of vertical profiles for the aerosol complex refractive index (ACRI) and single-scattering albedo with a vertical resolution of 200 m over Paris area. The averaged value over the entire planetary boundary layer (PBL) for the ACRI is close to 1.51(± 0.02)-i0.017(± 0.003) at 532 nm. The single-scattering albedo of the corresponding aerosols is found to be similar to 0.9 at the same wavelength. A good agreement is found with previous studies for urban aerosols. A comparison of vertical profiles of ACRI with simulations combining in situ measurements and relative humidity (RH) profiles has highlighted a modification in aerosol optical properties linked to their history and the origin of the air mass. The determination of ACRI in the atmospheric column enabled to retrieve vertical profiles of extinction coefficient in accordance with lidar profiles measurements. (authors)

  13. Cortical complexity as a measure of age-related brain atrophy.

    Science.gov (United States)

    Madan, Christopher R; Kensinger, Elizabeth A

    2016-07-01

    The structure of the human brain changes in a variety of ways as we age. While a sizeable literature has examined age-related differences in cortical thickness, and to a lesser degree, gyrification, here we examined differences in cortical complexity, as indexed by fractal dimensionality in a sample of over 400 individuals across the adult lifespan. While prior studies have shown differences in fractal dimensionality between patient populations and age-matched, healthy controls, it is unclear how well this measure would relate to age-related cortical atrophy. Initially computing a single measure for the entire cortical ribbon, i.e., unparcellated gray matter, we found fractal dimensionality to be more sensitive to age-related differences than either cortical thickness or gyrification index. We additionally observed regional differences in age-related atrophy between the three measures, suggesting that they may index distinct differences in cortical structure. We also provide a freely available MATLAB toolbox for calculating fractal dimensionality. PMID:27103141

  14. Vertical profiles of urban aerosol complex refractive index in the frame of ESQUIF airborne measurements

    Directory of Open Access Journals (Sweden)

    J.-C. Raut

    2007-07-01

    Full Text Available A synergy between lidar, sunphotometer and in situ measurements has been applied to airborne observations performed during the Etude et Simulation de la QUalité de l'air en Ile-de-France (ESQUIF, enabling the retrieval of vertical profiles for the aerosol complex refractive index (ACRI and single-scattering albedo with a vertical resolution of 200 m over Paris area. The averaged value over the entire planetary boundary layer (PBL for the ACRI is close to 1.51(±0.02–i0.017(±0.003 at 532 nm. The single-scattering albedo of the corresponding aerosols is found to be ~0.9 at the same wavelength. A good agreement is found with previous studies for urban aerosols. A comparison of vertical profiles of ACRI with simulations combining in situ measurements and relative humidity (RH profiles has highlighted a modification in aerosol optical properties linked to their history and the origin of the air mass. The determination of ACRI in the atmospheric column enabled to retrieve vertical profiles of extinction coefficient in accordance with lidar profiles measurements.

  15. Measuring spatial patterns in floodplains: A step towards understanding the complexity of floodplain ecosystems: Chapter 6

    Science.gov (United States)

    Murray Scown,; Martin Thoms,; DeJager, Nathan R.; Gilvear, David J.; Greenwood, Malcolm T.; Thoms, Martin C.; Wood, Paul J.

    2016-01-01

    Floodplains can be viewed as complex adaptive systems (Levin, 1998) because they are comprised of many different biophysical components, such as morphological features, soil groups and vegetation communities as well as being sites of key biogeochemical processing (Stanford et al., 2005). Interactions and feedbacks among the biophysical components often result in additional phenomena occuring over a range of scales, often in the absence of any controlling factors (sensu Hallet, 1990). This emergence of new biophysical features and rates of processing can lead to alternative stable states which feed back into floodplain adaptive cycles (cf. Hughes, 1997; Stanford et al., 2005). Interactions between different biophysical components, feedbacks, self emergence and scale are all key properties of complex adaptive systems (Levin, 1998; Phillips, 2003; Murray et al., 2014) and therefore will influence the manner in which we study and view spatial patterns. Measuring the spatial patterns of floodplain biophysical components is a prerequisite to examining and understanding these ecosystems as complex adaptive systems. Elucidating relationships between pattern and process, which are intrinsically linked within floodplains (Ward et al., 2002), is dependent upon an understanding of spatial pattern. This knowledge can help river scientists determine the major drivers, controllers and responses of floodplain structure and function, as well as the consequences of altering those drivers and controllers (Hughes and Cass, 1997; Whited et al., 2007). Interactions and feedbacks between physical, chemical and biological components of floodplain ecosystems create and maintain a structurally diverse and dynamic template (Stanford et al., 2005). This template influences subsequent interactions between components that consequently affect system trajectories within floodplains (sensu Bak et al., 1988). Constructing and evaluating models used to predict floodplain ecosystem responses to

  16. Measuring The Influence of TAsk COMplexity on Human Error Probability: An Empirical Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Podofillini, Luca; Dang, Vinh N. [Paul Scherrer Institute, Villigen (Switzerland)

    2013-04-15

    A key input for the assessment of Human Error Probabilities (HEPs) with Human Reliability Analysis (HRA) methods is the evaluation of the factors influencing the human performance (often referred to as Performance Shaping Factors, PSFs). In general, the definition of these factors and the supporting guidance are such that their evaluation involves significant subjectivity. This affects the repeatability of HRA results as well as the collection of HRA data for model construction and verification. In this context, the present paper considers the TAsk COMplexity (TACOM) measure, developed by one of the authors to quantify the complexity of procedure-guided tasks (by the operating crew of nuclear power plants in emergency situations), and evaluates its use to represent (objectively and quantitatively) task complexity issues relevant to HRA methods. In particular, TACOM scores are calculated for five Human Failure Events (HFEs) for which empirical evidence on the HEPs (albeit with large uncertainty) and influencing factors are available from the International HRA Empirical Study. The empirical evaluation has shown promising results. The TACOM score increases as the empirical HEP of the selected HFEs increases. Except for one case, TACOM scores are well distinguished if related to different difficulty categories (e. g., 'easy' vs. 'somewhat difficult'), while values corresponding to tasks within the same category are very close. Despite some important limitations related to the small number of HFEs investigated and the large uncertainty in their HEPs, this paper presents one of few attempts to empirically study the effect of a performance shaping factor on the human error probability. This type of study is important to enhance the empirical basis of HRA methods, to make sure that 1) the definitions of the PSFs cover the influences important for HRA (i. e., influencing the error probability), and 2) the quantitative relationships among PSFs and error

  17. Rotational study of the CH4-CO complex: Millimeter-wave measurements and ab initio calculations

    Science.gov (United States)

    Surin, L. A.; Tarabukin, I. V.; Panfilov, V. A.; Schlemmer, S.; Kalugina, Y. N.; Faure, A.; Rist, C.; van der Avoird, A.

    2015-10-01

    The rotational spectrum of the van der Waals complex CH4-CO has been measured with the intracavity OROTRON jet spectrometer in the frequency range of 110-145 GHz. Newly observed and assigned transitions belong to the K = 2-1 subband correlating with the rotationless jCH4 = 0 ground state and the K = 2-1 and K = 0-1 subbands correlating with the jCH4 = 2 excited state of free methane. The (approximate) quantum number K is the projection of the total angular momentum J on the intermolecular axis. The new data were analyzed together with the known millimeter-wave and microwave transitions in order to determine the molecular parameters of the CH4-CO complex. Accompanying ab initio calculations of the intermolecular potential energy surface (PES) of CH4-CO have been carried out at the explicitly correlated coupled cluster level of theory with single, double, and perturbative triple excitations [CCSD(T)-F12a] and an augmented correlation-consistent triple zeta (aVTZ) basis set. The global minimum of the five-dimensional PES corresponds to an approximately T-shaped structure with the CH4 face closest to the CO subunit and binding energy De = 177.82 cm-1. The bound rovibrational levels of the CH4-CO complex were calculated for total angular momentum J = 0-6 on this intermolecular potential surface and compared with the experimental results. The calculated dissociation energies D0 are 91.32, 94.46, and 104.21 cm-1 for A (jCH4 = 0), F (jCH4 = 1), and E (jCH4 = 2) nuclear spin modifications of CH4-CO, respectively.

  18. Evoked potential correlates of intelligence: some problems with Hendrickson's string measure of evoked potential complexity and error theory of intelligence.

    Science.gov (United States)

    Vetterli, C F; Furedy, J J

    1985-07-01

    The string measure of evoked potential (EP) complexity is based on a new error theory of intelligence, which differs from the older speed-based formulations which focus on EP latency rather than complexity. In this note we first raise a methodological problem of arbitrariness with respect to one version of the string measure. We then provide a comparative empirical assessment of EP-IQ correlations with respect to a revised string measure (which does not suffer from the methodological problem), a latency measure, and another measure of EP complexity: average voltage. This assessment indicates that the string measure, in particular, yields quite disorderly results, and that, in general, the results favor the speed over the error formulation.

  19. PAFit: A Statistical Method for Measuring Preferential Attachment in Temporal Complex Networks.

    Directory of Open Access Journals (Sweden)

    Thong Pham

    Full Text Available Preferential attachment is a stochastic process that has been proposed to explain certain topological features characteristic of complex networks from diverse domains. The systematic investigation of preferential attachment is an important area of research in network science, not only for the theoretical matter of verifying whether this hypothesized process is operative in real-world networks, but also for the practical insights that follow from knowledge of its functional form. Here we describe a maximum likelihood based estimation method for the measurement of preferential attachment in temporal complex networks. We call the method PAFit, and implement it in an R package of the same name. PAFit constitutes an advance over previous methods primarily because we based it on a nonparametric statistical framework that enables attachment kernel estimation free of any assumptions about its functional form. We show this results in PAFit outperforming the popular methods of Jeong and Newman in Monte Carlo simulations. What is more, we found that the application of PAFit to a publically available Flickr social network dataset yielded clear evidence for a deviation of the attachment kernel from the popularly assumed log-linear form. Independent of our main work, we provide a correction to a consequential error in Newman's original method which had evidently gone unnoticed since its publication over a decade ago.

  20. Entropy-based complexity measures for gait data of patients with Parkinson's disease

    Science.gov (United States)

    Afsar, Ozgur; Tirnakli, Ugur; Kurths, Juergen

    2016-02-01

    Shannon, Kullback-Leibler, and Klimontovich's renormalized entropies are applied as three different complexity measures on gait data of patients with Parkinson's disease (PD) and healthy control group. We show that the renormalized entropy of variability of total reaction force of gait is a very efficient tool to compare patients with respect to disease severity. Moreover, it is a good risk predictor such that the sensitivity, i.e., the percentage of patients with PD who are correctly identified as having PD, increases from 25% to 67% while the Hoehn-Yahr stage increases from 2.5 to 3.0 (this stage goes from 0 to 5 as the disease severity increases). The renormalized entropy method for stride time variability of gait is found to correctly identify patients with a sensitivity of 80%, while the Shannon entropy and the Kullback-Leibler relative entropy can do this with a sensitivity of only 26.7% and 13.3%, respectively.

  1. Measuring mixing patterns in complex networks by Spearman rank correlation coefficient

    Science.gov (United States)

    Zhang, Wen-Yao; Wei, Zong-Wen; Wang, Bing-Hong; Han, Xiao-Pu

    2016-06-01

    In this paper, we utilize Spearman rank correlation coefficient to measure mixing patterns in complex networks. Compared with the widely used Pearson coefficient, Spearman coefficient is rank-based, nonparametric, and size-independent. Thus it is more effective to assess linking patterns of diverse networks, especially for large-size networks. We demonstrate this point by testing a variety of empirical and artificial networks. Moreover, we show that normalized Spearman ranks of stubs are subject to an interesting linear rule where the correlation coefficient is just the Spearman coefficient. This compelling linear relationship allows us to directly produce networks with any prescribed Spearman coefficient. Our method apparently has an edge over the well known uncorrelated configuration model.

  2. Response to Disturbance and Abundance of Final State: a Measure for Complexity?

    Institute of Scientific and Technical Information of China (English)

    SHEN Dan; WANG Wen-Xiu; JIANG Yu-Mei; HE Yue; HE Da-Ren

    2007-01-01

    We propose a new definition of complexity. The definition shows that when a system evolves to a final state via a transient state, its complexity depends on the abundance of both the final state and transient state. The abundance of the transient state may be described by the diversity of the response to disturbance. We hope that this definition can describe a clear boundary between simple systems and complex systems by showing that all the simple systems have zero complexity, and all the complex systems have positive complexity. Some examples of the complexity calculations are presented, which supports our hope.

  3. Separation of copper, iron, and zinc from complex aqueous solutions for isotopic measurement

    Science.gov (United States)

    Borrok, D.M.; Wanty, R.B.; Ridley, W.I.; Wolf, R.; Lamothe, P.J.; Adams, M.

    2007-01-01

    The measurement of Cu, Fe, and Zn isotopes in natural samples may provide valuable information about biogeochemical processes in the environment. However, the widespread application of stable Cu, Fe, and Zn isotope chemistry to natural water systems remains limited by our ability to efficiently separate these trace elements from the greater concentrations of matrix elements. In this study, we present a new method for the isolation of Cu, Fe, and Zn from complex aqueous solutions using a single anion-exchange column with hydrochloric acid media. Using this method we are able to quantitatively separate Cu, Fe, and Zn from each other and from matrix elements in a single column elution. Elution of the elements of interest, as well as all other elements, through the anion-exchange column is a function of the speciation of each element in the various concentrations of HCl. We highlight the column chemistry by comparing our observations with published studies that have investigated the speciation of Cu, Fe, and Zn in chloride solutions. The functionality of the column procedure was tested by measuring Cu, Fe, and Zn isotopes in a variety of stream water samples impacted by acid mine drainage. The accuracy and precision of Zn isotopic measurements was tested by doping Zn-free stream water with the Zn isotopic standard. The reproducibility of the entire column separation process and the overall precision of the isotopic measurements were also evaluated. The isotopic results demonstrate that the Cu, Fe, and Zn column separates from the tested stream waters are of sufficient purity to be analyzed directly using a multicollector inductively coupled plasma mass spectrometer (MC-ICP-MS), and that the measurements are fully-reproducible, accurate, and precise. Although limited in scope, these isotopic measurements reveal significant variations in ??65Cu (- 1.41 to + 0.30???), ??56Fe (- 0.56 to + 0.34???), and ??66Zn (0.31 to 0.49???) among samples collected from different

  4. Advances in Measuring the Apparent Optical Properties (AOPs) of Optically Complex Waters

    Science.gov (United States)

    Morrow, John H.; Hooker, Stanford B.; Booth, Charles R.; Bernhard, Germar; Lind, Randall N.; Brown, James W.

    2010-01-01

    This report documents new technology used to measure the apparent optical properties (AOPs) of optically complex waters. The principal objective is to be prepared for the launch of next-generation ocean color satellites with the most capable commercial off-the-shelf (COTS) instrumentation. An enhanced COTS radiometer was the starting point for designing and testing the new sensors. The follow-on steps were to apply the lessons learned towards a new in-water profiler based on a kite-shaped backplane for mounting the light sensors. The next level of sophistication involved evaluating new radiometers emerging from a development activity based on so-called microradiometers. The exploitation of microradiometers resulted in an in-water profiling system, which includes a sensor networking capability to control ancillary sensors like a shadowband or global positioning system (GPS) device. A principal advantage of microradiometers is their flexibility in producing, interconnecting, and maintaining instruments. The full problem set for collecting sea-truth data--whether in coastal waters or the open ocean-- involves other aspects of data collection that were improved for instruments measuring both AOPs and inherent optical properties (IOPs), if the uncertainty budget is to be minimized. New capabilities associated with deploying solar references were developed as well as a compact solution for recovering in-water instrument systems from small boats.

  5. Investigation of the Ionic conductivity and dielectric measurements of poly (N-vinyl pyrrolidone)-sulfamic acid polymer complexes

    International Nuclear Information System (INIS)

    Polymer electrolyte complexes of poly (N-vinyl pyrrolidone) (PVP)-sulfamic acid (NH2SO3H) were prepared by a familiar solution casting method with different molar concentrations of PVP and sulfamic acid. The interaction between PVP and NH2SO3H was confirmed by Fourier transform infrared spectroscopy analysis. Laser microscopy analysis was used to study the surface morphology of the polymer complexes. The glass transition temperature (Tg) and the melting temperature (Tm) of polymer complexes were computed from Differential scanning calorimetric studies. AC impedance spectroscopic measurements revealed that the polymer complex, 97 mol% PVP-3 mol% NH2SO3H shows higher ionic conductivity with two different activation energies above and below the glass transition temperature (Tg). Dielectric studies confirmed that the dc conduction mechanism has dominated in the polymer complexes. The value of power law exponent (n) confirmed the translational motion of ions from one site to another vacant site in these complexes

  6. Complex Correlation Measure: a novel descriptor for Poincaré plot

    Directory of Open Access Journals (Sweden)

    Gubbi Jayavardhana

    2009-08-01

    Full Text Available Abstract Background Poincaré plot is one of the important techniques used for visually representing the heart rate variability. It is valuable due to its ability to display nonlinear aspects of the data sequence. However, the problem lies in capturing temporal information of the plot quantitatively. The standard descriptors used in quantifying the Poincaré plot (SD1, SD2 measure the gross variability of the time series data. Determination of advanced methods for capturing temporal properties pose a significant challenge. In this paper, we propose a novel descriptor "Complex Correlation Measure (CCM" to quantify the temporal aspect of the Poincaré plot. In contrast to SD1 and SD2, the CCM incorporates point-to-point variation of the signal. Methods First, we have derived expressions for CCM. Then the sensitivity of descriptors has been shown by measuring all descriptors before and after surrogation of the signal. For each case study, lag-1 Poincaré plots were constructed for three groups of subjects (Arrhythmia, Congestive Heart Failure (CHF and those with Normal Sinus Rhythm (NSR, and the new measure CCM was computed along with SD1 and SD2. ANOVA analysis distribution was used to define the level of significance of mean and variance of SD1, SD2 and CCM for different groups of subjects. Results CCM is defined based on the autocorrelation at different lags of the time series, hence giving an in depth measurement of the correlation structure of the Poincaré plot. A surrogate analysis was performed, and the sensitivity of the proposed descriptor was found to be higher as compared to the standard descriptors. Two case studies were conducted for recognizing arrhythmia and congestive heart failure (CHF subjects from those with NSR, using the Physionet database and demonstrated the usefulness of the proposed descriptors in biomedical applications. CCM was found to be a more significant (p = 6.28E-18 parameter than SD1 and SD2 in discriminating

  7. Measurements of key offensive odorants in a fishery industrial complex in Korea

    Science.gov (United States)

    Seo, Seong-Gyu; Ma, Zhong-Kun; Jeon, Jun-Min; Jung, Sang-Chul; Lee, Woo-Bum

    2011-06-01

    This study was carried out to measure the concentrations of offensive odorants with an emphasis on nitrogenous compounds [NC: ammonia (NH 3) and trimethylamine (TMA)] and reduced sulfur compounds [RSC: hydrogen sulfide (H 2S), methyl mercaptan (CH 3SH), dimethyl sulfide (DMS), and dimethyl disulfide (DMDS)] from various sources in a fishery industrial complex in Yeosu, Korea. Samples were collected from a total of 18 sampling sites including the major fishery facilities (C-1˜C-5) and the border areas (O-1˜O-8) of this fishery industrial complex during spring, summer, and fall. The mean concentrations of odorants at the major fishery facilities were found in the order of NH 3 (638 ppb), H 2S (291 ppb), CH 3SH (123 ppb), TMA (20.6 ppb), DMDS (7.71 ppb), and DMS (5.25 ppb). On the other hand, the mean concentrations of odorants at the border areas were NH 3 (85.3 ppb), TMA (1.75 ppb), H 2S (0.25 ppb), CH 3SH (0.18 ppb), DMS (0.07 ppb), and DMDS (0.06 ppb). The mean concentrations of H 2S, CH 3SH and TMA in the major fishery facilities greatly exceeded the Odorant Emission Guideline (OEG) applied to an industrial area. The concentration gradient of RSC between the major fishery facilities and border areas was more prominent than that of NC. From the correlation analyses, the highest correlation coefficient of 0.976 ( p = 3.99E-40, n = 60) was found between DMS and DMDS at the major fishery facilities, while NH 3 had a strong correlation with the sum of odorant concentrations (SOC) at the border areas ( r = 0.997, p = 4.83E-54, n = 48). The results of this study thus confirmed that CH 3SH and TMA were the major odorants at the major fishery facilities and the border areas, respectively.

  8. Selective extraction of metals from products of mine acidic water treatment

    International Nuclear Information System (INIS)

    A study was made on possibility of processing of foam products prepared during flotation purification of mine acidic waters for the purpose of selective extraction of non-ferrous (Co, Ni) and rare earth elements (REE) and their separation from the basic macrocomponent of waters-iron. Optimal conditions of selective metal extraction from foam flotation products are the following: T=333 K, pH=3.0-3.5, ratio of solid and liquid phase - 1:4-1:7, duration of sulfuric acid leaching - 30 min. Rare earth extraction under such conditions equals 87.6-93.0%. The degree of valuable component concentration equals ∼ 10. Rare earths are separated from iron by extraction methods

  9. Complexity Measures, Task Type, and Analytic Evaluations of Speaking Proficiency in a School-Based Assessment Context

    Science.gov (United States)

    Gan, Zhengdong

    2012-01-01

    This study, which is part of a large-scale study of using objective measures to validate assessment rating scales and assessment tasks in a high-profile school-based assessment initiative in Hong Kong, examined how grammatical complexity measures relate to task type and analytic evaluations of students' speaking proficiency in a classroom-based…

  10. Network complexity as a measure of information processing across resting-state networks: Evidence from the Human Connectome Project

    Directory of Open Access Journals (Sweden)

    Ian M Mcdonough

    2014-06-01

    Full Text Available An emerging field of research focused on fluctuations in brain signals has provided evidence that the complexity of those signals, as measured by entropy, conveys important information about network dynamics (e.g., local and distributed processing. While much research has focused on how neural complexity differs in populations with different age groups or clinical disorders, substantially less research has focused on the basic understanding of neural complexity in populations with young and healthy brain states. The present study used resting-state fMRI data from the Human Connectome Project (Van Essen et al., 2013 to test the extent that neural complexity in the BOLD signal, as measured by multiscale entropy 1 would differ from random noise, 2 would differ between four major resting-state networks previously associated with higher-order cognition, and 3 would be associated with the strength and extent of functional connectivity—a complementary method of estimating information processing. We found that complexity in the BOLD signal exhibited different patterns of complexity from white, pink, and red noise and that neural complexity was differentially expressed between resting-state networks, including the default mode, cingulo-opercular, left and right frontoparietal networks. Lastly, neural complexity across all networks was negatively associated with functional connectivity at fine scales, but was positively associated with functional connectivity at coarse scales. The present study is the first to characterize neural complexity in BOLD signals at a high temporal resolution and across different networks and might help clarify the inconsistencies between neural complexity and functional connectivity, thus informing the mechanisms underlying neural complexity.

  11. Inspection of Complex Internal Surface Shape with Fiber-optic Sensor II: for Specular Tilted Surface Measurement

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Complex surface shape measurement has been a focus topic in the CAD/CAM field. A popular method for measuring dimensional information is using a 3D coordinate measuring machine (CMM)with a touch trigger probe. The measurement set up with CMM, however, is a time consuming task and the accuracy of the measurement deteriorates as the speed of measurement increase. Non-contact measurement is favored since high speed measurement can be achieved and problems with vibration and friction can be eliminated. Although much research has been conducted in non-contact measurement using image capturing and processing schemes, accuracy is poor and measurement is limited. Some optical technologies developed provide a good accuracy but the dynamic range and versatility is very limited. A novel fiber-optic sensor used for the inspection of complex internal contours is presented in this paper, which is able to measure a surface shape in a non-contact manner with high accuracy and high speed, and is compact and flexible to be incorporated into a CMM. Modulation functions for tilted surface shape measurement, based on the Gaussian distribution of the emitting beam from single-mode fiber (SMF), were derived for specular reflection. The feasibility of the proposed measurement principle was verified by simulations.

  12. An entropy-based measure of hydrologic complexity and its applications

    Science.gov (United States)

    Castillo, Aldrich; Castelli, Fabio; Entekhabi, Dara

    2015-07-01

    Basin response and hydrologic fluxes are functions of hydrologic states, most notably of soil moisture. However, characterization of hillslope-scale soil moisture is challenging since it is both spatially heterogeneous and dynamic. This paper introduces an entropy-based and discretization-invariant dimensionless index of hydrologic complexity H that measures the distance of a given distribution of soil moisture from a Dirac delta (most organization) and a uniform distribution (widest distribution). Applying the distributed hydrologic model MOBIDIC to seven test basins with areas ranging 100-103 km2 and representing semiarid and temperate climates, H is shown to capture distributional characteristics of soil moisture fields. It can also track the temporal evolution of the distributional features. Furthermore, this paper explores how basin attributes affect the characteristic H, and how H can be used to explain interbasin variability in hydrologic response. Relationships are found only by grouping basins with the same climate or size. For the semiarid basins, H scales with catchment area, topographic wetness, infiltration ratio, and base flow index; while H is inversely related to relief ratio.

  13. An entropy‐based measure of hydrologic complexity and its applications

    Science.gov (United States)

    Castelli, Fabio; Entekhabi, Dara

    2015-01-01

    Abstract Basin response and hydrologic fluxes are functions of hydrologic states, most notably of soil moisture. However, characterization of hillslope‐scale soil moisture is challenging since it is both spatially heterogeneous and dynamic. This paper introduces an entropy‐based and discretization‐invariant dimensionless index of hydrologic complexity H that measures the distance of a given distribution of soil moisture from a Dirac delta (most organization) and a uniform distribution (widest distribution). Applying the distributed hydrologic model MOBIDIC to seven test basins with areas ranging 100−103 km2 and representing semiarid and temperate climates, H is shown to capture distributional characteristics of soil moisture fields. It can also track the temporal evolution of the distributional features. Furthermore, this paper explores how basin attributes affect the characteristic H, and how H can be used to explain interbasin variability in hydrologic response. Relationships are found only by grouping basins with the same climate or size. For the semiarid basins, H scales with catchment area, topographic wetness, infiltration ratio, and base flow index; while H is inversely related to relief ratio. PMID:26937055

  14. [Evaluation of a complex trace element composition and bioutilization using isotope technics and total body measurement].

    Science.gov (United States)

    Balogh, L; Kerekes, A; Bodó, K; Körösi, L; Jánoki, G A

    1998-05-24

    Modified mineral and trace element solutions were prepared containing Zn-65, Co-57, Mn-54, Fe-59, Mo-99 and Ni-63 isotopes which were physico-chemically identical to the original solution. Bioutilization examinations were carried out on animals receiving their normal feeding, after p. os application of complex trace element composition (CTEC) namely whole-body retention studies, bioassays, scintigraphic and excretion examinations in altogether 180 Wistar rats, 6 Beagle and 2 mongrel dogs using whole body counter, gamma and beta counters, gamma camera and metabolic cages. Extremely high whole body retention was measured in case of iron (8-30%), high utilizations in case of zinc (4-5%), cobalt (4-6%), molybdenum (3-4%) and manganese (2-4%) and a lower value in case of nickel. Bioassay and scintigraphic evaluations showed marked liver-, kidney-, and muscle and moderated blood uptakes. The way of excretion was mainly (more than 90%) via the faeces in case of zinc, manganese, iron and nickel, although cobalt excreted in 8% and molybdenum in 52% via the urinary tract. Our results show, that isotope technique combined with whole body counting and excretion studies in an available method for trace element bioutilization studies. PMID:9632924

  15. Method for Determining the Activation Energy Distribution Function of Complex Reactions by Sieving and Thermogravimetric Measurements.

    Science.gov (United States)

    Bufalo, Gennaro; Ambrosone, Luigi

    2016-01-14

    A method for studying the kinetics of thermal degradation of complex compounds is suggested. Although the method is applicable to any matrix whose grain size can be measured, herein we focus our investigation on thermogravimetric analysis, under a nitrogen atmosphere, of ground soft wheat and ground maize. The thermogravimetric curves reveal that there are two well-distinct jumps of mass loss. They correspond to volatilization, which is in the temperature range 298-433 K, and decomposition regions go from 450 to 1073 K. Thermal degradation is schematized as a reaction in the solid state whose kinetics is analyzed separately in each of the two regions. By means of a sieving analysis different size fractions of the material are separated and studied. A quasi-Newton fitting algorithm is used to obtain the grain size distribution as best fit to experimental data. The individual fractions are thermogravimetrically analyzed for deriving the functional relationship between activation energy of the degradation reactions and the particle size. Such functional relationship turns out to be crucial to evaluate the moments of the activation energy distribution, which is unknown in terms of the distribution calculated by sieve analysis. From the knowledge of moments one can reconstruct the reaction conversion. The method is applied first to the volatilization region, then to the decomposition region. The comparison with the experimental data reveals that the method reproduces the experimental conversion with an accuracy of 5-10% in the volatilization region and of 3-5% in the decomposition region. PMID:26671287

  16. Airborne Measurements of Aerosol Emissions From the Alberta Oil Sands Complex

    Science.gov (United States)

    Howell, S. G.; Clarke, A. D.; McNaughton, C. S.; Freitag, S.

    2012-12-01

    The Alberta oil sands contain a vast reservoir of fossil hydrocarbons. The extremely viscous bitumen requires significant energy to extract and upgrade to make a fluid product suitable for pipelines and further refinement. The mining and upgrading process constitute a large industrial complex in an otherwise sparsely populated area of Canada. During the ARCTAS project in June/July 2008, while studying forest fire plumes, the NASA DC-8 and P-3B flew through the plume a total of 5 times. Once was a coordinated visit by both aircraft; the other 3 were fortuitous passes downwind. One study has been published about gas emissions from the complex. Here we concentrate on aerosol emissions and aging. As previously reported, there appear to be at least 2 types of plumes produced. One is an industrial-type plume with vast numbers of ultrafine particles, SO2, sulfate, black carbon (BC), CO, and NO2. The other, probably from the mining, has more organic aerosol and BC together with dust-like aerosols at 3 μm and a 1 μm mode of unknown origin. The DC-8 crossed the plume about 10 km downwind of the industrial site, giving time for the boundary layer to mix and enabling a very crude flux calculation suggesting that sulfate and organic aerosols were each produced at about 500 g/s (estimated errors are a factor of 2, chiefly due to concerns about vertical mixing). Since this was a single flight during a project dedicated to other purposes and operating conditions and weather may change fluxes considerably, this may not be a typical flux. As the plume progresses downwind, the ultrafine particles grow to sizes effective as cloud condensation nucei (CCN), SO2 is converted to sulfate, and organic aerosol is produced. During fair weather in the summer, as was the case during these flights, cloud convection pumps aerosol above the mixed layer. While the aerosol plume is difficult to detect from space, NO2 is measured by the OMI instrument an the Aura satellite and the oil sands plume

  17. Characterization of Nuclear Materials Using Complex of Non-Destructive and Mass-Spectroscopy Methods of Measurements

    International Nuclear Information System (INIS)

    Information and Analytical Centre for nuclear materials investigations was established in Russian Federation in the February 2 of 2009 by ROSATOM State Atomic Energy Corporation (the order #80). Its purpose is in preventing unauthorized access to nuclear materials and excluding their illicit traffic. Information and Analytical Centre includes analytical laboratory to provide composition and properties of nuclear materials of unknown origin for their identification. According to Regulation the Centre deals with: · identification of nuclear materials of unknown origin to provide information about their composition and properties; · arbitration analyzes of nuclear materials; · comprehensive research of nuclear and radioactive materials for developing techniques characterization of materials; · interlaboratory measurements; · measurements for control and accounting; · confirmatory measurements. Complex of non-destructive and mass-spectroscopy techniques was developed for the measurements. The complex consists of: · gamma-ray techniques on the base of MGAU, MGA and FRAM codes for uranium and plutonium isotopic composition; · gravimetrical technique with gamma-spectroscopy in addition for uranium content; · calorimetric technique for plutonium mass; · neutron multiplicity technique for plutonium mass; · measurement technique on the base of mass-spectroscopy for uranium isotopic composition; · measurement technique on the base of mass-spectroscopy for metallic impurities. Complex satisfies the state regulation requirements of ensuring the uniformity of measurements including the Russian Federation Federal Law on Ensuring the Uniformity of Measurements #102-FZ, Interstate Standard GOST R ISO/IEC 17025-2006, National Standards of Russian Federation GOST R 8.563-2009, GOST R 8.703-2010, Federal Regulations NRB-99/2009, OSPORB 99/2010. Created complex is provided in reference materials, equipment end certificated techniques. The complex is included in accredited

  18. Complex bounds and microstructural recovery from measurements of sea ice permittivity

    International Nuclear Information System (INIS)

    Sea ice is a porous composite of pure ice with brine, air, and salt inclusions. The polar sea ice packs play a key role in the earth's ocean-climate system, and they host robust algal and bacterial communities that support the Arctic and Antarctic ecosystems. Monitoring the sea ice packs on global or regional scales is an increasingly important problem, typically involving the interaction of an electromagnetic wave with sea ice. In the quasistatic regime where the wavelength is much longer than the composite microstructural scale, the electromagnetic behavior is characterized by the effective complex permittivity tensor ε*. In assessing the impact of climate change on the polar sea ice covers, current satellites and algorithms can predict ice extent, but the thickness distribution remains an elusive, yet most important feature. In recent years, electromagnetic induction devices using low frequency waves have been deployed on ships, helicopters and planes to obtain thickness data. Here we compare two sets of theoretical bounds to extensive outdoor tank and in situ field data on ε* at 50MHz taken in the Arctic and Antarctic. The sea ice is assumed to be a two phase composite of ice and brine with known constituent permittivities. The first set of bounds assumes only knowledge of the brine volume fraction or porosity, and the second set further assumes statistical isotropy of the microstructure. We obtain excellent agreement between theory and experiment, and are able to observe the apparent violation of the isotropic bounds as the vertically oriented microstructure becomes increasingly connected for higher porosities. Moreover, these bounds are inverted to obtain estimates of the porosity from the measurements of ε*. We find that the temporal variations of the reconstructed porosity, which is directly related to temperature, closely follow the actual behavior

  19. Risk reduction measures applied to horizontal directional drilling of a complex pipeline river crossing in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Cocciolo, P.P.; Zeleny, B. [Terasen Pipelines Inc., Calgary, AB (Canada)

    2004-07-01

    The Trans Mountain Pipe Line (TMPL) system transports crude oil and refined products from Edmonton, Alberta to Burnaby, British Columbia (BC) and northwest Washington State. The lower mainland of BC along Canada's west coast is classed as a seismic zone 4, the highest risk earthquake classification. However, construction of the NPS 24, 1142 km pipeline was completed in 1953 when both seismic design knowledge and seismic risk awareness were limited. A seismic overview assessment completed in 1998 of the 140 km segment of the pipeline from Hope station to the Burnaby terminal, revealed that the crossing at the Fraser River in southwestern BC is vulnerable to damage in the 1/475 year return and 1/2000 year return earthquakes. The configuration at the north and south banks of the Fraser River are at risk due to lateral soil spreading from liquefaction during a large earthquake. Mitigating seismic risk in liquefiable areas includes straightening the pipeline and aligning parallel to the direction of lateral spreading, and minimizing the length of pipe in liquefiable soils. The most effective alternative from a cost seismic risk mitigation perspective was found to be the replacement of the pipeline crossing by horizontal directional drilling (HDD). In 2003, HDD replacement was completed. The complexity and risk of the HDD crossing were high due to the level of urban development and the existence of major linear infrastructure on both sides of the river. Measures were applied to reduce construction risk, environmental damage, as well as delays or inability to complete the crossings. Strategies included proper pipeline route selection, site specific geotechnical investigations, HDD annular pressure monitoring, HDD electronic drilling recording, and a contractor prequalification process. Over a period of one month, a 1293 m long HDD crossing was installed without incident. 4 refs., 1 tab., 6 figs.

  20. Water Accounting Plus (WA+) - a water accounting procedure for complex river basins based on satellite measurements

    Science.gov (United States)

    Karimi, P.; Bastiaanssen, W. G. M.; Molden, D.

    2013-07-01

    Coping with water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links depletion to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper, we introduce Water Accounting Plus (WA+), which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use and landscape evapotranspiration on the water cycle is described explicitly by defining land use groups with common characteristics. WA+ presents four sheets including (i) a resource base sheet, (ii) an evapotranspiration sheet, (iii) a productivity sheet, and (iv) a withdrawal sheet. Every sheet encompasses a set of indicators that summarise the overall water resources situation. The impact of external (e.g., climate change) and internal influences (e.g., infrastructure building) can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used to acquire a vast amount of required data but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  1. Binary, ternary and quaternary liquid-liquid equilibria in 1-butanol, oleic acid, water and n-heptane mixtures

    NARCIS (Netherlands)

    Winkelman, J. G. M.; Kraai, G. N.; Heeres, H. J.

    2009-01-01

    This work reports on liquid-liquid equilibria in the system 1-butanol, oleic acid, water and n-heptane used for biphasic, lipase catalysed esterifications. The literature was studied on the mutual solubility in binary systems of water and each of the organic components. Experimental results were obt

  2. An investigation of ozone and planetary boundary layer dynamics over the complex topography of Grenoble combining measurements and modeling

    OpenAIRE

    Couach, O.; Balin, I.; R. Jiménez; Ristori, P.; Perego, S.; Kirchner, F.; Simeonov, V.; B. Calpini; Bergh, H.

    2003-01-01

    This paper concerns an evaluation of ozone (O3) and planetary boundary layer (PBL) dynamics over the complex topography of the Grenoble region through a combination of measurements and mesoscale model (METPHOMOD) predictions for three days, during July 1999. The measurements of O3 and PBL structure were obtained with a Differential Absorption Lidar (DIAL) system, situated 20 km south of Grenoble at Vif (310 m ASL). The combined lidar observations ...

  3. Validation of ASTER Surface Temperature Data with In Situ Measurements to Evaluate Heat Islands in Complex Urban Areas

    OpenAIRE

    Bonggeun Song; Kyunghun Park

    2014-01-01

    This study compared Advanced Spaceborne Thermal Emission Reflection Radiometer (ASTER) surface temperature data with in situ measurements to validate the use of ASTER data for studying heat islands in urban settings with complex spatial characteristics. Eight sites in Changwon, Korea, were selected for analyses. Surface temperature data were extracted from the thermal infrared (TIR) band of ASTER on four dates during the summer and fall of 2012, and corresponding in situ measurements of tempe...

  4. Inactivation of bacteria on surfaces by sprayed slightly acidic hypochlorous acid water: in vitro experiments.

    Science.gov (United States)

    Hakim, Hakimullah; Alam, Md Shahin; Sangsriratanakul, Natthanan; Nakajima, Katsuhiro; Kitazawa, Minori; Ota, Mari; Toyofuku, Chiharu; Yamada, Masashi; Thammakarn, Chanathip; Shoham, Dany; Takehara, Kazuaki

    2016-08-01

    The capacity of slightly acidic hypochlorous acid water (SAHW), in both liquid and spray form, to inactivate bacteria was evaluated as a potential candidate for biosecurity enhancement in poultry production. SAHW (containing 50 or 100 ppm chlorine, pH 6) was able to inactivate Escherichia coli and Salmonella Infantis in liquid to below detectable levels (≤2.6 log10 CFU/ml) within 5 sec of exposure. In addition, SAHW antibacterial capacity was evaluated by spraying it using a nebulizer into a box containing these bacteria, which were present on the surfaces of glass plates and rayon sheets. SAHW was able to inactivate both bacterial species on the glass plates (dry condition) and rayon sheets within 5 min spraying and 5 min contact times, with the exception of 50 ppm SAHW on the rayon sheets. Furthermore, a corrosivity test determined that SAHW does not corrode metallic objects, even at the longest exposure times (83 days). Our findings demonstrate that SAHW is a good candidate for biosecurity enhancement in the poultry industry. Spraying it on the surfaces of objects, eggshells, egg incubators and transport cages could reduce the chances of contamination and disease transmission. These results augment previous findings demonstrating the competence of SAHW as an anti-viral disinfectant. PMID:27052464

  5. Inception of Acetic Acid/Water Cluster Growth in Molecular Beams.

    Science.gov (United States)

    Bende, Attila; Perretta, Giuseppe; Sementa, Paolo; Di Palma, Tonia M

    2015-10-01

    The influence of carboxylic acids on water nucleation in the gas phase has been explored in the supersonic expansion of water vapour mixed with acetic acid (AcA) at various concentrations. The sodium-doping method has been used to detect clusters produced in supersonic expansions by using UV photoionisation. The mass spectra obtained at lower acid concentrations show well-detected Na(+) -AcA(H2O)n and Na(+)-AcA2 (H2O)n clusters up to 200 Da and, in the best cooling expansions, emerging Na(+)-AcAm (H2O)n signals at higher masses and unresolved signals that extend beyond m/e values >1000 Da. These signals, which increase with increasing acid content in water vapour, are an indication that the cluster growth taking place arises from mixed water-acid clusters. Theoretical calculations show that small acid-water clusters are stable and their formation is even thermodynamically favoured with respect to pure water clusters, especially at lower temperatures. These findings suggest that acetic acid may play a significant role as a pre-nucleation embryo in the formation of aerosols in wet environments. PMID:26296812

  6. Measurement of the speed of sound by observation of the Mach cones in a complex plasma under microgravity conditions

    International Nuclear Information System (INIS)

    We report the first observation of the Mach cones excited by a larger microparticle (projectile) moving through a cloud of smaller microparticles (dust) in a complex plasma with neon as a buffer gas under microgravity conditions. A collective motion of the dust particles occurs as propagation of the contact discontinuity. The corresponding speed of sound was measured by a special method of the Mach cone visualization. The measurement results are incompatible with the theory of ion acoustic waves. The estimate for the pressure in a strongly coupled Coulomb system and a scaling law for the complex plasma make it possible to derive an evaluation for the speed of sound, which is in a reasonable agreement with the experiments in complex plasmas

  7. Modeling and measuring Business/IT Alignment by using a complex-network approach

    OpenAIRE

    Sousa, José Luís da Rocha

    2014-01-01

    Business/IT Alignment is an information systems research field with a long existence and a high number of researchers and represents a central thinking direction over the entanglement between business and information systems. lt aims to achieve a paradigm, on which there is a high degree of visibility and availability of information about the information systems sociomateriality. _ Complex-networks constitute an approach to the study of the emergent properties of complex-sys...

  8. Measuring and Perceiving Changes in Oral Complexity, Accuracy and Fluency: Examining Instructed Learners' Short-Term Gains

    Science.gov (United States)

    Tonkyn, Alan Paul

    2012-01-01

    This paper reports a case study of the nature and extent of progress in speaking skills made by a group of upper intermediate instructed learners, and also assessors' perceptions of that progress. Initial and final interview data were analysed using several measures of Grammatical and Lexical Complexity, Language Accuracy and Fluency. These…

  9. Comment on 'Interpretation of the Lempel-Ziv Complexity Measure in the context of Biomedical Signal Analysis'

    CERN Document Server

    Balasubramanian, Karthi

    2013-01-01

    In this Communication, we express our reservations on some aspects of the interpretation of the Lempel-Ziv Complexity measure (LZ) by Mateo et al. in "Interpretation of the Lempel-Ziv complexity measure in the context of biomedical signal analysis," IEEE Trans. Biomed. Eng., vol. 53, no. 11, pp. 2282-2288, Nov. 2006. In particular, we comment on the dependence of the LZ complexity measure on number of harmonics, frequency content and amplitude modulation. We disagree with the following statements made by Mateo et al. 1. "LZ is not sensitive to the number of harmonics in periodic signals." 2. "LZ increases as the frequency of a sinusoid increases." 3. "Amplitude modulation of a signal doesnot result in an increase in LZ." We show the dependence of LZ complexity measure on harmonics and amplitude modulation by using a modified version of the synthetic signal that has been used in the original paper. Also, the second statement is a generic statement which is not entirely true. This is true only in the low freque...

  10. Carleson Measures for the Drury-Arveson Hardy space and other Besov-Sobolev spaces on Complex Balls

    OpenAIRE

    Arcozzi, N.; Rochberg, R.; Sawyer, E

    2007-01-01

    We characterize the Carleson measures for the Drury-Arveson Hardy space and other Hilbert spaces of analytic functions of several complex variables. This provides sharp estimates for Drury's generalization of Von Neumann's inequality. The characterization is in terms of a geometric condition, the "split tree condition", which reflects the nonisotropic geometry underlying the Drury-Arveson Hardy space.

  11. Finalizing a measurement framework for the burden of treatment in complex patients with chronic conditions

    Directory of Open Access Journals (Sweden)

    Eton DT

    2015-03-01

    % were coping with multiple chronic conditions. A preliminary conceptual framework using data from the first 32 interviews was evaluated and was modified using narrative data from 18 additional interviews with a racially and socioeconomically diverse sample of patients. The final framework features three overarching themes with associated subthemes. These themes included: 1 work patients must do to care for their health (eg, taking medications, keeping medical appointments, monitoring health; 2 challenges/stressors that exacerbate perceived burden (eg, financial, interpersonal, provider obstacles; and 3 impacts of burden (eg, role limitations, mental exhaustion. All themes and subthemes were subsequently confirmed in focus groups. Conclusion: The final conceptual framework can be used as a foundation for building a patient self-report measure to systematically study treatment burden for research and analytical purposes, as well as to promote meaningful clinic-based dialogue between patients and providers about the challenges inherent in maintaining complex self-management of health. Keywords: treatment burden, conceptual framework, adherence, questionnaire, self-management, multi-morbidity

  12. Complexity measures of the central respiratory networks during wakefulness and sleep

    Science.gov (United States)

    Dragomir, Andrei; Akay, Yasemin; Curran, Aidan K.; Akay, Metin

    2008-06-01

    Since sleep is known to influence respiratory activity we studied whether the sleep state would affect the complexity value of the respiratory network output. Specifically, we tested the hypothesis that the complexity values of the diaphragm EMG (EMGdia) activity would be lower during REM compared to NREM. Furthermore, since REM is primarily generated by a homogeneous population of neurons in the medulla, the possibility that REM-related respiratory output would be less complex than that of the awake state was also considered. Additionally, in order to examine the influence of neuron vulnerabilities within the rostral ventral medulla (RVM) on the complexity of the respiratory network output, we inhibited respiratory neurons in the RVM by microdialysis of GABAA receptor agonist muscimol. Diaphragm EMG, nuchal EMG, EEG, EOG as well as other physiological signals (tracheal pressure, blood pressure and respiratory volume) were recorded from five unanesthetized chronically instrumented intact piglets (3-10 days old). Complexity of the diaphragm EMG (EMGdia) signal during wakefulness, NREM and REM was evaluated using the approximate entropy method (ApEn). ApEn values of the EMGdia during NREM and REM sleep were found significantly (p < 0.05 and p < 0.001, respectively) lower than those of awake EMGdia after muscimol inhibition. In the absence of muscimol, only the differences between REM and wakefulness ApEn values were found to be significantly different.

  13. Eddy-correlation measurements of benthic fluxes under complex flow conditions: Effects of coordinate transformations and averaging time scales

    DEFF Research Database (Denmark)

    Lorke, Andreas; McGinnis, Daniel F.; Maeck, Andreas

    2013-01-01

    hours of continuous eddy-correlation measurements of sediment oxygen fluxes in an impounded river, we demonstrate that rotation of measured current velocities into streamline coordinates can be a crucial and necessary step in data processing under complex flow conditions in non-flat environments...... with complex topography. We found that under these conditions neither time series detrending nor coordinate rotation can remove low-frequency velocity variations completely. These variations result in spurious flux contributions and in a pronounced dependence of the derived fluxes on averaging time scales......Eddy-correlation measurements of sediment oxygen uptake rates in aquatic systems are increasingly used to obtain areal-averaged fluxes with a high temporal resolution. Here we discuss the effects of coordinate rotation and averaging time scale for Reynolds decomposition on flux estimates. Using 119...

  14. Application of the modified Wheeler cap method for radiation efficiency measurement of balanced electrically small antennas in complex environment

    DEFF Research Database (Denmark)

    Zhang, Jiaying; Pivnenko, Sergey; Breinbjerg, Olav

    2010-01-01

    In this paper, application of a modified Wheeler cap method for the radiation efficiency measurement of balanced electrically small antennas is presented. It is shown that the limitations on the cavity dimension can be overcome and thus measurement in a large cavity is possible. The cavity loss...... is investigated, and a modified radiation efficiency formula that includes the cavity loss is introduced. Moreover, a modification of the technique is proposed that involves the antenna working complex environment inside the Wheeler Cap and thus makes possible measurement of an antenna close to a hand or head...

  15. Matrix measure method for global exponential stability of complex-valued recurrent neural networks with time-varying delays.

    Science.gov (United States)

    Gong, Weiqiang; Liang, Jinling; Cao, Jinde

    2015-10-01

    In this paper, based on the matrix measure method and the Halanay inequality, global exponential stability problem is investigated for the complex-valued recurrent neural networks with time-varying delays. Without constructing any Lyapunov functions, several sufficient criteria are obtained to ascertain the global exponential stability of the addressed complex-valued neural networks under different activation functions. Here, the activation functions are no longer assumed to be derivative which is always demanded in relating references. In addition, the obtained results are easy to be verified and implemented in practice. Finally, two examples are given to illustrate the effectiveness of the obtained results.

  16. Inferring a Drive-Response Network from Time Series of Topological Measures in Complex Networks with Transfer Entropy

    Directory of Open Access Journals (Sweden)

    Xinbo Ai

    2014-11-01

    Full Text Available Topological measures are crucial to describe, classify and understand complex networks. Lots of measures are proposed to characterize specific features of specific networks, but the relationships among these measures remain unclear. Taking into account that pulling networks from different domains together for statistical analysis might provide incorrect conclusions, we conduct our investigation with data observed from the same network in the form of simultaneously measured time series. We synthesize a transfer entropy-based framework to quantify the relationships among topological measures, and then to provide a holistic scenario of these measures by inferring a drive-response network. Techniques from Symbolic Transfer Entropy, Effective Transfer Entropy, and Partial Transfer Entropy are synthesized to deal with challenges such as time series being non-stationary, finite sample effects and indirect effects. We resort to kernel density estimation to assess significance of the results based on surrogate data. The framework is applied to study 20 measures across 2779 records in the Technology Exchange Network, and the results are consistent with some existing knowledge. With the drive-response network, we evaluate the influence of each measure by calculating its strength, and cluster them into three classes, i.e., driving measures, responding measures and standalone measures, according to the network communities.

  17. Measuring Conceptual Complexity: A Content-Analytic Model Using the Federal Income Tax Laws.

    Science.gov (United States)

    Karlinsky, Stewart S.; Andrews, J. Douglas

    1986-01-01

    Concludes that more than 15 percent of the federal income tax law's complexity is attributable to the capital gains sections. Confirms the idea that the capital gain and loss provisions substantially complicate the law in both absolute and relative terms. (FL)

  18. Using the Solution Space Diagram in Measuring the Effect of Sector Complexity During Merging Scenarios

    NARCIS (Netherlands)

    Abdul Rahman, S.M.B.; Van Paassen, M.M.; Mulder, M.

    2011-01-01

    When designing Air Traffic Control (ATC) sectors and procedures, traffic complexity and workload are important issues. For predicting ATC workload, metrics based on the Solution Space Diagram (SSD) have been proposed. This paper studies the effect of sector design on workload and SSD metrics. When c

  19. Reaction time and rapid serial processing measures of information processing speed in multiple sclerosis: complexity, compounding, and augmentation.

    Science.gov (United States)

    Hughes, Abbey J; Denney, Douglas R; Lynch, Sharon G

    2011-11-01

    Information processing speed is frequently cited as the primary cognitive domain impacted by multiple sclerosis (MS) and is usually evaluated with reaction time (RT) or rapid serial processing (RSP) measures. The present study compared the efficacy of RT and RSP measures to distinguish between patients with MS (N = 42) and healthy controls (N = 40). The RT measure was patterned after the Computerized Tests of Information Processing and included measures of simple, choice, and semantic RT. The RSP measures consisted of the Symbol Digit Modalities Test (SDMT) and the Stroop Test. Substantial differences in information processing speed between patients and controls were found on all tests, with slightly larger effect sizes for RSP measures than RT measures and for the SDMT than the Stroop Test. Binary logistic regression analyses showed RSP measures performed better than RT measures at distinguishing patients from controls, and likewise, the SDMT score performed better than the scores derived from the Stroop Test. Results are discussed in the context of three effects associated with common measures of processing speed: complexity, compounding, and augmentation. PMID:22040901

  20. Instrumentation Suite for Acoustic Propagation Measurements in Complex Shallow Water Environments

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Obtain at-sea measurements to test theoretical and modeling predictions of acoustic propagation in dynamic, inhomogeneous, and nonisotropic shallow water...

  1. Force and complexity of tongue task training influences behavioral measures of motor learning

    DEFF Research Database (Denmark)

    Kothari, Mohit; Svensson, Peter; Huo, Xueliang;

    2012-01-01

    Relearning of motor skills is important in neurorehabilitation. We investigated the improvement of training success during simple tongue protrusion (two force levels) and a more complex tongue-training paradigm using the Tongue Drive System (TDS). We also compared subject-based reports of fun, pain......, fatigue, and motivation between paradigms. Three randomized sessions and one control experiment were performed. Sixteen healthy subjects completed two different 1-h sessions of simple tongue training with 1 N and 3 N, respectively, and one TDS session. After 1 wk, six out of 16 subjects participated...... the experienced group performed equal to the last 5 min of their first TDS session and neither group improved during rest. Training with the TDS was rated as more fun, less painful, less fatiguing, and more motivating compared with simple tongue training. In conclusion, force level and complexity of tongue...

  2. Measuring social complexity and the emergence of cooperation from entropic principles

    CERN Document Server

    López-Corona, O; Huerta, A; Mustri-Trejo, D; Perez, K; Ruiz, A; Valdés, O; Zamudio, F

    2015-01-01

    Assessing quantitatively the state and dynamics of a social system is a very difficult problem. It is of great importance for both practical and theoretical reasons such as establishing the efficiency of social action programs, detecting possible community needs or allocating resources. In this paper we propose a new general theoretical framework for the study of social complexity, based on the relation of complexity and entropy in combination with evolutionary dynamics to asses the dynamics of the system. Imposing the second law of thermodynamics, we study the conditions under which cooperation emerges and demonstrate that it depends of relative importance of local and global fitness. As cooperation is a central concept in sustainability, this thermodynamic-informational approach allows new insights and means to asses it using the concept of Helmholtz free energy. Finally we introduce a new set of equations that consider the more general case where the social system change both in time and space, and relate ...

  3. Binary, ternary and quaternary liquid-liquid equilibria in 1-butanol, oleic acid, water and n-heptane mixtures

    OpenAIRE

    Winkelman, J.G.M.; Kraai, G. N.; Heeres, H. J.

    2009-01-01

    This work reports on liquid-liquid equilibria in the system 1-butanol, oleic acid, water and n-heptane used for biphasic, lipase catalysed esterifications. The literature was studied on the mutual solubility in binary systems of water and each of the organic components. Experimental results were obtained on the composition of the coexisting phases of a series of ternary and quaternary mixtures of the components at 301,308 and 313 K. The data were correlated successfully with the UNIQUAC model...

  4. Multiscale Cross-Approximate Entropy Analysis as a Measure of Complexity among the Aged and Diabetic

    OpenAIRE

    Hsien-Tsai Wu; Cyuan-Cin Liu; Men-Tzung Lo; Po-Chun Hsu; An-Bang Liu; Kai-Yu Chang; Chieh-Ju Tang

    2013-01-01

    Complex fluctuations within physiological signals can be used to evaluate the health of the human body. This study recruited four groups of subjects: young healthy subjects (Group 1, n = 32), healthy upper middle-aged subjects (Group 2, n = 36), subjects with well-controlled type 2 diabetes (Group 3, n = 31), and subjects with poorly controlled type 2 diabetes (Group 4, n = 24). Data acquisition for each participant lasted 30 minutes. We obtained data related to consecutive time series with R...

  5. High-precision optical measuring instruments and their application as part of mobile diagnostic complexes

    OpenAIRE

    Igor Miroshnichenko

    2014-01-01

    The article is devoted to the results of the laser technologies and methods of optical interferometry for registering information in the quality control and diagnostics of the construction materials and the power of elements of the acoustic non-destructive control, and also described new technical decisions, allowing to apply the results obtained to the solution of practical problems of diagnostics of products in operation, as part of mobile diagnostic complexes.

  6. High-precision optical measuring instruments and their application as part of mobile diagnostic complexes

    Directory of Open Access Journals (Sweden)

    Igor Miroshnichenko

    2014-04-01

    Full Text Available The article is devoted to the results of the laser technologies and methods of optical interferometry for registering information in the quality control and diagnostics of the construction materials and the power of elements of the acoustic non-destructive control, and also described new technical decisions, allowing to apply the results obtained to the solution of practical problems of diagnostics of products in operation, as part of mobile diagnostic complexes.

  7. Efficacy of dynamic traffic management measures: the influence of complexity and situational awareness

    NARCIS (Netherlands)

    Hoogendoorn, R.; Vreeswijk, J.D.; Hoogendoorn, S.P.; Brookhuis, K.A.; Arem, van B.; Berkum, van E.C.

    2012-01-01

    Behavior of road users (e.g. route choice, driving behavior) is a critical factor in the efficacy of measures applied in the context of Dynamic Traffic Management (DTM). In order for drivers to make well-informed decisions, it is required that information provided by DTM measures is perceived. In th

  8. MEASURING ACCURACY AND COMPLEXITY OF AN L2 LEARNER’S ORAL PRODUCTION

    Directory of Open Access Journals (Sweden)

    Teguh Khaerudin

    2015-03-01

    Full Text Available This paper aims at examining the influence of different tasks on the degree of task performance in a second language learner’s oral production. The underlying assumption is that among the three aspects of language performance in L2, i.e. fluency, accuracy, and complexity, learners may prioritize only one of them (Ellis & Barkhuizen, 2005, p. 150 and that their decision to prioritize one particular area of language performance may be determined by the characteristics of the task given to the learners (Skehan & Foster, 1997. Having a written record of an oral production, the writer focuses this study on determining the degree of complexity and accuracy, and analyzing whether the different tasks change the level of learner’s oral performance. The results show that learner’s accuracy from both tasks remains in the same level. However, both task conditions, which do not allow speech plan, result in no improvement in accuracy level and a minor improvement in the complexity level.

  9. Is the habitation of acidic-water sanctuaries by galaxiid fish facilitated by natural organic matter modification of sodium metabolism?

    Science.gov (United States)

    Glover, Chris N; Donovan, Katherine A; Hill, Jonathan V

    2012-01-01

    Acidic waters of New Zealand's West Coast are hypothesized to be a refuge for native galaxiid fish, allowing them to escape predation from acid-sensitive invasive salmonid species. To determine the mechanisms by which galaxiids tolerate low pH, we investigated sodium metabolism in inanga Galaxias maculatus in response to water pH, short-term acclimation to acidic waters, the presence and source of natural organic matter (NOM), and fish life history. Contrary to expectation, inanga were physiologically sensitive to acid exposure, displaying inhibited sodium influx and exacerbated sodium efflux. Short-term (144 h) acclimation to acid did not modify this effect, and NOM did not exert a protective effect on sodium metabolism at low pH. Inanga sourced from naturally acidic West Coast waters did, however, display a sodium influx capacity (J(max)) that was significantly elevated when compared with that of fish collected from neutral waters. All inanga, independent of source, exhibited exceptionally high sodium uptake affinities (18-40 μM) relative to previously studied freshwater teleosts. Although inanga displayed relatively poor physiological tolerance to acidic waters, their high sodium influx affinity coupled with their occupation of near-coastal waters with elevated sodium levels may permit habitation of low-pH freshwaters. PMID:22902374

  10. An improved technique for the measurement of the complex susceptibility of magnetic colloids in the microwave region

    OpenAIRE

    FANNIN, PAUL; COUPER, COLIN STUART

    2010-01-01

    PUBLISHED Measurements by means of the short-circuit (S/C) and open circuit (O/C) transmission line techniques are well established methods for investigating the magnetic and dielectric properties of magnetic colloids, respectively. In particular, the S/C technique has been used in the investigation of the resonant properties of ferrofluids; resonance being indicated by the transition of the real component of the magnetic complex susceptibility, ?(?)=??(?)?i??(?), from a positive to a nega...

  11. Complexity-based measures inform tai chi’s impact on standing postural control in older adults with peripheral neuropathy

    OpenAIRE

    Manor, Bradley David; Lipsitz, Lewis Arnold; Wayne, Peter Michael; Peng, Chung-Kang; Li, Li

    2013-01-01

    Background: Tai Chi training enhances physical function and may reduce falls in older adults with and without balance disorders, yet its effect on postural control as quantified by the magnitude or speed of center-of-pressure (COP) excursions beneath the feet is less clear. We hypothesized that COP metrics derived from complex systems theory may better capture the multi-component stimulus that Tai Chi has on the postural control system, as compared with traditional COP measures. Methods: We p...

  12. A convenient method for complex permittivity measurement of thin materials at microwave frequencies

    Science.gov (United States)

    Chung, B. K.

    2006-05-01

    A practical problem in the reflection method for measuring permittivity of thin materials is the difficulty in ensuring the sample is placed exactly at the waveguide flange. A small position offset of the dielectric slab will give rise to significant errors in calculating the permittivity. To circumvent this problem, a measurement method using a waveguide partially filled with a thin material slab has been developed. The material sample can be easily prepared and inserted into the guide through a longitudinal slot on the broad wall of the waveguide. Multiple material slabs can be measured rapidly because one does not have to disconnect the waveguide system for sample placement. The method is verified with measurement of Teflon, glass and FR4 fibreglass. The measured permittivity show good agreement with published data. Subsequently, the permittivity of a vegetation leaf was measured. The method presented in this paper is particularly useful in measuring the permittivity of a thin and narrow slab of natural materials such as a paddy leaf.

  13. A convenient method for complex permittivity measurement of thin materials at microwave frequencies

    Energy Technology Data Exchange (ETDEWEB)

    Chung, B K [Faculty of Engineering, Multimedia University, 63100 Cyberjaya (Malaysia)

    2006-05-07

    A practical problem in the reflection method for measuring permittivity of thin materials is the difficulty in ensuring the sample is placed exactly at the waveguide flange. A small position offset of the dielectric slab will give rise to significant errors in calculating the permittivity. To circumvent this problem, a measurement method using a waveguide partially filled with a thin material slab has been developed. The material sample can be easily prepared and inserted into the guide through a longitudinal slot on the broad wall of the waveguide. Multiple material slabs can be measured rapidly because one does not have to disconnect the waveguide system for sample placement. The method is verified with measurement of Teflon, glass and FR4 fibreglass. The measured permittivity show good agreement with published data. Subsequently, the permittivity of a vegetation leaf was measured. The method presented in this paper is particularly useful in measuring the permittivity of a thin and narrow slab of natural materials such as a paddy leaf.

  14. Validation of ASTER Surface Temperature Data with In Situ Measurements to Evaluate Heat Islands in Complex Urban Areas

    Directory of Open Access Journals (Sweden)

    Bonggeun Song

    2014-01-01

    Full Text Available This study compared Advanced Spaceborne Thermal Emission Reflection Radiometer (ASTER surface temperature data with in situ measurements to validate the use of ASTER data for studying heat islands in urban settings with complex spatial characteristics. Eight sites in Changwon, Korea, were selected for analyses. Surface temperature data were extracted from the thermal infrared (TIR band of ASTER on four dates during the summer and fall of 2012, and corresponding in situ measurements of temperature were also collected. Comparisons showed that ASTER derived temperatures were generally 4.27°C lower than temperatures collected by in situ measurements during the daytime, except on cloudy days. However, ASTER temperatures were higher by 2.23–2.69°C on two dates during the nighttime. Temperature differences between a city park and a paved area were insignificant. Differences between ASTER derived temperatures and onsite measurements are caused by a variety of factors including the application of emissivity values that do not consider the complex spatial characteristics of urban areas. Therefore, to improve the accuracy of surface temperatures extracted from infrared satellite imagery, we propose a revised model whereby temperature data is obtained from ASTER and emissivity values for various land covers are extracted based on in situ measurements.

  15. Measurement of reflected second harmonics and nonlinearity parameter using a transducer with complex structure

    Institute of Scientific and Technical Information of China (English)

    MA Qingyu; LU Rongrong; ZHANG Dong; GONG Xiufen; LIU Xiaozhou

    2003-01-01

    Measurement of nonlinearity parameter using the second-harmonic reflective model is studied. A new kind of compound transducer is designed and fabricated for this purpose. With this transducer and the finite amplitude insert-substitution method, an experimental system to measure the nonlinearity parameter using reflective model is developed. B/A values of some liquids and biological tissues are obtained and results coincide well with those presented in the literatures.

  16. Assessment of the microclimatic and human comfort conditions in a complex urban environment: Modelling and measurements

    Energy Technology Data Exchange (ETDEWEB)

    Gulyas, Agnes; Unger, Janos [University of Szeged, Szeged (Hungary). Department of Climatology and Landscape Ecology; Matzarakis, Andreas [Meteorological Institute, University of Freiburg, Freiburg im Breisgau (Germany)

    2006-12-15

    Several complex thermal indices (e.g. Predicted Mean Vote and Physiological Equivalent Temperature) were developed in the last decades to describe and quantify the thermal environment of humans and the energy fluxes between body and environment. Compared to open spaces/landscapes the complex surface structure of urban areas creates an environment with special microclimatic characteristics, which have a dominant effect on the energy balance of the human body. In this study, outdoor thermal comfort conditions are examined through two field-surveys in Szeged, a South-Hungarian city (population 160,000). The intensity of radiation fluxes is dependent on several factors, such as surface structure and housing density. Since our sample area is located in a heavily built-up city centre, radiation fluxes are mainly influenced by narrow streets and several 20-30-year-old (20-30m tall) trees. Special emphasis is given to the human-biometeorological assessment of the microclimate of complex urban environments through the application of the thermal index PET. The analysis is carried out by the utilization of the RayMan model. Firstly, bioclimatic conditions of sites located close to each other but shaded differently by buildings and plants are compared. The results show that differences in the PET index amongst these places can be as high as 15-20{sup |}C due to the different irradiation. Secondly, the investigation of different modelled environments by RayMan (only buildings, buildings+trees and only trees) shows significant alterations in the human comfort sensation between the situations. (author)

  17. Application of Image Measurement and Continuum Mechanics to the Direct Measurement of Two-Dimensional Finite Strain in a Complex Fibro-Porous Material

    Science.gov (United States)

    Britton, Paul; Loughran, Jeff

    This paper outlines a computational procedure that has been implemented for the direct measurement of finite material strains from digital images taken of a material surface during plane-strain process experiments. The selection of both hardware and software components of the image processing system is presented, and the numerical procedures developed for measuring the 2D material deformations are described. The algorithms are presented with respect to two-roll milling of sugar cane bagasse, a complex fibro-porous material that undergoes large strains during processing to extract the sucrose-rich liquid. Elaborations are made in regard to numerical developments for other forms of experimentation, algorithm calibrations and measurement improvements. Finite 2D strain results are shown for both confined uniaxial compression and two-roll milling experiments.

  18. A Graphical Aid for the Complex Permittivity Measurement at Microwave and Millimeter Wavelengths

    OpenAIRE

    Silveirinha, M. G.; Fernandes, C. A.; Costa, Jorge R.

    2014-01-01

    WOS:000337131400021 (Nº de Acesso Web of Science) We introduce a novel procedure to retrieve the complex permittivity ϵ'-jϵ'' of dielectric materials. It is a variant of the well-known waveguide method, and uses as input the one-port reflection data from a vector network analyzer connected to a short-circuited rectangular waveguide filled with a dielectric sample of known length. Here, it is shown that for low to moderate loss materials, the locus of the reflection coefficient in the compl...

  19. Towards a methodology for validation of centrality measures in complex networks.

    Directory of Open Access Journals (Sweden)

    Komal Batool

    Full Text Available BACKGROUND: Living systems are associated with Social networks - networks made up of nodes, some of which may be more important in various aspects as compared to others. While different quantitative measures labeled as "centralities" have previously been used in the network analysis community to find out influential nodes in a network, it is debatable how valid the centrality measures actually are. In other words, the research question that remains unanswered is: how exactly do these measures perform in the real world? So, as an example, if a centrality of a particular node identifies it to be important, is the node actually important? PURPOSE: The goal of this paper is not just to perform a traditional social network analysis but rather to evaluate different centrality measures by conducting an empirical study analyzing exactly how do network centralities correlate with data from published multidisciplinary network data sets. METHOD: We take standard published network data sets while using a random network to establish a baseline. These data sets included the Zachary's Karate Club network, dolphin social network and a neural network of nematode Caenorhabditis elegans. Each of the data sets was analyzed in terms of different centrality measures and compared with existing knowledge from associated published articles to review the role of each centrality measure in the determination of influential nodes. RESULTS: Our empirical analysis demonstrates that in the chosen network data sets, nodes which had a high Closeness Centrality also had a high Eccentricity Centrality. Likewise high Degree Centrality also correlated closely with a high Eigenvector Centrality. Whereas Betweenness Centrality varied according to network topology and did not demonstrate any noticeable pattern. In terms of identification of key nodes, we discovered that as compared with other centrality measures, Eigenvector and Eccentricity Centralities were better able to identify

  20. Noise exposure assessment with task-based measurement in complex noise environment

    Institute of Scientific and Technical Information of China (English)

    LI Nan; YANG Qiu-ling; ZENG Lin; ZHU Liang-liang; TAO Li-yuan; ZHANG Hua; ZHAO Yi-ming

    2011-01-01

    Background Task-based measurement (TBM) is a method to assess the eight-hour A-weighted equivalent noise exposure level (LAeq. 8h) besides dosimeter. TBM can be better used in factories by non-professional workers and staffs.However, it is still not clear if TBM is equal or similar with dosimeter for LAeq.8h measurement in general. This study considered the measurement with dosimeter as real personal noise exposure level (PNEL) and assessed the accuracy of TBM by comparing the consistencies of TBM and dosimeter in LAeq.8h measurement.Methods The study was conducted in one automobile firm among 387 workers who are exposed to unstable noise.Dosimeters and TBM were used to compare the two strategies and assess the degree of agreement and causes of disagreement. Worker's PNEL was measured via TBM for noise; the real PNEL was also recorded. The TBM for noise was computed with task/position noise levels measured via sound level meter and workers' exposure information collected via working diary forms (WDF) filled by participants themselves. Full-shift noise exposure measurement via personal noise dosimeters were taken as the real PNEL. General linear model (GLM) was built to analyze the accuracy of TBM for noise and the source of difference between TBM for noise and real PNEL.Results The LAeq.8h with TBM were slightly higher than the real PNELs, except the electricians. Differences of the two values had statistical significance in stamping workers (P <0.001), assembly workers (P=0.015) and welding workers (P=0.001). The correlation coefficient of LAeq.8h with TBM and real PNELs was 0.841. Differences of the two results were mainly affected by real PNEL (F=11.27, P=0.001); and work groups (F=3.11, P <0.001) divided by jobs and workshops were also independent factors. PNEL of workers with fixed task/position ((86.53±8.82) dB(A)) was higher than those without ((75.76±9.92) dB(A)) (t=8.84, P <0.01). Whether workers had fixed task/position was another factor on the

  1. ELF field in the proximity of complex power line configuration measurement procedures.

    Science.gov (United States)

    Benes, M; Comelli, M; Villalta, R

    2006-01-01

    The issue of how to measure magnetic induction fields generated by various power line configurations, when there are several power lines that run across the same exposure area, has become a matter of interest and study within the Regional Environment Protection Agency of Friuli Venezia Giulia. In classifying the various power line typologies the definition of double circuit line was given: in this instance the magnetic field is determined by knowing the electrical and geometric parameters of the line. In the case of independent lines instead, the field is undetermined. It is therefore pointed out how, in the latter case, extracting previsional information from a set of measurements of the magnetic field alone is impossible. Making measurements throughout the territory of service has in several cases offered the opportunity to define standard operational procedures. PMID:16410292

  2. Comparisons of Snowfall Measurements in Complex Terrain Made During the 2010 Winter Olympics in Vancouver

    Science.gov (United States)

    Boudala, Faisal S.; Isaac, George A.; Rasmussen, Roy; Cober, Stewart G.; Scott, Bill

    2014-01-01

    Solid precipitation (SP) intensity () using four automatic gauges, Pluvio, PARSIVEL (PArticle, SIze and VELocity), FD12P and POSS, and radar reflectivity factor () using the POSS and PARSIVEL were measured at a naturally sheltered station (VOA) located at high level (1,640 m) on the Whistler Mountain in British Colombia, Canada. The R s and other standard meteorological parameters were collected from March 2009, and from November 2009, to February 2010. The wind speed (ws) measured during this period ranged from 0 to 4.5 ms-1, with a mean value of 0.5 ms-1. The temperature varied from 4 to -17 °C. The SP amount reported by the PARSIVEL was higher than that reported by the Pluvio by more than a factor of 2, while the FD12P and POSS measured relatively smaller amounts, but much closer to that reported by the Pluvio and manual measurements. The dependence of R s from the PARSIVEL on wind speed was examined, but no significant dependence was found. The PARSIVEL's precipitation retrieval algorithm was modified and tested using three different snow density size relationships ( ρ s- D) reported in literature. It was found that after modification of the algorithm, the derived R s amounts using the raw data agreed reasonably well with the Pluvio. Statistical analysis shows that more than 95 % of data measured by POSS appears to correlates well with the reflectivity factors determined using the three ρ s- D relationships. The automated Pluvio accumulation and manually determined daily SP amount (SPm) measured during five winter months were compared. The mean ratio (MR) and the mean difference (MD), and the correlation coefficient ( r) calculated using the data collected using the two methods, were found to be 0.96, 0.4 and 0.6 respectively, indicating respectable agreement between these two methods, with only the Pluvio underestimating the amount by about 4 %.

  3. Measuring the pollutant transport capacity of dissolved organic matter in complex matrixes

    DEFF Research Database (Denmark)

    Persson, L.; Alsberg, T.; Odham, G.;

    2003-01-01

    Dissolved organic matter (DOM) facilitated transport in contaminated groundwater was investigated through the measurement of the binding capacity of landfill leachate DOM (Vejen, Denmark) towards two model pollutants (pyrene and phenanthrene). Three different methods for measuring binding capacity....... It was further concluded that DOM facilitated transport should be taken into account for non-ionic PAHs with lg K OW above 5, at DOM concentrations above 250 mg C/L. The total DOM concentration was found to be more important for the potential of facilitated transport than differences in the DOM binding capacity....

  4. Overview of Approaches to Incorporate Dynamics into the Measurement of Complex Phenomena with the Use of Composite Indices

    Directory of Open Access Journals (Sweden)

    Anna Łatuszyńska

    2012-06-01

    Full Text Available Composite indices have substantially gained in popularity in recent years. Despite their alleged disadvantages, they appear to be very useful in measuring the level of certain phenomena that are too complex to express with a single indicator. Most rankings based on composite indicators are created at regular intervals, such as every month, every quarter or every year. A common approach is to base rankings solely on the most current values of single indicators, making no reference to previous results. The absence of dynamics from such measurements deprives studies of information on change in these phenomena and may limit the stability of classifications. This article presents the possibility of creating reliable, dynamic rankings of measured items and measuring the complex phenomena with the use of composite indices. Potential solutions are presented on the basis of a review of the international literature. Some advantages and disadvantages of the presented solutions are described and an example of a new approach is shown.

  5. Measurement of net electric charge and dipole moment of dust aggregates in a complex plasma

    CERN Document Server

    Yousefi, Razieh; Carmona-Reyes, Jorge; Matthews, Lorin S; Hyde, Truell W

    2014-01-01

    Understanding the agglomeration of dust particles in complex plasmas requires a knowledge of the basic properties such as the net electrostatic charge and dipole moment of the dust. In this study, dust aggregates are formed from gold coated mono-disperse spherical melamine-formaldehyde monomers in a radio-frequency (rf) argon discharge plasma. The behavior of observed dust aggregates is analyzed both by studying the particle trajectories and by employing computer models examining 3D structures of aggregates and their interactions and rotations as induced by torques arising from their dipole moments. These allow the basic characteristics of the dust aggregates, such as the electrostatic charge and dipole moment, to be determined. It is shown that the experimental results support the predicted values from computer models for aggregates in these environments.

  6. Measuring the complex permittivity tensor of uniaxial biological materials with coplanar waveguide transmission line

    Science.gov (United States)

    A simple and accurate technique is described for measuring the uniaxial permittivity tensor of biological materials with a coplanar waveguide transmission-line configuration. Permittivity tensor results are presented for several chicken and beef fresh meat samples at 2.45 GHz....

  7. Online measurement of mental representations of complex spatial decision problems : Comparison of CNET and hard laddering

    NARCIS (Netherlands)

    O. Horeni (Oliver); T.A. Arentze (Theo); B.G.C. Dellaert (Benedict); H.J.P. Timmermans (Harry)

    2013-01-01

    textabstractThis paper introduces the online Causal Network Elicitation Technique (CNET), as a technique for measuring components of mental representations of choice tasks and compares it with the more common technique of online ‘hard’ laddering (HL). While CNET works in basically two phases, one in

  8. Measures of Causality in Complex Datasets with Application to Financial Data

    Directory of Open Access Journals (Sweden)

    Anna Zaremba

    2014-04-01

    Full Text Available This article investigates the causality structure of financial time series. We concentrate on three main approaches to measuring causality: linear Granger causality, kernel generalisations of Granger causality (based on ridge regression and the Hilbert–Schmidt norm of the cross-covariance operator and transfer entropy, examining each method and comparing their theoretical properties, with special attention given to the ability to capture nonlinear causality. We also present the theoretical benefits of applying non-symmetrical measures rather than symmetrical measures of dependence. We apply the measures to a range of simulated and real data. The simulated data sets were generated with linear and several types of nonlinear dependence, using bivariate, as well as multivariate settings. An application to real-world financial data highlights the practical difficulties, as well as the potential of the methods. We use two real data sets: (1 U.S. inflation and one-month Libor; (2 S&P data and exchange rates for the following currencies: AUDJPY, CADJPY, NZDJPY, AUDCHF, CADCHF, NZDCHF. Overall, we reach the conclusion that no single method can be recognised as the best in all circumstances, and each of the methods has its domain of best applicability. We also highlight areas for improvement and future research.

  9. The Texas Projection Measure: Ignoring Complex Ecologies in a Changing World

    Science.gov (United States)

    Roane, Warren

    2010-01-01

    The Texas Projection Measure (TPM) has grown out of the state's need to meet the requirements of No Child Left Behind (NCLB). An examination of the state's method of predicting 8th grade mathematics scores reveals that several factors have been ignored in the process of developing the model, including assumptions in its underlying statistical…

  10. Microwave generation and complex microwave responsivity measurements on small Dayem bridges

    DEFF Research Database (Denmark)

    Pedersen, Niels Falsig; Sørensen, O; Mygind, Jesper;

    1977-01-01

    Measurements of the active properties of a Dayem micro-bridge at X-band frequencies is described. The bridge was mounted in a microwave cavity designed to match the bridge properly and the microwave output from the cavity was detected using a sensitive X-band spectrometer. Microwave power...

  11. Multi-agent based modeling and execution framework for complex simulation, control and measuring tasks

    NARCIS (Netherlands)

    Papp, Z.; Hoeve, H.J.

    2000-01-01

    The paper presents a modeling concept and a supporting runtime environment, which enables running simulation, control and measuring (data processing) tasks on distributed implementation platforms. Its main features: (1) it is scaleable in various application domains; (2) it has a model based system

  12. Quality analysis applied on eddy covariance measurements at complex forest sites using footprint modelling

    NARCIS (Netherlands)

    Rebmann, C.; Göckede, M.; Foken, T.; Aubinet, M.; Aurela, M.; Berbigier, P.; Bernhofer, C.; Buchmann, N.; Carrara, A.; Cescatti, A.; Ceulemans, R.; Clement, R.; Elbers, J.A.; Granier, A.; Grünwald, T.; Guyon, D.; Havránková, K.; Heinesch, B.; Knohl, A.; Laurila, T.; Longdoz, B.; Marcolla, B.; Markkanen, T.; Miglietta, F.; Moncrieff, J.; Montagnani, L.; Moors, E.J.; Nardino, M.; Ourcival, J.M.; Rambal, S.; Rannik, Ü.; Rotenberg, E.; Sedlak, P.; Unterhuber, G.; Vesala, T.; Yakir, D.

    2005-01-01

    Measuring turbulent fluxes with the eddy covariance method has become a widely accepted and powerful tool for the determination of long term data sets for the exchange of momentum, sensible and latent heat, and trace gases such as CO2 between the atmosphere and the underlying surface. Several flux n

  13. Multi-sensor data fusion for measurement of complex freeform surfaces

    Science.gov (United States)

    Ren, M. J.; Liu, M. Y.; Cheung, C. F.; Yin, Y. H.

    2016-01-01

    Along with the rapid development of the science and technology in fields such as space optics, multi-scale enriched freeform surfaces are widely used to enhance the performance of the optical systems in both functionality and size reduction. Multi-sensor technology is considered as one of the promising methods to measure and characterize these surfaces at multiple scales. This paper presents a multi-sensor data fusion based measurement method to purposely extract the geometric information of the components with different scales which is used to establish a holistic geometry of the surface via data fusion. To address the key problems of multi-sensor data fusion, an intrinsic feature pattern based surface registration method is developed to transform the measured datasets to a common coordinate frame. Gaussian zero-order regression filter is then used to separate each measured data in different scales, and the datasets are fused based on an edge intensity data fusion algorithm within the same wavelength. The fused data at different scales is then merged to form a new surface with holistic multiscale information. Experimental study is presented to verify the effectiveness of the proposed method.

  14. Measuring The Impact Of Innovations On Efficiency In Complex Hospital Settings

    Directory of Open Access Journals (Sweden)

    Bonća Petra Došenović

    2015-12-01

    Full Text Available In this paper the authors propose an approach for measuring the impact of innovations on hospital efficiency. The suggested methodology can be applied to any type of innovation, including technology-based innovations, as well as consumer-focused and business model innovations. The authors apply the proposed approach to measure the impact of transcanalicular diode laser-assisted dacryocystorhinostomy (DCR, i.e. an innovation introduced in the surgical procedure for treating a tear duct blockage, on the efficiency of general hospitals in Slovenia. They demonstrate that the impact of an innovation on hospital efficiency depends not only on the features of the studied innovation but also on the characteristics of hospitals adopting the innovation and their external environment represented by a set of comparable hospitals.

  15. Serum, urinary, and salivary nitric oxide in rheumatoid arthritis: complexities of interpreting nitric oxide measures

    OpenAIRE

    Weinberg, J. Brice; Lang, Thomas; Wilkinson, William E.; Pisetsky, David S.; St Clair, E. William

    2006-01-01

    Nitric oxide (NO) may play important roles in rheumatoid arthritis (RA). RA is an inflammatory disease involving joints and other systems including salivary glands. To assess NO production in RA patients, we compared levels of serum, urine, and salivary nitrite and nitrate (NOx) in patients with RA and normal subjects, and we examined the relationships of these measures to disease activity. Serum, urine, and NOx levels as well as renal creatinine, NOx clearance and fractional excretion rates ...

  16. Developing palaeolimnological records of organic content (DOC and POC) using the UK Acid Water Monitoring Network sites

    Science.gov (United States)

    Russell, Fiona; Chiverrell, Richard; Boyle, John

    2016-04-01

    Monitoring programmes have shown increases in concentrations of dissolved organic matter (DOM) in the surface waters of northern and central Europe (Monteith et al. 2007), and negative impacts of the browning of river waters have been reported for fish populations (Jonsson et al. 2012; Ranaker et al. 2012) and for ecosystem services such as water treatment (Tuvendal and Elmqvist 2011). Still the exact causes of the recent browning remain uncertain, the main contenders being climate change (Evans et al. 2005) and reduced ionic strength in surface water resulting from declines in anthropogenic sulphur and sea salt deposition (Monteith et al. 2007). There is a need to better understand the pattern, drivers and trajectory of these increases in DOC and POC in both recent and longer-term (Holocene) contexts to improve the understanding of carbon cycling within lakes and their catchments. In Britain there are some ideal sites for testing whether these trends are preserved and developing methods for reconstructing organic fluxes from lake sedimentary archives. There is a suite of lakes distributed across the country, the UK Acid Waters Monitoring Network (UKAWMN) sites, which have been monitored monthly for dissolved organic carbon and other aqueous species since 1988. These 12 lakes have well studied recent and in some case whole Holocene sediment records. Here four of those lakes (Grannoch, Chon, Scoat Tarn and Cwm Mynach) are revisited, with sampling focused on the sediment-water interface and very recent sediments (approx.150 years). At Scoat Tarn (approx. 1000 years) and Llyn Mynach (11.5k years) longer records have been obtained to assess equivalent patterns through the Holocene. Analyses of the gravity cores have focused on measuring and characterising the organic content for comparison with recorded surface water DOC measurements (UKAWMN). Data from pyrolysis measurements (TGA/DSC) in an N atmosphere show that the mass loss between 330-415°C correlates well with

  17. Measurement of the formation of complexes in tyrosine kinase-mediated signal transduction

    Energy Technology Data Exchange (ETDEWEB)

    Ladbury, John E., E-mail: j.ladbury@biochem.ucl.ac.uk [Department of Biochemistry and Molecular Biology, University College London, Gower Street, London WC1E 6BT (United Kingdom)

    2007-01-01

    The use of isothermal titration calorimetry (ITC) provides a full thermodynamic characterization of an interaction in one experiment. The determination of the affinity is an important value; however, the additional layer of information provided by the change in enthalpy and entropy can help in understanding the biology. This is demonstrated with respect to tyrosine kinase-mediated signal transduction. Isothermal titration calorimetry (ITC) provides highly complementary data to high-resolution structural detail. An overview of the methodology of the technique is provided. Ultimately, the correlation of the thermodynamic parameters determined by ITC with structural perturbation observed on going from the free to the bound state should be possible at an atomic level. Currently, thermodynamic data provide some insight as to potential changes occurring on complex formation. Here, this is demonstrated in the context of in vitro quantification of intracellular tyrosine kinase-mediated signal transduction and the issue of specificity of the important interactions. The apparent lack of specificity in the interactions of domains of proteins involved in early signalling from membrane-bound receptors is demonstrated using data from ITC.

  18. Cleaner production and methodological proposal of eco-efficiency measurement in a Mexican petrochemical complex.

    Science.gov (United States)

    Morales, M A; Herrero, V M; Martínez, S A; Rodríguez, M G; Valdivieso, E; Garcia, G; de los Angeles Elías, Maria

    2006-01-01

    Abstract In the frame of the Petróleos Mexicanos Institutional Program for Sustainable Development, processes were evaluated in the manufacture operation of the petrochemical industry, with the purpose of reducing their ecological fingerprint. Thirteen cleaner production opportunities were registered in six process plants: ethylene oxide and glycols, acetaldehyde, ethylene, high density polyethylene, polypropylene switch and acrylonitrile, and 45 recommendations in the waste water treatment plant. Morelos is the second most important petrochemical complex in the Mexican and Latin American petrochemical industry. A tool was developed to obtain eco-efficiency indicators in operation processes, and as a result, potential savings were obtained based on best performance, as well as the integrated distribution of Sankey diagrams. Likewise, a mechanism of calculation to obtain economic savings based on the reduction of residues during the whole productive process is proposed. These improvement opportunities and recommendations will result in economic and environmental benefits minimising the use of water, efficient use of energy, raw materials and reducing residues from source, generating less environmental impacts during the process.

  19. Determining Wind Turbine Gearbox Model Complexity Using Measurement Validation and Cost Comparison: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    LaCava, W.; Xing, Y.; Guo, Y.; Moan, T.

    2012-04-01

    The Gearbox Reliability Collaborative (GRC) has conducted extensive field and dynamometer test campaigns on two heavily instrumented wind turbine gearboxes. In this paper, data from the planetary stage is used to evaluate the accuracy and computation time of numerical models of the gearbox. First, planet-bearing load and motion data is analyzed to characterize planetary stage behavior in different environments and to derive requirements for gearbox models and life calculations. Second, a set of models are constructed that represent different levels of fidelity. Simulations of the test conditions are compared to the test data and the computational cost of the models are compared. The test data suggests that the planet-bearing life calculations should be made separately for each bearing on a row due to unequal load distribution. It also shows that tilting of the gear axes is related to planet load share. The modeling study concluded that fully flexible models were needed to predict planet-bearing loading in some cases, although less complex models were able to achieve good correlation in the field-loading case. Significant differences in planet load share were found in simulation and were dependent on the scope of the model and the bearing stiffness model used.

  20. Amines are likely to enhance neutral and ion-induced sulfuric acid-water nucleation in the atmosphere more effectively than ammonia

    Directory of Open Access Journals (Sweden)

    T. Kurtén

    2008-04-01

    Full Text Available We have studied the structure and formation thermodynamics of dimer clusters containing H2SO4 or HSO4 together with ammonia and seven different amines possibly present in the atmosphere, using the high-level ab initio methods RI-MP2 and RI-CC2. As expected from e.g. proton affinity data, the binding of all studied amine – H2SO4 complexes is significantly stronger than that of NH3•H2SO4, while most amine – HSO4 complexes are only somewhat more strongly bound than NH3•HSO4. Further calculations on larger cluster structures containing dimethylamine or ammonia together with two H2SO4 molecules or one H2SO4 molecule and one HSO4 ion demonstrate that amines, unlike ammonia, significantly assist the growth of not only neutral but also ionic clusters along the H2SO4 co-ordinate. A sensitivity analysis indicates that the difference in complexation free energies for amine- and ammonia-containing clusters is large enough to overcome the mass-balance effect caused by the fact that the concentration of amines in the atmosphere is probably 2 or 3 orders of magnitude lower than that of ammonia. This implies that amines might be more important than ammonia in enhancing neutral and especially ion-induced sulfuric acid-water nucleation in the atmosphere.

  1. Amines are likely to enhance neutral and ion-induced sulfuric acid-water nucleation in the atmosphere more effectively than ammonia

    Directory of Open Access Journals (Sweden)

    T. Kurtén

    2008-07-01

    Full Text Available We have studied the structure and formation thermodynamics of dimer clusters containing H2SO4 or HSO4 together with ammonia and seven different amines possibly present in the atmosphere, using the high-level ab initio methods RI-MP2 and RI-CC2. As expected from e.g. proton affinity data, the binding of all studied amine-H2SO4 complexes is significantly stronger than that of NH3•H2SO4, while most amine-HSO4 complexes are only somewhat more strongly bound than NH3•HSO4. Further calculations on larger cluster structures containing dimethylamine or ammonia together with two H2SO4 molecules or one H2SO4 molecule and one HSO4 ion demonstrate that amines, unlike ammonia, significantly assist the growth of not only neutral but also ionic clusters along the H2SO4 co-ordinate. A sensitivity analysis indicates that the difference in complexation free energies for amine- and ammonia-containing clusters is large enough to overcome the mass-balance effect caused by the fact that the concentration of amines in the atmosphere is probably 2 or 3 orders of magnitude lower than that of ammonia. This implies that amines might be more important than ammonia in enhancing neutral and especially ion-induced sulfuric acid-water nucleation in the atmosphere.

  2. Using Complexity Metrics With R-R Intervals and BPM Heart Rate Measures

    DEFF Research Database (Denmark)

    Wallot, Sebastian; Fusaroli, Riccardo; Tylén, Kristian;

    2013-01-01

    on variability of the data, different choices regarding the kind of measures can have a substantial impact on the results. In this article we compare linear and non-linear statistics on two prominent types of heart beat data, beat-to-beat intervals (R-R interval) and beats-per-minute (BPM). As a proof...... dynamics, but their power to do so critically depends on the type data that is employed: While R-R intervals are very susceptible to nonlinear analyses, the success of nonlinear methods for BPM data critically depends on their construction. Generally, ‘oversampled’ BPM time-series can be recommended...

  3. System Architecture for measuring and monitoring Beam Losses in the Injector Complex at CERN

    CERN Document Server

    Zamantzas, C; Dehning, B; Jackson, S; Kwiatkowski, M; Vigano, W

    2012-01-01

    The strategy for beam setup and machine protection of the accelerators at the European Organisation for Nuclear Research (CERN) is mainly based on its Beam Loss Monitoring (BLM) systems. For their upgrade to higher beam energies and intensities, a new BLM system is under development with the aim of providing faster measurement updates with higher dynamic range and the ability to accept more types of detectors as input compared to its predecessors. In this paper, the architecture of the complete system is explored giving an insight to the design choices made to provide a highly reconfigurable system that is able to fulfil the different requirements of each accelerator using reprogrammable devices.

  4. Ultrasound temporal-spatial phase-interference in complex composite media; a comparison of experimental measurement and simulation prediction.

    Science.gov (United States)

    Al-Qahtani, Saeed M; Langton, Christian M

    2016-09-01

    The propagation of ultrasound through solid:liquid complex composite media such as cancellous bone suffers from a lack of a comprehensive understanding of the dependence upon density and structure. Assuming that a propagating ultrasound wave may be considered as an array of parallel sonic rays, we may determine the transit time of each by the relative proportion of the two constituents. A transit time spectrum (TTS) describes the proportion of sonic rays having a particular transit time between the minimum (tmin) and maximum (tmax) values; representing, for example, entire bone tissue and marrow respectively in the case of cancellous bone. Langton has proposed that the primary ultrasound attenuation mechanism in such media is phase-interference. The phase-interference of two or more ultrasound pulses detected at a phase-sensitive transducer has both temporal and spatial components. The temporal component is primarily dependent upon the transit time difference (dt) between the pulses and the propagating pulse-length (PL). The spatial component is primarily dependent upon the lateral separation (ds) of the detectedpulses of differing transit time and the lateral dimension of the ultrasound receive transducer aperture (dL). The aim of the paper was to explore these temporal and spatial dependencies through a comparison of experimental measurement and computer simulation in solid:liquid models of varying temporal and spatial complexity. Transmission measurements at nominal ultrasound frequencies of 1MHz and 5MHz were performed, thereby investigating the dependency upon period. The results demonstrated an overall agreement between experimental measurement and computer simulation of 87±16% and 85±12% for temporal and spatial components respectively. It is envisaged that a comprehensive understanding of ultrasound propagation through complex structures such as cancellous bone could provide an improved non-invasive tool for osteoporosis assessment.

  5. The direct and indirect measurement of boundary stress and drag on individual and complex arrays of elements

    Science.gov (United States)

    Tinoco, Rafael O.; Cowen, Edwin A.

    2013-04-01

    Motivated by the study of drag on plant canopies, a novel non-intrusive drag measurement device was developed—its design, calibration, and validation are presented. The device is based on isolating a region of a test facility, a section of the bed of an open channel flume in the present case, from the facility itself. The drag plate, sufficiently large to allow for spatial averaging over multiple elements, is constrained to move on essentially frictionless rails in the direction of flow, and the force applied to the plate by the interaction of objects on the plate with the flow is monitored. In contrast to force balances used in wind tunnels, our design allows for easy mounting of multiple elements on different configurations, it holds large vertical loads with negligible effect to the horizontal forces measured, does not require intrusive frames to hold the elements within the flow, all of its components are externally located at the bottom of the flume, providing immediate access for adjustments, and the mounted load cell is easily interchangeable to increase the measurement dynamic range without system modifications. The measurement of two canonical, well-studied cases is used to validate the drag plate approach: drag induced by a turbulent boundary layer and the drag on a rigid cylinder. A third series of experiments, flow through arrays of rigid cylinders, is presented to show the applicability of the drag plate on more complex flows. The experimental results confirm the drag plate approach to be suitable for the accurate direct measurement of drag on simple and complex arrays of objects, which makes it ideal for studies of vegetated flows, natural rough boundary layers, coastal structures, and urban canopies, just to name a few possibilities.

  6. Circular dichroism measured on single chlorosomal light-harvesting complexes of green photosynthetic bacteria

    KAUST Repository

    Furumaki, Shu

    2012-12-06

    We report results on circular dichroism (CD) measured on single immobilized chlorosomes of a triple mutant of green sulfur bacterium Chlorobaculum tepidum. The CD signal is measured by monitoring chlorosomal bacteriochlorphyll c fluorescence excited by alternate left and right circularly polarized laser light with a fixed wavelength of 733 nm. The excitation wavelength is close to a maximum of the negative CD signal of a bulk solution of the same chlorosomes. The average CD dissymmetry parameter obtained from an ensemble of individual chlorosomes was gs = -0.025, with an intrinsic standard deviation (due to variations between individual chlorosomes) of 0.006. The dissymmetry value is about 2.5 times larger than that obtained at the same wavelength in the bulk solution. The difference can be satisfactorily explained by taking into account the orientation factor in the single-chlorosome experiments. The observed distribution of the dissymmetry parameter reflects the well-ordered nature of the mutant chlorosomes. © 2012 American Chemical Society.

  7. Complexities of particulate matter measurement in parenteral formulations of small-molecule amphiphilic drugs.

    Science.gov (United States)

    Hickey, Magali B; Waggener, Sara; Gole, Dilip; Jimidar, Ilias; Vermeersch, Hans; Ratanabanangkoon, Poe; Tinke, Arjen P; Almarsson, Örn

    2011-03-01

    Reconstituted parenteral solutions of three surface-active anti-infective small-molecule drugs and solutions of sodium dodecyl sulfate (SDS, a model surfactant) were studied to quantify the impact of sample preparation and handling on particle counts. Turbidimetry and light obscuration profiles were recorded as a function of agitation and shearing with and without the introduction of foam into the solutions. SDS solutions at concentrations above the critical micelle concentration (CMC) show significantly greater sensitivity to shear and foam presence than SDS solution below the CMC: Values of >10 μm particles increased 8 fold over control (an unsheared sample) in the micellar solution vs. 4 fold particle count increase over control at a sub-micellar concentration. An even more significant increase in the ratio of particle count in sheared/unsheared solution is seen for >25 μm unit counts, due to the increased interference of foam with the measurement. Two commercial products, injection formulations of teicoplanin and cefotaxime sodium, as well as an investigational compound 1, showed an increase in scattering as a function of foam production. The impact of foaming was significant, resulting in an increase of turbidity and light obscuration measurements in all solutions. The results illustrate some of the challenges that are inherent to optically clear, homogeneous pharmaceutical injections containing compounds which have a tendency toward self-association and surfactant-like behavior. PMID:21234824

  8. Quantification of soil pore network complexity with X-ray computed tomography and gas transport measurements

    DEFF Research Database (Denmark)

    Katuwal, Sheela; Arthur, Emmanuel; Tuller, M.;

    2015-01-01

    Flow and transport of gases through soils are largely controlled by pore structural attributes. The quantification of pore network characteristics is therefore essential for accurate prediction of air permeability and gas diffusivity. In this study, the pore network characteristics of seven...... different soils subjected to 22 mo of field regeneration were quantified with X-ray computed tomography (CT) and compared with functional pore characteristics estimated from measurements of air permeability and gas diffusivity. Furthermore, predictive models for air permeability and gas diffusivity were...... equivalent pore diameter in predictive gas diffusivity and air permeability models significantly improved their performance. The obtained results suggest that the application of X-ray CT-derived pore-structural parameters has great potential for predicting gas diffusivity and air permeability....

  9. An investigation of ozone and planetary boundary layer dynamics over the complex topography of Grenoble combining measurements and modeling

    Directory of Open Access Journals (Sweden)

    O. Couach

    2003-01-01

    Full Text Available This paper concerns an evaluation of ozone (O3 and planetary boundary layer (PBL dynamics over the complex topography of the Grenoble region through a combination of measurements and mesoscale model (METPHOMOD predictions for three days, during July 1999. The measurements of O3 and PBL structure were obtained with a Differential Absorption Lidar (DIAL system, situated 20 km south of Grenoble at Vif (310 m ASL. The combined lidar observations and model calculations are in good agreement with atmospheric measurements obtained with an instrumented aircraft (METAIR. Ozone fluxes were calculated using lidar measurements of ozone vertical profiles concentrations and the horizontal wind speeds measured with a Radar Doppler wind profiler (DEGREANE. The ozone flux patterns indicate that the diurnal cycle of ozone production is controlled by local thermal winds. The convective PBL maximum height was some 2700 m above the land surface while the nighttime residual ozone layer was generally found between 1200 and 2200 m. Finally we evaluate the magnitude of the ozone processes at different altitudes in order to estimate the photochemical ozone production due to the primary pollutants emissions of Grenoble city and the regional network of automobile traffic.

  10. Multi-frequency color-marked fringe projection profilometry for fast 3D shape measurement of complex objects.

    Science.gov (United States)

    Jiang, Chao; Jia, Shuhai; Dong, Jun; Bao, Qingchen; Yang, Jia; Lian, Qin; Li, Dichen

    2015-09-21

    We propose a novel multi-frequency color-marked fringe projection profilometry approach to measure the 3D shape of objects with depth discontinuities. A digital micromirror device projector is used to project a color map consisting of a series of different-frequency color-marked fringe patterns onto the target object. We use a chromaticity curve to calculate the color change caused by the height of the object. The related algorithm to measure the height is also described in this paper. To improve the measurement accuracy, a chromaticity curve correction method is presented. This correction method greatly reduces the influence of color fluctuations and measurement error on the chromaticity curve and the calculation of the object height. The simulation and experimental results validate the utility of our method. Our method avoids the conventional phase shifting and unwrapping process, as well as the independent calculation of the object height required by existing techniques. Thus, it can be used to measure complex and dynamic objects with depth discontinuities. These advantages are particularly promising for industrial applications. PMID:26406621

  11. Direct measurement of the Mn(II) hydration state in metal complexes and metalloproteins through 17O NMR line widths.

    Science.gov (United States)

    Gale, Eric M; Zhu, Jiang; Caravan, Peter

    2013-12-11

    Here we describe a simple method to estimate the inner-sphere hydration state of the Mn(II) ion in coordination complexes and metalloproteins. The line width of bulk H2(17)O is measured in the presence and absence of Mn(II) as a function of temperature, and transverse (17)O relaxivities are calculated. It is demonstrated that the maximum (17)O relaxivity is directly proportional to the number of inner-sphere water ligands (q). Using a combination of literature data and experimental data for 12 Mn(II) complexes, we show that this method provides accurate estimates of q with an uncertainty of ±0.2 water molecules. The method can be implemented on commercial NMR spectrometers working at fields of 7 T and higher. The hydration number can be obtained for micromolar Mn(II) concentrations. We show that the technique can be extended to metalloproteins or complex:protein interactions. For example, Mn(II) binds to the multimetal binding site A on human serum albumin with two inner-sphere water ligands that undergo rapid exchange (1.06 × 10(8) s(-1) at 37 °C). The possibility of extending this technique to other metal ions such as Gd(III) is discussed.

  12. Analysis of full scale measurements for the investigation of the turbulence structure acting on a rotor disk over complex terrain

    Energy Technology Data Exchange (ETDEWEB)

    Glinou, G.L.; Morfiadakis, E.E.; Koulouvari, M.J. [Centre for Renewable Energy Sources, Wind Energy Dept., Pikermi (Greece)

    1996-12-31

    In the framework of the MOUNTURB project, contract no JOU2-CT93- 0378, co-funded by the European Union, full scale measurements have been carried out at CRES`s test site, situated in complex terrain, for the investigation of the wind and turbulence structure acting on the rotor disk of a wind turbine. The analysis of the deterministic characteristics shows evidence of terrain induced effects on both the longitudinal and the vertical velocity component. The analysis of the stochastic characteristics of the wind field suggests non isotropic turbulence decreasing with the height above ground level. The measured coherence exhibited the typical exponential decay with increasing turbulence frequency and the decay rate is increasing with wind speed. The horizontal coherence is slightly higher than the vertical coherence. (Author)

  13. A complexity measure based method for studying the dependence of 222Rn concentration time series on indoor air temperature and humidity

    CERN Document Server

    Mihailovic, Dragutin T; Krmar, Miodrag; Arsenić, Ilija

    2013-01-01

    We have suggested a complexity measure based method for studying the dependence of measured 222Rn concentration time series on indoor air temperature and humidity. This method is based on the Kolmogorov complexity (KL). We have introduced (i) the sequence of the KL, (ii) the Kolmogorov complexity highest value in the sequence (KLM) and (iii) the KL of the product of time series. The noticed loss of the KLM complexity of 222Rn concentration time series can be attributed to the indoor air humidity that keeps the radon daughters in air.

  14. Low Charge and Reduced Mobility of Membrane Protein Complexes Has Implications for Calibration of Collision Cross Section Measurements.

    Science.gov (United States)

    Allison, Timothy M; Landreh, Michael; Benesch, Justin L P; Robinson, Carol V

    2016-06-01

    Ion mobility mass spectrometry of integral membrane proteins provides valuable insights into their architecture and stability. Here we show that, due to their lower charge, the average mobility of native-like membrane protein ions is approximately 30% lower than that of soluble proteins of similar mass. This has implications for drift time measurements, made on traveling wave ion mobility mass spectrometers, which have to be calibrated to extract collision cross sections (Ω). Common calibration strategies employ unfolded or native-like soluble protein standards with masses and mobilities comparable to the protein of interest. We compare Ω values for membrane proteins, derived from standard calibration protocols using soluble proteins, to values measured using an RF-confined drift tube. Our results demonstrate that, while common calibration methods underestimate Ω for native-like or unfolded membrane protein complexes, higher mass soluble calibration standards consistently yield more accurate Ω values. These findings enable us to obtain directly structural information for highly charge-reduced complexes by traveling wave ion mobility mass spectrometry. PMID:27153188

  15. Unraveling the complexity of protein backbone dynamics with combined (13)C and (15)N solid-state NMR relaxation measurements.

    Science.gov (United States)

    Lamley, Jonathan M; Lougher, Matthew J; Sass, Hans Juergen; Rogowski, Marco; Grzesiek, Stephan; Lewandowski, Józef R

    2015-09-14

    Typically, protein dynamics involve a complex hierarchy of motions occurring on different time scales between conformations separated by a range of different energy barriers. NMR relaxation can in principle provide a site-specific picture of both the time scales and amplitudes of these motions, but independent relaxation rates sensitive to fluctuations in different time scale ranges are required to obtain a faithful representation of the underlying dynamic complexity. This is especially pertinent for relaxation measurements in the solid state, which report on dynamics in a broader window of time scales by more than 3 orders of magnitudes compared to solution NMR relaxation. To aid in unraveling the intricacies of biomolecular dynamics we introduce (13)C spin-lattice relaxation in the rotating frame (R1ρ) as a probe of backbone nanosecond-microsecond motions in proteins in the solid state. We present measurements of (13)C'R1ρ rates in fully protonated crystalline protein GB1 at 600 and 850 MHz (1)H Larmor frequencies and compare them to (13)C'R1, (15)N R1 and R1ρ measured under the same conditions. The addition of carbon relaxation data to the model free analysis of nitrogen relaxation data leads to greatly improved characterization of time scales of protein backbone motions, minimizing the occurrence of fitting artifacts that may be present when (15)N data is used alone. We also discuss how internal motions characterized by different time scales contribute to (15)N and (13)C relaxation rates in the solid state and solution state, leading to fundamental differences between them, as well as phenomena such as underestimation of picosecond-range motions in the solid state and nanosecond-range motions in solution.

  16. Robust estimation of fractal measures for characterizing the structural complexity of the human brain: optimization and reproducibility.

    Science.gov (United States)

    Goñi, Joaquín; Sporns, Olaf; Cheng, Hu; Aznárez-Sanado, Maite; Wang, Yang; Josa, Santiago; Arrondo, Gonzalo; Mathews, Vincent P; Hummer, Tom A; Kronenberger, William G; Avena-Koenigsberger, Andrea; Saykin, Andrew J; Pastor, María A

    2013-12-01

    High-resolution isotropic three-dimensional reconstructions of human brain gray and white matter structures can be characterized to quantify aspects of their shape, volume and topological complexity. In particular, methods based on fractal analysis have been applied in neuroimaging studies to quantify the structural complexity of the brain in both healthy and impaired conditions. The usefulness of such measures for characterizing individual differences in brain structure critically depends on their within-subject reproducibility in order to allow the robust detection of between-subject differences. This study analyzes key analytic parameters of three fractal-based methods that rely on the box-counting algorithm with the aim to maximize within-subject reproducibility of the fractal characterizations of different brain objects, including the pial surface, the cortical ribbon volume, the white matter volume and the gray matter/white matter boundary. Two separate datasets originating from different imaging centers were analyzed, comprising 50 subjects with three and 24 subjects with four successive scanning sessions per subject, respectively. The reproducibility of fractal measures was statistically assessed by computing their intra-class correlations. Results reveal differences between different fractal estimators and allow the identification of several parameters that are critical for high reproducibility. Highest reproducibility with intra-class correlations in the range of 0.9-0.95 is achieved with the correlation dimension. Further analyses of the fractal dimensions of parcellated cortical and subcortical gray matter regions suggest robustly estimated and region-specific patterns of individual variability. These results are valuable for defining appropriate parameter configurations when studying changes in fractal descriptors of human brain structure, for instance in studies of neurological diseases that do not allow repeated measurements or for disease

  17. Increasing the sensitivity of NMR diffusion measurements by paramagnetic longitudinal relaxation enhancement, with application to ribosome–nascent chain complexes

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Sammy H. S.; Waudby, Christopher A.; Cassaignau, Anaïs M. E.; Cabrita, Lisa D.; Christodoulou, John, E-mail: j.christodoulou@ucl.ac.uk [University College London and Birkbeck College, Institute of Structural and Molecular Biology (United Kingdom)

    2015-10-15

    The translational diffusion of macromolecules can be examined non-invasively by stimulated echo (STE) NMR experiments to accurately determine their molecular sizes. These measurements can be important probes of intermolecular interactions and protein folding and unfolding, and are crucial in monitoring the integrity of large macromolecular assemblies such as ribosome–nascent chain complexes (RNCs). However, NMR studies of these complexes can be severely constrained by their slow tumbling, low solubility (with maximum concentrations of up to 10 μM), and short lifetimes resulting in weak signal, and therefore continuing improvements in experimental sensitivity are essential. Here we explore the use of the paramagnetic longitudinal relaxation enhancement (PLRE) agent NiDO2A on the sensitivity of {sup 15}N XSTE and SORDID heteronuclear STE experiments, which can be used to monitor the integrity of these unstable complexes. We exploit the dependence of the PLRE effect on the gyromagnetic ratio and electronic relaxation time to accelerate recovery of {sup 1}H magnetization without adversely affecting storage on N{sub z} during diffusion delays or introducing significant transverse relaxation line broadening. By applying the longitudinal relaxation-optimized SORDID pulse sequence together with NiDO2A to 70S Escherichia coli ribosomes and RNCs, NMR diffusion sensitivity enhancements of up to 4.5-fold relative to XSTE are achieved, alongside ∼1.9-fold improvements in two-dimensional NMR sensitivity, without compromising the sample integrity. We anticipate these results will significantly advance the use of NMR to probe dynamic regions of ribosomes and other large, unstable macromolecular assemblies.Graphical Abstract.

  18. Rotational study of the CH{sub 4}–CO complex: Millimeter-wave measurements and ab initio calculations

    Energy Technology Data Exchange (ETDEWEB)

    Surin, L. A., E-mail: surin@ph1.uni-koeln.de [I. Physikalisches Institut, University of Cologne, Zülpicher St. 77, 50937 Cologne (Germany); Institute of Spectroscopy, Russian Academy of Sciences, Fizicheskaya St. 5, 142190 Troitsk, Moscow (Russian Federation); Tarabukin, I. V.; Panfilov, V. A. [Institute of Spectroscopy, Russian Academy of Sciences, Fizicheskaya St. 5, 142190 Troitsk, Moscow (Russian Federation); Schlemmer, S. [I. Physikalisches Institut, University of Cologne, Zülpicher St. 77, 50937 Cologne (Germany); Kalugina, Y. N. [Department of Optics and Spectroscopy, Tomsk State University, 36 Lenin Ave., 634050 Tomsk (Russian Federation); Faure, A.; Rist, C. [University Grenoble Alpes, IPAG, F-38000 Grenoble (France); CNRS, IPAG, F-38000 Grenoble (France); Avoird, A. van der, E-mail: A.vanderAvoird@theochem.ru.nl [Theoretical Chemistry, Institute for Molecules and Materials, Radboud University, Heyendaalseweg 135, 6525 AJ Nijmegen (Netherlands)

    2015-10-21

    The rotational spectrum of the van der Waals complex CH{sub 4}–CO has been measured with the intracavity OROTRON jet spectrometer in the frequency range of 110–145 GHz. Newly observed and assigned transitions belong to the K = 2–1 subband correlating with the rotationless j{sub CH4} = 0 ground state and the K = 2–1 and K = 0–1 subbands correlating with the j{sub CH4} = 2 excited state of free methane. The (approximate) quantum number K is the projection of the total angular momentum J on the intermolecular axis. The new data were analyzed together with the known millimeter-wave and microwave transitions in order to determine the molecular parameters of the CH{sub 4}–CO complex. Accompanying ab initio calculations of the intermolecular potential energy surface (PES) of CH{sub 4}–CO have been carried out at the explicitly correlated coupled cluster level of theory with single, double, and perturbative triple excitations [CCSD(T)-F12a] and an augmented correlation-consistent triple zeta (aVTZ) basis set. The global minimum of the five-dimensional PES corresponds to an approximately T-shaped structure with the CH{sub 4} face closest to the CO subunit and binding energy D{sub e} = 177.82 cm{sup −1}. The bound rovibrational levels of the CH{sub 4}–CO complex were calculated for total angular momentum J = 0–6 on this intermolecular potential surface and compared with the experimental results. The calculated dissociation energies D{sub 0} are 91.32, 94.46, and 104.21 cm{sup −1} for A (j{sub CH4} = 0), F (j{sub CH4} = 1), and E (j{sub CH4} = 2) nuclear spin modifications of CH{sub 4}–CO, respectively.

  19. Implementation of a complex of measures to fulfill the planetary protection requirements of the ExoMars-2016 mission

    Science.gov (United States)

    Khamidullina, Natalia; Novikova, Nataliya; Deshevaya, Elena; Orlov, Oleg; Guridov, Alexander; Zakharenko, Dmitry; Zaytseva, Olga

    2016-07-01

    The major purpose of the planetary protection program in the ExoMars-2016 mission is to forestall Mars contamination by terrestrial microorganisms. Since Martian descent module is not intended for biological experiments, ExoMars-2016 mission falls under COSPAR category IVa. Within the joint project co-sponsored by ESA and Roscosmos the European side holds full responsibility for ensuring a prescribed level of SC microbiological purity, while the Russian side is charged with compliance of the launch services provided on Baikonur technical complex with the planetary protection requirements that is, specifically, prevention of SC recontamination. To this end, a complex of measures was executed to control microbial contamination of cosmodrome facilities on the prescribed level which included: - regular decontamination of clean rooms using an effective disinfectant and impulse ultraviolet radiation that created favorable conditions for reliable functioning of the ESA clean tent, - replacement of airline filters in the Thermal Conditioning Unit (TCU) air duct for SC conditioning with pure air. The results of microbiological tests performed in the period of 2015 - 2016 lead to the conclusion that the Baikonur clean rooms (ISO class 8), TCU air ducts and Air Thermal Control System (ATCS) at launch site are ready for the launch campaign and that the Russian side fulfilled the planetary protection requirements of the ExoMars-2016 mission.

  20. An Image Pattern Tracking Algorithm for Time-resolved Measurement of Mini- and Micro-scale Motion of Complex Object

    Directory of Open Access Journals (Sweden)

    John M. Seiner

    2009-03-01

    Full Text Available An image pattern tracking algorithm is described in this paper for time-resolved measurements of mini- and micro-scale movements of complex objects. This algorithm works with a high-speed digital imaging system, which records thousands of successive image frames in a short time period. The image pattern of the observed object is tracked among successively recorded image frames with a correlation-based algorithm, so that the time histories of the position and displacement of the investigated object in the camera focus plane are determined with high accuracy. The speed, acceleration and harmonic content of the investigated motion are obtained by post processing the position and displacement time histories. The described image pattern tracking algorithm is tested with synthetic image patterns and verified with tests on live insects.

  1. Complex surface three-dimensional shape measurement method based on defocused Gray code plus phase-shifting

    Science.gov (United States)

    Zeng, Zhuohuan; Fu, Yanjun; Li, Biao; Chai, Minggang

    2016-08-01

    Binary pattern defocused projection method can overcome the nonlinear gamma of the projector, as well as filter high harmonics and high-frequency noise. However, high-accuracy three-dimensional (3-D) shape measurement of complex surface using it still remains a challenge. Therefore, a novel Gray code plus phase-shifting method based on defocusing is proposed to solve the problem. The edges of Gray code patterns become blurred owing to defocus, which makes the recovery of accurate Gray code patterns difficult. To solve this problem, the positive and inverse Gray code patterns are projected to obtain threshold values, which are used to achieve the binarization of Gray code patterns. This method is robust and suitable for different defocus levels. Compared with the traditional Gray code plus phase-shifting method, the experimental results prove the effectiveness and feasibility of the proposed method.

  2. Assessing dry density and gravimetric water content of soils in geotechnics with complex conductivity measurements : preliminary investigations

    Science.gov (United States)

    Kaouane, C.; Beck, Y.; Fauchard, C.; Chouteau, M.

    2012-12-01

    /ρw =a*Sr^n+b. a=0.223 agreed with φ^(-n)=F, F being the formation factor. This leads to a mean tortuosity α=1.47. b=0.5 might be related to surface conductivity. An empirical Rhoades-Corwin model also fit great to data. Revil&Florsh model allows us to predict a phase peak in case of complex conductivity measurements. We predicted a frequency peak at 2.4 Hz. This peak is well located in the frequency range of SIP (from 1 mHz to ~10 Hz). At the frequency peak, this model allows the direct evaluation of saturation and porosity. Hence, complex conductivity measurements might be a fine alternative to nuclear probes. Still, driving in electrodes in compacted soils remains difficult. Ongoing studies are looking further to extend this model to higher frequency range (5-200 kHz) where capacitively coupled resistivity arrays might be used allowing continuous measurements.

  3. Measurements of the Intensity and Polarization of the Anomalous Microwave Emission in the Perseus molecular complex with QUIJOTE

    CERN Document Server

    Génova-Santos, R; Rebolo, R; Peláez-Santos, A; López-Caraballo, C H; Harper, S; Watson, R A; Ashdown, M; Barreiro, R B; Casaponsa, B; Dickinson, C; Diego, J M; Fernández-Cobos, R; Grainge, K J B; Herranz, D; Hoyland, R; Lasenby, A; López-Caniego, M; Martínez-González, E; McCulloch, M; Melhuish, S; Piccirillo, L; Perrott, Y C; Poidevin, F; Razavi-Ghods, N; Scott, P F; Titterington, D; Tramonte, D; Vielva, P; Vignaga, R

    2015-01-01

    Anomalous microwave emission (AME) has been observed in numerous sky regions, in the frequency range ~10-60 GHz. One of the most scrutinized regions is G159.6-18.5, located within the Perseus molecular complex. In this paper we present further observations of this region (194 hours in total over ~250 deg^2), both in intensity and in polarization. They span four frequency channels between 10 and 20 GHz, and were gathered with QUIJOTE, a new CMB experiment with the goal of measuring the polarization of the CMB and Galactic foregrounds. When combined with other publicly-available intensity data, we achieve the most precise spectrum of the AME measured to date, with 13 independent data points being dominated by this emission. The four QUIJOTE data points provide the first independent confirmation of the downturn of the AME spectrum at low frequencies, initially unveiled by the COSMOSOMAS experiment in this region. We accomplish an accurate fit of these data using models based on electric dipole emission from spin...

  4. Building a measurement framework of burden of treatment in complex patients with chronic conditions: a qualitative study

    Directory of Open Access Journals (Sweden)

    Eton DT

    2012-08-01

    Full Text Available David T Eton,1 Djenane Ramalho de Oliveira,2,3 Jason S Egginton,1 Jennifer L Ridgeway,1 Laura Odell,4 Carl R May,5 Victor M Montori1,61Division of Health Care Policy and Research, Department of Health Sciences Research, Mayo Clinic, Rochester, MN, USA; 2College of Pharmacy, Universidade Federal de Minas Gerais, Belo Horizonte, Brazil; 3Medication Therapy Management Program, Fairview Pharmacy Services LLC, Minneapolis, MN, USA; 4Pharmacy Services, Mayo Clinic, Rochester, MN, USA; 5Faculty of Health Sciences, University of Southampton, Southampton, UK; 6Knowledge and Evaluation Research Unit, Mayo Clinic, Rochester, MN, USABackground: Burden of treatment refers to the workload of health care as well as its impact on patient functioning and well-being. We set out to build a conceptual framework of issues descriptive of burden of treatment from the perspective of the complex patient, as a first step in the development of a new patient-reported measure.Methods: We conducted semistructured interviews with patients seeking medication therapy management services at a large, academic medical center. All patients had a complex regimen of self-care (including polypharmacy, and were coping with one or more chronic health conditions. We used framework analysis to identify and code themes and subthemes. A conceptual framework of burden of treatment was outlined from emergent themes and subthemes.Results: Thirty-two patients (20 female, 12 male, age 26–85 years were interviewed. Three broad themes of burden of treatment emerged including: the work patients must do to care for their health; problem-focused strategies and tools to facilitate the work of self-care; and factors that exacerbate the burden felt. The latter theme encompasses six subthemes including challenges with taking medication, emotional problems with others, role and activity limitations, financial challenges, confusion about medical information, and health care delivery obstacles

  5. PREPARATION OF XYLOSE AND KRAFT PULP FROM POPLAR BASED ON FORMIC/ACETIC ACID /WATER SYSTEM HYDROLYSIS

    Directory of Open Access Journals (Sweden)

    Junping Zhuang

    2009-08-01

    Full Text Available A formic/acetic acid/water system was used in the ratios of 30:60:10, 20:60:20, and 30:50:20 separately for efficient hydrolysis and bioconversion of poplar chips, under the solid/liquid ratio of 1:12(g/ml, at 105 oC for 30, 45, 60, 75, and 90 min, respectively. The highest yield of 69.89% was at a formic/acetic acid /water ratio of 30:50:20(v/v/v, with solid/liquid in the ratio of 1:12(g/ml at 105 oC for 90min. Lower kappa number and similar yield were achieved when hydrolytic residual woodchips were used for kraft pulping with over 2% Na2O and temperature 5 °C lower compared to untreated chips. Pulps from prehydrolysis-treated chips were easy to beat. But the tensile index, tear index, and burst index of the handsheets obtained from pulp with lowest kappa number from prehydrolysis-treated poplar chips were lower than those of the pulp from the untreated chips. Considerable xylose could be obtained from the prehydrolysis stage following kraft pulping under the same conditions for prehydrolysis-treated chips and untreated chips. However, by building on the mature kraft pulping and xylitol processes, large amounts of xylose from the hemicellulose were obtained in prehydrolysis, allowing production of high-valued products via biorefinery pathways. An economical balance of chemical dosage, energy consumption, pulp properties, and xylose value for prehydrolysis with organic acid should be reached with further investigation.

  6. Precipitation of arsenic sulphide from acidic water in a fixed-film bioreactor.

    Science.gov (United States)

    Battaglia-Brunet, Fabienne; Crouzet, Catherine; Burnol, André; Coulon, Stéphanie; Morin, Dominique; Joulian, Catherine

    2012-08-01

    Arsenic (As) is a toxic element frequently present in acid mine waters and effluents. Precipitation of trivalent arsenic sulphide in sulphate-reducing conditions at low pH has been studied with the aim of removing this hazardous element in a waste product with high As content. To achieve this, a 400m L fixed-film column bioreactor was fed continuously with a synthetic solution containing 100mg L(-1) As(V), glycerol and/or hydrogen, at pH values between 2.7 and 5. The highest global As removal rate obtained during these experiments was close to 2.5mg L(-1)h(-1). A switch from glycerol to hydrogen when the biofilm was mature induced an abrupt increase in the sulphate-reducing activity, resulting in a dramatic mobilisation of arsenic due to the formation of soluble thioarsenic complexes. A new analytical method, based on ionic chromatography, was used to evaluate the proportion of As present as thioarsenic complexes in the bioreactor. Profiles of pH, total As and sulphate concentrations suggest that As removal efficiency was linked to solubility of orpiment (As(2)S(3)) depending on pH conditions. Molecular fingerprints revealed fairly homogeneous bacterial colonisation throughout the reactor. The bacterial community was diverse and included fermenting bacteria and Desulfosporosinus-like sulphate-reducing bacteria. arrA genes, involved in dissimilatory reduction of As(V), were found and the retrieved sequences suggested that As(V) was reduced by a Desulfosporosinus-like organism. This study was the first to show that As can be removed by bioprecipitation of orpiment from acidic solution containing up to 100mg L(-1) As(V) in a bioreactor.

  7. Rotational study of the NH{sub 3}–CO complex: Millimeter-wave measurements and ab initio calculations

    Energy Technology Data Exchange (ETDEWEB)

    Surin, L. A., E-mail: surin@ph1.uni-koeln.de [I. Physikalisches Institut, University of Cologne, Zülpicher Str. 77, 50937 Cologne (Germany); Institute of Spectroscopy, Russian Academy of Sciences, Fizicheskaya Str. 5, 142190 Troitsk, Moscow (Russian Federation); Potapov, A.; Schlemmer, S. [I. Physikalisches Institut, University of Cologne, Zülpicher Str. 77, 50937 Cologne (Germany); Dolgov, A. A.; Tarabukin, I. V.; Panfilov, V. A. [Institute of Spectroscopy, Russian Academy of Sciences, Fizicheskaya Str. 5, 142190 Troitsk, Moscow (Russian Federation); Kalugina, Y. N. [Department of Optics and Spectroscopy, Tomsk State University, 36 Lenin av., 634050 Tomsk (Russian Federation); Faure, A. [Université de Grenoble Alpes, IPAG, F-38000 Grenoble (France); CNRS, IPAG, F-38000 Grenoble (France); Avoird, A. van der, E-mail: A.vanderAvoird@theochem.ru.nl [Theoretical Chemistry, Institute for Molecules and Materials, Radboud University Nijmegen, Heyendaalseweg 135, 6525 AJ Nijmegen (Netherlands)

    2015-03-21

    The rotational spectrum of the van der Waals complex NH{sub 3}–CO has been measured with the intracavity OROTRON jet spectrometer in the frequency range of 112–139 GHz. Newly observed and assigned transitions belong to the K = 0–0, K = 1–1, K = 1–0, and K = 2–1 subbands correlating with the rotationless (j{sub k}){sub NH3} = 0{sub 0} ground state of free ortho-NH{sub 3} and the K = 0–1 and K = 2–1 subbands correlating with the (j{sub k}){sub NH3} = 1{sub 1} ground state of free para-NH{sub 3}. The (approximate) quantum number K is the projection of the total angular momentum J on the intermolecular axis. Some of these transitions are continuations to higher J values of transition series observed previously [C. Xia et al., Mol. Phys. 99, 643 (2001)], the other transitions constitute newly detected subbands. The new data were analyzed together with the known millimeter-wave and microwave transitions in order to determine the molecular parameters of the ortho-NH{sub 3}–CO and para-NH{sub 3}–CO complexes. Accompanying ab initio calculations of the intermolecular potential energy surface (PES) of NH{sub 3}–CO has been carried out at the explicitly correlated coupled cluster level of theory with single, double, and perturbative triple excitations and an augmented correlation-consistent triple zeta basis set. The global minimum of the five-dimensional PES corresponds to an approximately T-shaped structure with the N atom closest to the CO subunit and binding energy D{sub e} = 359.21 cm{sup −1}. The bound rovibrational levels of the NH{sub 3}–CO complex were calculated for total angular momentum J = 0–6 on this intermolecular potential surface and compared with the experimental results. The calculated dissociation energies D{sub 0} are 210.43 and 218.66 cm{sup −1} for ortho-NH{sub 3}–CO and para-NH{sub 3}–CO, respectively.

  8. Reprint of The improvement of the energy resolution in epi-thermal neutron region of Bonner sphere using boric acid water solution moderator.

    Science.gov (United States)

    Ueda, H; Tanaka, H; Sakurai, Y

    2015-12-01

    Bonner sphere is useful to evaluate the neutron spectrum in detail. We are improving the energy resolution in epi-thermal neutron region of Bonner sphere, using boric acid water solution as a moderator. Its response function peak is narrower than that for polyethylene moderator and the improvement of the resolution is expected. The resolutions between polyethylene moderator and boric acid water solution moderator were compared by simulation calculation. Also the influence in the uncertainty of Bonner sphere configuration to spectrum estimation was simulated.

  9. The improvement of the energy resolution in epi-thermal neutron region of Bonner sphere using boric acid water solution moderator.

    Science.gov (United States)

    Ueda, H; Tanaka, H; Sakurai, Y

    2015-10-01

    Bonner sphere is useful to evaluate the neutron spectrum in detail. We are improving the energy resolution in epi-thermal neutron region of Bonner sphere, using boric acid water solution as a moderator. Its response function peak is narrower than that for polyethylene moderator and the improvement of the resolution is expected. The resolutions between polyethylene moderator and boric acid water solution moderator were compared by simulation calculation. Also the influence in the uncertainty of Bonner sphere configuration to spectrum estimation was simulated.

  10. Investigating Project Measurement Complexity from TO Perspectives%基于TO视角的项目复杂性测度研究

    Institute of Scientific and Technical Information of China (English)

    何清华; 罗岚; 陆云波; 任俊山

    2013-01-01

    在分析传统项目复杂性影响要素的基础上,从客观性任务和主观性组织的角度探讨了项目复杂性微观影响因子的TO概念模型;并基于ProjectSim建立了以隐性工作量表示的项目复杂性测度方法;然后以世博AB片区项目构建模型,对TO测度方法的假设进行验证,证实基于隐性工作量的项目复杂性测度方法正确有效.本研究丰富和发展了复杂项目管理理论,对大型复杂项目管理具有重要的理论指导意义.%Projects have been growing in quantity, size, and complexity. Managing project complexity has become an important part of the project management. However, the traditional methods often measure project complexity from macro-perspectives, but largely ignore the potential influence of microcosmic factors on project complexity. Therefore, from the task and organizational (TO) perspective this paper explores the reasonable measurement model which can reflect the dynamic "emerging" effect of micro factors on project complexity. Based on the analysis of traditional factors affecting project complexity, the paper discusses microcosmic factors of project complexity from the perspectives of objectivity task and subjectivity organization, and establishes a method to meausre project complexity expressed by implicit workload based on the tool of ProjectSim. This tool effectively measures poject complexity from the perspective of implicit workload. Project complexity is equal to implicit workload/dominant workload. Implicit workload or ProjectSim (T, 0) is equal to reworking workload + coordinating workload + waiting workload. According to the synchronous relationship of the implicit workload and the project complexity, the paper combines the measurement method "TO" with the micro factors of project complexity based on the implicit workload. We also propose hypothesized relationships among task complexity, organization structure, organization members and project complexity

  11. The importance of chemical buffering for pelagic and benthic colonization in acidic waters

    International Nuclear Information System (INIS)

    In poorly buffered areas acidification may occur for two reasons: through atmospheric deposition of acidifying substances and - in mining districts - through pyrite weathering. These different sources of acidity lead to distinct clearly geochemistry in lakes and rivers. In general, the geochemistry is the major determinant for the planktonic composition of the acidified water bodies, whereas the nutrient status mainly determines the level of biomass. A number of acidic mining lakes in Eastern Germany have to be neutralized to meet the water quality goals of the European Union Directives and to overcome the ecological degradation. This neutralization process is limnologically a short-term maturation of lakes, which permits biological succession to overcome two different geochemical buffer systems. First, the iron buffer system characterizes an initial state, when colonization starts: there is low organismic diversity and productivity, clear net heterotrophy in most cases. Organic carbon that serves as fuel for the food web derives mainly from allochthonous sources. In the second, less acidic state aluminum is the buffer. This state is found exceptionally among the hard water mining lakes, often as a result of deposition of acidifying substances onto soft water systems. Colonization in aluminum-buffered lakes is more complex and controlled by the sensitivity of the organisms towards both, protons and inorganic reactive aluminum species. In soft-water systems, calcium may act as antidote against acid and aluminum; however, this function is lost in hard water post mining lakes of similar proton concentrations. Nutrient limitations may occur, but these do not usually control qualitative and quantitative plankton composition. In these lakes, total pelagic biomass is controlled by the bioavailability of nutrients, particularly phosphorus

  12. The importance of chemical buffering for pelagic and benthic colonization in acidic waters

    Energy Technology Data Exchange (ETDEWEB)

    Nixdorf, B., E-mail: b.nixdorf@t-online.de; Lessmann, D. [Brandenburg University of Technology at Cottbus, Chair of Water Conservation, Faculty of Environmental Sciences (Germany); Steinberg, C. E. W. [Leibniz-Institute of Freshwater Ecology and Inland Fisheries (Germany)

    2003-01-15

    In poorly buffered areas acidification may occur for two reasons: through atmospheric deposition of acidifying substances and - in mining districts - through pyrite weathering. These different sources of acidity lead to distinct clearly geochemistry in lakes and rivers. In general, the geochemistry is the major determinant for the planktonic composition of the acidified water bodies, whereas the nutrient status mainly determines the level of biomass. A number of acidic mining lakes in Eastern Germany have to be neutralized to meet the water quality goals of the European Union Directives and to overcome the ecological degradation. This neutralization process is limnologically a short-term maturation of lakes, which permits biological succession to overcome two different geochemical buffer systems. First, the iron buffer system characterizes an initial state, when colonization starts: there is low organismic diversity and productivity, clear net heterotrophy in most cases. Organic carbon that serves as fuel for the food web derives mainly from allochthonous sources. In the second, less acidic state aluminum is the buffer. This state is found exceptionally among the hard water mining lakes, often as a result of deposition of acidifying substances onto soft water systems. Colonization in aluminum-buffered lakes is more complex and controlled by the sensitivity of the organisms towards both, protons and inorganic reactive aluminum species. In soft-water systems, calcium may act as antidote against acid and aluminum; however, this function is lost in hard water post mining lakes of similar proton concentrations. Nutrient limitations may occur, but these do not usually control qualitative and quantitative plankton composition. In these lakes, total pelagic biomass is controlled by the bioavailability of nutrients, particularly phosphorus.

  13. Charge carrier effective mass and concentration derived from combination of Seebeck coefficient and 125Te NMR measurements in complex tellurides

    Science.gov (United States)

    Levin, E. M.

    2016-06-01

    Thermoelectric materials utilize the Seebeck effect to convert heat to electrical energy. The Seebeck coefficient (thermopower), S , depends on the free (mobile) carrier concentration, n , and effective mass, m*, as S ˜m*/n2 /3 . The carrier concentration in tellurides can be derived from 125Te nuclear magnetic resonance (NMR) spin-lattice relaxation measurements. The NMR spin-lattice relaxation rate, 1 /T1 , depends on both n and m* as 1 /T1˜(m*)3/2n (within classical Maxwell-Boltzmann statistics) or as 1 /T1˜(m*)2n2 /3 (within quantum Fermi-Dirac statistics), which challenges the correct determination of the carrier concentration in some materials by NMR. Here it is shown that the combination of the Seebeck coefficient and 125Te NMR spin-lattice relaxation measurements in complex tellurides provides a unique opportunity to derive the carrier effective mass and then to calculate the carrier concentration. This approach was used to study A gxS bxG e50-2xT e50 , well-known GeTe-based high-efficiency tellurium-antimony-germanium-silver thermoelectric materials, where the replacement of Ge by [Ag+Sb] results in significant enhancement of the Seebeck coefficient. Values of both m* and n derived using this combination show that the enhancement of thermopower can be attributed primarily to an increase of the carrier effective mass and partially to a decrease of the carrier concentration when the [Ag+Sb] content increases.

  14. Measurement issues associated with quantitative molecular biology analysis of complex food matrices for the detection of food fraud.

    Science.gov (United States)

    Burns, Malcolm; Wiseman, Gordon; Knight, Angus; Bramley, Peter; Foster, Lucy; Rollinson, Sophie; Damant, Andrew; Primrose, Sandy

    2016-01-01

    Following a report on a significant amount of horse DNA being detected in a beef burger product on sale to the public at a UK supermarket in early 2013, the Elliott report was published in 2014 and contained a list of recommendations for helping ensure food integrity. One of the recommendations included improving laboratory testing capacity and capability to ensure a harmonised approach for testing for food authenticity. Molecular biologists have developed exquisitely sensitive methods based on the polymerase chain reaction (PCR) or mass spectrometry for detecting the presence of particular nucleic acid or peptide/protein sequences. These methods have been shown to be specific and sensitive in terms of lower limits of applicability, but they are largely qualitative in nature. Historically, the conversion of these qualitative techniques into reliable quantitative methods has been beset with problems even when used on relatively simple sample matrices. When the methods are applied to complex sample matrices, as found in many foods, the problems are magnified resulting in a high measurement uncertainty associated with the result which may mean that the assay is not fit for purpose. However, recent advances in the technology and the understanding of molecular biology approaches have further given rise to the re-assessment of these methods for their quantitative potential. This review focuses on important issues for consideration when validating a molecular biology assay and the various factors that can impact on the measurement uncertainty of a result associated with molecular biology approaches used in detection of food fraud, with a particular focus on quantitative PCR-based and proteomics assays. PMID:26631264

  15. Measurement issues associated with quantitative molecular biology analysis of complex food matrices for the detection of food fraud.

    Science.gov (United States)

    Burns, Malcolm; Wiseman, Gordon; Knight, Angus; Bramley, Peter; Foster, Lucy; Rollinson, Sophie; Damant, Andrew; Primrose, Sandy

    2016-01-01

    Following a report on a significant amount of horse DNA being detected in a beef burger product on sale to the public at a UK supermarket in early 2013, the Elliott report was published in 2014 and contained a list of recommendations for helping ensure food integrity. One of the recommendations included improving laboratory testing capacity and capability to ensure a harmonised approach for testing for food authenticity. Molecular biologists have developed exquisitely sensitive methods based on the polymerase chain reaction (PCR) or mass spectrometry for detecting the presence of particular nucleic acid or peptide/protein sequences. These methods have been shown to be specific and sensitive in terms of lower limits of applicability, but they are largely qualitative in nature. Historically, the conversion of these qualitative techniques into reliable quantitative methods has been beset with problems even when used on relatively simple sample matrices. When the methods are applied to complex sample matrices, as found in many foods, the problems are magnified resulting in a high measurement uncertainty associated with the result which may mean that the assay is not fit for purpose. However, recent advances in the technology and the understanding of molecular biology approaches have further given rise to the re-assessment of these methods for their quantitative potential. This review focuses on important issues for consideration when validating a molecular biology assay and the various factors that can impact on the measurement uncertainty of a result associated with molecular biology approaches used in detection of food fraud, with a particular focus on quantitative PCR-based and proteomics assays.

  16. Expanding the scope of CE reactor to ssDNA-binding protein-ssDNA complexes as exemplified for a tool for direct measurement of dissociation kinetics of biomolecular complexes.

    Science.gov (United States)

    Takahashi, Toru; Ohtsuka, Kei-Ichirou; Tomiya, Yoriyuki; Iki, Nobuhiko; Hoshino, Hitoshi

    2009-09-01

    CE reactor (CER), which was developed as a tool for direct measurement of the dissociation kinetics of metal complexes, was successfully applied to the complexes of Escherichia coli ssDNA-binding protein (SSB) with ssDNA. The basic concept of CER is the application of CE separation process as a dissociation kinetic reactor for the complex, and the observation of the on-capillary dissociation reaction profile of the complex as the decrease of the peak height of the complex with increase of the migration time. The peak height of [SSB-ssDNA] decreases as the migration time increases since the degree of the decrease of [SSB-ssDNA] through the on-capillary dissociation reaction is proportional to the degree of the decrease of the peak height of [SSB-ssDNA]. The dissociation degree-time profiles for the complexes are quantitatively described by analyzing a set of electropherograms with different migration times. Dissociation rate constants of [SSB-ssDNA] consisting of 20-mer, 25-mer and 31-mer ssDNA were directly determined to be 3.99x10(-4), 4.82x10(-4) and 1.50x10(-3)/s, respectively. CER is a concise and effective tool for dissociation kinetic analysis of biomolecular complexes.

  17. The development of a quantitative measure for the complexity of emergency tasks stipulated in emergency operating procedures of nuclear power plants

    International Nuclear Information System (INIS)

    Previous studies have continuously pointed out that human performance is a decisive factor affecting the safety of complicated process systems. Subsequently, as the result of extensive efforts, it has been revealed that the provision of procedures is one of the most effective countermeasures, especially if human operators have to carry out their tasks under a very stressful environment. That is, since good procedures are helpful to not only enhance the performance of human operators but also the reduction of the possibility of a human error through stipulating detailed tasks to be done by human operators. Ironically, it has been emphasized that the performance of human operators could be impaired due to complicated procedures, because procedures directly govern the physical as well as cognitive behavior of human operators by institutionalizing detailed actions. Therefore, it is a prerequisite to develop a systematic framework that properly evaluate the complexity of tasks described in procedures. For this reason, a measure called TACOM (Task Complexity) that can quantify the complexity of emergency tasks described in the emergency operating procedures (EOPs) of NPPs has been developed. In this report, a technical background as well as practical steps to quantify the complexity of tasks were presented with a series of studies that were conducted to ensure the validity of the TACOM measure. As a result of validation studies, since it is shown that the TACOM measure seem to properly quantify the complexity of emergency tasks, it is desirable that the TACOM measure plays an important role in improving the performance of human operators

  18. The development of a quantitative measure for the complexity of emergency tasks stipulated in emergency operating procedures of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jung, Won Dea

    2006-11-15

    Previous studies have continuously pointed out that human performance is a decisive factor affecting the safety of complicated process systems. Subsequently, as the result of extensive efforts, it has been revealed that the provision of procedures is one of the most effective countermeasures, especially if human operators have to carry out their tasks under a very stressful environment. That is, since good procedures are helpful to not only enhance the performance of human operators but also the reduction of the possibility of a human error through stipulating detailed tasks to be done by human operators. Ironically, it has been emphasized that the performance of human operators could be impaired due to complicated procedures, because procedures directly govern the physical as well as cognitive behavior of human operators by institutionalizing detailed actions. Therefore, it is a prerequisite to develop a systematic framework that properly evaluate the complexity of tasks described in procedures. For this reason, a measure called TACOM (Task Complexity) that can quantify the complexity of emergency tasks described in the emergency operating procedures (EOPs) of NPPs has been developed. In this report, a technical background as well as practical steps to quantify the complexity of tasks were presented with a series of studies that were conducted to ensure the validity of the TACOM measure. As a result of validation studies, since it is shown that the TACOM measure seem to properly quantify the complexity of emergency tasks, it is desirable that the TACOM measure plays an important role in improving the performance of human operators.

  19. Water Accounting Plus (WA+ – a water accounting procedure for complex river basins based on satellite measurements

    Directory of Open Access Journals (Sweden)

    P. Karimi

    2013-07-01

    Full Text Available Coping with water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links depletion to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper, we introduce Water Accounting Plus (WA+, which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use and landscape evapotranspiration on the water cycle is described explicitly by defining land use groups with common characteristics. WA+ presents four sheets including (i a resource base sheet, (ii an evapotranspiration sheet, (iii a productivity sheet, and (iv a withdrawal sheet. Every sheet encompasses a set of indicators that summarise the overall water resources situation. The impact of external (e.g., climate change and internal influences (e.g., infrastructure building can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used to acquire a vast amount of required data but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  20. Measuring and predicting reservoir heterogeneity in complex deposystems: The fluvial-deltaic Big Injun sandstone in West Virginia

    Energy Technology Data Exchange (ETDEWEB)

    Patchen, D.G.; Hohn, M.E.; Aminian, K.; Donaldson, A.; Shumaker, R.; Wilson, T.

    1993-04-01

    The purpose of this research is to develop techniques to measure and predict heterogeneities in oil reservoirs that are the products of complex deposystems. The unit chosen for study is the Lower Mississippian Big Injun sandstone, a prolific oil producer (nearly 60 fields) in West Virginia. This research effort has been designed and is being implemented as an integrated effort involving stratigraphy, structural geology, petrology, seismic study, petroleum engineering, modeling and geostatistics. Sandstone bodies are being mapped within their regional depositional systems, and then sandstone bodies are being classified in a scheme of relative heterogeneity to determine heterogeneity across depositional systems. Facies changes are being mapped within given reservoirs, and the environments of deposition responsible for each facies are being interpreted to predict the inherent relative heterogeneity of each facies. Structural variations will be correlated both with production, where the availability of production data will permit, and with variations in geologic and engineering parameters that affect production. A reliable seismic model of the Big Injun reservoirs in Granny Creek field is being developed to help interpret physical heterogeneity in that field. Pore types are being described and related to permeability, fluid flow and diagenesis, and petrographic data are being integrated with facies and depositional environments to develop a technique to use diagenesis as a predictive tool in future reservoir development. Another objective in the Big Injun study is to determine the effect of heterogeneity on fluid flow and efficient hydrocarbon recovery in order to improve reservoir management. Graphical methods will be applied to Big Injun production data and new geostatistical methods will be developed to detect regional trends in heterogeneity.

  1. Water Accounting Plus (WA+ – a water accounting procedure for complex river basins based on satellite measurements

    Directory of Open Access Journals (Sweden)

    D. Molden

    2012-11-01

    Full Text Available Coping with the issue of water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links hydrological flows to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper we introduce Water Accounting Plus (WA+, which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use on the water cycle is described explicitly by defining land use groups with common characteristics. Analogous to financial accounting, WA+ presents four sheets including (i a resource base sheet, (ii a consumption sheet, (iii a productivity sheet, and (iv a withdrawal sheet. Every sheet encompasses a set of indicators that summarize the overall water resources situation. The impact of external (e.g. climate change and internal influences (e.g. infrastructure building can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used for 3 out of the 4 sheets, but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  2. The complexities of measuring access to parks and physical activity sites in New York City: a quantitative and qualitative approach

    Directory of Open Access Journals (Sweden)

    Sohler Nancy L

    2009-06-01

    Full Text Available Abstract Background Proximity to parks and physical activity sites has been linked to an increase in active behaviors, and positive impacts on health outcomes such as lower rates of cardiovascular disease, diabetes, and obesity. Since populations with a low socio-economic status as well as racial and ethnic minorities tend to experience worse health outcomes in the USA, access to parks and physical activity sites may be an environmental justice issue. Geographic Information systems were used to conduct quantitative and qualitative analyses of park accessibility in New York City, which included kernel density estimation, ordinary least squares (global regression, geographically weighted (local regression, and longitudinal case studies, consisting of field work and archival research. Accessibility was measured by both density of park acreage and density of physical activity sites. Independent variables included percent non-Hispanic black, percent Hispanic, percent below poverty, percent of adults without high school diploma, percent with limited English-speaking ability, and population density. Results The ordinary least squares linear regression found weak relationships in both the park acreage density and the physical activity site density models (Ra2 = .11 and .23, respectively; AIC = 7162 and 3529, respectively. Geographically weighted regression, however, suggested spatial non-stationarity in both models, indicating disparities in accessibility that vary over space with respect to magnitude and directionality of the relationships (AIC = 2014 and -1241, respectively. The qualitative analysis supported the findings of the local regression, confirming that although there is a geographically inequitable distribution of park space and physical activity sites, it is not globally predicted by race, ethnicity, or socio-economic status. Conclusion The combination of quantitative and qualitative analyses demonstrated the complexity of the issues around

  3. Formation of κ-carrageenan-gelatin polyelectrolyte complexes studied by (1)H NMR, UV spectroscopy and kinematic viscosity measurements.

    Science.gov (United States)

    Voron'ko, Nicolay G; Derkach, Svetlana R; Vovk, Mikhail A; Tolstoy, Peter M

    2016-10-20

    The intermolecular interactions between an anionic polysaccharide from the red algae κ-carrageenan and a gelatin polypeptide, forming stoichiometric polysaccharide-polypeptide (bio)polyelectrolyte complexes in the aqueous phase, were examined. The major method of investigation was high-resolution (1)H NMR spectroscopy. Additional data were obtained by UV absorption spectroscopy, light scattering dispersion and capillary viscometry. Experimental data were interpreted in terms of the changing roles of electrostatic interactions, hydrophobic interactions and hydrogen bonds when κ-carrageenan-gelatin complexes are formed. At high temperatures, when biopolymer macromolecules in solution are in the state of random coil, hydrophobic interactions make a major contribution to complex stabilization. At the temperature of gelatin's coil→helix conformational transition and at lower temperatures, electrostatic interactions and hydrogen bonds play a defining role in complex formation. A proposed model of the κ-carrageenan-gelatin complex is discussed. PMID:27474666

  4. Development of InP solid state detector and liquid scintillator containing metal complex for measurement of pp/7Be solar neutrinos and neutrinoless double beta decay

    Science.gov (United States)

    Fukuda, Yoshiyuki; Moriyama, Shigetaka

    2012-07-01

    A large volume solid state detector using a semi-insulating Indium Phosphide (InP) wafer have been developed for measurement of pp/7Be solar neutrinos. Basic performance such as the charge collection efficiency and the energy resolution were measured by 60% and 20%, respectively. In order to detect two gammas (115keV and 497keV) from neutrino capture, we have designed hybrid detector which consist InP detector and liquid xenon scintillator for IPNOS experiment. New InP detector with thin electrode (Cr 50Å- Au 50Å). For another possibility, an organic liquid scintillator containing indium complex and zirconium complex were studied for a measurement of low energy solar neutrinos and neutrinosless double beta decay, respectively. Benzonitrile was chosen as a solvent because of good solubility for the quinolinolato complexes (2 wt%) and of good light yield for the scintillation induced by gamma-ray irradiation. The photo-luminescence emission spectra of InQ3 and ZrQ4 in benzonitrile was measured and liquid scintillator cocktail using InQ3 and ZrQ4 (50mg) in benzonitrile solutions (20 mL) with secondary scintillators with PPO (100mg) and POPOP (10mg) was made. The energy spectra of incident gammas were measured, and they are first results of the gamma-ray energy spectra using luminescent of metal complexes.

  5. Aerosol Disinfection Capacity of Slightly Acidic Hypochlorous Acid Water Towards Newcastle Disease Virus in the Air: An In Vivo Experiment.

    Science.gov (United States)

    Hakim, Hakimullah; Thammakarn, Chanathip; Suguro, Atsushi; Ishida, Yuki; Nakajima, Katsuhiro; Kitazawa, Minori; Takehara, Kazuaki

    2015-12-01

    Existence of bioaerosol contaminants in farms and outbreaks of some infectious organisms with the ability of transmission by air increase the need for enhancement of biosecurity, especially for the application of aerosol disinfectants. Here we selected slightly acidic hypochlorous acid water (SAHW) as a candidate and evaluated its virucidal efficacy toward a virus in the air. Three-day-old conventional chicks were challenged with 25 doses of Newcastle disease live vaccine (B1 strain) by spray with nebulizer (particle size water as the control and SAHW containing 50 or 100 parts per million (ppm) free available chlorine in pH 6 were sprayed on the treated chicks with other nebulizers. Exposed chicks were kept in separated cages in an isolator and observed for clinical signs. Oropharyngeal swab samples were collected from 2 to 5 days postexposure from each chick, and then the samples were titrated with primary chicken kidney cells to detect the virus. Cytopathic effects were observed, and a hemagglutination test was performed to confirm the result at 5 days postinoculation. Clinical signs (sneezing) were recorded, and the virus was isolated from the control and 50 ppm treatment groups, while no clinical signs were observed in and no virus was isolated from the 100 ppm treatment group. The virulent Newcastle disease virus (NDV) strain Sato, too, was immediately inactivated by SAHW containing 50 ppm chlorine in the aqueous phase. These data suggest that SAHW containing 100 ppm chlorine can be used for aerosol disinfection of NDV in farms.

  6. Temperature effect on sorption of cations onto clay minerals: complexation modeling and experimental measurements up to 150 deg. C

    Energy Technology Data Exchange (ETDEWEB)

    Tertre, E. [LMTG, UMR UPS-CNRS-IRD 5563, 14 av. E. Belin, 31400 Toulouse (France)]|[ANDRA, Parc de la Croix Blanche - 1/7 rue Jean Monnet, 92298 Chatenay-Malabry (France)]|[EDF R and D, 77818 Moret sur Loing (France); Berger, G.; Castet, S.; Loubet, M. [LMTG, UMR UPS-CNRS-IRD 5563, 14 av. E. Belin, 31400 Toulouse (France); Giffaut, E. [ANDRA, Parc de la Croix Blanche - 1/7 rue Jean Monnet, 92298 Chatenay-Malabry (France); Simoni, E. [Universite Paris XI, Institut de Physique Nucleaire, Groupe de Radiochimie, Bat. 100, 91406 Orsay (France); Catalette, H. [EDF R and D, 77818 Moret sur Loing (France)

    2005-07-01

    clay minerals is not temperature dependant whereas the surface charges increase weakly when temperature rises from 25 to 60 deg. C [2]. A surface complexation model (DLM) integrating the temperature parameter was performed to explain our sorption data. This model takes into account the site densities and their associated pK{sub a} obtained by our surface acid/base model [2]. [1] Experimental sorption of Ni{sup 2+}, Cs{sup +} and Ln{sup 3+} onto a montmorillonite up to 150 deg. C. E. Tertre, G. Berger, S. Castet, M. Loubet and E. Giffaut (submitted). [2] Acid/base surface chemistry of kaolinite and montmorillonite at 25 and 60 deg. C. Experimental measurements and modeling by CHESS{sup R}. E. Tertre, S. Castet, G. Berger, M. Loubet and E. Giffaut (in preparation). (authors)

  7. Development of X-ray Computed Tomography (CT) Imaging Method for the Measurement of Complex 3D Ice Shapes Project

    Data.gov (United States)

    National Aeronautics and Space Administration — When ice accretes on a wing or other aerodynamic surface, it can produce extremely complex shapes. These are comprised of well-known shapes such as horns and...

  8. The importance and complexity of regret in the measurement of 'good' decisions: a systematic review and a content analysis of existing assessment instruments

    NARCIS (Netherlands)

    Joseph-Williams, N.; Edwards, A.; Elwyn, G.

    2011-01-01

    BACKGROUND OR CONTEXT: Regret is a common consequence of decisions, including those decisions related to individuals' health. Several assessment instruments have been developed that attempt to measure decision regret. However, recent research has highlighted the complexity of regret. Given its relev

  9. Complexity Plots

    KAUST Repository

    Thiyagalingam, Jeyarajan

    2013-06-01

    In this paper, we present a novel visualization technique for assisting the observation and analysis of algorithmic complexity. In comparison with conventional line graphs, this new technique is not sensitive to the units of measurement, allowing multivariate data series of different physical qualities (e.g., time, space and energy) to be juxtaposed together conveniently and consistently. It supports multivariate visualization as well as uncertainty visualization. It enables users to focus on algorithm categorization by complexity classes, while reducing visual impact caused by constants and algorithmic components that are insignificant to complexity analysis. It provides an effective means for observing the algorithmic complexity of programs with a mixture of algorithms and black-box software through visualization. Through two case studies, we demonstrate the effectiveness of complexity plots in complexity analysis in research, education and application. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  10. Dynamics measured by neutron scattering correlates with the organization of bioenergetics complexes in natural membranes from hyperthermophile and mesophile bacteria.

    Science.gov (United States)

    Peters, J; Giudici-Orticoni, M T; Zaccai, G; Guiral, M

    2013-07-01

    Various models on membrane structure and organization of proteins and complexes in natural membranes emerged during the last years. However, the lack of systematic dynamical studies to complement structural investigations hindered the establishment of a more complete picture of these systems. Elastic incoherent neutron scattering gives access to the dynamics on a molecular level and was applied to natural membranes extracted from the hyperthermophile Aquifex aeolicus and the mesophile Wolinella succinogenes bacteria. The results permitted to extract a hierarchy of dynamic flexibility and atomic resilience within the samples, which correlated with the organization of proteins in bioenergetics complexes and the functionality of the membranes. PMID:23880731

  11. A Measure of Systems Engineering Effectiveness in Government Acquisition of Complex Information Systems: A Bayesian Belief Network-Based Approach

    Science.gov (United States)

    Doskey, Steven Craig

    2014-01-01

    This research presents an innovative means of gauging Systems Engineering effectiveness through a Systems Engineering Relative Effectiveness Index (SE REI) model. The SE REI model uses a Bayesian Belief Network to map causal relationships in government acquisitions of Complex Information Systems (CIS), enabling practitioners to identify and…

  12. Cutaneous noradrenaline measured by microdialysis in complex regional pain syndrome during whole-body cooling and heating

    DEFF Research Database (Denmark)

    Terkelsen, Astrid Juhl; Gierthmühlen, Janne; Petersen, Lars J.;

    2013-01-01

    Complex regional pain syndrome (CRPS) is characterised by autonomic, sensory, and motor disturbances. The underlying mechanisms of the autonomic changes in CPRS are unknown. However, it has been postulated that sympathetic inhibition in the acute phase with locally reduced levels of noradrenaline...

  13. Complex Problem Solving in Educational Contexts--Something beyond "g": Concept, Assessment, Measurement Invariance, and Construct Validity

    Science.gov (United States)

    Greiff, Samuel; Wustenberg, Sascha; Molnar, Gyongyver; Fischer, Andreas; Funke, Joachim; Csapo, Beno

    2013-01-01

    Innovative assessments of cross-curricular competencies such as complex problem solving (CPS) have currently received considerable attention in large-scale educational studies. This study investigated the nature of CPS by applying a state-of-the-art approach to assess CPS in high school. We analyzed whether two processes derived from cognitive…

  14. Investigation of the model of the vibration measuring channel of the complex monitoring system of steel tanks

    OpenAIRE

    Бурау, Надежда Ивановна; Цыбульник, Сергей Алексеевич; Шевчук, Дмитрий Владимирович

    2015-01-01

    The presence of defects and damage incurred during the manufacture, installation and operation raises the problem of controlling the technical condition of critical structures of engineering and construction facilities on one of the first places in the diagnosis of objects. In the modern world practice, this problem is solved by using complex intelligent monitoring systems. Due to the wide range of opportunities, these tools for functional diagnostics are widely used in various industries.The...

  15. Formation of p-cresol:piperazine complex in solution monitored by spin-lattice relaxation times and pulsed field gradient NMR diffusion measurements

    Science.gov (United States)

    de Carvalho, Erika Martins; Velloso, Marcia Helena Rodrigues; Tinoco, Luzineide Wanderley; Figueroa-Villar, José Daniel

    2003-10-01

    A study of the nature of the anthelmintic p-cresol:piperazine complex in chloroform solution has been conducted using different NMR techniques: self-diffusion coefficients using DOSY; NOE, NULL, and double-selective T1 measurements to determine inter-molecular distances; and selective and non-selective T1 measurements to determine correlation times. The experimental results in solution and CP-MAS were compared to literature X-ray diffraction data using molecular modeling. It was shown that the p-cresol:piperazine complex exists in solution in a very similar manner as it does in the solid state, with one p-cresol molecule hydrogen bonded through the hydroxyl hydrogen to each nitrogen atom of piperazine. The close correspondence between the X-ray diffraction data and the inter-proton distances obtained by NULL and double selective excitation techniques indicate that those methodologies can be used to determine inter-molecular distances in solution.

  16. Db复小波在超高频局部放电测量中的应用%Application of Complex Daubechies Wavelet in UHF Partial Discharge Measurements

    Institute of Scientific and Technical Information of China (English)

    谢颜斌; 唐炬; 张晓星

    2008-01-01

    On-line partial discharge (PD) detection still remains a very challenging task because of the strong electromagnetic interferences. In this paper, a new method of de-noising, using complex Daubechies wavelet (CDW) transform, has been proposed. It is a relatively recent enhancement to the real-valued wavelet transform because of tow important properties, which are nearly shift-invariant and availability of phase information. Those properties give CDW transform superiority over other real-valued wavelet transform, and then the construction algorithm of CDW is introduced in detail. Secondly, based on the real threshold algorithm of real-valued wavelet transform, complex threshold algorithm is devised. This algorithm take the different characteristics of real part and imaginary part of complex wavelet coefficients into account, it modifies the real and imaginary parts of complex wavelet coefficients respectively. Thirdly, to obtain a real de-noised signal, new combined information series is devised. By applying different combination of real part and imaginary part of de-noised complex signal, a real de-noised signal can be restored with higher peak signal-to-noise ratio (PSNR) and less distortion of original signals. Finally, On-site applications of extracting PD signals from noisy background by the optimal de-noising scheme based on CDW are illustrated. The on-site experimental results show that the optimal de-noising scheme is an effective way to suppress white noise in PD measurement.

  17. Latent-class analysis of recurrence risks for complex phenotypes with selection and measurement error: a twin and family history study of autism.

    OpenAIRE

    Pickles, A; Bolton, P.; Macdonald, H.; Bailey, A; Le Couteur, A; Sim, C H; Rutter, M

    1995-01-01

    The use of the family history method to examine the pattern of recurrence risks for complex disorders such as autism is not straightforward. Problems such as uncertain phenotypic definition, unreliable measurement with increased error rates for more distant relatives, and selection due to reduced fertility all complicate the estimation of risk ratios. Using data from a recent family history study of autism, and a similar study of twins, this paper shows how a latent-class approach can be used...

  18. A system for traceable measurement of the microwave complex permittivity of liquids at high pressures and temperatures

    International Nuclear Information System (INIS)

    A system has been developed for direct traceable dielectric measurements on liquids at high pressures and temperatures. The system consists of a coaxial reflectometric sensor terminated by a metallic cylindrical cell to contain the liquid. It has been designed for measurements on supercritical liquids, but as a first step measurements on dielectric reference liquids were performed. This paper reports on a full evaluation of the system up to 2.5 GHz using methanol, ethanol and n-propanol at pressures up to 9 MPa and temperatures up to 273 °C. A comprehensive approach to the evaluation of uncertainties using Monte Carlo modelling is used

  19. Translabial ultrasound assessment of the anal sphincter complex: normal measurements of the internal and external anal sphincters at the proximal, mid-, and distal levels.

    Science.gov (United States)

    Hall, Rebecca J; Rogers, Rebecca G; Saiz, Lori; Qualls, C

    2007-08-01

    The purpose of this study was to measure the internal and external anal sphincters using translabial ultrasound (TLU) at the proximal, mid, and distal levels of the anal sphincter complex. The human review committee approval was obtained and all women gave written informed consent. Sixty women presenting for gynecologic ultrasound for symptoms other than pelvic organ prolapse or urinary or anal incontinence underwent TLU. Thirty-six (60%) were asymptomatic and intact, 13 symptomatic and intact, and 11 disrupted. Anterior-posterior diameters of the internal anal sphincter at all levels and the external anal sphincter at the distal level were measured in four quadrants. Mean sphincter measurements are given for symptomatic and asymptomatic intact women and are comparable to previously reported endoanal MRI and ultrasound measurements. PMID:17221149

  20. Complexity measurement of a graphical programming language and comparison of a graphical and a textual design language

    OpenAIRE

    Goff, Roger Allen

    1987-01-01

    For many years the software engineering community has been attacking the software reliability problem on two fronts. First via design methodologies, languages and tools as a precheck on quality and second by measuring the quality of produced software as a postcheck. This research attempts to unify the approach to creating reliable software by providing the ability to measure the quality of a design prior to its implementation. Also presented is a comparison of a graphical and a...

  1. The Relationship of 3D Translabial Ultrasound Anal Sphincter Complex Measurements to Postpartum Anal and Fecal Incontinence

    Science.gov (United States)

    MERIWETHER, Kate V.; HALL, Rebecca J.; LEEMAN, Lawrence M.; MIGLIACCIO, Laura; QUALLS, Clifford; ROGERS, Rebecca G.

    2015-01-01

    Objective We aimed to determine whether ASC measurements on translabial ultrasound (TL-US) were related to anal incontinence (AI) or fecal incontinence (FI) symptoms six months postpartum. Methods A prospective cohort of primiparous women underwent TL-US six months after a vaginal birth (VB) or Cesarean delivery (CD). Muscle thickness was measured at 3, 6, 9, and 12 o’clock positions of the external sphincter (EAS), the same four quadrants of the internal sphincter (IAS) at proximal, mid, and distal levels, and at the bilateral pubovisceralis muscle (PVM). Measurements were correlated to AI and FI on the Wexner Fecal Incontinence Scale, with sub-analyses by mode of delivery. The odds ratio (OR) of symptoms was calculated for every one millimeter increase in muscle thickness (E1MIT). Results 423 women (299 VB, 124 CD) had TL-US six months postpartum. Decreased AI risk was associated with thicker measurements at the 6 o’clock (OR 0.74 E1MIT) and 9 o’clock proximal IAS (OR 0.71 E1MIT) in the entire cohort. For CD women, thicker measurements of the 9 o’clock proximal IAS were associated with decreased risk of AI (OR 0.56 E1MIT) and thicker distal 6 o’clock IAS measurements were related to a decreased risk of FI (OR 0.37 E1MIT). For VB women, no sphincter measurements were significantly related to symptoms, but thicker PVM measurements were associated with increased risk of AI (right side OR 1.32 E1MIT; left side OR 1.21 E1MIT). Conclusions ASC anatomy is associated with AI and FI in certain locations; these locations varybased on the patient’s mode of delivery. PMID:26085463

  2. Kinetic measurements and quantum chemical calculations on low spin Ni(II)/(III) macrocyclic complexes in aqueous and sulphato medium

    Indian Academy of Sciences (India)

    Anuradha Sankaran; E J Padma Malar; Venkatapuram Ramanujam Vijayaraghavan

    2015-07-01

    Cu(II) ion catalyzed kinetics of oxidation of H2O2 by [NiIIIL2] (L2 = 1,8-bis(2-hydroxyethyl)-1,3,6,8,10,13-hexaazacyclotetradecane) was studied in aqueous acidic medium in the presence of sulphate ion. The rate of oxidation of H2O2 by [NiIIIL2] is faster than that by [NiIIIL1] (L1 = 1,4,8,11-tetraazacyclote-tradecane) in sulphate medium. DFT calculations at BP86/def2-TZVP level lead to different modes of bonding between [NiL]II/III and water ligands (L = L1 and L2). In aqueous medium, two water molecules interact with [NiL]II through weak hydrogen bonds with L and are tilted by ∼23° from the vertical axis forming the dihydrate [NiL]2+.2H2O. However, there is coordinate bond formation between [NiL1]III and two water molecules in aqueous medium and an aqua and a sulphato ligand in sulphate medium leading to the octahedral complexes [NiL1(H2O)2]3+ and [NiL1(SO4)(H2O)]+. In the analogous [NiL2]III, the water molecules are bound by hydrogen bonds resulting in [NiL2]3+.2H2O and [NiL2(SO4)]+.H2O. As the sulphato complex [NiL2(SO4)]+.H2O is less stable than [NiL1(SO4)(H2O)]+ in view of the weak H-bonding interactions in the former it can react faster. Thus the difference in the mode of bonding between Ni(III) and the water ligand can explain the rate of oxidation of H2O2 by [NiIIIL] complexes.

  3. Electronic speckle pattern interferometry technique for the measurement of complex mechanical structures for aero-spatial applications

    Science.gov (United States)

    Restrepo, René; Uribe-Patarroyo, Néstor; Garranzo, Daniel; Pintado, José M.; Frovel, Malte; Belenguer, Tomás

    2010-09-01

    Using the electronic speckle pattern interferometry (ESPI) technique in the in-plane arrangement, the coefficient of thermal expansion (CTE) of a composite material that will be used in a passive focusing mechanism of an aerospace mission was measured. This measurement with ESPI was compared with another interferometric method (Differential Interferometer), whose principal characteristic is its high accuracy, but the measurement is only local. As a final step, the results have been used to provide feedback with the finite element analysis (FEA). Before the composite material measurements, a quality assessment of the technique was carried out measuring the CTE of Aluminum 6061-T6. Both techniques were compared with the datasheet delivered by the supplier. A review of the basic concepts was done, especially with regards to ESPI, and the considerations to predict the quality in the fringes formation were explained. Also, a review of the basic concepts for the mechanical calculation in composite materials was done. The CTE of the composite material found was 4.69X10-6 +/- 3X10-6K-1. The most important advantage between ESPI and differential interferometry is that ESPI provides more information due to its intrinsic extended area, surface deformation reconstruction, in comparison with the strictly local measurement of differential interferometry

  4. Automatic and accurate measurements of P-wave and S-wave polarisation properties with a weighted multi-station complex polarisation analysis

    Science.gov (United States)

    de Meersman, K.; van der Baan, M.; Kendall, J.-M.; Jones, R. H.

    2003-04-01

    We present a weighted multi-station complex polarisation analysis to determine P-wave and S-wave polarisation properties of three-component seismic array data. Complex polarisation analysis of particle motion on seismic data was first introduced by Vidale (1986). In its original form, the method is an interpretation of the eigenvalue decomposition of a 3 by 3, complex data-covariance matrix. We have extended the definition of the data-covariance matrix (C) to C=X^HW-1 X, where C now is a 3n by 3n symmetric complex covariance matrix, with n the number of included three-component (3C) stations. X is the data matrix, the columns of which are the analytic signals of the Northern, Eastern and vertical components of the subsequent 3C stations. X^H is the transpose of the complex conjugate of X and W is a diagonal weighting matrix containing the pre-arrival noise levels of all components and all stations. The signals used in the data-matrix are corrected for arrival time differences. The eigenvectors and eigenvalues of C now describe the polarisation properties within the selected analysis window for all included stations. The main advantages of this approach are a better separation of signal and noise in the covariance matrix and the measurement of signal polarisation properties that are not influenced by the presence of polarised white noise. The technique was incorporated in an automated routine to measure the P-wave and S-wave polarisation properties of a microseismic data-set. The data were recorded in the Valhall oilfield in 1998 with a six level 3C vertical linear array with geophones at 20 m intervals between depths of 2100 m and 2200 m. In total 303 microseismic events were analysed and the results compared with manual interpretations. This comparison showed the advantage and high accuracy of the method.

  5. Measurement of Labile Cu, Pb and Their Complexation Capa-city in Yueqing Bay in Zhejiang Province, China

    Institute of Scientific and Technical Information of China (English)

    王正方; 吕海燕; 傅和芳

    2004-01-01

    The complexation capacity of Cu and Pb and their labile and organic contents were determined separately for surface seawater samples from Yueqing Bay. The samples were prepared using Nuclepore filtration method yielding <1.0μm, <0.4μm and <0.2μm particulate water samples. Our data indicated that the <0.2μm colloidal fraction is a major carrier for distribution of copper in seawater. Affinity of Cu to marine microparticles plays an important role in the process. Pb however, tends to be absorbed by >0.2μm particles. The complexation capacity of Pb with <0.2μm particulates was smaller than that with 0.2-1.0μm particulates, and averaged 11.5 and 23.0nmol/L respectively. The results suggested that colloidal particles were responsible for the distribution and concentration of Pb in seawater.

  6. Simultaneous Measurement of Antenna Gain and Complex Permittivity of Liquid in Near-Field Region Using Weighted Regression

    Science.gov (United States)

    Ishii, Nozomu; Shiga, Hiroki; Ikarashi, Naoto; Sato, Ken-Ichi; Hamada, Lira; Watanabe, Soichi

    As a technique for calibrating electric-field probes used in standardized SAR (Specific Absorption Rate) assessment, we have studied the technique using the Friis transmission formula in the tissue-equivalent liquid. It is difficult to measure power transmission between two reference antennas in the far-field region due to large attenuation in the liquid. This means that the conventional Friis transmission formula cannot be applied to our measurement so that we developed an extension of this formula that is valid in the near-field region. In this paper, the method of weighted least squares is introduced to reduce the effect of the noise in the measurement system when the gain of the antenna operated in the liquid is determined by the curve-fitting technique. And we examine how to choose the fitting range to reduce the uncertainty of the estimated gain.

  7. Fractal dimension of trabecular bone: comparison of three histomorphometric computed techniques for measuring the architectural two-dimensional complexity.

    Science.gov (United States)

    Chappard, D; Legrand, E; Haettich, B; Chalès, G; Auvinet, B; Eschard, J P; Hamelin, J P; Baslé, M F; Audran, M

    2001-11-01

    Trabecular bone has been reported as having two-dimensional (2-D) fractal characteristics at the histological level, a finding correlated with biomechanical properties. However, several fractal dimensions (D) are known and computational ways to obtain them vary considerably. This study compared three algorithms on the same series of bone biopsies, to obtain the Kolmogorov, Minkowski-Bouligand, and mass-radius fractal dimensions. The relationships with histomorphometric descriptors of the 2-D trabecular architecture were investigated. Bone biopsies were obtained from 148 osteoporotic male patients. Bone volume (BV/TV), trabecular characteristics (Tb.N, Tb.Sp, Tb.Th), strut analysis, star volumes (marrow spaces and trabeculae), inter-connectivity index, and Euler-Poincaré number were computed. The box-counting method was used to obtain the Kolmogorov dimension (D(k)), the dilatation method for the Minkowski-Bouligand dimension (D(MB)), and the sandbox for the mass-radius dimension (D(MR)) and lacunarity (L). Logarithmic relationships were observed between BV/TV and the fractal dimensions. The best correlation was obtained with D(MR) and the lowest with D(MB). Lacunarity was correlated with descriptors of the marrow cavities (ICI, star volume, Tb.Sp). Linear relationships were observed among the three fractal techniques which appeared highly correlated. A cluster analysis of all histomorphometric parameters provided a tree with three groups of descriptors: for trabeculae (Tb.Th, strut); for marrow cavities (Euler, ICI, Tb.Sp, star volume, L); and for the complexity of the network (Tb.N and the three D's). A sole fractal dimension cannot be used instead of the classic 2-D descriptors of architecture; D rather reflects the complexity of branching trabeculae. Computation time is also an important determinant when choosing one of these methods. PMID:11745685

  8. Speed Isn't Everything: Complex Processing Speed Measures Mask Individual Differences and Developmental Changes in Executive Control

    Science.gov (United States)

    Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko

    2013-01-01

    The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of "processing speed" may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive control and…

  9. Pseudo-stokes vector from complex signal representation of a speckle pattern and its applications to micro-displacement measurement

    DEFF Research Database (Denmark)

    Wang, W.; Ishijima, R.; Matsuda, A.;

    2010-01-01

    As an improvement of the intensity correlation used widely in conventional electronic speckle photography, we propose a new technique for displacement measurement based on correlating Stokes-like parameters derivatives for transformed speckle patterns. The method is based on a Riesz transform of ...

  10. Theory and calibration of non-nulling seven-hole cone probes for use in complex flow measurement

    Science.gov (United States)

    Everett, K. N.; Durston, D. A.; Gerner, A. A.

    1982-01-01

    A seven-hole conical pressure probe capable of measuring flow conditions at angles up to 75 deg relative to its axis is described. The theoretical rationale of the seven-hole probe is developed and the calibration procedure outlined. Three-variable third order polynomials are used to represent local values of total pressure, static pressure, Mach number and relative flow angles. These flow conditions can be determined explicitly from measured probe pressures. Flow angles may be determined within 2.5 deg and Mach number within 0.05 with 95% certainty. The probe was calibrated in subsonic compressible and incompressible flows. Results of a calibration of four seven-hole probes are presented.

  11. Development of a microwave transmission setup for time-resolved measurements of the transient complex conductivity in bulk samples

    Science.gov (United States)

    Schins, J. M.; Prins, P.; Grozema, F. C.; Abellón, R. D.; de Haas, M. P.; Siebbeles, L. D. A.

    2005-08-01

    We describe and characterize a microwave transmission setup for the measurement of radiation-induced transient conductivities in the frequency range between 26 and 38GHz (Q band). This technique combines the virtues of two already existing techniques. On one hand, the microwave transmission technique is well established for the determination of (quasi)static conductivities, but requires adaptations to be suitable to the determination of transient conductivities with 1ns temporal resolution. On the other hand, the transient conductivity technique is well established, too, but in its present form (using a reflection configuration) it suffers from a poor signal to noise ratio due to unwanted interferences. These interferences are due to the circulator, which diverts part of the incoming microwave flux directly to the detector. We characterized the transmission setup by measuring the real and imaginary components of the conductivity of pulse irradiated CO2 gas at different pressures, and compared these results to predictions of the Drude model. CO2 was chosen as a test sample because of its well characterized behavior when irradiated with MeV electron pulses, and the fact that a wide range of the ratios of imaginary to real components of the conductivity are obtainable by just controlling the pressure. For intrinsic bulk isolators (either powders or in solution) pulse-induced conductivity changes as small as 10-8S /m can be measured with nanosecond time resolution. The ratio of the imaginary to real part of the conductivity can be measured in the range from 0.084 to 28, which means that the dynamic range has been increased more than 100-fold with respect to the customary reflection setup.

  12. Structure and equilibria of Ca 2+-complexes of glucose and sorbitol from multinuclear ( 1H, 13C and 43Ca) NMR measurements supplemented with molecular modelling calculations

    Science.gov (United States)

    Pallagi, A.; Dudás, Cs.; Csendes, Z.; Forgó, P.; Pálinkó, I.; Sipos, P.

    2011-05-01

    Ca 2+-complexation of D-glucose and D-sorbitol have been investigated with the aid of multinuclear ( 1H, 13C and 43Ca) NMR spectroscopy and ab initio quantum chemical calculations. Formation constants of the forming 1:1 complexes have been estimated from one-dimensional 13C NMR spectra obtained at constant ionic strength (1 M NaCl). Binding sites were identified from 2D 1H- 43Ca NMR spectra. 2D NMR measurements and ab initio calculations indicated that Ca 2+ ions were bound in a tridentate manner via the glycosidic OH, the ethereal oxygen in the ring and the OH on the terminal carbon for the α- and β-anomers of glucose and for sorbitol simultaneous binding of four hydroxide moieties (C1, C2, C4 and C6) was suggested.

  13. Measuring $\

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Jessica Sarah [Univ. of Cambridge (United Kingdom)

    2011-01-01

    The MINOS Experiment consists of two steel-scintillator calorimeters, sampling the long baseline NuMI muon neutrino beam. It was designed to make a precise measurement of the ‘atmospheric’ neutrino mixing parameters, Δm2 atm. and sin2 (2 atm.). The Near Detector measures the initial spectrum of the neutrino beam 1km from the production target, and the Far Detector, at a distance of 735 km, measures the impact of oscillations in the neutrino energy spectrum. Work performed to validate the quality of the data collected by the Near Detector is presented as part of this thesis. This thesis primarily details the results of a vμ disappearance analysis, and presents a new sophisticated fitting software framework, which employs a maximum likelihood method to extract the best fit oscillation parameters. The software is entirely decoupled from the extrapolation procedure between the detectors, and is capable of fitting multiple event samples (defined by the selections applied) in parallel, and any combination of energy dependent and independent sources of systematic error. Two techniques to improve the sensitivity of the oscillation measurement were also developed. The inclusion of information on the energy resolution of the neutrino events results in a significant improvement in the allowed region for the oscillation parameters. The degree to which sin2 (2θ )= 1.0 could be disfavoured with the exposure of the current dataset if the true mixing angle was non-maximal, was also investigated, with an improved neutrino energy reconstruction for very low energy events. The best fit oscillation parameters, obtained by the fitting software and incorporating resolution information were: | Δm2| = 2.32+0.12 -0.08×10-3 eV2 and sin2 (2θ ) > 0.90(90% C.L.). The analysis provides the current world best measurement of the atmospheric neutrino mass

  14. Determination of catecholamines based on the measurement of the metal nanoparticle-enhanced fluorescence of their terbium complexes

    International Nuclear Information System (INIS)

    We have developed a method for the determination of the three catecholamines (CAs) epinephrine (EP), norepinephrine (NE), and dopamine (DA) at sub-nanomolar levels. It is found that the luminescence of the complexes formed between the CAs and Tb 3+ ion is strongly enhanced in the presence of colloidal silver nanoparticles (Ag-NPs). The Ag-NPs cause a transfer of the resonance energy to the fluorophores through the interaction of the excited-state fluorophores and surface plasmon electrons in the Ag-NPs. Under the optimized condition, the luminescence intensity of the system is linearly related to the concentration of the CAs. Linearity is observed in the concentration ranges of 2. 5-110 nM for EP, 2. 8-240 nM for NE, and 2. 4-140 nM for DA, with limits of detection as low as 0. 25 nM, 0. 64 nM and 0. 42 nM, respectively. Relative standard deviations were determined at 10 nM concentrations (for n = 10) and gave values of 0. 98%, 1. 05% and 0. 96% for EP, NE and DA, respectively. Catecholamines were successfully determined in pharmaceutical preparations, and successful recovery experiments are demonstrated for urine and serum samples. (author)

  15. Statistical and Spectral Analysis of Wind Characteristics Relevant to Wind Energy Assessment Using Tower Measurements in Complex Terrain

    Directory of Open Access Journals (Sweden)

    Radian Belu

    2013-01-01

    Full Text Available The main objective of the study was to investigate spatial and temporal characteristics of the wind speed and direction in complex terrain that are relevant to wind energy assessment and development, as well as to wind energy system operation, management, and grid integration. Wind data from five tall meteorological towers located in Western Nevada, USA, operated from August 2003 to March 2008, used in the analysis. The multiannual average wind speeds did not show significant increased trend with increasing elevation, while the turbulence intensity slowly decreased with an increase were the average wind speed. The wind speed and direction were modeled using the Weibull and the von Mises distribution functions. The correlations show a strong coherence between the wind speed and direction with slowly decreasing amplitude of the multiday periodicity with increasing lag periods. The spectral analysis shows significant annual periodicity with similar characteristics at all locations. The relatively high correlations between the towers and small range of the computed turbulence intensity indicate that wind variability is dominated by the regional synoptic processes. Knowledge and information about daily, seasonal, and annual wind periodicities are very important for wind energy resource assessment, wind power plant operation, management, and grid integration.

  16. Utilization of Methyl Proton Resonances in Cross-Saturation Measurement for Determining the Interfaces of Large Protein-Protein Complexes

    International Nuclear Information System (INIS)

    Cross-saturation experiments allow the identification of the contact residues of large protein complexes (MW>50 K) more rigorously than conventional NMR approaches which involve chemical shift perturbations and hydrogen-deuterium exchange experiments [Takahashi et al. (2000) Nat. Struct. Biol., 7, 220-223]. In the amide proton-based cross-saturation experiment, the combined use of high deuteration levels for non-exchangeable protons of the ligand protein and a solvent with a low concentration of 1H2Ogreatly enhanced the selectivity of the intermolecular cross-saturation phenomenon. Unfortunately, experimental limitations caused losses in sensitivity. Furthermore, since main chain amide protons are not generally exposed to solvent, the efficiency of the saturation transfer directed to the main chain amide protons is not very high. Here we propose an alternative cross-saturation experiment which utilizes the methyl protons of the side chains of the ligand protein. Owing to the fast internal rotation along the methyl axis, we theoretically and experimentally demonstrated the enhanced efficiency of this approach. The methyl-utilizing cross-saturation experiment has clear advantages in sensitivity and saturation transfer efficiency over the amide proton-based approach

  17. Comparative study on CoFe2O4 ultrafineparticles in liquid and dry specimens of the acidic water-based ferrofluids by STM

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In the acidic water based ferro fluids, the ultra-fine particles appeared in a form of spheres with a diameter in a range of 2~6nm. The poly-groups of CoFe2O4 ultrafineparticles are divided into two species, i.e. the weak poly-group and strong poly-group, based on the resoluble degree of these ultrafineparticles that aggregate into the groups observed by STM. The ultra-fine particles formed at the liquid and dry specimens have different ratio of two species

  18. Approximate Entropy as a measure of complexity in sap flow temporal dynamics of two tropical tree species under water deficit

    Directory of Open Access Journals (Sweden)

    Gustavo M. Souza

    2004-09-01

    Full Text Available Approximate Entropy (ApEn, a model-independent statistics to quantify serial irregularities, was used to evaluate changes in sap flow temporal dynamics of two tropical species of trees subjected to water deficit. Water deficit induced a decrease in sap flow of G. ulmifolia, whereas C. legalis held stable their sap flow levels. Slight increases in time series complexity were observed in both species under drought condition. This study showed that ApEn could be used as a helpful tool to assess slight changes in temporal dynamics of physiological data, and to uncover some patterns of plant physiological responses to environmental stimuli.Entropia Aproximada (ApEn, um modelo estatístico independente para quantificar irregularidade em séries temporais, foi utilizada para avaliar alterações na dinâmica temporal do fluxo de seiva em duas espécies arbóreas tropicais submetidas à deficiência hídrica. A deficiência hídrica induziu uma grande redução no fluxo de seiva em G. ulmifolia, enquanto que na espécie C. legalis manteve-se estável. A complexidade das séries temporais foi levemente aumentada sob deficiência hídrica. O estudo mostrou que ApEn pode ser usada como um método para detectar pequenas alterações na dinâmica temporal de dados fisiológicos, e revelar alguns padrões de respostas fisiológicas a estímulos ambientais.

  19. Complex permittivity measurements during high temperature recycling of space shuttle antenna window and dielectric heat shield materials

    Science.gov (United States)

    Bassett, H. L.; Bomar, S. H., Jr.

    1973-01-01

    The research performed and the data obtained on candidate space shuttle antenna window and heat shield materials are presented. The measurement technique employs a free-space focused beam microwave bridge for obtaining RF transmission data, and a device which rotates a sample holder which is heated on one side by natural gas-air flames. The surface temperature of each sample is monitored by IR pyrometry; embedded and rear surface thermocouples are also used in obtaining temperature data. The surface of the sample undergoing test is subjected to approximately the same temperature/time profile that occurs at a proposed antenna position on the space shuttle as it re-enters. The samples are cycled through ten of these temperature profiles to determine the recycling effects. Very little change was noted in the materials due to the recycling.

  20. Direct quantitative electrical measurement of many-body interactions in exciton complexes in InAs quantum dots.

    Science.gov (United States)

    Labud, P A; Ludwig, A; Wieck, A D; Bester, G; Reuter, D

    2014-01-31

    We present capacitance-voltage spectra for the conduction band states of InAs quantum dots obtained under continuous illumination. The illumination leads to the appearance of additional charging peaks that we attribute to the charging of electrons into quantum dots containing a variable number of illumination-induced holes. By this we demonstrate an electrical measurement of excitonic states in quantum dots. Magnetocapacitance-voltage spectroscopy reveals that the electron always tunnels into the lowest electronic state. This allows us to directly extract, from the highly correlated many-body states, the correlation energy. The results are compared quantitatively to state of the art atomistic configuration interaction calculations, showing very good agreement for a lower level of excitations and also limitations of the approach for an increasing number of particles. Our experiments offer a rare benchmark to many-body theoretical calculations. PMID:24580478

  1. Intentional cargo disruption by nefarious means: Examining threats, systemic vulnerabilities and securitisation measures in complex global supply chains.

    Science.gov (United States)

    McGreevy, Conor; Harrop, Wayne

    2015-01-01

    Global trade and commerce requires products to be securely contained and transferred in a timely way across great distances and between national boundaries. Throughout the process, cargo and containers are stored, handled and checked by a range of authorities and authorised agents. Intermodal transportation involves the use of container ships, planes, railway systems, land bridges, road networks and barges. This paper examines the the nefarious nature of intentional disruption and nefarious risks associated with the movement of cargo and container freight. The paper explores main threats, vulnerabilities and security measures relevant to significant intermodal transit risk issues such as theft, piracy, terrorism, contamination, counterfeiting and product tampering. Three risk and vulnerability models are examined and basic standards and regulations that are relevant to safe and secure transit of container goods across international supply networks are outlined. PMID:25990978

  2. Characterization of the low-temperature triplet state of chlorophyll in photosystem II core complexes: Application of phosphorescence measurements and Fourier transform infrared spectroscopy.

    Science.gov (United States)

    Zabelin, Alexey A; Neverov, Konstantin V; Krasnovsky, Alexander A; Shkuropatova, Valentina A; Shuvalov, Vladimir A; Shkuropatov, Anatoly Ya

    2016-06-01

    Phosphorescence measurements at 77 K and light-induced FTIR difference spectroscopy at 95 K were applied to study of the triplet state of chlorophyll a ((3)Chl) in photosystem II (PSII) core complexes isolated from spinach. Using both methods, (3)Chl was observed in the core preparations with doubly reduced primary quinone acceptor QA. The spectral parameters of Chl phosphorescence resemble those in the isolated PSII reaction centers (RCs). The main spectral maximum and the lifetime of the phosphorescence corresponded to 955±1 nm and of 1.65±0.05 ms respectively; in the excitation spectrum, the absorption maxima of all core complex pigments (Chl, pheophytin a (Pheo), and β-carotene) were observed. The differential signal at 1667(-)/1628(+)cm(-1) reflecting a downshift of the stretching frequency of the 13(1)-keto C=O group of Chl was found to dominate in the triplet-minus-singlet FTIR difference spectrum of core complexes. Based on FTIR results and literature data, it is proposed that (3)Chl is mostly localized on the accessory chlorophyll that is in triplet equilibrium with P680. Analysis of the data suggests that the Chl triplet state responsible for the phosphorescence and the FTIR difference spectrum is mainly generated due to charge recombination in the reaction center radical pair P680(+)PheoD1(-), and the energy and temporal parameters of this triplet state as well as the molecular environment and interactions of the triplet-bearing Chl molecule are similar in the PSII core complexes and isolated PSII RCs.

  3. Measuring and predicting reservoir heterogeneity in complex deposystems. The fluvial-deltaic Big Injun Sandstone in West Virginia. Final report, September 20, 1991--October 31, 1993

    Energy Technology Data Exchange (ETDEWEB)

    Hohn, M.E.; Patchen, D.G.; Heald, M.; Aminian, K.; Donaldson, A.; Shumaker, R.; Wilson, T.

    1994-05-01

    Non-uniform composition and permeability of a reservoir, commonly referred to as reservoir heterogeneity, is recognized as a major factor in the efficient recovery of oil during primary production and enhanced recovery operations. Heterogeneities are present at various scales and are caused by various factors, including folding and faulting, fractures, diagenesis and depositional environments. Thus, a reservoir consists of a complex flow system, or series of flow systems, dependent on lithology, sandstone genesis, and structural and thermal history. Ultimately, however, fundamental flow units are controlled by the distribution and type of depositional environments. Reservoir heterogeneity is difficult to measure and predict, especially in more complex reservoirs such as fluvial-deltaic sandstones. The Appalachian Oil and Natural Gas Research Consortium (AONGRC), a partnership of Appalachian basin state geological surveys in Kentucky, Ohio, Pennsylvania, and West Virginia, and West Virginia University, studied the Lower Mississippian Big Injun sandstone in West Virginia. The Big Injun research was multidisciplinary and designed to measure and map heterogeneity in existing fields and undrilled areas. The main goal was to develop an understanding of the reservoir sufficient to predict, in a given reservoir, optimum drilling locations versus high-risk locations for infill, outpost, or deeper-pool tests.

  4. Soil-vegetation interaction on slopes with shrub encroachment in the central Alps - simple measurements for complex slopes?

    Science.gov (United States)

    Caviezel, Chatrina; Hunziker, Matthias; Kuhn, Nikolaus J.

    2013-04-01

    In the European Alps many high mountain grasslands which where traditionally used for summer pasturing and haying have been abandoned during the last decades. Abandonment of mown or grazed grasslands causes a shift in vegetation composition, e.g. a change in landscape ecology and geomorphology. From a short term perspective, alpine areas are very fragile ecosystems and are highly sensitive to changing environmental conditions. Land use change can affect runoff and water erosion rates, snow gliding and avalanches as well as mass wasting in high-energy mountain environments. The effect of land use intensification on surface processes is well documented. However, the effect of land abandonment on surface resistance to eroding processes is discussed controversially in literature, particularly in relation to its short term and long-term consequences. Generally, perennial vegetation is considered to improve the mechanical anchoring of loose surface material and the regulation of the soil water budget including the control over the generation of runoff. This study aimed at determining the effect of green alder encroachment in the Unteralpvalley in the Swiss Alps. A range of measurements of the mechanical strength of the soil under green alder stands ranging from 15 to 90 years of age and a control site still used for grazing were conducted. Unlike the literature on the effects of perennial vegetation suggest, the data presented in this study show that soil shear strength is decreasing with along the sampled chronosequence, including compared to the grazed reference site. A possible explanation for this decline in soil stability with shrub encroachment is the loosing effect of the green alder roots on the soil structure, which causes an increase in porosity and thus less friction between soil particles. As a consequence, rates of water erosion may decline with shrub encroachment, but frequency of creeping and sliding may increase.

  5. Human brain mapping under increasing cognitive complexity using regional cerebral blood flow measurements and positron emission tomography.

    Science.gov (United States)

    Law, Ian

    2007-11-01

    Measurement of the regional cerebral blood flow (rCBF) is an important parameter in the evaluation of cerebral function. With positron emission tomography (PET) rCBF has predominantly been quantified using the short-lived radiotracer oxygen-15 labelled water (H 2 15 O) and an adaptation of the Kety one-tissue compartment autoradiographic model. The values attained in putative grey matter, however, are systematically underestimated because of the limited scanner resolution. For this reason we applied a dynamic kinetic two-tissue compartment model including a fast and a slow flow component each with a perfusable tissue fraction. In the fast component rCBF was 2-2.5 times greater than grey matter values using traditional autoradiography in both human and monkey. Visual stimulation in human gave a corrected rCBF increase of approximately 40%. Visual stimulation was also used to indirectly validate carbon-10 labelled carbondioxide ( 10 CO 2 ), a new very short-lived rCBF PET tracer with a half-life of only 19.3 seconds. This allowed an increase in the number of independent PET scans per subject from 12-14 using H 2 15 O to 64 using 10 CO 2 . The experiment demonstrated a maximal activation response in the visual cortex at a 10-15 Hz stimulation frequency. The use of the rCBF PET mapping technique is illustrated by studies of the organization of language and the oculomotor system. With respect to the former, we found confirmation of neuropsychological evidence of the involvement of the left supramarginal/angular gyrus in reading in Japanese of a phonologically based script system, Kana, and of the left posterior inferior temporal gyrus in reading of a morphogram based script system, Kanji. Concerning the organization of the oculomotor system we found overlapping areas in fronto-parietal cortex involved in maintaining visual fixation, and performing visually guided and imagined eye movements. These data show that overt eye movements are not a prerequisite of the

  6. Design of measuring machine for complex geometry based on form-free measurement mode%基于免形状测量模式的复杂形状测量机设计

    Institute of Scientific and Technical Information of China (English)

    石照耀; 张斌; 林家春

    2012-01-01

    To measure complex geometries without nominal mathematics models, a "form-free measurement mode" is introduced, and its basic requirements for measuring machine are analyzed. A fixed column structure coordinate measuring machine with high accuracy and efficiency was designed based on the novel mode. The machine is driven by linear motors, and high accuracy gratings are used as measurement devices. To decrease the influence of work-piece weight, a closed aerostatic bearing with vacuum preload and an H style two dimensional co-planar structure are designed. A pneumatic cylinder balanced axis Z assembly with brake function is designed, and a vibration isolation assembly is also designed. The measurement span of the machine is 300 mm × 300 mm × 300 mm and the measurement uncertainty is 1.8 μm. It can be applied to measure complex geometries without nominal mathematics models.%为了高精度高效率地测量数学模型未知的复杂几何形状,介绍了“免形状测量模式”,分析了该测量模式对仪器的基本要求,基于该模式设计了一台移动工作台式测量机.测量机以直线电机为驱动元件,以高精度长光栅为测量元件.为减小被测件重量对测量机的影响,设计了封闭式真空负压的空气静压气浮导轨和共平面的H形二维结构.设计了具有制动功能的Z轴和气浮隔振等关键部件.仪器量程为300 mm × 300 mm×300 mm,测量不确定度为1.8μm,能测量数学模型未知的复杂几何形状.

  7. 基于熵理论和复杂度的肌电信号分析%Electromyography Analysis Based on the Entropy Theory and Complexity Measures

    Institute of Scientific and Technical Information of China (English)

    肖毅; 陈善广; 王春慧

    2009-01-01

    传统信号处理方法对肌电信号分析存在一定局限,不能很好地描述肌电信号的复杂性;而基于熵理论和复杂度等非线性分析方法越来越多地应用于肌电信号等生理信号的处理.熵理论和复杂度对于肌电信号的处理具有运算速度快、数值特征明显,并且能够很好地描述其复杂性等特点.本文以做俯卧撑的上肢肌电信号为分析对象,通过计算其Renyi熵、小波熵和复杂度刻画不同阶段肌电信号的复杂性,并验证以此测度进行肌电信号不同区域划分的合理性.以及应用该方法分析肌电信号的有效性,取得了较好的效果.试验数据分析结果表明,小波熵值较大的部分对应于肌电信号能量较高的区域.从生理意义而言,这些区域正是肌肉纤维集中放电的过程.肌电信号成分单一,是复杂度较低的区域,而Renyi熵和复杂度值越大,对应的肌电信号成分复杂度越高,这与理论分析吻合得比较好,同时三者也得到了相互验证.由此表明该方法对于肌电信号的分析是可行的,非线性分析方法可能是未来肌电信号等生理信号的发展方向.该方法还可以应用于体力疲劳评价.%Electromyography (EMG) finds more and more application areas, such as the workload evaluation and the robotic control. The key is the feature analysis and extraction. Various methods for the analysis of the EMG can be grouped into the linear methods and the nonlinear methods. The methods based the entropy theory and the complexity are among the nonlinear methods. The entropy methods include various entropy calculation algorithms, and the Renyi Entropy and Wavelet Entropy are the typical methods. In this paper the EMG are obtained from two volunteers while they are doing the push-up exercise, and the measures of the Renyi entropy, Wavelet entropy and LZC complexity are calculated. In the algorithm, the segments of the EMG are made to achieve better analysis results

  8. Multivariate curve resolution based chromatographic peak alignment combined with parallel factor analysis to exploit second-order advantage in complex chromatographic measurements.

    Science.gov (United States)

    Parastar, Hadi; Akvan, Nadia

    2014-03-13

    In the present contribution, a new combination of multivariate curve resolution-correlation optimized warping (MCR-COW) with trilinear parallel factor analysis (PARAFAC) is developed to exploit second-order advantage in complex chromatographic measurements. In MCR-COW, the complexity of the chromatographic data is reduced by arranging the data in a column-wise augmented matrix, analyzing using MCR bilinear model and aligning the resolved elution profiles using COW in a component-wise manner. The aligned chromatographic data is then decomposed using trilinear model of PARAFAC in order to exploit pure chromatographic and spectroscopic information. The performance of this strategy is evaluated using simulated and real high-performance liquid chromatography-diode array detection (HPLC-DAD) datasets. The obtained results showed that the MCR-COW can efficiently correct elution time shifts of target compounds that are completely overlapped by coeluted interferences in complex chromatographic data. In addition, the PARAFAC analysis of aligned chromatographic data has the advantage of unique decomposition of overlapped chromatographic peaks to identify and quantify the target compounds in the presence of interferences. Finally, to confirm the reliability of the proposed strategy, the performance of the MCR-COW-PARAFAC is compared with the frequently used methods of PARAFAC, COW-PARAFAC, multivariate curve resolution-alternating least squares (MCR-ALS), and MCR-COW-MCR. In general, in most of the cases the MCR-COW-PARAFAC showed an improvement in terms of lack of fit (LOF), relative error (RE) and spectral correlation coefficients in comparison to the PARAFAC, COW-PARAFAC, MCR-ALS and MCR-COW-MCR results.

  9. WIND VELOCITIES AND SAND FLUXES IN MESQUITE DUNE-LANDS IN THE NORTHERN CHIHUAHUAN DESERT: A COMPARISON BETWEEN FIELD MEASUREMENTS AND THE QUIC (QUICK URBAN AND INDUSTRIAL COMPLEX) MODEL

    Science.gov (United States)

    The poster shows comparisons of wind velocities and sand fluxes between field measurements and a computer model, called QUIC (Quick Urban & Industrial Complex). The comparisons were made for a small desert region in New Mexico.

  10. NET formation induced by Pseudomonas aeruginosa cystic fibrosis isolates measured as release of myeloperoxidase-DNA and neutrophil elastase-DNA complexes.

    Science.gov (United States)

    Yoo, Dae-goon; Floyd, Madison; Winn, Matthew; Moskowitz, Samuel M; Rada, Balázs

    2014-08-01

    Cystic fibrosis (CF) airway disease is characterized by Pseudomonas aeruginosa infection and recruitment of neutrophil granulocytes. Neutrophil granule components (myeloperoxidase (MPO), human neutrophil elastase (HNE)), extracellular DNA and P. aeruginosa can all be found in the CF respiratory tract and have all been associated with worsening CF lung function. Pseudomonas-induced formation of neutrophil extracellular traps (NETs) offers a likely mechanism for release of MPO, HNE and DNA from neutrophils. NETs are composed of a DNA backbone decorated with granule proteins like MPO and HNE. Here we sought to examine whether CF clinical isolates of Pseudomonas are capable of inducing NET release from human neutrophil granulocytes. We used two methods to quantify NETs. We modified a previously employed ELISA that detects MPO-DNA complexes and established a new HNE-DNA ELISA. We show that these methods reliably quantify MPO-DNA and HNE-DNA complexes, measures of NET formation. We have found that CF isolates of P. aeruginosa stimulate robust respiratory burst and NET release in human neutrophils. By comparing paired "early" and "late" bacterial isolates obtained from the same CF patient we have found that early isolates induced significantly more NET release than late isolates. Our data support that Pseudomonas-induced NET release represents an important mechanism for release of neutrophil-derived CF inflammatory mediators, and confirm that decreased induction of NET formation is required for long-term adaptation of P. aeruginosa to CF airways.

  11. Complexity and Dynamical Depth

    OpenAIRE

    Terrence Deacon; Spyridon Koutroufinis

    2014-01-01

    We argue that a critical difference distinguishing machines from organisms and computers from brains is not complexity in a structural sense, but a difference in dynamical organization that is not well accounted for by current complexity measures. We propose a measure of the complexity of a system that is largely orthogonal to computational, information theoretic, or thermodynamic conceptions of structural complexity. What we call a system’s dynamical depth is a separate dimension of system c...

  12. The Na+ transport in gram-positive bacteria defect in the Mrp antiporter complex measured with 23Na nuclear magnetic resonance.

    Science.gov (United States)

    Górecki, Kamil; Hägerhäll, Cecilia; Drakenberg, Torbjörn

    2014-01-15

    (23)Na nuclear magnetic resonance (NMR) has previously been used to monitor Na(+) translocation across membranes in gram-negative bacteria and in various other organelles and liposomes using a membrane-impermeable shift reagent to resolve the signals resulting from internal and external Na(+). In this work, the (23)Na NMR method was adapted for measurements of internal Na(+) concentration in the gram-positive bacterium Bacillus subtilis, with the aim of assessing the Na(+) translocation activity of the Mrp (multiple resistance and pH) antiporter complex, a member of the cation proton antiporter-3 (CPA-3) family. The sodium-sensitive growth phenotype observed in a B. subtilis strain with the gene encoding MrpA deleted could indeed be correlated to the inability of this strain to maintain a lower internal Na(+) concentration than an external one. PMID:24139955

  13. Direct sun and airborne MAX-DOAS measurements of the collision induced oxygen complex, O2O2 absorption with significant pressure and temperature differences

    Directory of Open Access Journals (Sweden)

    E. Spinei

    2014-09-01

    Full Text Available The collision induced O2 complex, O2O2, is a very important trace gas in remote sensing measurements of aerosol and cloud properties. Some ground based MAX-DOAS measurements of O2O2 slant column density require correction factors of 0.75 ± 0.1 to reproduce radiative transfer modeling (RTM results for a near pure Rayleigh atmosphere. One of the potential causes of this discrepancy is believed to be uncertainty in laboratory measured O2O2 absorption cross section temperature and pressure dependence, due to difficulties in replicating atmospheric conditions in the laboratory environment. This paper presents direct-sun (DS and airborne multi-axis (AMAX DOAS measurements of O2O2 absorption optical depths under actual Earth atmospheric conditions in two wavelength regions (335–390 nm and 435–490 nm. DS irradiance measurements were made by the research grade MFDOAS instrument from 2007–2014 at seven sites with significant pressure (778–1013 hPa and O2O2 profile weighted temperature (247–275 K differences. Aircraft MAX-DOAS measurements were conducted by the University of Colorado AMAX-DOAS instrument on 29 January 2012 over the Southern Hemisphere subtropical Pacific Ocean. Scattered solar radiance spectra were collected at altitudes between 9 and 13.2 km, with O2O2 profile weighted temperatures of 231–244 K, and near pure Rayleigh scattering conditions. Due to the well defined DS air mass factors and extensively characterized atmospheric conditions during the AMAX-DOAS measurements, O2O2"pseudo" absorption cross sections, σ, are derived from the observed optical depths and estimated O2O2column densities. Vertical O2O2 columns are calculated from the atmospheric sounding temperature, pressure and specific humidity profiles. Based on the atmospheric DS observations, there is no pressure dependence of the O2O2 σ, within the measurement errors (3%. The two data sets are combined to derive peak σ temperature dependence of 360 and 477 nm

  14. Direct sun and airborne MAX-DOAS measurements of the collision induced oxygen complex, O2O2 absorption with significant pressure and temperature differences

    Science.gov (United States)

    Spinei, E.; Cede, A.; Herman, J.; Mount, G. H.; Eloranta, E.; Morley, B.; Baidar, S.; Dix, B.; Ortega, I.; Koenig, T.; Volkamer, R.

    2014-09-01

    The collision induced O2 complex, O2O2, is a very important trace gas in remote sensing measurements of aerosol and cloud properties. Some ground based MAX-DOAS measurements of O2O2 slant column density require correction factors of 0.75 ± 0.1 to reproduce radiative transfer modeling (RTM) results for a near pure Rayleigh atmosphere. One of the potential causes of this discrepancy is believed to be uncertainty in laboratory measured O2O2 absorption cross section temperature and pressure dependence, due to difficulties in replicating atmospheric conditions in the laboratory environment. This paper presents direct-sun (DS) and airborne multi-axis (AMAX) DOAS measurements of O2O2 absorption optical depths under actual Earth atmospheric conditions in two wavelength regions (335-390 nm and 435-490 nm). DS irradiance measurements were made by the research grade MFDOAS instrument from 2007-2014 at seven sites with significant pressure (778-1013 hPa) and O2O2 profile weighted temperature (247-275 K) differences. Aircraft MAX-DOAS measurements were conducted by the University of Colorado AMAX-DOAS instrument on 29 January 2012 over the Southern Hemisphere subtropical Pacific Ocean. Scattered solar radiance spectra were collected at altitudes between 9 and 13.2 km, with O2O2 profile weighted temperatures of 231-244 K, and near pure Rayleigh scattering conditions. Due to the well defined DS air mass factors and extensively characterized atmospheric conditions during the AMAX-DOAS measurements, O2O2"pseudo" absorption cross sections, σ, are derived from the observed optical depths and estimated O2O2column densities. Vertical O2O2 columns are calculated from the atmospheric sounding temperature, pressure and specific humidity profiles. Based on the atmospheric DS observations, there is no pressure dependence of the O2O2 σ, within the measurement errors (3%). The two data sets are combined to derive peak σ temperature dependence of 360 and 477 nm absorption bands from 231

  15. A system for measuring complex dielectric properties of thin films at submillimeter wavelengths using an open hemispherical cavity and a vector network analyzer

    Science.gov (United States)

    Rahman, Rezwanur; Taylor, P. C.; Scales, John A.

    2013-08-01

    Quasi-optical (QO) methods of dielectric spectroscopy are well established in the millimeter and submillimeter frequency bands. These methods exploit standing wave structure in the sample produced by a transmitted Gaussian beam to achieve accurate, low-noise measurement of the complex permittivity of the sample [e.g., J. A. Scales and M. Batzle, Appl. Phys. Lett. 88, 062906 (2006);, 10.1063/1.2172403 R. N. Clarke and C. B. Rosenberg, J. Phys. E 15, 9 (1982);, 10.1088/0022-3735/15/1/002 T. M. Hirovnen, P. Vainikainen, A. Lozowski, and A. V. Raisanen, IEEE Trans. Instrum. Meas. 45, 780 (1996)], 10.1109/19.516996. In effect the sample itself becomes a low-Q cavity. On the other hand, for optically thin samples (films of thickness much less than a wavelength) or extremely low loss samples (loss tangents below 10-5) the QO approach tends to break down due to loss of signal. In such a case it is useful to put the sample in a high-Q cavity and measure the perturbation of the cavity modes. Provided that the average mode frequency divided by the shift in mode frequency is less than the Q (quality factor) of the mode, then the perturbation should be resolvable. Cavity perturbation techniques are not new, but there are technological difficulties in working in the millimeter/submillimeter wave region. In this paper we will show applications of cavity perturbation to the dielectric characterization of semi-conductor thin films of the type used in the manufacture of photovoltaics in the 100 and 350 GHz range. We measured the complex optical constants of hot-wire chemical deposition grown 1-μm thick amorphous silicon (a-Si:H) film on borosilicate glass substrate. The real part of the refractive index and dielectric constant of the glass-substrate varies from frequency-independent to linearly frequency-dependent. We also see power-law behavior of the frequency-dependent optical conductivity from 316 GHz (9.48 cm-1) down to 104 GHz (3.12 cm-1).

  16. Response of the Cu(II) ion selective electrode to Cu titration in artificial and natural shore seawater and in the measurement of the Cu complexation capacity.

    Science.gov (United States)

    Rivera-Duarte, Ignacio; Zirino, Alberto

    2004-06-01

    The Orion 94-29 Cu(II) jalpaite ion selective electrode (Cu-ISE) was used to measure both the concentration of the aqueous free Cu(II) ion ([Cu(II)aq]) and its changes due to additions of Cu, in artificial seawater (ASW) and in seawater from San Diego Bay, CA. The range of free copper ion (i.e., pCu, -log [Cu(II)aq]) determined in seawater samples from the San Diego Bay area (11.3-12.6, 11.9 +/- 0.4, average +/- SD) is consistent with that previously reported for estuarine and coastal areas (10.9-14.1). The changes in [Cu(II)aq] as a result of the additions of Cu were used to determine the Cu complexation capacity (Cu-CC), which has a measured range (2.7 x 10(-8)-2.0 x 10(-7) M; 7.6 x 10(-8) +/- 4.8 x 10(-8) M) comparable to the range of values previously reported for estuarine and coastal zones (i.e., L1+L2, 1.1 x 10(-8)-2.0 x 10(-7) M). The narrow range of pCu at the Cu-CC (pCuCu-CC, 11.1-11.9, 11.5 +/- 0.2) indicates the predominant role of the Cu-CC in regulating the concentration of ambient Cu(II)aq to a level < or =1 x 10(-11) M Cu(II)aq. These results attest to the capability of the Cu-ISE to measure pCu and Cu-CC in aquatic coastal environments with relatively high total Cu concentrations and organic loads, such as those from heavily used coasts and bays.

  17. Environmental Assessment and Finding of No Significant Impact: Interim Measures for the Mixed Waste Management Facility Groundwater at the Burial Ground Complex at the Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    N/A

    1999-12-08

    The U. S. Department of Energy (DOE) prepared this environmental assessment (EA) to analyze the potential environmental impacts associated with the proposed interim measures for the Mixed Waste Management Facility (MW) groundwater at the Burial Ground Complex (BGC) at the Savannah River Site (SRS), located near Aiken, South Carolina. DOE proposes to install a small metal sheet pile dam to impound water around and over the BGC groundwater seepline. In addition, a drip irrigation system would be installed. Interim measures will also address the reduction of volatile organic compounds (VOCS) from ''hot-spot'' regions associated with the Southwest Plume Area (SWPA). This action is taken as an interim measure for the MWMF in cooperation with the South Carolina Department of Health and Environmental Control (SCDHEC) to reduce the amount of tritium seeping from the BGC southwest groundwater plume. The proposed action of this EA is being planned and would be implemented concurrent with a groundwater corrective action program under the Resource Conservation and Recovery Act (RCRA). On September 30, 1999, SCDHEC issued a modification to the SRS RCRA Part B permit that adds corrective action requirements for four plumes that are currently emanating from the BGC. One of those plumes is the southwest plume. The RCRA permit requires SRS to submit a corrective action plan (CAP) for the southwest plume by March 2000. The permit requires that the initial phase of the CAP prescribe a remedy that achieves a 70-percent reduction in the annual amount of tritium being released from the southwest plume area to Fourmile Branch, a nearby stream. Approval and actual implementation of the corrective measure in that CAP may take several years. As an interim measure, the actions described in this EA would manage the release of tritium from the southwest plume area until the final actions under the CAP can be implemented. This proposed action is expected to reduce the

  18. Temporal linear mode complexity as a surrogate measure of the effect of remifentanil on the central nervous system in healthy volunteers

    Science.gov (United States)

    Choi, Byung-Moon; Shin, Da-Huin; Noh, Moon-Ho; Kim, Young-Hac; Jeong, Yong-Bo; Lee, Soo-Han; Lee, Eun-Kyung; Noh, Gyu-Jeong

    2011-01-01

    AIMS Previously, electroencephalographic approximate entropy (ApEn) effectively described both depression of central nervous system (CNS) activity and rebound during and after remifentanil infusion. ApEn is heavily dependent on the record length. Linear mode complexity, which is algorithmatically independent of the record length, was investigated to characterize the effect of remifentanil on the CNS using the combined effect and tolerance, feedback and sigmoid Emax models. METHODS The remifentanil blood concentrations and electroencephalographic data obtained in our previous study were used. With the recording of the electroencephalogram, remifentanil was infused at a rate of 1, 2, 3, 4, 5, 6, 7 or 8 µg kg−1 min−1 for 15–20 min. The areas below (AUCeffect) or above (AACrebound) the effect vs. time curve of temporal linear mode complexity (TLMC) and ApEn were calculated to quantitate the decrease of the CNS activity and rebound. The coefficients of variation (CV) of median baseline (E0), maximal (Emax), and individual median E0 minus Emaxvalues of TLMC were compared with those of ApEn. The concentration–TLMC relationship was characterized by population analysis using non-linear mixed effects modelling. RESULTS Median AUCeffectand AACreboundwere 1016 and 5.3 (TLMC), 787 and 4.5 (ApEn). The CVs of individual median E0 minus Emax were 35.6, 32.5% (TLMC, ApEn). The combined effect and tolerance model demonstrated the lowest Akaike information criteria value and the highest positive predictive value of rebound in tolerance. CONCLUSIONS The combined effect and tolerance model effectively characterized the time course of TLMC as a surrogate measure of the effect of remifentanil on the CNS. PMID:21223358

  19. Considering sampling strategy and cross-section complexity for estimating the uncertainty of discharge measurements using the velocity-area method

    Science.gov (United States)

    Despax, Aurélien; Perret, Christian; Garçon, Rémy; Hauet, Alexandre; Belleville, Arnaud; Le Coz, Jérôme; Favre, Anne-Catherine

    2016-02-01

    uncertainty component for any routine gauging, the four most similar gaugings among the reference stream-gaugings dataset are selected using an analog approach, where analogy includes both riverbed shape and flow distribution complexity. This new method was applied to 3185 stream-gaugings with various flow conditions and compared with the other methods (ISO 748 , IVE, Q + with a simple automated parametrization). Results show that FLAURE is overall consistent with the Q + method but not with ISO 748 and IVE methods, which produce clearly overestimated uncertainties for discharge measurements with less than 15 verticals. The FLAURE approach therefore appears to be a consistent method. An advantage is the explicit link made between the estimation of cross-sectional interpolation errors and the study of high-resolution reference gaugings.

  20. QUIJOTE Scientific Results. II. Polarisation Measurements of the Microwave Emission in the Galactic molecular complexes W43 and W47 and supernova remnant W44

    CERN Document Server

    Génova-Santos, R; Peláez-Santos, A; Poidevin, F; Rebolo, R; Vignaga, R; Artal, E; Harper, S; Hoyland, R; Lasenby, A; Martínez-González, E; Piccirillo, L; Tramonte, D; Watson, R A

    2016-01-01

    We present Q-U-I JOint TEnerife (QUIJOTE) intensity and polarisation maps at 10-20 GHz covering a region along the Galactic plane 24complexes W43 (22-sigma) and W47 (8-sigma). We also detect at high significance (6-sigma) AME associated with W44, the first clear detection of this emission towards a SNR. The new QUIJOTE polarisation data, in combination with WMAP, are essential to: i) Determine the spectral index of the synchrotron emission in W44, beta_sync=-0.62+/-0.03 in good agreement with the value inferred from the intensity spectrum once a free-free component is included in the fit. ii) Trace the change in the polarisation angle associated with Faraday rotation in the direction of W44 with rotation measure -404+/-49 rad/m2. And iii)...

  1. Direct sun and airborne MAX-DOAS measurements of the collision induced oxygen complex, O2O2 absorption with significant pressure and temperature differences

    OpenAIRE

    E. Spinei; A. Cede; Herman, J.; G. H. Mount; Eloranta, E; B. Morley; S. Baidar; Dix, B.; I. Ortega; Koenig, T; Volkamer, R.

    2014-01-01

    The collision induced O2 complex, O2O2, is a very important trace gas in remote sensing measurements of aerosol and cloud properties. Some ground based MAX-DOAS measurements of O2O2 slant column density require correction factors of 0.75 ± 0.1 to reproduce radiative transfer modeling (RTM) results for a near pure Rayleigh atmosphere. One of the potential causes of this discrepancy is believed to be uncertainty in laboratory measured ...

  2. Modelling Complexity in Musical Rhythm

    OpenAIRE

    Liou, Cheng-Yuan; Wu, Tai-Hei; Lee, Chia-Ying

    2007-01-01

    This paper constructs a tree structure for the music rhythm using the L-system. It models the structure as an automata and derives its complexity. It also solves the complexity for the L-system. This complexity can resolve the similarity between trees. This complexity serves as a measure of psychological complexity for rhythms. It resolves the music complexity of various compositions including the Mozart effect K488. Keyword: music perception, psychological complexity, rhythm, L-system, autom...

  3. Complex Beauty

    OpenAIRE

    Franceschet, Massimo

    2014-01-01

    Complex systems and their underlying convoluted networks are ubiquitous, all we need is an eye for them. They pose problems of organized complexity which cannot be approached with a reductionist method. Complexity science and its emergent sister network science both come to grips with the inherent complexity of complex systems with an holistic strategy. The relevance of complexity, however, transcends the sciences. Complex systems and networks are the focal point of a philosophical, cultural ...

  4. Thin film Z-scan measurements of the nonlinear response of novel conjugated silicon-ethynylene polymers and metal-containing complexes incorporated into polymeric matrices

    Science.gov (United States)

    Douglas, William E.; Klapshina, Larisa G.; Rubinov, Anatoly N.; Domrachev, George A.; Bushuk, Boris A.; Antipov, Oleg L.; Semenov, Vladimir V.; Kuzhelev, Alexander S.; Bushuk, Sergey B.; Kalvinkovskaya, Julia A.

    2000-11-01

    The third-order optical nonlinearities of new conjugated poly[(arylene)(ethynylene)silylene]s, and a variety of chromium, neodymium or cobalt complexes incorporated into polymeric matrices as thin sol-gel or polyacrylonitrile films have been determined by using a single beam Z-scan technique. The samples were pumped by a single ultrashort pulse of a mode-locked Nd-phosphate glass laser (wavelength 1054 nm) with a 5ps pulse duration (full width at half- maximum), the repetition rate of the Gaussian beam being low (0.3Hz) ro avoid thermal effects. The spot radius of the focused pulse was ca. 60micrometers , its beam waist being in the sample (intensity up to 4x1013 Wm-2). Calibration was done with chloroform and benzene, the value of N2 for the latter (2x10-12esu) being similar to that previously reported. A small-aperture Z-scan (S=0.03) was used to measure the magnitude and the sign of the nonlinear refractive index, n2. Very high nonlinear refractive indices were found for a film containing (a) a poly[(arylene)(ethynylene)silylene]s with pentacoordinated silicon (c 5 gl-1) in a sol-gel matrix (N2 = 6 x 10-13 cm2W-1), (b) a film containing a poly[(arylene)(ethynylene)silylene] with tetracoordinated silicon (c 0.5 gl-1) and a very small proportion of fullerene-C70 incorporated into an NH2-containing sol-gel matrix (n2 = 5x10-13 cm2W-1, and (c) a thin polyacrylonitrile film of polycyanoethylate bis-arenechromium(I) hydroxide (n2 = -5 x 10-12 cm(superscript 2W-1.

  5. QUIJOTE Scientific Results. II. Polarisation Measurements of the Microwave Emission in the Galactic molecular complexes W43 and W47 and supernova remnant W44

    Science.gov (United States)

    Génova-Santos, R.; Rubiño-Martín, J. A.; Peláez-Santos, A.; Poidevin, F.; Rebolo, R.; Vignaga, R.; Artal, E.; Harper, S.; Hoyland, R.; Lasenby, A.; Martínez-González, E.; Piccirillo, L.; Tramonte, D.; Watson, R. A.

    2016-10-01

    We present Q-U-I JOint TEnerife (QUIJOTE) intensity and polarisation maps at 10 - 20 GHz covering a region along the Galactic plane 24° ≲ l ≲ 45°, |b| ≲ 8°. These maps result from 210 h of data, have a sensitivity in polarisation of ≈40 μK beam-1 and an angular resolution of ≈1°. Our intensity data are crucial to confirm the presence of anomalous microwave emission (AME) towards the two molecular complexes W43 (22σ) and W47 (8σ). We also detect at high significance (6σ) AME associated with W44, the first clear detection of this emission towards a SNR. The new QUIJOTE polarisation data, in combination with WMAP, are essential to: i) Determine the spectral index of the synchrotron emission in W44, βsync = -0.62 ± 0.03, in good agreement with the value inferred from the intensity spectrum once a free-free component is included in the fit. ii) Trace the change in the polarisation angle associated with Faraday rotation in the direction of W44 with rotation measure -404 ± 49 rad m-2. And iii) set upper limits on the polarisation of W43 of ΠAME < 0.39 per cent (95 per cent C.L.) from QUIJOTE 17 GHz, and <0.22 per cent from WMAP 41 GHz data, which are the most stringent constraints ever obtained on the polarisation fraction of the AME. For typical physical conditions (grain temperature and magnetic field strengths), and in the case of perfect alignment between the grains and the magnetic field, the models of electric or magnetic dipole emissions predict higher polarisation fractions.

  6. The measurement of complex network based on motif*%基于模体的复杂网络测度量研究*

    Institute of Scientific and Technical Information of China (English)

    韩华; 刘婉璐; 吴翎燕

    2013-01-01

    According to the existence of motif in complex network topology structure, the motif-based node degree and edge degree are proposed to measure the importance of node and edge in the network on the basis of the traditional node degree and edge clustering coefficient. The Rand-ESU algorithm is used for motif detection of eight different scale networks, and the result demonstrates the existence of motif. The Rand-ESU algorithm is also used for analyzing the motif structures and characteristics in Karate network and Dolphin network. The Pearson correlation coefficient is used to measure the correlations of motif-based node degree and traditional node degree, motif-based edge degree and edge clustering coefficient. The results of simulation analysis show that the correlations are related to the motif species. The definitions of motif-based node degree and edge degree are the improvement and development of original definitions, and they comprehensively depict the importance of node and edge in the network.%  针对复杂网络拓扑结构中模体的存在性,在传统的顶点度和边聚类系数定义的基础上,提出了基于模体的顶点度和边度来衡量网络中顶点和边的重要性。用Rand-ESU算法对不同规模的8个网络进行模体检测,验证了网络中模体的存在性,重点分析了Karate网络和Dolphin网络中模体的结构和特征。用Pearson相关系数衡量基于模体的顶点度与传统顶点度、基于模体的边度与边聚类系数的相关性,仿真分析结果表明相关性大小与模体种类有关,基于模体的顶点度和边度是对原定义的一种改进和拓展,更全面地刻画了顶点和边在网络中的重要性。

  7. A New Complete Class Complexity Metric

    OpenAIRE

    Singh, Vinay; Bhattacherjee, Vandana

    2014-01-01

    Software complexity metrics is essential for minimizing the cost of software maintenance. Package level and System level complexity cannot be measured without class level complexity. This research addresses the class complexity metrics. This paper studies the existing class complexity metrics and proposes a new class complexity metric CCC (Complete class complexity metric). The CCC metric is then analytically evaluated by Weyuker's property.

  8. Multiscale Cross-Approximate Entropy Analysis as a Measurement of Complexity between ECG R-R Interval and PPG Pulse Amplitude Series among the Normal and Diabetic Subjects

    OpenAIRE

    Hsien-Tsai Wu; Chih-Yuan Lee; Cyuan-Cin Liu; An-Bang Liu

    2013-01-01

    Physiological signals often show complex fluctuation (CF) under the dual influence of temporal and spatial scales, and CF can be used to assess the health of physiologic systems in the human body. This study applied multiscale cross-approximate entropy (MC-ApEn) to quantify the complex fluctuation between R-R intervals series and photoplethysmography amplitude series. All subjects were then divided into the following two groups: healthy upper middle-aged subjects (Group 1, age range: 41–80 ye...

  9. Bucolic Complexes

    CERN Document Server

    Brešar, Bostjan; Chepoi, Victor; Gologranc, Tanja; Osajda, Damian

    2012-01-01

    In this article, we introduce and investigate bucolic complexes, a common generalization of systolic complexes and of CAT(0) cubical complexes. This class of complexes is closed under Cartesian products and amalgamations over some convex subcomplexes. We study various approaches to bucolic complexes: from graph-theoretic and topological viewpoints, as well as from the point of view of geometric group theory. Bucolic complexes can be defined as locally-finite simply connected prism complexes satisfying some local combinatorial conditions. We show that bucolic complexes are contractible, and satisfy some nonpositive-curvature-like properties. In particular, we prove a version of the Cartan-Hadamard theorem, the fixed point theorem for finite group actions, and establish some results on groups acting geometrically on such complexes. We also characterize the 1-skeletons (which we call bucolic graphs) and the 2-skeletons of bucolic complexes. In particular, we prove that bucolic graphs are precisely retracts of Ca...

  10. Synthesis and measurements of the optical bandgap of single crystalline complex metal oxide BaCuV2O7 nanowires by UV–VIS absorption

    International Nuclear Information System (INIS)

    Highlights: • Synthesis of single crystalline complex metal oxides BaCuV2O7 nanowires. • Surfactant free, economically favorable chemical solution deposition method. • Complex metal oxides nanowires with controlled stoichiometry. • Simply controlling the temperature and thickness of the coated film, we can easily obtain high quality BaCuV2O7 nanowires. - Abstract: The synthesis of single crystalline complex metal oxides BaCuV2O7 nanowires were attained by using surfactant free, economically favorable chemical solution deposition method. A thin layer of BaCuV2O7 nanocrystals is formed by the decomposition of complex metal oxide solution at 150 °C to provide nucleation sites for the growth of nanowires. The synthesized nanowires were typically 1–5 μm long with diameter from 50 to 150 nm. We showed that by simply controlling the temperature and thickness of the coated film, we can easily obtain high quality BaCuV2O7 nanowires. The UV–VIS absorption spectra show indirect bandgap of 2.65 ± 0.05 eV of nanowires. The temperature-dependent resistances of BaCuV2O7 nanowires agree with the exponential correlation, supporting that the conducting carriers are the quasi-free electrons. We believe that our methodology will provides a simple and convenient route for the synthesis of variety of complex metal oxides nanowires with controlled stoichiometry

  11. How evolution guides complexity

    OpenAIRE

    LARRY S. YAEGER

    2009-01-01

    Long-standing debates about the role of natural selection in the growth of biological complexity over geological time scales are difficult to resolve from the paleobiological record. Using an evolutionary model—a computational ecosystem subjected to natural selection—we investigate evolutionary trends in an information-theoretic measure of the complexity of the neural dynamics of artificial agents inhabiting the model. Our results suggest that evolution always guides complexity change, just n...

  12. Complexity Through Nonextensivity

    CERN Document Server

    Bialek, W; Tishby, N; Bialek, William; Nemenman, Ilya; Tishby, Naftali

    2001-01-01

    The problem of defining and studying complexity of a time series has interested people for years. In the context of dynamical systems, Grassberger has suggested that a slow approach of the entropy to its extensive asymptotic limit is a sign of complexity. We investigate this idea further by information theoretic and statistical mechanics techniques and show that these arguments can be made precise, and that they generalize many previous approaches to complexity, in particular unifying ideas from the physics literature with ideas from learning and coding theory; there are even connections of this statistical approach to algorithmic or Kolmogorov complexity. Moreover, a set of simple axioms similar to those used by Shannon in his development of information theory allows us to prove that the divergent part of the subextensive component of the entropy is a unique complexity measure. We classify time series by their complexities and demonstrate that beyond the `logarithmic' complexity classes widely anticipated in...

  13. Treatment and reuse of acidic water from sulphuric acid production based on nickel and copper smelter off-gases%镍铜冶炼烟气制酸系统酸性水处理及再利用

    Institute of Scientific and Technical Information of China (English)

    冯臻; 甘宪福; 董尚志; 王家蓉; 唐兴东

    2012-01-01

    介绍了金川集团有限公司镍、铜冶炼烟气制酸系统酸性水处理及再利用情况。镍冶炼烟气制酸系统酸性水主要特点为悬浮物含量较高,采用过滤、沉降处理技术;铜冶炼烟气制酸系统酸性水成分较复杂,采用硫化钠除铜及初步除砷、EX2000深度除砷2级处理技术。装置自投用以来运行稳定,7套制酸装置产生的酸性水70%得到回用,废水排放量由原来的550 m~3/h减少到157 m~3/h;除铜率达98%以上,除砷率达94%以上,减排效果明显。%Treatment and reuse of acidic water from sulphuric acid production based on nickel and copper smelter off-gases in Jinehuan Group are described. The acidic water from nickel-sulphuric acid systems features high suspended solid content, so filtration and settling technologies were adopted. For the acidic water from copper-sulphuric acid systems, because of much more complicated composition, the technology consisting of initial arsenic removal with sodium sulphide and secondary treatment for further arsenic removal with EX2000 settling aid was used. The plants have been operating well since start up with copper removal efficiency of more than 98% and arsenic removal efficiency of more than 94%. 70% of acidic water form 7 sulphuric acid plants was reused and the water discharge was significantly reduced from 550 m^3/h to 157 m^3/h.

  14. Visual Complexity: A Review

    Science.gov (United States)

    Donderi, Don C.

    2006-01-01

    The idea of visual complexity, the history of its measurement, and its implications for behavior are reviewed, starting with structuralism and Gestalt psychology at the beginning of the 20th century and ending with visual complexity theory, perceptual learning theory, and neural circuit theory at the beginning of the 21st. Evidence is drawn from…

  15. 酸性水环境下桥梁桩基混凝土的配制与耐久性研究%Preparation and Durability of Concrete for Bridge Pile Foundation in Acidic Water Environment

    Institute of Scientific and Technical Information of China (English)

    刘江; 曾逢春; 蔡老虎; 王在喜; 邵然; 李北星

    2012-01-01

    Aiming at the preparation of bridge pile foundation concrete in acidic water environment,the effect of mineral admixtures, water-binder (w/b) ratio and pH value of erosion environment on acid resistance of concrete is studied. Cement is replaced by fly ash (FA) with the incorporation amount from 20% to 50% or by ground granulated blast-furnace slag (GGBS) from 35% to 65%. The w/b ratio is 0. 35,0. 39 and 0. 43 respectively. The acidic environmental conditions are simulated by using sulfuric acid solution with the pH value 1,2 or 4. The corrosion process was monitored by measuring the compressive strength for a period of one year. The results showed that a reduce in the w/b ratio improved the acid resistance of concrete. It was found that the improvement effect of mineral admixture on acid attack of concrete is dependent on the type and dosage mixture admixture and the pH value of simulation erosion solution. The high volume fly ash concrete mixtures performed better than high volume ground granulated blast-furnace slag concrete mixtures.%针对酸性水环境下桥梁桩基混凝土的配制,研究了矿物掺合料、水胶比以及侵蚀溶液pH值对混凝土抗酸侵蚀性能的影响.混凝土的粉煤灰掺量从20%~50%变化,矿粉掺量从35 %~65%变化,水胶比分别取0.35、0.39、0.43,酸性水侵蚀环境通过pH值分别为1、2、4的硫酸溶液模拟,抗酸性能通过测定1年龄期的混凝土抗压强度变化率表征.结果表明,降低水胶比有利于提高混凝土的抗酸侵蚀性能;矿物掺合料对混凝土耐酸性能的改善作用与其品种、掺量及模拟侵蚀溶液的pH值大小有关.大掺量粉煤灰混凝土比大掺量矿渣粉混凝土具有更好的耐酸性能.

  16. Development of Oceanic Core Complexes on the Mid-Atlantic Ridge at 13-14N: Deep-Towed Geophysical Measurements and Detailed Seafloor Sampling

    Science.gov (United States)

    Searle, R.; MacLeod, C.; Murton, B.; Mallows, C.; Casey, J.; Achenbach, K.; Unsworth, S.; Harris, M.

    2007-12-01

    The first scientific cruise of research vessel James Cook in March-April 2007 targeted the Mid-Atlantic Ridge at 13-14°N, to investigate details of lithospheric generation and development in a low-magmatic setting. Overall objectives were to 1) investigate the 3D pattern of mantle upwelling and melt focusing; 2) study how plate accretion and separation mechanisms differ between magma-rich and magma-poor areas; and 3) test mechanisms of detachment faulting and extensional strain localisation in the lower crust and upper mantle. Smith et al. (Nature 2006) had shown this to be an area of widespread detachment faulting and formation of oceanic core complexes (OCC), and published bathymetry showed an extensive area of blocky rather than lineated topography, which elsewhere has correlated with areas of low effusive magmatism. We conducted a TOBI deep-towed geophysical survey over a 70 km length of ridge extending to magnetic chron C2n (1.9 Ma) on each flank. This included sidescan sonar and high resolution bathymetry and magnetic measurements on 13 E-W tracks spaced 3 - 6 km apart. The area includes 1 active, 1 dying, and 1 defunct OCC and borders well-lineated, apparently magmatically robust seafloor to the north. The geophysical survey was complimented by recovery of 7 oriented and 18 unoriented core and 29 dredge samples, including some from a probable OCC south of the TOBI survey. Deep-towed sidescan, bathymetry and video show the OCCs typically comprise a steeply outward tilted volcanic ridge marking the breakaway (as suggested by Smith et al., 2006); a high, rugged central massif that is complexly deformed as a result of uplift and bending, and may be separated from the breakaway ridge by what we interpret as a late outward dipping normal fault; and a smooth, corrugated surface that generally dips c. 20° towards the ridge axis at the termination but gradually rotates to horizontal or gently outward dipping near its junction with the central massif. Older OCCs

  17. Spectroscopic and physical measurements on charge-transfer complexes: Interactions between norfloxacin and ciprofloxacin drugs with picric acid and 3,5-dinitrobenzoic acid acceptors

    Science.gov (United States)

    Refat, Moamen S.; Elfalaky, A.; Elesh, Eman

    2011-03-01

    Charge-transfer complexes formed between norfloxacin (nor) or ciprofloxacin (cip) drugs as donors with picric acid (PA) and/or 3,5-dinitrobenzoic acid (DNB) as π-acceptors have been studied spectrophotometrically in methanol solvent at room temperature. The results indicated the formation of CT-complexes with molar ratio1:1 between donor and acceptor at maximum CT-bands. In the terms of formation constant ( KCT), molar extinction coefficient ( ɛCT), standard free energy (Δ Go), oscillator strength ( f), transition dipole moment (μ), resonance energy ( RN) and ionization potential ( ID) were estimated. IR, H NMR, UV-Vis techniques, elemental analyses (CHN) and TG-DTG investigations were used to characterize the structural of charge-transfer complexes. It indicates that the CT interaction was associated with a proton migration from each acceptor to nor or cip donors which followed by appearing intermolecular hydrogen bond. In addition, X-ray investigation was carried out to scrutinize the crystal structure of the resulted CT-complexes.

  18. When physics is not "just physics": complexity science invites new measurement frames for exploring the physics of cognitive and biological development.

    Science.gov (United States)

    Kelty-Stephen, Damian; Dixon, James A

    2012-01-01

    The neurobiological sciences have struggled to resolve the physical foundations for biological and cognitive phenomena with a suspicion that biological and cognitive systems, capable of exhibiting and contributing to structure within themselves and through their contexts, are fundamentally distinct or autonomous from purely physical systems. Complexity science offers new physics-based approaches to explaining biological and cognitive phenomena. In response to controversy over whether complexity science might seek to "explain away" biology and cognition as "just physics," we propose that complexity science serves as an application of recent advances in physics to phenomena in biology and cognition without reducing or undermining the integrity of the phenomena to be explained. We highlight that physics is, like the neurobiological sciences, an evolving field and that the threat of reduction is overstated. We propose that distinctions between biological and cognitive systems from physical systems are pretheoretical and thus optional. We review our own work applying insights from post-classical physics regarding turbulence and fractal fluctuations to the problems of developing cognitive structure. Far from hoping to reduce biology and cognition to "nothing but" physics, we present our view that complexity science offers new explanatory frameworks for considering physical foundations of biological and cognitive phenomena.

  19. In Defense of Simulating Complex and Tragic Historical Episodes: A Measured Response to the Outcry over a New England Slavery Simulation

    Science.gov (United States)

    Wright-Maley, Cory

    2014-01-01

    A slavery simulation that took place as part of a field trip for students of a Hartford junior high academy led a father to file a human rights suit against the school district, and for one official to comment that simulations of complex and tragic human phenomena have "no place in an educational system." In light of these conclusions,…

  20. Measurements of Enthalpy Change of Reaction of Formation, Molar Heat Capacity and Constant-Volume Combustion Energy of Solid Complex Yb(Et2dtc)3(phen)

    Institute of Scientific and Technical Information of China (English)

    Song Weiming; Hu Qilin; Chang Xuan; Chen Sanping; Xie Gang; Gao Shengli

    2006-01-01

    A ternary solid complex Yb(Et2dtc)3(phen) was obtained from the reaction of hydrous ytterbium chloride with sodium diethyldithiocarbamate (NaEt2dtc), and 1, 10-phenanthroline (o-phen·H2O) in absolute ethanol.The bonding characteristics of the complex were characterized by IR.The result shows Yb3+ bands with two sulfur atoms in the Na(Et2dtc)3 and two nitrogen atoms in the o-phen.The enthalpy change of liquid-phase reaction of formation of the complex ΔrHθm (l), was determined as being (-24.838±0.114) kJ·mol-1 at 298.15 K, by an RD-496 Ⅲ type heat conduction microcalormeter.The enthalpy change of the solid-phase reaction of formation of the complex ΔrHθm (s), was calculated as being (108.015±0.479) kJ·mol-1 on the basis of an appropriate thermochemistry cycle.The thermodynamics of liquid-phase reaction of formation of the complex was investigated by changing the temperature during the liquid-phase reaction.Fundamental parameters, the activation enthalpy, ΔHθ≠, the activation entropy, ΔSθ≠, the activation free energy, ΔGθ≠, the apparent reaction rate constant k, the apparent activation energy E, the pre-exponential constant A, and the reaction order n, were obtained by a combination of the reaction thermodynamic and kinetic equations with the data from the thermokinetic experiments.At the same time, the molar heat capacity of the complex cm, p, was determined to be (86.34±1.74) J·mol-1·K-1 by the same microcalormeter.The constant-volume combustion energy of the complex, ΔcU, was determined to be (-17954.08±8.11) kJ·mol-1 by an RBC-Ⅱ type rotating-bomb calorimeter at 298.15 K.Its standard enthalpy of combustion, ΔcHθm, and standard enthalpy of formation, ΔfHθm, were calculated to be (-17973.29±8.11) kJ·mol-1 and (-770.36±9.02) kJ·mol-1, respectively.

  1. Cadmium(2) complexes of cytosine

    International Nuclear Information System (INIS)

    Complexes of cadmium(2) with cytosine obtained from aqueous or physiological solutions at room temperature are reported. The complexes were characterized by spectroscopic, conductometric, 1H-NMR, and 13C-NMR measurements and also by thermogravimetry. (Authors)

  2. Comparison of net CO2 fluxes measured with open- and closed-path infrared gas analyzers in an urban complex environment

    DEFF Research Database (Denmark)

    Järvi, L.; Mammarella, I.; Eugster, W.;

    2009-01-01

    Simultaneous eddy covariance (EC) measurements of CO2 fluxes made with open-path and closed-path analyzers were done in urban area of Helsinki, Finland, in July 2007–June 2008. Our purpose was to study the differences between the two analyzers, the necessary correction procedures...... and their suitability to accurately measure CO2 exchange in such non-ideal landscape. In addition, this study examined the effect of open-path sensor heating on measured fluxes in urban terrain, and these results were compared with similar measurements made above a temperate beech forest in Denmark. The correlation...... between the two fluxes was good (R2 = 0.93) at the urban site, but during the measurement period the open-path net surface exchange (NSE) was 17% smaller than the closed-path NSE, indicating apparent additional uptake of CO2 by open-path measurements. At both sites, sensor heating corrections evidently...

  3. Measurement of homonuclear three-bond J(HNH{alpha}) coupling constants in unlabeled peptides complexed with labeled proteins: Application to a decapeptide inhibitor bound to the proteinase domain of the NS3 protein of hepatitis C virus (HCV)

    Energy Technology Data Exchange (ETDEWEB)

    Cicero, Daniel O.; Barbato, Gaetano; Koch, Uwe; Ingallinella, Paolo; Bianchi, Elisabetta; Sambucini, Sonia; Neddermann, Petra; De Francesco, Raffaele; Pessi, Antonello; Bazzo, Renzo

    2001-05-15

    A new isotope-filtered experiment has been designed to measure homonuclear three-bond J(H{sup N}H{sup {alpha}}) coupling constants of unlabeled peptides complexed with labeled proteins. The new experiment is based on the 3D HNHA pulse scheme, and belongs to the 'quantitative J-correlation' type. It has been applied to a decapeptide inhibitor bound to the proteinase domain of the NS3 protein of human hepatitis C virus (HCV)

  4. Ground-based direct-sun DOAS and airborne MAX-DOAS measurements of the collision-induced oxygen complex, O2O2, absorption with significant pressure and temperature differences

    OpenAIRE

    E. Spinei; A. Cede; Herman, J.; G. H. Mount; Eloranta, E; B. Morley; S. Baidar; Dix, B.; I. Ortega; Koenig, T; Volkamer, R.

    2015-01-01

    The collision-induced O2 complex, O2O2, is a very important trace gas for understanding remote sensing measurements of aerosols, cloud properties and atmospheric trace gases. Many ground-based multi-axis differential optical absorption spectroscopy (MAX-DOAS) measurements of the O2O2 optical depth require correction factors of 0.75 ± 0.1 to reproduce radiative transfer modeling (RTM) results for a nearly pure Rayleigh atmosphere. One of the potential causes of this discrepa...

  5. Interdisciplinary Symposium on Complex Systems

    CERN Document Server

    Rössler, Otto; Zelinka, Ivan

    2015-01-01

    The book you hold in your hands is the outcome of the “2014 Interdisciplinary Symposium on Complex Systems” held in the historical city of Florence. The book consists of 37 chapters from 4 areas of Physical Modeling of Complex Systems, Evolutionary Computations, Complex Biological Systems and Complex Networks. All 4 parts contain contributions that give interesting point of view on complexity in different areas in science and technology. The book starts with a comprehensive overview and classification of complexity problems entitled Physics in the world of ideas: Complexity as Energy”  , followed by chapters about complexity measures and physical principles, its observation, modeling and its applications, to solving various problems including real-life applications. Further chapters contain recent research about evolution, randomness and complexity, as well as complexity in biological systems and complex networks. All selected papers represent innovative ideas, philosophical overviews and state-of-the-...

  6. The acute effect of upper-body complex training on power output of martial art athletes as measured by the bench press throw exercise.

    Science.gov (United States)

    Liossis, Loudovikos Dimitrios; Forsyth, Jacky; Liossis, Ceorge; Tsolakis, Charilaos

    2013-12-18

    The purpose of this study was to examine the acute effect of upper body complex training on power output, as well as to determine the requisite preload intensity and intra-complex recovery interval needed to induce power output increases. Nine amateur-level combat/martial art athletes completed four distinct experimental protocols, which consisted of 5 bench press repetitions at either: 65% of one-repetition maximum (1RM) with a 4 min rest interval; 65% of 1RM with an 8 min rest; 85% of 1RM with a 4 min rest; or 85% of 1RM with an 8 min rest interval, performed on different days. Before (pre-conditioning) and after (post-conditioning) each experimental protocol, three bench press throws at 30% of 1RM were performed. Significant differences in power output pre-post conditioning were observed across all experimental protocols (F=26.489, partial eta2=0.768, p=0.001). Mean power output significantly increased when the preload stimulus of 65% 1RM was matched with 4 min of rest (p=0.001), and when the 85% 1RM preload stimulus was matched with 8 min of rest (p=0.001). Moreover, a statistically significant difference in power output was observed between the four conditioning protocols (F= 21.101, partial eta(2)=0.913, p=0.001). It was concluded that, in complex training, matching a heavy preload stimulus with a longer rest interval, and a lighter preload stimulus with a shorter rest interval is important for athletes wishing to increase their power production before training or competition.

  7. Direct amination of benzene to aniline with several typical vanadium complexes

    Institute of Scientific and Technical Information of China (English)

    Yu Fen Lv; Liang Fang Zhu; Qiu Yuan Liu; Bin Guo; Xiao Ke Hu; Chang Wei Hu

    2009-01-01

    The liquid-phase direct catalytic amination of benzene to aniline was performed in acetic acid water solvent using a series of vanadium(Ⅲ,Ⅳ,Ⅴ)complexes with N,O-or O,O-ligands as catalysts and hydroxylamine hydrochloride as the aminating agent.The vanadium complexes exhibited much higher selectivity towards the production of aniline than NaVO3 or VOSO4.Under the optimized conditions,an aniline yield of 42.5% and a TON of 48 with a high selectivity of above 99.9% was obtained using 0.2 mmol of[VO(OAc)2]as the catalyst.

  8. Engaging complexity

    Directory of Open Access Journals (Sweden)

    Gys M. Loubser

    2014-01-01

    Full Text Available In this article, I discuss studies in complexity and its epistemological implications for systematic and practical theology. I argue that engagement with complexity does not necessarily assurea non-reductionist approach. However, if complexity is engaged transversally, it becomes possible to transcend reductionist approaches. Moreover, systematic and practical the ologians can draw on complexity in developing new ways of understanding and, therefore, new ways of describing the focus, epistemic scope and heuristic structures of systematic and practical theology. Firstly, Edgar Morin draws a distinction between restricted and general complexity based on the epistemology drawn upon in studies in complexity. Moving away from foundationalist approaches to epistemology, Morin argues for a paradigm of systems. Secondly,I discuss Kees van Kooten Niekerk�s distinction between epistemology, methodology andontology in studies in complexity and offer an example of a theological argument that drawson complexity. Thirdly, I argue for the importance of transversality in engaging complexity by drawing on the work of Wentzel van Huyssteen and Paul Cilliers. In conclusion, I argue that theologians have to be conscious of the epistemic foundations of each study in complexity, and these studies illuminate the heart of Reformed theology.Intradisciplinary and/or interdisciplinary implications: Therefore, this article has both intradisciplinary and interdisciplinary implications. When theologians engage studies incomplexity, the epistemological roots of these studies need to be considered seeing thatresearchers in complexity draw on different epistemologies. Drawing on transversality wouldenhance such considerations. Furthermore, Edgar Morin�s and Paul Cilliers� approach tocomplexity will inform practical and theoretical considerations in church polity and unity.

  9. Simplifying complexity

    NARCIS (Netherlands)

    Leemput, van de I.A.

    2016-01-01

    In this thesis I use mathematical models to explore the properties of complex systems ranging from microbial nitrogen pathways and coral reefs to the human state of mind. All are examples of complex systems, defined as systems composed of a number of interconnected parts, where the systemic behavior

  10. Carney Complex

    Science.gov (United States)

    ... of Carney complex are Cushing’s syndrome and multiple thyroid nodules (tumors). Cushing’s syndrome features a combination of weight gain, ... with Carney complex include adrenocortical carcinoma , pituitary gland tumors , thyroid , colorectal , liver and pancreatic cancers . Ovarian cancer in ...

  11. Syntactic Complexity as an Aspect of Text Complexity

    Science.gov (United States)

    Frantz, Roger S.; Starr, Laura E.; Bailey, Alison L.

    2015-01-01

    Students' ability to read complex texts is emphasized in the Common Core State Standards (CCSS) for English Language Arts and Literacy. The standards propose a three-part model for measuring text complexity. Although the model presents a robust means for determining text complexity based on a variety of features inherent to a text as well as…

  12. In the search for the low-complexity sequences in prokaryotic and eukaryotic genomes: how to derive a coherent picture from global and local entropy measures

    Energy Technology Data Exchange (ETDEWEB)

    Acquisti, Claudia; Allegrini, Paolo E-mail: allegrip@ilc.cnr.it; Bogani, Patrizia; Buiatti, Marcello; Catanese, Elena; Fronzoni, Leone; Grigolini, Paolo; Mersi, Giuseppe; Palatella, Luigi

    2004-04-01

    We investigate on a possible way to connect the presence of low-complexity sequences (LCS) in DNA genomes and the non-stationary properties of base correlations. Under the hypothesis that these variations signal a change in the DNA function, we use a new technique, called non-stationarity entropic index (NSEI) method, and we prove that this technique is an efficient way to detect functional changes with respect to a random baseline. The remarkable aspect is that NSEI does not imply any training data or fitting parameter, the only arbitrarity being the choice of a marker in the sequence. We make this choice on the basis of biological information about LCS distributions in genomes. We show that there exists a correlation between changing the amount in LCS and the ratio of long- to short-range correlation.

  13. Thermally driven circulation in a region of complex topography: comparison of wind-profiling radar measurements and MM5 numerical predictions

    Directory of Open Access Journals (Sweden)

    L. Bianco

    2006-07-01

    Full Text Available The diurnal variation of regional wind patterns in the complex terrain of Central Italy was investigated for summer fair-weather conditions and winter time periods using a radar wind profiler. The profiler is located on a site where interaction between the complex topography and land-surface produces a variety of thermally and dynamically driven wind systems. The observational data set, collected for a period of one year, was used first to describe the diurnal evolution of thermal driven winds, second to validate the Mesoscale Model 5 (MM5 that is a three-dimensional numerical model. This type of analysis was focused on the near-surface wind observation, since thermally driven winds occur in the lower atmosphere. According to the valley wind theory expectations, the site – located on the left sidewall of the valley (looking up valley – experiences a clockwise turning with time. Same characteristics in the behavior were established in both the experimental and numerical results.

    Because the thermally driven flows can have some depth and may be influenced mainly by model errors, as a third step the analysis focuses on a subset of cases to explore four different MM5 Planetary Boundary Layer (PBL parameterizations. The reason is to test how the results are sensitive to the selected PBL parameterization, and to identify the better parameterization if it is possible. For this purpose we analysed the MM5 output for the whole PBL levels. The chosen PBL parameterizations are: 1 Gayno-Seaman; 2 Medium-Range Forecast; 3 Mellor-Yamada scheme as used in the ETA model; and 4 Blackadar.

  14. Managing Complexity

    DEFF Research Database (Denmark)

    Maylath, Bruce; Vandepitte, Sonia; Minacori, Patricia;

    2013-01-01

    and into French. The complexity of the undertaking proved to be a central element in the students' learning, as the collaboration closely resembles the complexity of international documentation workplaces of language service providers. © Association of Teachers of Technical Writing.......This article discusses the largest and most complex international learning-by-doing project to date- a project involving translation from Danish and Dutch into English and editing into American English alongside a project involving writing, usability testing, and translation from English into Dutch...

  15. Complex variables

    CERN Document Server

    Fisher, Stephen D

    1999-01-01

    The most important topics in the theory and application of complex variables receive a thorough, coherent treatment in this introductory text. Intended for undergraduates or graduate students in science, mathematics, and engineering, this volume features hundreds of solved examples, exercises, and applications designed to foster a complete understanding of complex variables as well as an appreciation of their mathematical beauty and elegance. Prerequisites are minimal; a three-semester course in calculus will suffice to prepare students for discussions of these topics: the complex plane, basic

  16. Algorithmic Problem Complexity

    OpenAIRE

    Burgin, Mark

    2008-01-01

    People solve different problems and know that some of them are simple, some are complex and some insoluble. The main goal of this work is to develop a mathematical theory of algorithmic complexity for problems. This theory is aimed at determination of computer abilities in solving different problems and estimation of resources that computers need to do this. Here we build the part of this theory related to static measures of algorithms. At first, we consider problems for finite words and stud...

  17. Electronic structures of TiO2-TCNE, -TCNQ, and -2,6-TCNAQ surface complexes studied by ionization potential measurements and DFT calculations: Mechanism of the shift of interfacial charge-transfer bands

    Science.gov (United States)

    Fujisawa, Jun-ichi; Hanaya, Minoru

    2016-06-01

    Interfacial charge-transfer (ICT) transitions between inorganic semiconductors and π-conjugated molecules allow direct charge separation without loss of energy. This feature is potentially useful for efficient photovoltaic conversions. Charge-transferred complexes of TiO2 nanoparticles with 7,7,8,8-tetracyanoquinodimethane (TCNQ) and its analogues (TCNX) show strong ICT absorption in the visible region. The ICT band was reported to be significantly red-shifted with extension of the π-conjugated system of TCNX. In order to clarify the mechanism of the red-shift, in this work, we systematically study electronic structures of the TiO2-TCNX surface complexes (TCNX; TCNE, TCNQ, 2,6-TCNAQ) by ionization potential measurements and density functional theory (DFT) calculations.

  18. I. Fundamental Practicum: Temperature Measurements of Falling Droplets, July, 1989. II. Industrial Practicum: Interaction and Effect of Adsorbed Organics on Reference Clays and Reservoir Rock, April, 1988. III. Apprenticeship Practicum: Studies of Group XIII Metal Inclusion Complexes, March, 1987

    Science.gov (United States)

    Wells, Mark Richard

    The temperature of 225 μm decane droplets falling through a hot, quiescent, oxygen -free environment were measured using laser-induced exciplex fluorescence thermometry. The temperature of the droplets was found to increase approximately 0.42^ circC/^circC increase in the environment temperature as the environment temperature was increased to 250^circ C. Less than 10% evaporation of the droplets was observed at the highest environment temperatures. This represents one of the first successful applications of a remote-sensing technique for the temperature determination of droplets in a dynamic system. Industrial practicum. The industrial practicum report, entitled "Interaction and Effect of Adsorbed Organics on Reference Clays and Reservoir Rock," is a discussion of the measurement of the effect adsorbed organic material, especially from crude petroleum, has on the surface area, cation exchange capacity, and zeta potential of reference clay material and reservoir rock. In addition, the energetics of adsorption of a petroleum extract onto several reference clays and reservoir rock were measured using both flow and batch microcalorimetry. These results are very important in evaluating and understanding the wettability of reservoir rock and its impact on the recovery of crude oil from a petroleum reservoir. Apprenticeship practicum. "Studies of Group XIII Metal Inclusion Complexes" investigates the structure and dynamics of liquid inclusion complexes having the general formula (R_4N) (Al_2 Me_6I) cdot (C_6H_6) _{rm x}. ^1H and ^{13}C spin-lattice relaxation times, nuclear Overhauser enhancements, and molecular correlation times were measured as well as diffusion coefficients of the various species in solution. The dynamics of transfer between "guest" and free solvent molecules were measured using a variety of techniques. The inherent structure of liquid inclusion complexes as an ordered medium for homogeneous catalysis was studied using hydrogenation catalyzed by

  19. Management of complex fisheries

    DEFF Research Database (Denmark)

    Frost, Hans Staby; Andersen, Peder; Hoff, Ayoe

    2013-01-01

    The purpose of this paper is to demonstrate how fisheries economics management issues or problems can be analyzed by using a complex model based on conventional bioeconomic theory. Complex simulation models contain a number of details that make them suitable for practical management advice......, including taking into account the response of the fishermen to implemented management measures. To demonstrate the use of complex management models this paper assesses a number of second best management schemes against a first rank optimum (FRO), an ideal individual transferable quotas (ITQ) system...

  20. Complex Covariance

    Directory of Open Access Journals (Sweden)

    Frieder Kleefeld

    2013-01-01

    Full Text Available According to some generalized correspondence principle the classical limit of a non-Hermitian quantum theory describing quantum degrees of freedom is expected to be the well known classical mechanics of classical degrees of freedom in the complex phase space, i.e., some phase space spanned by complex-valued space and momentum coordinates. As special relativity was developed by Einstein merely for real-valued space-time and four-momentum, we will try to understand how special relativity and covariance can be extended to complex-valued space-time and four-momentum. Our considerations will lead us not only to some unconventional derivation of Lorentz transformations for complex-valued velocities, but also to the non-Hermitian Klein-Gordon and Dirac equations, which are to lay the foundations of a non-Hermitian quantum theory.

  1. Analysis of WAsP (Wind Atlas Analysis and Application Program) in complex topographical conditions using measured production from a large scale wind farm

    Science.gov (United States)

    Sveinbjornsson, Stefan

    The wind energy industry is leading the charge for renewable energy in the United States. In 2012, 13,124 MW of wind power capacity was installed, almost double that of 2011. The micro-model analysis of wind farms in the pre-construction phase is vital to ensure the feasibility of every project. As wind farms take advantage of increased wind speeds due to complex topographical features, their modeling becomes more complicated and expensive. WAsP is a linear numerical model that has become an industry standard for wind farm siting in Europe. It uses topographical inputs along with on-site meteorology data, to project wind speed and direction, over a pre-defined grid. The accuracy of WAsP was examined for both un-edited and WAsP recommended user-defined alterations to the wind speed at hub height, at all sites. The unedited projections yielded the lowest deviations for the net annual production (-1.2%), whereas user corrections significantly over or underestimated power production. Some sites within the wind farm layout had over-estimations of wind speed, both due to ruggedness of the terrain and close proximity to forests. WAsP shows significant promise in projections across the grid layout. A combination of unedited and user corrections is recommended for future analysis for wind farm siting.

  2. Simplifying complexity

    OpenAIRE

    Leemput, van de, J.C.H.

    2016-01-01

    In this thesis I use mathematical models to explore the properties of complex systems ranging from microbial nitrogen pathways and coral reefs to the human state of mind. All are examples of complex systems, defined as systems composed of a number of interconnected parts, where the systemic behavior leads to the emergence of properties that would not be expected from behavior or properties of the individual parts of the system. Although the full behavior of the systems I address will probably...

  3. 采用谐振腔微扰法的NiZn铁氧体介电常数测量%Measurement of complex permittivity of NiZn-ferrite by resonant cavity perturbation method

    Institute of Scientific and Technical Information of China (English)

    王翠平; 叶柳; 李爱侠; 张子云; 刘晨; 张利飞

    2012-01-01

    The cavity perturbation technique has been widely used for microwave dielectric properties measurements. It has the great advantages of convenient experimental measurement, small sample dimension, and simple computation formulas. It has a good application value in approximate calculation frequency and the quality factor of resonant cavity, material dielectric constant. The complex permittivity within microwave frequency band of NiZn ferrite was measured by resonator cavity perturbation method. The imaginary part e" and the real part e' of the dielectric constant were obtained, and the effects of content of Ni and Zn on complex permittivity were analyzed.%谐振腔微扰法广泛用于材料微波介电性能的测量,它与常规的测量方法相比,具有样品尺寸小、计算公式简单的优点,在近似计算频率、谐振腔品质因数、材料的介电常数等方面具有较高的应用价值.采用谐振腔微扰法,测量不同配方NiZn铁氧体在微波频段的复介电常数,计算得到介电常数的虚部ε″和实部ε′,进而分析Ni和Zn的含量对NiZn铁氧体材料介电常数的影响.

  4. Cyclomatic Complexity: theme and variations

    Directory of Open Access Journals (Sweden)

    Brian Henderson-Sellers

    1993-11-01

    Full Text Available Focussing on the "McCabe family" of measures for the decision/logic structure of a program, leads to an evaluation of extensions to modularization, nesting and, potentially, to object-oriented program structures. A comparison of rated, operating and essential complexities of programs suggests two new metrics: "inessential complexity" as a measure of unstructuredness and "product complexity" as a potential objective measure of structural complexity. Finally, nesting and abstraction levels are considered, especially as to how metrics from the "McCabe family" might be applied in an object-oriented systems development environment.

  5. Complexity regularized hydrological model selection

    NARCIS (Netherlands)

    Pande, S.; Arkesteijn, L.; Bastidas, L.A.

    2014-01-01

    This paper uses a recently proposed measure of hydrological model complexity in a model selection exercise. It demonstrates that a robust hydrological model is selected by penalizing model complexity while maximizing a model performance measure. This especially holds when limited data is available.

  6. Ganglion cell complex and retinal nerve fiber layer measured by fourier-domain optical coherence tomography for early detection of structural damage in patients with preperimetric glaucoma

    Directory of Open Access Journals (Sweden)

    Rolle T

    2011-07-01

    Full Text Available Teresa Rolle, Cristina Briamonte, Daniela Curto, Federico Maria GrignoloEye Clinic, Section of Ophthalmology, Department of Clinical Physiopathology, University of Torino, Torino, ItalyAims: To evaluate the capability of Fourier-domain optical coherence tomography (FD-OCT to detect structural damage in patients with preperimetric glaucoma.Methods: A total of 178 Caucasian subjects were enrolled in this cohort study: 116 preperimetric glaucoma patients and 52 healthy subjects. Using three-dimensional FD-OCT, the participants underwent imaging of the ganglion cell complex (GCC and the optic nerve head. Sensitivity, specificity, likelihood ratios, and predictive values were calculated for all parameters at the first and fifth percentiles. Areas under the curves (AUCs were generated for all parameters and were compared (Delong test. For both the GCC and the optic nerve head protocols, the OR logical disjunction (Boolean logic operator was calculated.Results: The AUCs didn’t significantly differ. Macular global loss volume had the largest AUC (0.81. Specificities were high at both the fifth and first percentiles (up to 97%, but sensitivities were low, especially at the first percentile (55%–27%.Conclusion: Macular and papillary diagnostic accuracies did not differ significantly based on the 95% confidence interval. The computation of the Boolean OR operator has been found to boost diagnostic accuracy. Using the software-provided classification, sensitivity and diagnostic accuracy were low for both the retinal nerve fiber layer and the GCC scans. FD-OCT does not seem to be decisive for early detection of structural damage in patients with no functional impairment. This suggests that there is a need for analysis software to be further refined to enhance glaucoma diagnostic capability.Keywords: OCT, RNFL, GCC, diagnostic accuracy 

  7. 磨矿方法对某硫化矿浮选的影响%Effects of Grinding Measure on Complex Pb/Zn Ore Flotation

    Institute of Scientific and Technical Information of China (English)

    魏以和; 周高云

    2007-01-01

    The flotation performances of a complex Pb/zn ore are investigated by three kinds of mill(steel,stainless and ceramic mill)in laboratory batch flotation testa.It is found that the lead recovery is increased while grinding in a ceramic mill,i.e.a non-ferrous,oxidizing environment.but the selectivity is decreased due to the activation ol sphalerite by copper ions produced via the oxidation of ore.On the contrary,the lead recovery is decreased while grinding in a traditionel iron mill,but the selectivity is increased.The low recovery with iron grinding media is probably caused by the coating of iron oxy-hydroxides on sulphide surfaces.Coarse particles are more sensitive to this kind of depression.%为提高效果,用三种磨机(普通钢磨机,不锈钢磨机和瓷磨机)研究某铅锌矿小型试验的浮选行为.结果表明,瓷磨机(即非铁质磨机)的氧化性磨矿环境可提高铅回收率,但由于矿石氧化产生的铜离子对闪锌矿的活化而降低了浮选选择性.相反地,传统铁质磨机磨矿则导致铅收率下降,而选择性提高.铁质磨机磨矿时铅收率降低可能是由铁氧化产物在矿物表面覆盖而造成的.粗粒更易受这种抑制的影响.

  8. Identification of the properties of a complex layer deposited on the surface of a quartz crystal microbalance (QCM) by the impedance measurement

    Science.gov (United States)

    Wakatsuki, N.; Kagawa, Y.

    2007-01-01

    The quartz crystal microbalance (QCM) is a quartz crystal plate resonator for measuring a minute mass according to the resonant frequency change. In some applications, an adsorbing layer must be formed on the resonator surface, which adsorbs the material whose mass is measured. This layer affects not only the resonant frequency but also its damping as the layer is viscoelastic. Its presence cannot simply be ignored but should be included in the modelling. In our previous work, the algorithm to characterize the viscoelastic layer's properties was developed, in which the multiple resonant frequencies and the corresponding resonant resistances were considered including the overtone operation of the quartz crystal plate (Wakatsuki N, Wada S, Kagawa Y and Haba M 2003 Inverse Problems in Engineering Mechanics vol 4 (Oxford: Elsevier) pp 121-6). It is unrealistic however to assume that the layer's properties are unchanged for such a wide frequency range. In the present paper, no overtone resonance is considered. Mass of the adsorbed material and the thickness of the viscoelastic layer are identified by means of Newton's method.

  9. Techniques of Measure Lofting with Total Station in Complex Engineering Surveying Area%浅述在复杂工程测区用全站仪进行测量放样的技巧

    Institute of Scientific and Technical Information of China (English)

    王雪春

    2014-01-01

    Based on the author's personal experience and starting from the characteristics and working principle of total station, this article discusses about how to do measure lofting with total station in complex engineering area where environment is bad with poor visibility. Specific techniques are discussed.%本文结合个人经验从全站仪的特点及工作原理出发,浅述了在复杂工程测区,环境不佳、通视不好等的情况下如何利用全站仪器进行测量放样的一些技巧。

  10. Nanoscale quantitative measurement of the potential of charged nanostructures by electrostatic and Kelvin probe force microscopy: unraveling electronic processes in complex materials.

    Science.gov (United States)

    Liscio, Andrea; Palermo, Vincenzo; Samorì, Paolo

    2010-04-20

    In microelectronics and biology, many fundamental processes involve the exchange of charges between small objects, such as nanocrystals in photovoltaic blends or individual proteins in photosynthetic reactions. Because these nanoscale electronic processes strongly depend on the structure of the electroactive assemblies, a detailed understanding of these phenomena requires unraveling the relationship between the structure of the nano-object and its electronic function. Because of the fragility of the structures involved and the dynamic variance of the electric potential of each nanostructure during the charge generation and transport processes, understanding this structure-function relationship represents a great challenge. This Account discusses how our group and others have exploited scanning probe microscopy based approaches beyond imaging, particularly Kelvin probe force microscopy (KPFM), to map the potential of different nanostructures with a spatial and voltage resolution of a few nanometers and millivolts, respectively. We describe in detail how these techniques can provide researchers several types of chemical information. First, KPFM allows researchers to visualize the photogeneration and splitting of several unitary charges between well-defined nano-objects having complementary electron-acceptor and -donor properties. In addition, this method maps charge injection and transport in thin layers of polycrystalline materials. Finally, KPFM can monitor the activity of immobilized chemical components of natural photosynthetic systems. In particular, researchers can use KPFM to measure the electric potential without physical contact between the tip and the nanostructure studied. These measurements exploit long-range electrostatic interactions between the scanning probe and the sample, which scale with the square of the probe-sample distance, d. While allowing minimal perturbation, these long-range interactions limit the resolution attainable in the measurement

  11. Complex networks: Patterns of complexity

    Science.gov (United States)

    Pastor-Satorras, Romualdo; Vespignani, Alessandro

    2010-07-01

    The Turing mechanism provides a paradigm for the spontaneous generation of patterns in reaction-diffusion systems. A framework that describes Turing-pattern formation in the context of complex networks should provide a new basis for studying the phenomenon.

  12. Near infrared-red models for the remote estimation of chlorophyll- a concentration in optically complex turbid productive waters: From in situ measurements to aerial imagery

    Science.gov (United States)

    Gurlin, Daniela

    Today the water quality of many inland and coastal waters is compromised by cultural eutrophication in consequence of increased human agricultural and industrial activities and remote sensing is widely applied to monitor the trophic state of these waters. This study explores near infrared-red models for the remote estimation of chlorophyll-a concentration in turbid productive waters and compares several near infrared-red models developed within the last 35 years. Three of these near infrared-red models were calibrated for a dataset with chlorophyll-a concentrations from 2.3 to 81.2 mg m -3 and validated for independent and statistically significantly different datasets with chlorophyll-a concentrations from 4.0 to 95.5 mg m-3 and 4.0 to 24.2 mg m-3 for the spectral bands of the MEdium Resolution Imaging Spectrometer (MERIS) and Moderate-resolution Imaging Spectroradiometer (MODIS). The developed MERIS two-band algorithm estimated chlorophyll-a concentrations from 4.0 to 24.2 mg m-3, which are typical for many inland and coastal waters, very accurately with a mean absolute error 1.2 mg m-3. These results indicate a high potential of the simple MERIS two-band algorithm for the reliable estimation of chlorophyll-a concentration without any reduction in accuracy compared to more complex algorithms, even though more research seems required to analyze the sensitivity of this algorithm to differences in the chlorophyll-a specific absorption coefficient of phytoplankton. Three near infrared-red models were calibrated and validated for a smaller dataset of atmospherically corrected multi-temporal aerial imagery collected by the hyperspectral airborne imaging spectrometer for applications (AisaEAGLE). The developed algorithms successfully captured the spatial and temporal variability of the chlorophyll-a concentrations and estimated chlorophyll- a concentrations from 2.3 to 81.2 mg m-3 with mean absolute errors from 4.4 mg m-3 for the AISA two band algorithm to 5.2 mg m-3

  13. A simple measure with complex determinants: investigation of the correlates of self-rated health in older men and women from three continents

    Directory of Open Access Journals (Sweden)

    French Davina J

    2012-08-01

    consider earlier life experiences of cohorts as well as national and individual factors in later life. Further research is required to understand the complex societal influences on perceptions of health.

  14. Field Measurements

    CERN Document Server

    Bottura, L

    2004-01-01

    The measurement of the magnetic field is often the final verification of the complex design and fabrication process of a magnetic system. In several cases, when seeking high accuracy, the measurement technique and its realization can result in a considerable effort. This note describes most used measurement techniques, such as nuclear magnetic resonance, fluxmeters and Hall generators, and their typical range of application. In addition some of less commonly used techniques, such as magneto-optical, SQUIDs, or particle beams methods, are listed.

  15. Complex analysis

    CERN Document Server

    Freitag, Eberhard

    2005-01-01

    The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...

  16. Measurement of the complex refractive index of particles based on Mie theory and transmission method%Mie散射理论测量粒子系复折射率的透射方法

    Institute of Scientific and Technical Information of China (English)

    刘晓东; 戴景民

    2009-01-01

    The optical constant of particles is not equal to that of the material which makes of them. In both theory and wide practical use, it is significant that the radiation properties of a particle and assemble particles are researched with its spectral complex refractive index. The complex refractive index of a particle cannot be directly measured experimentally, since there is not direct measurement instrument. It has to be inversed through other quantity from experimental measurement and associative theoretical model. The solving process belongs to inverse problem. According to the simplified Mie scattering theory and Kramers-Kronig relation expression, the transmissivity of Al_2O_3, particles and coal ash particles were measured experimentally with a FTIR spectrometer, and combining associative theoretical model, their complex refractive indexes were inversely solved. The influence of the transmissivity measurement error on the inversed results was analyzed.%粒子的光学常数并不等同于构成粒子材料的光学常数,而通过粒子光谱复折射率数据,研究粒子及聚集粒子系的辐射特性,不仅对辐射物性的研究具有较高的理论意义,而且具有广泛的实际应用价值.粒子的复折射率不能直接通过实验测量(没有直接测量的仪器),须由实验测定其他量,然后结合相应的理论模型反求,属反问题研究.用简化的Mie散射理论及Kramers-Kronig关系式,利用傅里叶红外光谱仪对Al_2O_3粒子及煤灰粒子进行透射率实验测量,结合相应的理论模型,反演Al_2O_3粒子及煤灰粒子的复折射率.并对透射率实验误差对反演结果的影响进行了分析.

  17. 探讨复杂环境下城市隧道施工控制措施%Discussion on City Tunnel Construction Control Measures under Complex Environment

    Institute of Scientific and Technical Information of China (English)

    钱雪锋; 王永朕

    2013-01-01

    在现代城市道路建设中,城市穿山丘隧道往往会碰到对隧道施工而言地质条件差、地形非常不利及外界影响施工环境也非常恶劣等客观因素。本文以某城市隧道工程为例,浅谈不良地质、地形、环境条件下城市隧道施工控制的一些措施和方法。%In the modern city road construction, city tunnel cross massifs often encounter on the objective factors for tun-nel construction that geological conditions, topography is very negative, and external influences, construction environment is very bad. This paper takes a city tunnel project as an example, discusses city tunnel construction control measures and metho-ds under the adverse geological, topography, environment co-nditions.

  18. Complex Laparoscopic Myomectomy with Severe Adhesions Performed with Proper Preventive Measures and Power Morcellation Provides a Safe Choice in Certain Infertility Cases

    Science.gov (United States)

    Alfaro-Alfaro, Jaime; Flores-Manzur, María de los Ángeles; Nevarez-Bernal, Roberto

    2016-01-01

    Laparoscopic myomectomy offers a real benefit to infertile patients with uterine fibroids and peritoneal adhesions. The procedure requires a skilled surgeon and laparoscopy technique to minimize adhesion formation and other proven benefits. Restrictions arise since this procedure requires power morcellation for fibroid tissue extraction. Two years ago, the Food and Drug Administration in the United States of America (FDA) issued the alert on power morcellation for uterine leiomyomas, addressing the risk of malignant cell spreading within the abdominal cavity (actual risk assessment from 1 in 360 to 1 in 7400 cases). We review a 30-year-old female, without previous gestations, hypermenorrhea, intermenstrual bleeding, and chronic pelvic pain. Transvaginal ultrasound reports multiple fibroids in the right portion of a bicornuate uterus. Relevant history includes open myomectomy 6 years before and a complicated appendectomy, developing peritonitis within a year. Laparoscopy revealed multiple adhesions blocking uterine access, a bicornuate uterus, and myomas in the expected site. Myomectomy was performed utilizing power morcellation with good results. FDA recommendations have diminished this procedure's selection, converting many to open variants. This particular case was technically challenging, requiring morcellation, and safety device deployment was impossible, yet the infertility issue was properly addressed. Patient evaluation, safety measures, and laparoscopy benefits may outweigh the risks in particular cases as this one. PMID:27668110

  19. Measurement of high energy neutrons (E > 50 MeV) at electron accelerators of INDUS accelerator complex using bismuth fission detectors

    International Nuclear Information System (INIS)

    This paper reports the measurement of high energy neutron component (E > 50 MeV) carried out at INDUS-I (450 MeV) and INDUS-II (2.5 GeV) electron accelerators (RRCAT, Indore, India). The study is based on the registration of neutron induced fission fragments from bismuth films in solid polymeric track detectors. These BFD stacks were exposed at the injection septums of booster synchrotron, Indus-1 and Indus-2 storage rings, where the possibility of dose due to beam loss is expected to be maximum. The detection efficiency of the bismuth fission detector (BFD) could be enhanced by enlarging the detector surface area and accordingly a large area spark counter was fabricated for automatic and rapid counting of the track densities. The dose equivalent rates were found to be 11.0 ± 0.7 mrem/h (73 h total exposure time), 11.0 ± 2.6 mrem/h (35 h total exposure time) and 65.0 mrem/h (5 h total exposure time) for the injection septums of booster synchrotron, Indus-1 and Indus-2 respectively. However, the values reported here were not corrected for the contribution from photo fissions, if any. (author)

  20. On the Use of Molecular Weight Cutoff Cassettes to Measure Dynamic Relaxivity of Novel Gadolinium Contrast Agents: Example Using Hyaluronic Acid Polymer Complexes in Phosphate-Buffered Saline

    Directory of Open Access Journals (Sweden)

    Nima Kasraie

    2011-01-01

    Full Text Available The aims of this study were to determine whether standard extracellular contrast agents of Gd(III ions in combination with a polymeric entity susceptible to hydrolytic degradation over a finite period of time, such as Hyaluronic Acid (HA, have sufficient vascular residence time to obtain comparable vascular imaging to current conventional compounds and to obtain sufficient data to show proof of concept that HA with Gd-DTPA ligands could be useful as vascular imaging agents. We assessed the dynamic relaxivity of the HA bound DTPA compounds using a custom-made phantom, as well as relaxation rates at 10.72 MHz with concentrations ranging between 0.09 and 7.96 mM in phosphate-buffered saline. Linear dependences of static longitudinal relaxation rate (R1 on concentration were found for most measured samples, and the HA samples continued to produce high signal strength after 24 hours after injection into a dialysis cassette at 3T, showing superior dynamic relaxivity values compared to conventional contrast media such as Gd-DTPA-BMA.

  1. A competitive indirect enzyme-linked immunoassay for lead ion measurement using mAbs against the lead-DTPA complex

    Energy Technology Data Exchange (ETDEWEB)

    Xiang Junjian; Zhai Yifan [Molecular Immunology and Antibody Engineering Center, Jinan University, Guangzhou 510632 (China); Tang Yong, E-mail: ty7926@qq.co [Molecular Immunology and Antibody Engineering Center, Jinan University, Guangzhou 510632 (China); Wang Hong; Liu Bin; Guo Changwei [Molecular Immunology and Antibody Engineering Center, Jinan University, Guangzhou 510632 (China)

    2010-05-15

    Immunoassays for quantitative measurement of environmental heavy metals offer several advantages over other traditional methods. To develop an immunoassay for lead, Balb/c mice were immunized with a lead-chelate-protein conjugate to allow maximum exposure of the metal to the immune system. Three stable hybridoma cell lines were obtained through spleen cells fusion with Sp2/0 cells. One cell line, 2A11D11, produced mAbs with preferential selectivity and sensitivity for Pb-DTPA than DTPA, exhibiting an affinity constant of 3.34 +- 0.24 x 10{sup 9} M{sup -1}. Cross reactivity (CR) with other metals were below 1%, except for Fe(III) with a CR less than 5%. This quantitative indirect ELISA for the lead ion was used to detect environmental lead content in local water sources; importantly, the results from the immunoassay were in excellent agreement with those from ICP-MS. Development of immunoassays for metal ions may thus facilitate the detection and regulation of environmental pollution. - Development of immunoassays for metal ions may thus facilitate the detection and regulation of environmental pollution.

  2. On the Use of Molecular Weight Cutoff Cassettes to Measure Dynamic Relaxivity of Novel Gadolinium Contrast Agents: Example Using Hyaluronic Acid Polymer Complexes in Phosphate-Buffered Saline

    International Nuclear Information System (INIS)

    The aims of this study were to determine whether standard extracellular contrast agents of Gd(III) ions in combination with a polymeric entity susceptible to hydrolytic degradation over a finite period of time, such as Hyaluronic Acid (HA), have sufficient vascular residence time to obtain comparable vascular imaging to current conventional compounds and to obtain sufficient data to show proof of concept that HA with Gd-DTPA ligands could be useful as vascular imaging agents. We assessed the dynamic relaxivity of the HA bound DTPA compounds using a custom-made phantom, as well as relaxation rates at 10.72 MHz with concentrations ranging between 0.09 and 7.96 mM in phosphate-buffered saline. Linear dependences of static longitudinal relaxation rate (R1) on concentration were found for most measured samples, and the HA samples continued to produce high signal strength after 24 hours after injection into a dialysis cassette at 3T, showing superior dynamic relaxivity values compared to conventional contrast media such as Gd-DTPA-BMA

  3. Complexity and Dynamical Depth

    Directory of Open Access Journals (Sweden)

    Terrence Deacon

    2014-07-01

    Full Text Available We argue that a critical difference distinguishing machines from organisms and computers from brains is not complexity in a structural sense, but a difference in dynamical organization that is not well accounted for by current complexity measures. We propose a measure of the complexity of a system that is largely orthogonal to computational, information theoretic, or thermodynamic conceptions of structural complexity. What we call a system’s dynamical depth is a separate dimension of system complexity that measures the degree to which it exhibits discrete levels of nonlinear dynamical organization in which successive levels are distinguished by local entropy reduction and constraint generation. A system with greater dynamical depth than another consists of a greater number of such nested dynamical levels. Thus, a mechanical or linear thermodynamic system has less dynamical depth than an inorganic self-organized system, which has less dynamical depth than a living system. Including an assessment of dynamical depth can provide a more precise and systematic account of the fundamental difference between inorganic systems (low dynamical depth and living systems (high dynamical depth, irrespective of the number of their parts and the causal relations between them.

  4. Pain in patients with multiple sclerosis: a complex assessment including quantitative and qualitative measurements provides for a disease-related biopsychosocial pain model

    Directory of Open Access Journals (Sweden)

    Michalski D

    2011-08-01

    Full Text Available Dominik Michalski1,*, Stefanie Liebig1,*, Eva Thomae1,2, Andreas Hinz3, Florian Then Bergh1,21Department of Neurology, 2Translational Centre for Regenerative Medicine (TRM, 3Department of Medical Psychology and Medical Sociology, University of Leipzig, Leipzig, Germany *These authors contributed equallyBackground: Pain of various causes is a common phenomenon in patients with Multiple Sclerosis (MS. A biopsychosocial perspective has proven a useful theoretical construct in other chronic pain conditions and was also started in MS. To support such an approach, we aimed to investigate pain in MS with special emphasis on separating quantitative and qualitative aspects, and its interrelation to behavioral and physical aspects.Materials and methods: Pain intensity (NRS and quality (SES were measured in 38 consecutive outpatients with MS (mean age, 42.0 ± 11.5 years, 82% women. Pain-related behavior (FSR, health care utilization, bodily complaints (GBB-24 and fatigue (WEIMuS were assessed by questionnaires, and MS-related neurological impairment by a standardized neurological examination (EDSS.Results: Mean pain intensity was 4.0 (range, 0–10 and mean EDSS 3.7 (range, 0–8 in the overall sample. Currently present pain was reported by 81.6% of all patients. Disease duration and EDSS did not differ between patients with and without pain and were not correlated to quality or intensity of pain. Patients with pain had significantly higher scores of musculoskeletal complaints, but equal scores of exhaustion, gastrointestinal and cardiovascular complaints. Pain intensity correlated only with physical aspects, whereas quality of pain was additionally associated with increased avoidance, resignation and cognitive fatigue.Conclusion: As in other conditions, pain in MS must be assessed in a multidimensional way. Further research should be devoted to adapt existing models to a MS-specific model of pain.Keywords: pain intensity, quality of pain, pain

  5. Complexity, Information and Biological Organisation

    Directory of Open Access Journals (Sweden)

    Attila Grandpierre

    2005-12-01

    Full Text Available Regarding the widespread confusion about the concept and nature of complexity, information and biological organization, we look for some coordinated conceptual considerations corresponding to quantitative measures suitable to grasp the main characteristics of biological complexity. Quantitative measures of algorithmic complexity of supercomputers like Blue Gene/L are compared with the complexity of the brain. We show that both the computer and the brain have a more fundamental, dynamic complexity measure corresponding to the number of operations per second. Recent insights suggest that the origin of complexity may go back to simplicity at a deeper level, corresponding to algorithmic complexity. We point out that for physical systems Ashby’s Law, Kahre’s Law and causal closure of the physical exclude the generation of information, and since genetic information corresponds to instructions, we are faced with a controversy telling that the algorithmic complexity of physics is much lower than the instructions’ complexity of the human DNA: I_algorithmic(physics ~ 10^3 bit << I_instructions(DNA ~ 10^9 bit. Analyzing the genetic complexity we obtain that actually the genetic information corresponds to a deeper than algorithmic level of complexity, putting an even greater emphasis to the information paradox. We show that the resolution of the fundamental information paradox may lie either in the chemical evolution of inheritance in abiogenesis, or in the existence of an autonomous biological principle allowing the production of information beyond physics.

  6. Complex Networks

    CERN Document Server

    Evsukoff, Alexandre; González, Marta

    2013-01-01

    In the last decade we have seen the emergence of a new inter-disciplinary field focusing on the understanding of networks which are dynamic, large, open, and have a structure sometimes called random-biased. The field of Complex Networks is helping us better understand many complex phenomena such as the spread of  deseases, protein interactions, social relationships, to name but a few. Studies in Complex Networks are gaining attention due to some major scientific breakthroughs proposed by network scientists helping us understand and model interactions contained in large datasets. In fact, if we could point to one event leading to the widespread use of complex network analysis is the availability of online databases. Theories of Random Graphs from Erdös and Rényi from the late 1950s led us to believe that most networks had random characteristics. The work on large online datasets told us otherwise. Starting with the work of Barabási and Albert as well as Watts and Strogatz in the late 1990s, we now know th...

  7. Managing complexity of aerospace systems

    Science.gov (United States)

    Tamaskar, Shashank

    Growing complexity of modern aerospace systems has exposed the limits of conventional systems engineering tools and challenged our ability to design them in a timely and cost effective manner. According to the US Government Accountability Office (GAO), in 2009 nearly half of the defense acquisition programs are expecting 25% or more increase in unit acquisition cost. Increase in technical complexity has been identified as one of the primary drivers behind cost-schedule overruns. Thus to assure the affordability of future aerospace systems, it is increasingly important to develop tools and capabilities for managing their complexity. We propose an approach for managing the complexity of aerospace systems to address this pertinent problem. To this end, we develop a measure that improves upon the state-of-the-art metrics and incorporates key aspects of system complexity. We address the problem of system decomposition by presenting an algorithm for module identification that generates modules to minimize integration complexity. We demonstrate the framework on diverse spacecraft and show the impact of design decisions on integration cost. The measure and the algorithm together help the designer track and manage complexity in different phases of system design. We next investigate how complexity can be used as a decision metric in the model-based design (MBD) paradigm. We propose a framework for complexity enabled design space exploration that introduces the idea of using complexity as a non-traditional design objective. We also incorporate complexity with the component based design paradigm (a sub-field of MBD) and demonstrate it on several case studies. The approach for managing complexity is a small but significant contribution to the vast field of complexity management. We envision our approach being used in concert with a suite of complexity metrics to provide an ability to measure and track complexity through different stages of design and development. This will not

  8. The pervasive reach of resource-bounded Kolmogorov complexity in computational complexity theory

    OpenAIRE

    Allender, E.; Koucký, M.; Ronneburger, D.; Roy, S.

    2011-01-01

    We continue an investigation into resource-bounded Kolmogorov complexity, which highlights the close connections between circuit complexity and Levin's time-bounded Kolmogorov complexity measure Kt (and other measures with a similar flavor), and also exploits derandomization techniques to provide new insights regarding Kolmogorov complexity. The Kolmogorov measures that have been introduced have many advantages over other approaches to defining resource-bounded Kolmogorov complexity. Here, we...

  9. Managing Complexity

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Posse, Christian; Malard, Joel M.

    2004-08-01

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today’s most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically-based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This paper explores the state of the art in the use physical analogs for understanding the behavior of some econophysical systems and to deriving stable and robust control strategies for them. In particular we review and discussion applications of some analytic methods based on the thermodynamic metaphor according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood.

  10. PS complex

    CERN Multimedia

    1972-01-01

    A view of the present PS complex taken at the end of 1972. The earth embankment covering the Ring is clearly visible; in the foreground are the North and South Experimental Halls; to the right is the East Hall, and to the left the Booster surface buildings. The West Hall is too far to the left to seen, and also invisible is the SPS being constructed.

  11. Increase of Organization in Complex Systems

    OpenAIRE

    Georgiev, Georgi Yordanov; Daly, Michael; Gombos, Erin; Vinod, Amrit; Hoonjan, Gajinder

    2013-01-01

    Measures of complexity and entropy have not converged to a single quantitative description of levels of organization of complex systems. The need for such a measure is increasingly necessary in all disciplines studying complex systems. To address this problem, starting from the most fundamental principle in Physics, here a new measure for quantity of organization and rate of self-organization in complex systems based on the principle of least (stationary) action is applied to a model system -...

  12. The complex microelectrode for pH measurement of plaque%用于牙菌斑pH测定的复合型微pH电极的研制

    Institute of Scientific and Technical Information of China (English)

    刘大力; 袁诗芬; 应太林; 朱邦尚; 王小平

    2001-01-01

    Objective To fabricate microelectrode for pH measurement ofplaque. Methods The pH indicative electrode was made of tungsten wire modified by polyanilin in cycling Volt-Ampere(CV). The reference electrode was made of Ag-AgCl wire. They were put into the plastic casing , then the complex pH microelectrode which can be used to measure the pH value of plaque was formed. Results Within the pH range of 3 to 12 , the electrode presented Nernst response, and the time response of the electrode was less than 2 min. The electrode was used to measure the pH value of plaque, and the pH curve after sucrose solution rinse could be obtained. Conclusion This electrode can be applied to pH measurement of plaque and can also be expected to measure pH of other miniature environment.%目的 研制可用于测定菌斑pH值的微pH电极。方法 用循环伏安法聚苯胺修饰钨丝电极作为pH指示电极,用Ag-AgCl丝作为参比电极,二者复合于塑料套管内构成复合型pH微电极,试用于菌斑pH的测定。结果 在pH3~12范围内,该电极呈现Nernst响应,响应时间<2min。将该电极试用于菌斑pH测定,获得蔗糖含漱后菌斑pH变动曲线。结论 该电极可用于菌斑内pH测定,且有望用于体内其它微小环境pH测定。

  13. Complementary Use of Information from Space-Based Dinsar and Field Measuring Systems for Operational Monitoring Purposes in Open Pit Iron Mines of Carajas Mining Complex (brazilian Amazon Region)

    Science.gov (United States)

    Paradella, W. R.; Mura, J. C.; Gama, F. F.; Santos, A. R.; Silva, G. G.; Galo, M.; Camargo, P. O.; Silva, A. Q.

    2015-04-01

    Now spanning five simultaneous open-pit operations with exploration carried out through open pit benching, Carajas complex encompasses the world's largest iron reserves. Open pit mining operations in the area can lead to slope instabilities with risks to personnel, equipment and production due to intense excavations in rock products of low geomechanical quality, blasting practices and heavy precipitation. Thus, an effective prediction and management of surface deformations should be a key concern for the mining operations. The ground displacement monitoring techniques in Carajas include surface measurement techniques at discrete points (total station/reflective prisms) and over area using SSR (Slope Stability Radar, a ground based radar). On the other hand, DInSAR techniques are receiving relevance in the mining industry for reasons such a synoptic and continuous coverage without the need for ground instrumentation and a point-to-point good accuracy of measuring displacements (millimeter to centimeter scale) over a dense grid. Using a stack of 33 StripMap TerraSAR-X images acquired over Carajas covering the time span from March 2012 to April 2013, a monitoring approach is discussed based on the complementary use of information provided by DInSAR (DInSAR Time-Series and Persistent Scatterer Interferometry) and surface measuring techniques (total station/prisms, ground-based radar).

  14. Clinical efficacy of acid water on distal lower extremity osteomyelitis%应用酸性水治疗下肢远端骨髓炎的临床疗效

    Institute of Scientific and Technical Information of China (English)

    刘智深; 牛纪娥; 牛志勇; 杜张荣

    2013-01-01

    Objective To evaluate the clinical efficacy of acid water on distal lower extremity chronic osteomyelitis.Methods Between July 2011 and November 2012,11 patients with lots of exudates of chronic osteomyelitis were treated in our hospital.After the focus of infection was debrided completely,they were treated with acid water.As soon as the wound surface was clear and the surface was covered with granulating tissue,the wounds were treaded with secondary directly sutured or closed by skin transplantation.Results The exudates and the area of the wound were both decreased after rinsed and soaked in acid water for 10-25 days,with an average of 19 days.The wound was covered by second suture or flap transplantation.After of followed-up of 6-18 months,all 11 cases of chronic osteomyelitis were healed with no recurrence.Conclusions Application of acid water in treatment of osteomyelitis is effective and feasible,it has low medical cost,with good social and economic benefits,and provides a new option for the treatment of chronic osteomyelitis of the distal limbs.%目的 探讨酸性水在下肢远端骨髓炎的治疗作用.方法 晋城煤业集团总医院骨一科2011年7月至2012年10月对11例胫腓骨远端骨折内固定术后及跟骨骨折内固定术后感染,内固定材料或骨外露,有较多脓性分泌物的病例于病灶清除后先采用酸性水冲洗浸泡治疗,待创面清洁、肉芽组织形成后,二期直接缝合切口或应用皮瓣转移修复创面.结果 本组应用酸性水冲洗浸泡切口,同时全身应用敏感抗生素,冲洗浸泡时间为10~25 d,平均19 d,创面经上述方法治疗后分泌物逐渐减少,创面逐渐形成肉芽组织.二期手术修复创面,随访6~18个月,患者骨折断端骨性愈合,骨髓炎均无复发.结论 应用酸化水治疗骨髓炎,疗效确切可行,医疗费用低,具有良好的社会效益和经济效益,为治疗四肢远端的慢性骨髓炎提供了一种新的选择.

  15. Complexity, time and music

    OpenAIRE

    JEAN PIERRE BOON

    2009-01-01

    The concept of complexity as considered in terms of its algorithmic definition proposed by G.J. Chaitin and A.N. Kolmogorov is revisited for the dynamical complexity of music. When music pieces are cast in the form of time series of pitch variations, concepts of dynamical systems theory can be used to define new quantities such as the {\\em dimensionality} as a measure of the {\\em global temporal dynamics} of a music piece, and the Shanon {\\em entropy} as an evaluation of its {\\em local dynami...

  16. Phase equilibria at low temperature for light hydrocarbons-methanol-water-acid gases mixtures: measurements and modelling; Equilibres de phases a basse temperature de systemes complexes CO{sub 2} - hydrocarbures legers - methanol - eau: mesures et modelisation

    Energy Technology Data Exchange (ETDEWEB)

    Ruffine, L.

    2005-10-15

    The need to develop and improve natural gas treatment processes is real. The petroleum industry usually uses separation processes which require phase equilibrium phenomena. Yet, the complexity of the phase equilibria involved results in a lack of data, which in turn limits the development of thermodynamic models. The first part of this work is devoted to experimental investigations for systems containing light hydrocarbons, methanol, water and acid gases. We present a new apparatus that was developed to measure vapor-liquid and vapor-liquid-liquid equilibria. It allowed us to obtain new phase composition data for the methanol-ethane binary system and different mixtures, and also to determine a part of the three phases equilibrium envelope of the same systems. In the second part of this work, we have developed a thermodynamic model based on the CPA equation of state. This choice may be justified by the presence of associating components like methanol, hydrogen sulfide and water in the systems. Such model is necessary for the design of gas treatment plants. Our model provides good results for phase equilibrium calculations for binaries systems without binary interaction parameter in many cases, and describes correctly the vapour-liquid and vapor-liquid-liquid equilibria for complex mixtures. (author)

  17. Complex dynamics

    CERN Document Server

    Carleson, Lennart

    1993-01-01

    Complex dynamics is today very much a focus of interest. Though several fine expository articles were available, by P. Blanchard and by M. Yu. Lyubich in particular, until recently there was no single source where students could find the material with proofs. For anyone in our position, gathering and organizing the material required a great deal of work going through preprints and papers and in some cases even finding a proof. We hope that the results of our efforts will be of help to others who plan to learn about complex dynamics and perhaps even lecture. Meanwhile books in the field a. re beginning to appear. The Stony Brook course notes of J. Milnor were particularly welcome and useful. Still we hope that our special emphasis on the analytic side will satisfy a need. This book is a revised and expanded version of notes based on lectures of the first author at UCLA over several \\Vinter Quarters, particularly 1986 and 1990. We owe Chris Bishop a great deal of gratitude for supervising the production of cour...

  18. Complex Systems

    Directory of Open Access Journals (Sweden)

    Yi Zhao

    2012-01-01

    Full Text Available Quantum instanton (QI approximation is recently proposed for the evaluations of the chemical reaction rate constants with use of full dimensional potential energy surfaces. Its strategy is to use the instanton mechanism and to approximate time-dependent quantum dynamics to the imaginary time propagation of the quantities of partition function. It thus incorporates the properties of the instanton idea and the quantum effect of partition function and can be applied to chemical reactions of complex systems. In this paper, we present the QI approach and its applications to several complex systems mainly done by us. The concrete systems include, (1 the reaction of H+CH4→H2+CH3, (2 the reaction of H+SiH4→H2+SiH3, (3 H diffusion on Ni(100 surface; and (4 surface-subsurface transport and interior migration for H/Ni. Available experimental and other theoretical data are also presented for the purpose of comparison.

  19. Ground-based direct-sun DOAS and airborne MAX-DOAS measurements of the collision-induced oxygen complex, O2O2, absorption with significant pressure and temperature differences

    Science.gov (United States)

    Spinei, E.; Cede, A.; Herman, J.; Mount, G. H.; Eloranta, E.; Morley, B.; Baidar, S.; Dix, B.; Ortega, I.; Koenig, T.; Volkamer, R.

    2015-02-01

    The collision-induced O2 complex, O2O2, is a very important trace gas for understanding remote sensing measurements of aerosols, cloud properties and atmospheric trace gases. Many ground-based multi-axis differential optical absorption spectroscopy (MAX-DOAS) measurements of the O2O2 optical depth require correction factors of 0.75 ± 0.1 to reproduce radiative transfer modeling (RTM) results for a nearly pure Rayleigh atmosphere. One of the potential causes of this discrepancy is uncertainty in laboratory-measured O2O2 absorption cross section temperature and pressure dependencies due to difficulties in replicating atmospheric conditions in the laboratory environment. This paper presents ground-based direct-sun (DS) and airborne multi-axis (AMAX) DOAS measurements of O2O2 absorption optical depths under actual atmospheric conditions in two wavelength regions (335-390 and 435-490 nm). DS irradiance measurements were made by the Washington State University research-grade Multi-Function Differential Spectroscopy Instrument instrument from 2007 to 2014 at seven sites with significant pressure (778 to 1013 hPa) and O2O2 profile-weighted temperature (247 to 275 K) differences. Aircraft MAX-DOAS measurements were conducted by the University of Colorado (CU) AMAX-DOAS instrument on 29 January 2012 over the Southern Hemispheric subtropical Pacific Ocean. Scattered solar radiance spectra were collected at altitudes between 9 and 13.2 km, with O2O2 profile-weighted temperatures of 231 to 244 K and nearly pure Rayleigh scattering conditions. Due to the well-defined DS air-mass factors during ground-based measurements and extensively characterized atmospheric conditions during the aircraft AMAX-DOAS measurements, O2O2 "pseudo" absorption cross sections, σ, are derived from the observed optical depths and estimated O2O2 column densities. Vertical O2O2 columns are calculated from the atmospheric sounding temperature, pressure and specific humidity profiles. Based on the ground

  20. Fault Diagnosis Based on Nonlinear Complexity Measure for Reciprocating Compressor%基于非线性复杂测度的往复压缩机故障诊断

    Institute of Scientific and Technical Information of China (English)

    唐友福; 刘树林; 刘颖慧; 姜锐红

    2012-01-01

    往复压缩机以多源非线性冲击振动信号为主,应用传统方法难以从振动信号中提取故障特征,为此提出一种基于非线性复杂测度的往复压缩机故障诊断方法.以气阀正常、阀片有缺口、阀片断裂及弹簧损坏4种状态下往复压缩机气阀振动信号为分析数据,在小波阈值降噪处理的基础上,采用均值符号化方法计算信号的归一化Lempel-Ziv复杂度(Lempel-Ziv complexity,LZC)指标,分别给出各状态相应的LZC特征区间,利用BP人工神经网络对各状态信号的有效值特征、功率谱能量特征及LZC特征分别进行训练和测试,结果表明LZC更能准确区分不同状态的往复压缩机气阀故障,为往复压缩机故障诊断和维修决策提供了一种有效方法.%The vibration of reciprocating compressor mainly contains multi-source nonlinear pulse signal, it is difficult to extract fault characteristics from the signal with traditional methods. A novel fault diagnosis approach based on nonlinear complexity measure for reciprocating compressor is proposed. The gas valve signals of reciprocating compressor in four different states including normal valve sheets, gap valve sheets, fractured valve sheets and bad spring are used as the experimental data. The signals are denoised with threshold-based wavelet so as to reduce the noise interference. The normalized Lempel-Ziv complexity (LZC) indexes are calculated by using mean symbolization method. The LZC characteristics interval for each state is estimated, and the characteristics of effective value, power spectrum energy and LZC for reciprocating compressor are trained and detected by artificial neural network. The results show that the LZC method can extract the different faults states of reciprocating compressor accurately, which supplies a measure of fault diagnosis and maintenance strategy for reciprocating compressor.