WorldWideScience

Sample records for acid-water complexes measured

  1. Comparison of the SAWNUC model with CLOUD measurements of sulphuric acid-water nucleation.

    Science.gov (United States)

    Ehrhart, Sebastian; Ickes, Luisa; Almeida, Joao; Amorim, Antonio; Barmet, Peter; Bianchi, Federico; Dommen, Josef; Dunne, Eimear M; Duplissy, Jonathan; Franchin, Alessandro; Kangasluoma, Juha; Kirkby, Jasper; Kürten, Andreas; Kupc, Agnieszka; Lehtipalo, Katrianne; Nieminen, Tuomo; Riccobono, Francesco; Rondo, Linda; Schobesberger, Siegfried; Steiner, Gerhard; Tomé, António; Wimmer, Daniela; Baltensperger, Urs; Wagner, Paul E; Curtius, Joachim

    2016-10-27

    Binary nucleation of sulphuric acid-water particles is expected to be an important process in the free troposphere at low temperatures. SAWNUC (Sulphuric Acid Water Nucleation) is a model of binary nucleation that is based on laboratory measurements of the binding energies of sulphuric acid and water in charged and neutral clusters. Predictions of SAWNUC are compared for the first time comprehensively with experimental binary nucleation data from the CLOUD chamber at European Organization for Nuclear Research. The experimental measurements span a temperature range of 208-292 K, sulphuric acid concentrations from 1·10 6 to 1·10 9  cm -3 , and distinguish between ion-induced and neutral nucleation. Good agreement, within a factor of 5, is found between the experimental and modeled formation rates for ion-induced nucleation at 278 K and below and for neutral nucleation at 208 and 223 K. Differences at warm temperatures are attributed to ammonia contamination which was indicated by the presence of ammonia-sulphuric acid clusters, detected by an Atmospheric Pressure Interface Time of Flight (APi-TOF) mass spectrometer. APi-TOF measurements of the sulphuric acid ion cluster distributions ( (H2SO4)i·HSO4- with i = 0, 1, ..., 10) show qualitative agreement with the SAWNUC ion cluster distributions. Remaining differences between the measured and modeled distributions are most likely due to fragmentation in the APi-TOF. The CLOUD results are in good agreement with previously measured cluster binding energies and show the SAWNUC model to be a good representation of ion-induced and neutral binary nucleation of sulphuric acid-water clusters in the middle and upper troposphere.

  2. Comparison of the SAWNUC model with CLOUD measurements of sulphuric acid-water nucleation

    CERN Document Server

    Ehrhart, Sebastian; Almeida, Joao; Amorim, Antonio; Barmet, Peter; Bianchi, Federico; Dommen, Josef; Dunne, Eimear M; Duplissy, Jonathan; Franchin, Alessandro; Kangasluoma, Juha; Kirkby, Jasper; Kürten, Andreas; Kupc, Agnieszka; Lehtipalo, Katrianne; Nieminen, Tuomo; Riccobono, Francesco; Rondo, Linda; Schobesberger, Siegfried; Steiner, Gerhard; Tomé, António; Wimmer, Daniela; Baltensperger, Urs; Wagner, Paul E; Curtius, Joachim

    2016-01-01

    Binary nucleation of sulphuric acid-water particles is expected to be an important process in the free troposphere at low temperatures. SAWNUC (Sulphuric Acid Water Nucleation) is a model of binary nucleation that is based on laboratory measurements of the binding energies of sulphuric acid and water in charged and neutral clusters. Predictions of SAWNUC are compared for the first time comprehensively with experimental binary nucleation data from the CLOUD chamber at European Organization for Nuclear Research. The experimental measurements span a temperature range of 208–292 K, sulphuric acid concentrations from 1·106 to 1·109 cm−3, and distinguish between ion-induced and neutral nucleation. Good agreement, within a factor of 5, is found between the experimental and modeled formation rates for ion-induced nucleation at 278 K and below and for neutral nucleation at 208 and 223 K. Differences at warm temperatures are attributed to ammonia contamination which was indicated by the presence of ammonia-sulphu...

  3. Kinetic stability of the dysprosium(3) complex with tetraazaporphine in acetic acid-water and acetic acid-methanol mixtures

    International Nuclear Information System (INIS)

    Khelevina, O.G.; Vojnov, A.A.

    1999-01-01

    Water-soluble dysprosium tetraazaporphine with acetylacetonate-ion as extraligand is synthesized for the first time. Its kinetic stability in acetic acid solutions is investigated. It is shown that the complex is dissociated with formation of free tetraazaporphine. Kinetic parameters of dissociation reaction are determined [ru

  4. Simulations with complex measure

    International Nuclear Information System (INIS)

    Markham, J.K.; Kieu, T.D.

    1997-01-01

    A method is proposed to handle the sign problem in the simulation of systems having indefinite or complex-valued measures. In general, this new approach, which is based on renormalisation blocking, is shown to yield statistical errors smaller that the crude Monte Carlo method using absolute values of the original measures. The improved method is applied to the 2D Ising model with temperature generalised to take on complex values. It is also adapted to implement Monte Carlo Renormalisation Group calculations of the magnetic and thermal critical exponents. 10 refs., 4 tabs., 7 figs

  5. On convex complexity measures

    Czech Academy of Sciences Publication Activity Database

    Hrubeš, P.; Jukna, S.; Kulikov, A.; Pudlák, Pavel

    2010-01-01

    Roč. 411, 16-18 (2010), s. 1842-1854 ISSN 0304-3975 R&D Projects: GA AV ČR IAA1019401 Institutional research plan: CEZ:AV0Z10190503 Keywords : boolean formula * complexity measure * combinatorial rectangle * convexity Subject RIV: BA - General Mathematics Impact factor: 0.838, year: 2010 http://www.sciencedirect.com/science/article/pii/S0304397510000885

  6. Complexity measures of music

    Science.gov (United States)

    Pease, April; Mahmoodi, Korosh; West, Bruce J.

    2018-03-01

    We present a technique to search for the presence of crucial events in music, based on the analysis of the music volume. Earlier work on this issue was based on the assumption that crucial events correspond to the change of music notes, with the interesting result that the complexity index of the crucial events is mu ~ 2, which is the same inverse power-law index of the dynamics of the brain. The search technique analyzes music volume and confirms the results of the earlier work, thereby contributing to the explanation as to why the brain is sensitive to music, through the phenomenon of complexity matching. Complexity matching has recently been interpreted as the transfer of multifractality from one complex network to another. For this reason we also examine the mulifractality of music, with the observation that the multifractal spectrum of a computer performance is significantly narrower than the multifractal spectrum of a human performance of the same musical score. We conjecture that although crucial events are demonstrably important for information transmission, they alone are not suficient to define musicality, which is more adequately measured by the multifractality spectrum.

  7. Measurement of complex surfaces

    International Nuclear Information System (INIS)

    Brown, G.M.

    1993-05-01

    Several of the components used in coil fabrication involve complex surfaces and dimensions that are not well suited to measurements using conventional dimensional measuring equipment. Some relatively simple techniques that are in use in the SSCL Magnet Systems Division (MSD) for incoming inspection will be described, with discussion of their suitability for specific applications. Components that are submitted for MSD Quality Assurance (QA) dimensional inspection may be divided into two distinct categories; the first category involves components for which there is an approved drawing and for which all nominal dimensions are known; the second category involves parts for which 'reverse engineering' is required, the part is available but there are no available drawings or dimensions. This second category typically occurs during development of coil end parts and coil turn filler parts where it is necessary to manually shape the part and then measure it to develop the information required to prepare a drawing for the part

  8. Measuring Complexity of SAP Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-10-01

    Full Text Available The paper discusses the reasons of complexity rise in ERP system SAP R/3. It proposes a method for measuring complexity of SAP. Based on this method, the computer program in ABAP for measuring complexity of particular SAP implementation is proposed as a tool for keeping ERP complexity under control. The main principle of the measurement method is counting the number of items or relations in the system. The proposed computer program is based on counting of records in organization tables in SAP.

  9. Hierarchy Measure for Complex Networks

    Science.gov (United States)

    Mones, Enys; Vicsek, Lilla; Vicsek, Tamás

    2012-01-01

    Nature, technology and society are full of complexity arising from the intricate web of the interactions among the units of the related systems (e.g., proteins, computers, people). Consequently, one of the most successful recent approaches to capturing the fundamental features of the structure and dynamics of complex systems has been the investigation of the networks associated with the above units (nodes) together with their relations (edges). Most complex systems have an inherently hierarchical organization and, correspondingly, the networks behind them also exhibit hierarchical features. Indeed, several papers have been devoted to describing this essential aspect of networks, however, without resulting in a widely accepted, converging concept concerning the quantitative characterization of the level of their hierarchy. Here we develop an approach and propose a quantity (measure) which is simple enough to be widely applicable, reveals a number of universal features of the organization of real-world networks and, as we demonstrate, is capable of capturing the essential features of the structure and the degree of hierarchy in a complex network. The measure we introduce is based on a generalization of the m-reach centrality, which we first extend to directed/partially directed graphs. Then, we define the global reaching centrality (GRC), which is the difference between the maximum and the average value of the generalized reach centralities over the network. We investigate the behavior of the GRC considering both a synthetic model with an adjustable level of hierarchy and real networks. Results for real networks show that our hierarchy measure is related to the controllability of the given system. We also propose a visualization procedure for large complex networks that can be used to obtain an overall qualitative picture about the nature of their hierarchical structure. PMID:22470477

  10. Measuring distances between complex networks

    International Nuclear Information System (INIS)

    Andrade, Roberto F.S.; Miranda, Jose G.V.; Pinho, Suani T.R.; Lobao, Thierry Petit

    2008-01-01

    A previously introduced concept of higher order neighborhoods in complex networks, [R.F.S. Andrade, J.G.V. Miranda, T.P. Lobao, Phys. Rev. E 73 (2006) 046101] is used to define a distance between networks with the same number of nodes. With such measure, expressed in terms of the matrix elements of the neighborhood matrices of each network, it is possible to compare, in a quantitative way, how far apart in the space of neighborhood matrices two networks are. The distance between these matrices depends on both the network topologies and the adopted node numberings. While the numbering of one network is fixed, a Monte Carlo algorithm is used to find the best numbering of the other network, in the sense that it minimizes the distance between the matrices. The minimal value found for the distance reflects differences in the neighborhood structures of the two networks that arise only from distinct topologies. This procedure ends up by providing a projection of the first network on the pattern of the second one. Examples are worked out allowing for a quantitative comparison for distances among distinct networks, as well as among distinct realizations of random networks

  11. On Measuring the Complexity of Networks: Kolmogorov Complexity versus Entropy

    Directory of Open Access Journals (Sweden)

    Mikołaj Morzy

    2017-01-01

    Full Text Available One of the most popular methods of estimating the complexity of networks is to measure the entropy of network invariants, such as adjacency matrices or degree sequences. Unfortunately, entropy and all entropy-based information-theoretic measures have several vulnerabilities. These measures neither are independent of a particular representation of the network nor can capture the properties of the generative process, which produces the network. Instead, we advocate the use of the algorithmic entropy as the basis for complexity definition for networks. Algorithmic entropy (also known as Kolmogorov complexity or K-complexity for short evaluates the complexity of the description required for a lossless recreation of the network. This measure is not affected by a particular choice of network features and it does not depend on the method of network representation. We perform experiments on Shannon entropy and K-complexity for gradually evolving networks. The results of these experiments point to K-complexity as the more robust and reliable measure of network complexity. The original contribution of the paper includes the introduction of several new entropy-deceiving networks and the empirical comparison of entropy and K-complexity as fundamental quantities for constructing complexity measures for networks.

  12. Measurement methods on the complexity of network

    Institute of Scientific and Technical Information of China (English)

    LIN Lin; DING Gang; CHEN Guo-song

    2010-01-01

    Based on the size of network and the number of paths in the network,we proposed a model of topology complexity of a network to measure the topology complexity of the network.Based on the analyses of the effects of the number of the equipment,the types of equipment and the processing time of the node on the complexity of the network with the equipment-constrained,a complexity model of equipment-constrained network was constructed to measure the integrated complexity of the equipment-constrained network.The algorithms for the two models were also developed.An automatic generator of the random single label network was developed to test the models.The results show that the models can correctly evaluate the topology complexity and the integrated complexity of the networks.

  13. Minimal classical communication and measurement complexity for ...

    Indian Academy of Sciences (India)

    Minimal classical communication and measurement complexity for quantum ... Entanglement; teleportation; secret sharing; information splitting. ... Ahmedabad 380 009, India; Birla Institute of Technology and Science, Pilani 333 031, India ...

  14. A SVD Based Image Complexity Measure

    DEFF Research Database (Denmark)

    Gustafsson, David Karl John; Pedersen, Kim Steenstrup; Nielsen, Mads

    2009-01-01

    Images are composed of geometric structures and texture, and different image processing tools - such as denoising, segmentation and registration - are suitable for different types of image contents. Characterization of the image content in terms of geometric structure and texture is an important...... problem that one is often faced with. We propose a patch based complexity measure, based on how well the patch can be approximated using singular value decomposition. As such the image complexity is determined by the complexity of the patches. The concept is demonstrated on sequences from the newly...... collected DIKU Multi-Scale image database....

  15. Measuring Customer Profitability in Complex Environments

    DEFF Research Database (Denmark)

    Holm, Morten; Kumar, V.; Rohde, Carsten

    2012-01-01

    Customer profitability measurement is an important element in customer relationship management and a lever for enhanced marketing accountability. Two distinct measurement approaches have emerged in the marketing literature: Customer Lifetime Value (CLV) and Customer Profitability Analysis (CPA...... propositions. Additionally, the framework provides design and implementation guidance for managers seeking to implement customer profitability measurement models for resource allocation purposes....... that the degree of sophistication deployed when implementing customer profitability measurement models is determined by the type of complexity encountered in firms’ customer environments. This gives rise to a contingency framework for customer profitability measurement model selection and five research...

  16. Wind turbine wake measurement in complex terrain

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose; Larsen, Gunner Chr.; Menke, Robert

    2016-01-01

    SCADA data from a wind farm and high frequency time series measurements obtained with remote scanning systems have been analysed with focus on identification of wind turbine wake properties in complex terrain. The analysis indicates that within the flow regime characterized by medium to large dow...

  17. Measuring complexity in Brazilian economic crises.

    Directory of Open Access Journals (Sweden)

    Letícia P D Mortoza

    Full Text Available Capital flows are responsible for a strong influence on the foreign exchange rates and stock prices macroeconomic parameters. In volatile economies, capital flows can change due to several types of social, political and economic events, provoking oscillations on these parameters, which are recognized as economic crises. This work aims to investigate how these two macroeconomic variables are related with crisis events by using the traditional complex measures due to Lopez-Mancini-Calbet (LMC and to Shiner-Davison-Landsberg (SDL, that can be applied to any temporal series. Here, Ibovespa (Bovespa Stock Exchange main Index and the "dollar-real" parity are the background for calculating the LMC and SDL complexity measures. By analyzing the temporal evolution of these measures, it is shown that they might be related to important events that occurred in the Brazilian economy.

  18. Measuring complexity in Brazilian economic crises.

    Science.gov (United States)

    Mortoza, Letícia P D; Piqueira, José R C

    2017-01-01

    Capital flows are responsible for a strong influence on the foreign exchange rates and stock prices macroeconomic parameters. In volatile economies, capital flows can change due to several types of social, political and economic events, provoking oscillations on these parameters, which are recognized as economic crises. This work aims to investigate how these two macroeconomic variables are related with crisis events by using the traditional complex measures due to Lopez-Mancini-Calbet (LMC) and to Shiner-Davison-Landsberg (SDL), that can be applied to any temporal series. Here, Ibovespa (Bovespa Stock Exchange main Index) and the "dollar-real" parity are the background for calculating the LMC and SDL complexity measures. By analyzing the temporal evolution of these measures, it is shown that they might be related to important events that occurred in the Brazilian economy.

  19. Mesoscale meteorological measurements characterizing complex flows

    International Nuclear Information System (INIS)

    Hubbe, J.M.; Allwine, K.J.

    1993-09-01

    Meteorological measurements are an integral and essential component of any emergency response system for addressing accidental releases from nuclear facilities. An important element of the US Department of Energy's (DOE's) Atmospheric Studies in Complex Terrain (ASCOT) program is the refinement and use of state-of-the-art meteorological instrumentation. ASCOT is currently making use of ground-based remote wind sensing instruments such as doppler acoustic sounders (sodars). These instruments are capable of continuously and reliably measuring winds up to several hundred meters above the ground, unattended. Two sodars are currently measuring the winds, as part of ASCOT's Front Range Study, in the vicinity of DOE's Rocky Flats Plant (RFP) near Boulder, Colorado. A brief description of ASCOT's ongoing Front Range Study is given followed by a case study analysis that demonstrates the utility of the meteorological measurement equipment and the complexity of flow phenomena that are experienced near RFP. These complex flow phenomena can significantly influence the transport of the released material and consequently need to be identified for accurate assessments of the consequences of a release

  20. Measuring situation awareness in complex systems: Comparison of measures study

    OpenAIRE

    Salmon, PM; Stanton, NA; Walker, GH; Jenkins, DP; Ladva, D; Rafferty, L; Young, MS

    2008-01-01

    Situation Awareness (SA) is a distinct critical commodity for teams working in complex industrial systems and its measurement is a key provision in system, procedural and training design efforts. This article describes a study that was undertaken in order to compare three different SA measures (a freeze probe recall approach, a post trial subjective rating approach and a critical incident interview technique) when used to assess participant SA during a military planning task. The results indi...

  1. The step complexity measure for emergency operating procedures: measure verification

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Ha, Jaejoo; Park, Changkue

    2002-01-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. Therefore, to prevent an occurrence of accidents or to ensure system safety, extensive effort has been made to identify significant factors that can cause human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors, and the understandability is pointed out as one of the major reasons for procedure-related human errors. Many qualitative checklists are suggested to evaluate emergency operating procedures (EOPs) of NPPs. However, since qualitative evaluations using checklists have some drawbacks, a quantitative measure that can quantify the complexity of EOPs is very necessary to compensate for them. In order to quantify the complexity of steps included in EOPs, Park et al. suggested the step complexity (SC) measure. In addition, to ascertain the appropriateness of the SC measure, averaged step performance time data obtained from emergency training records for the loss of coolant accident and the excess steam dump event were compared with estimated SC scores. Although averaged step performance time data show good correlation with estimated SC scores, conclusions for some important issues that have to be clarified to ensure the appropriateness of the SC measure were not properly drawn because of lack of backup data. In this paper, to clarify remaining issues, additional activities to verify the appropriateness of the SC measure are performed using averaged step performance time data obtained from emergency training records. The total number of available records is 36, and training scenarios are the steam generator tube rupture and the loss of all feedwater. The number of scenarios is 18 each. From these emergency training records, averaged step performance time data for 30 steps are retrieved. As the results, the SC measure shows statistically meaningful

  2. Performance Measurement of Complex Event Platforms

    Directory of Open Access Journals (Sweden)

    Eva Zámečníková

    2016-12-01

    Full Text Available The aim of this paper is to find and compare existing solutions of complex event processing platforms (CEP. CEP platforms generally serve for processing and/or predicting of high frequency data. We intend to use CEP platform for processing of complex time series and integrate a solution for newly proposed method of decision making. The decision making process will be described by formal grammar. As there are lots of CEP solutions we will take the following characteristics under consideration - the processing in real time, possibility of processing of high volume data from multiple sources, platform independence, platform allowing integration with user solution and open license. At first we will talk about existing CEP tools and their specific way of use in praxis. Then we will mention the design of method for formalization of business rules used for decision making. Afterwards, we focus on two platforms which seem to be the best fit for integration of our solution and we will list the main pros and cons of each approach. Next part is devoted to benchmark platforms for CEP. Final part is devoted to experimental measurements of platform with integrated method for decision support.

  3. Measure of robustness for complex networks

    Science.gov (United States)

    Youssef, Mina Nabil

    Critical infrastructures are repeatedly attacked by external triggers causing tremendous amount of damages. Any infrastructure can be studied using the powerful theory of complex networks. A complex network is composed of extremely large number of different elements that exchange commodities providing significant services. The main functions of complex networks can be damaged by different types of attacks and failures that degrade the network performance. These attacks and failures are considered as disturbing dynamics, such as the spread of viruses in computer networks, the spread of epidemics in social networks, and the cascading failures in power grids. Depending on the network structure and the attack strength, every network differently suffers damages and performance degradation. Hence, quantifying the robustness of complex networks becomes an essential task. In this dissertation, new metrics are introduced to measure the robustness of technological and social networks with respect to the spread of epidemics, and the robustness of power grids with respect to cascading failures. First, we introduce a new metric called the Viral Conductance (VCSIS ) to assess the robustness of networks with respect to the spread of epidemics that are modeled through the susceptible/infected/susceptible (SIS) epidemic approach. In contrast to assessing the robustness of networks based on a classical metric, the epidemic threshold, the new metric integrates the fraction of infected nodes at steady state for all possible effective infection strengths. Through examples, VCSIS provides more insights about the robustness of networks than the epidemic threshold. In addition, both the paradoxical robustness of Barabasi-Albert preferential attachment networks and the effect of the topology on the steady state infection are studied, to show the importance of quantifying the robustness of networks. Second, a new metric VCSIR is introduced to assess the robustness of networks with respect

  4. Thermodynamic properties of citric acid and the system citric acid-water

    NARCIS (Netherlands)

    Kruif, C.G. de; Miltenburg, J.C. van; Sprenkels, A.J.J.; Stevens, G.; Graaf, W. de; Wit, H.G.M. de

    1982-01-01

    The binary system citric acid-water has been investigated with static vapour pressure measurements, adiabatic calorimetry, solution calorimetry, solubility measurements and powder X-ray measurements. The data are correlated by thermodynamics and a large part of the phase diagram is given. Molar heat

  5. Treatments of acid waters; Tratamientos pasivos de aguas acidas

    Energy Technology Data Exchange (ETDEWEB)

    Delgado Fernandez, J. L.

    2000-07-01

    The exploitation of coal mining locations causes acid effluents due to the oxidation of the sulfurous minerals content of the rocks, denominated acid waters. There are Pyritic materials, pyres and sulphates associated to acid waters that in presence of water, oxygen and certain bacteria (mainly Thiobacillus ferro oxidants), are oxidized, by means of a chemistry reaction, yielding different products. (Author)

  6. Complex Fuzzy Set-Valued Complex Fuzzy Measures and Their Properties

    Science.gov (United States)

    Ma, Shengquan; Li, Shenggang

    2014-01-01

    Let F*(K) be the set of all fuzzy complex numbers. In this paper some classical and measure-theoretical notions are extended to the case of complex fuzzy sets. They are fuzzy complex number-valued distance on F*(K), fuzzy complex number-valued measure on F*(K), and some related notions, such as null-additivity, pseudo-null-additivity, null-subtraction, pseudo-null-subtraction, autocontionuous from above, autocontionuous from below, and autocontinuity of the defined fuzzy complex number-valued measures. Properties of fuzzy complex number-valued measures are studied in detail. PMID:25093202

  7. Measuring Syntactic Complexity in Spontaneous Spoken Swedish

    Science.gov (United States)

    Roll, Mikael; Frid, Johan; Horne, Merle

    2007-01-01

    Hesitation disfluencies after phonetically prominent stranded function words are thought to reflect the cognitive coding of complex structures. Speech fragments following the Swedish function word "att" "that" were analyzed syntactically, and divided into two groups: one with "att" in disfluent contexts, and the other with "att" in fluent…

  8. Techniques to measure complex-plane fields

    CSIR Research Space (South Africa)

    Dudley, Angela L

    2014-09-25

    Full Text Available In this work we construct coherent superpositions of Gaussian and vortex modes which can be described to occupy the complex-plane. We demonstrate how these fields can be experimentally constructed in a digital, controllable manner with a spatial...

  9. Weak convergence to isotropic complex [Formula: see text] random measure.

    Science.gov (United States)

    Wang, Jun; Li, Yunmeng; Sang, Liheng

    2017-01-01

    In this paper, we prove that an isotropic complex symmetric α -stable random measure ([Formula: see text]) can be approximated by a complex process constructed by integrals based on the Poisson process with random intensity.

  10. Clinical Complexity in Medicine: A Measurement Model of Task and Patient Complexity.

    Science.gov (United States)

    Islam, R; Weir, C; Del Fiol, G

    2016-01-01

    Complexity in medicine needs to be reduced to simple components in a way that is comprehensible to researchers and clinicians. Few studies in the current literature propose a measurement model that addresses both task and patient complexity in medicine. The objective of this paper is to develop an integrated approach to understand and measure clinical complexity by incorporating both task and patient complexity components focusing on the infectious disease domain. The measurement model was adapted and modified for the healthcare domain. Three clinical infectious disease teams were observed, audio-recorded and transcribed. Each team included an infectious diseases expert, one infectious diseases fellow, one physician assistant and one pharmacy resident fellow. The transcripts were parsed and the authors independently coded complexity attributes. This baseline measurement model of clinical complexity was modified in an initial set of coding processes and further validated in a consensus-based iterative process that included several meetings and email discussions by three clinical experts from diverse backgrounds from the Department of Biomedical Informatics at the University of Utah. Inter-rater reliability was calculated using Cohen's kappa. The proposed clinical complexity model consists of two separate components. The first is a clinical task complexity model with 13 clinical complexity-contributing factors and 7 dimensions. The second is the patient complexity model with 11 complexity-contributing factors and 5 dimensions. The measurement model for complexity encompassing both task and patient complexity will be a valuable resource for future researchers and industry to measure and understand complexity in healthcare.

  11. Complexity analysis in particulate matter measurements

    Directory of Open Access Journals (Sweden)

    Luciano Telesca

    2011-09-01

    Full Text Available We investigated the complex temporal fluctuations of particulate matter data recorded in London area by using the Fisher-Shannon (FS information plane. In the FS plane the PM10 and PM2.5 data are aggregated in two different clusters, characterized by different degrees of order and organization. This results could be related to different sources of the particulate matter.

  12. Complex technique for materials hardness measurement

    Energy Technology Data Exchange (ETDEWEB)

    Krashchenko, V P; Oksametnaya, O B

    1984-01-01

    A review of existing methods of measurement of material hardness in national and foreign practice has been made. A necessity of improving the technique of material hardness measurement in a wide temperature range and insuring load change with indenting, continuity of imprint application, smooth changing of temperatures along a sample length, and deformation rate control has been noted.

  13. Unraveling chaotic attractors by complex networks and measurements of stock market complexity.

    Science.gov (United States)

    Cao, Hongduo; Li, Ying

    2014-03-01

    We present a novel method for measuring the complexity of a time series by unraveling a chaotic attractor modeled on complex networks. The complexity index R, which can potentially be exploited for prediction, has a similar meaning to the Kolmogorov complexity (calculated from the Lempel-Ziv complexity), and is an appropriate measure of a series' complexity. The proposed method is used to research the complexity of the world's major capital markets. None of these markets are completely random, and they have different degrees of complexity, both over the entire length of their time series and at a level of detail. However, developing markets differ significantly from mature markets. Specifically, the complexity of mature stock markets is stronger and more stable over time, whereas developing markets exhibit relatively low and unstable complexity over certain time periods, implying a stronger long-term price memory process.

  14. Unraveling chaotic attractors by complex networks and measurements of stock market complexity

    International Nuclear Information System (INIS)

    Cao, Hongduo; Li, Ying

    2014-01-01

    We present a novel method for measuring the complexity of a time series by unraveling a chaotic attractor modeled on complex networks. The complexity index R, which can potentially be exploited for prediction, has a similar meaning to the Kolmogorov complexity (calculated from the Lempel–Ziv complexity), and is an appropriate measure of a series' complexity. The proposed method is used to research the complexity of the world's major capital markets. None of these markets are completely random, and they have different degrees of complexity, both over the entire length of their time series and at a level of detail. However, developing markets differ significantly from mature markets. Specifically, the complexity of mature stock markets is stronger and more stable over time, whereas developing markets exhibit relatively low and unstable complexity over certain time periods, implying a stronger long-term price memory process

  15. Measuring the 'complexity'of sound

    Indian Academy of Sciences (India)

    Sounds in the natural environment form an important class of biologically relevant nonstationary signals. We propose a dynamic spectral measure to characterize the spectral dynamics of such non-stationary sound signals and classify them based on rate of change of spectral dynamics. We categorize sounds with slowly ...

  16. On the complexity of computing two nonlinearity measures

    DEFF Research Database (Denmark)

    Find, Magnus Gausdal

    2014-01-01

    We study the computational complexity of two Boolean nonlinearity measures: the nonlinearity and the multiplicative complexity. We show that if one-way functions exist, no algorithm can compute the multiplicative complexity in time 2O(n) given the truth table of length 2n, in fact under the same ...

  17. A Game Map Complexity Measure Based on Hamming Distance

    Science.gov (United States)

    Li, Yan; Su, Pan; Li, Wenliang

    With the booming of PC game market, Game AI has attracted more and more researches. The interesting and difficulty of a game are relative with the map used in game scenarios. Besides, the path-finding efficiency in a game is also impacted by the complexity of the used map. In this paper, a novel complexity measure based on Hamming distance, called the Hamming complexity, is introduced. This measure is able to estimate the complexity of binary tileworld. We experimentally demonstrated that Hamming complexity is highly relative with the efficiency of A* algorithm, and therefore it is a useful reference to the designer when developing a game map.

  18. Cognitive Agility Measurement in a Complex Environment

    Science.gov (United States)

    2017-04-01

    validated by the corresponding psychological tests in the experiment. 15 Chapter 3 – Results 3.1. Result Summary from Thesis #1 (An...experiment using psychological tests and a military decision computer game called Make Goal to attempt to measure cognitive agility in military leaders...NPS thesis students. This document discusses the experimental design and the results from one of those theses. 14. SUBJECT TERMS cognitive agility

  19. Solving Complex Problems: A Convergent Approach to Cognitive Load Measurement

    Science.gov (United States)

    Zheng, Robert; Cook, Anne

    2012-01-01

    The study challenged the current practices in cognitive load measurement involving complex problem solving by manipulating the presence of pictures in multiple rule-based problem-solving situations and examining the cognitive load resulting from both off-line and online measures associated with complex problem solving. Forty-eight participants…

  20. Measurement complexity of adherence to medication

    Directory of Open Access Journals (Sweden)

    Galato D

    2012-04-01

    Full Text Available Dayani Galato, Fabiana Schuelter-Trevisol, Anna Paula PiovezanMaster Program in Health Sciences, University of Southern Santa Catarina (Unisul Tubarão, Santa Catarina, BrazilAdherence to pharmacologic therapy is a major challenge for the rational use of medicines, particularly when it comes to antiretroviral drugs that require adherence to at least 95% of prescribed doses.1 Studies in this area are always important and contribute to medication adherence understanding, even though there is no reference test for measuring this. Recently, an article was published in this journal that proposes the determination of lamivudine plasma concentration to validate patient self-reported adherence to antiretroviral treatment.2 In that study, serum levels obtained after 3 hours of ingestion of the last dose of the drug were compared with patient reports that were classified into different levels of adherence, based on their recall of missed doses in the previous 7 days.It was hypothesized by the authors that the use of a biological marker for drug adherence was extremely important, given the relevance of the topic. However, we would like to draw attention to some points that may determine the success of the use of similar methods for this purpose. The formation of groups with similar anthropometric characteristics is relevant since the dose of lamivudine may have to be changed, depending, for example, on sex, weight, and age.3 Even information considered important by the authors of that study was not provided. There is a need for greater clarity on the eligibility criteria, especially with regard to the clinical stage of the disease, CD4 counts and viral load, associated diseases, and comorbidity, as well as the evaluation of kidney function and other medications used that can affect lamivudine pharmacokinetics.3View original paper by Minzi and colleagues

  1. Laser beam complex amplitude measurement by phase diversity.

    Science.gov (United States)

    Védrenne, Nicolas; Mugnier, Laurent M; Michau, Vincent; Velluet, Marie-Thérèse; Bierent, Rudolph

    2014-02-24

    The control of the optical quality of a laser beam requires a complex amplitude measurement able to deal with strong modulus variations and potentially highly perturbed wavefronts. The method proposed here consists in an extension of phase diversity to complex amplitude measurements that is effective for highly perturbed beams. Named camelot for Complex Amplitude MEasurement by a Likelihood Optimization Tool, it relies on the acquisition and processing of few images of the beam section taken along the optical path. The complex amplitude of the beam is retrieved from the images by the minimization of a Maximum a Posteriori error metric between the images and a model of the beam propagation. The analytical formalism of the method and its experimental validation are presented. The modulus of the beam is compared to a measurement of the beam profile, the phase of the beam is compared to a conventional phase diversity estimate. The precision of the experimental measurements is investigated by numerical simulations.

  2. Measuring the Complexity of Urban Form and Design

    OpenAIRE

    Boeing, Geoff

    2017-01-01

    Complex systems have become a popular lens for conceptualizing cities, and complexity has substantial implications for urban performance and resilience. This paper develops a typology of methods and measures for assessing the complexity of the built form at the scale of urban design. It extends quantitative methods from urban planning, network science, ecosystems studies, fractal geometry, and information theory to the physical urban form and the analysis of qualitative human experience. Metr...

  3. Measurement of complex permittivity of composite materials using waveguide method

    NARCIS (Netherlands)

    Tereshchenko, O.V.; Buesink, Frederik Johannes Karel; Leferink, Frank Bernardus Johannes

    2011-01-01

    Complex dielectric permittivity of 4 different composite materials has been measured using the transmissionline method. A waveguide fixture in L, S, C and X band was used for the measurements. Measurement accuracy is influenced by air gaps between test fixtures and the materials tested. One of the

  4. Statistical analysis of complex systems with nonclassical invariant measures

    KAUST Repository

    Fratalocchi, Andrea

    2011-01-01

    I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a

  5. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    Science.gov (United States)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  6. Confidence bounds of recurrence-based complexity measures

    International Nuclear Information System (INIS)

    Schinkel, Stefan; Marwan, N.; Dimigen, O.; Kurths, J.

    2009-01-01

    In the recent past, recurrence quantification analysis (RQA) has gained an increasing interest in various research areas. The complexity measures the RQA provides have been useful in describing and analysing a broad range of data. It is known to be rather robust to noise and nonstationarities. Yet, one key question in empirical research concerns the confidence bounds of measured data. In the present Letter we suggest a method for estimating the confidence bounds of recurrence-based complexity measures. We study the applicability of the suggested method with model and real-life data.

  7. A complex network-based importance measure for mechatronics systems

    Science.gov (United States)

    Wang, Yanhui; Bi, Lifeng; Lin, Shuai; Li, Man; Shi, Hao

    2017-01-01

    In view of the negative impact of functional dependency, this paper attempts to provide an alternative importance measure called Improved-PageRank (IPR) for measuring the importance of components in mechatronics systems. IPR is a meaningful extension of the centrality measures in complex network, which considers usage reliability of components and functional dependency between components to increase importance measures usefulness. Our work makes two important contributions. First, this paper integrates the literature of mechatronic architecture and complex networks theory to define component network. Second, based on the notion of component network, a meaningful IPR is brought into the identifying of important components. In addition, the IPR component importance measures, and an algorithm to perform stochastic ordering of components due to the time-varying nature of usage reliability of components and functional dependency between components, are illustrated with a component network of bogie system that consists of 27 components.

  8. Investigating dynamical complexity in the magnetosphere using various entropy measures

    Science.gov (United States)

    Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Kalimeri, Maria; Anastasiadis, Anastasios; Eftaxias, Konstantinos

    2009-09-01

    The complex system of the Earth's magnetosphere corresponds to an open spatially extended nonequilibrium (input-output) dynamical system. The nonextensive Tsallis entropy has been recently introduced as an appropriate information measure to investigate dynamical complexity in the magnetosphere. The method has been employed for analyzing Dst time series and gave promising results, detecting the complexity dissimilarity among different physiological and pathological magnetospheric states (i.e., prestorm activity and intense magnetic storms, respectively). This paper explores the applicability and effectiveness of a variety of computable entropy measures (e.g., block entropy, Kolmogorov entropy, T complexity, and approximate entropy) to the investigation of dynamical complexity in the magnetosphere. We show that as the magnetic storm approaches there is clear evidence of significant lower complexity in the magnetosphere. The observed higher degree of organization of the system agrees with that inferred previously, from an independent linear fractal spectral analysis based on wavelet transforms. This convergence between nonlinear and linear analyses provides a more reliable detection of the transition from the quiet time to the storm time magnetosphere, thus showing evidence that the occurrence of an intense magnetic storm is imminent. More precisely, we claim that our results suggest an important principle: significant complexity decrease and accession of persistency in Dst time series can be confirmed as the magnetic storm approaches, which can be used as diagnostic tools for the magnetospheric injury (global instability). Overall, approximate entropy and Tsallis entropy yield superior results for detecting dynamical complexity changes in the magnetosphere in comparison to the other entropy measures presented herein. Ultimately, the analysis tools developed in the course of this study for the treatment of Dst index can provide convenience for space weather

  9. Combining complexity measures of EEG data: multiplying measures reveal previously hidden information.

    Science.gov (United States)

    Burns, Thomas; Rajan, Ramesh

    2015-01-01

    Many studies have noted significant differences among human electroencephalograph (EEG) results when participants or patients are exposed to different stimuli, undertaking different tasks, or being affected by conditions such as epilepsy or Alzheimer's disease. Such studies often use only one or two measures of complexity and do not regularly justify their choice of measure beyond the fact that it has been used in previous studies. If more measures were added to such studies, however, more complete information might be found about these reported differences. Such information might be useful in confirming the existence or extent of such differences, or in understanding their physiological bases. In this study we analysed publically-available EEG data using a range of complexity measures to determine how well the measures correlated with one another. The complexity measures did not all significantly correlate, suggesting that different measures were measuring unique features of the EEG signals and thus revealing information which other measures were unable to detect. Therefore, the results from this analysis suggests that combinations of complexity measures reveal unique information which is in addition to the information captured by other measures of complexity in EEG data. For this reason, researchers using individual complexity measures for EEG data should consider using combinations of measures to more completely account for any differences they observe and to ensure the robustness of any relationships identified.

  10. The Generalization Complexity Measure for Continuous Input Data

    Directory of Open Access Journals (Sweden)

    Iván Gómez

    2014-01-01

    defined in Boolean space, quantifies the complexity of data in relationship to the prediction accuracy that can be expected when using a supervised classifier like a neural network, SVM, and so forth. We first extend the original measure for its use with continuous functions to later on, using an approach based on the use of the set of Walsh functions, consider the case of having a finite number of data points (inputs/outputs pairs, that is, usually the practical case. Using a set of trigonometric functions a model that gives a relationship between the size of the hidden layer of a neural network and the complexity is constructed. Finally, we demonstrate the application of the introduced complexity measure, by using the generated model, to the problem of estimating an adequate neural network architecture for real-world data sets.

  11. Measuring viscosity with a levitating magnet: application to complex fluids

    International Nuclear Information System (INIS)

    Even, C; Bouquet, F; Deloche, B; Remond, J

    2009-01-01

    As an experimental project proposed to students in fourth year of university, a viscometer was developed, consisting of a small magnet levitating in a viscous fluid. The viscous force acting on the magnet is directly measured: viscosities in the range 10-10 6 mPa s are obtained. This experiment is used as an introduction to complex fluids and soft matter physics

  12. Basing of a complex design measures for protection against fire

    International Nuclear Information System (INIS)

    Kryuger, V.

    1983-01-01

    Fire impact on NPP radiation safety is analyzed. The general industry requirements to the protection system against fire are shown to be insufficient for NPPs. A complex of protection measures against fire is suggested that should be taken into account in the NPP designs [ru

  13. Measuring the Complexity of Self-Organizing Traffic Lights

    Directory of Open Access Journals (Sweden)

    Darío Zubillaga

    2014-04-01

    Full Text Available We apply measures of complexity, emergence, and self-organization to an urban traffic model for comparing a traditional traffic-light coordination method with a self-organizing method in two scenarios: cyclic boundaries and non-orientable boundaries. We show that the measures are useful to identify and characterize different dynamical phases. It becomes clear that different operation regimes are required for different traffic demands. Thus, not only is traffic a non-stationary problem, requiring controllers to adapt constantly; controllers must also change drastically the complexity of their behavior depending on the demand. Based on our measures and extending Ashby’s law of requisite variety, we can say that the self-organizing method achieves an adaptability level comparable to that of a living system.

  14. A generalized complexity measure based on Rényi entropy

    Science.gov (United States)

    Sánchez-Moreno, Pablo; Angulo, Juan Carlos; Dehesa, Jesus S.

    2014-08-01

    The intrinsic statistical complexities of finite many-particle systems (i.e., those defined in terms of the single-particle density) quantify the degree of structure or patterns, far beyond the entropy measures. They are intuitively constructed to be minima at the opposite extremes of perfect order and maximal randomness. Starting from the pioneering LMC measure, which satisfies these requirements, some extensions of LMC-Rényi type have been published in the literature. The latter measures were shown to describe a variety of physical aspects of the internal disorder in atomic and molecular systems (e.g., quantum phase transitions, atomic shell filling) which are not grasped by their mother LMC quantity. However, they are not minimal for maximal randomness in general. In this communication, we propose a generalized LMC-Rényi complexity which overcomes this problem. Some applications which illustrate this fact are given.

  15. Complex Susceptibility Measurement Using Multi-frequency Slingram EMI Instrument

    OpenAIRE

    Simon , François Xavier; Tabbagh , Alain; Thiesson , Julien; Donati , J.C.; Sarris , A.

    2014-01-01

    International audience; Complex magnetic susceptibility is a well-known property both theoretically and experimentally. To achieve this measurement, different ways have been tested, like TDEM or multi-frequential measurement on soil sample. In this study we carry out the measurements by the use of a multi-frequential EMI Slingram instrument to collect data quickly and in-situ. The use of multi-frequency data is also a way to correct effects of the conductivity on the in-phase component and ef...

  16. Self-dissimilarity as a High Dimensional Complexity Measure

    Science.gov (United States)

    Wolpert, David H.; Macready, William

    2005-01-01

    For many systems characterized as "complex" the patterns exhibited on different scales differ markedly from one another. For example the biomass distribution in a human body "looks very different" depending on the scale at which one examines it. Conversely, the patterns at different scales in "simple" systems (e.g., gases, mountains, crystals) vary little from one scale to another. Accordingly, the degrees of self-dissimilarity between the patterns of a system at various scales constitute a complexity "signature" of that system. Here we present a novel quantification of self-dissimilarity. This signature can, if desired, incorporate a novel information-theoretic measure of the distance between probability distributions that we derive here. Whatever distance measure is chosen, our quantification of self-dissimilarity can be measured for many kinds of real-world data. This allows comparisons of the complexity signatures of wholly different kinds of systems (e.g., systems involving information density in a digital computer vs. species densities in a rain-forest vs. capital density in an economy, etc.). Moreover, in contrast to many other suggested complexity measures, evaluating the self-dissimilarity of a system does not require one to already have a model of the system. These facts may allow self-dissimilarity signatures to be used a s the underlying observational variables of an eventual overarching theory relating all complex systems. To illustrate self-dissimilarity we present several numerical experiments. In particular, we show that underlying structure of the logistic map is picked out by the self-dissimilarity signature of time series produced by that map

  17. A Proposal for Cardiac Arrhythmia Classification using Complexity Measures

    Directory of Open Access Journals (Sweden)

    AROTARITEI, D.

    2017-08-01

    Full Text Available Cardiovascular diseases are one of the major problems of humanity and therefore one of their component, arrhythmia detection and classification drawn an increased attention worldwide. The presence of randomness in discrete time series, like those arising in electrophysiology, is firmly connected with computational complexity measure. This connection can be used, for instance, in the analysis of RR-intervals of electrocardiographic (ECG signal, coded as binary string, to detect and classify arrhythmia. Our approach uses three algorithms (Lempel-Ziv, Sample Entropy and T-Code to compute the information complexity applied and a classification tree to detect 13 types of arrhythmia with encouraging results. To overcome the computational effort required for complexity calculus, a cloud computing solution with executable code deployment is also proposed.

  18. The step complexity measure - its meaning and applications

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea; Kim, Jae Whan; Ha, Jae Joo

    2003-01-01

    According to related studies, it was revealed that the procedural deviation plays a significant role in initiating accidents or incidents. This means that, to maximize safety, it is indispensable to be able to answer the question of 'why the operators deviate from procedures?' In this study, the SC(Step Complexity) measure is introduced to investigate its applicability for studying the procedural deviation, since it was shown that the change of the operators' performance is strongly correlated with the change of SC scores. This means that the SC measure could play an important role for researches related to the procedural deviation, since it is strongly believed that complicated procedures would affect both the operators' performance and the possibility of the procedural deviation. Thus, to ensure this expectation, the meaning of the SC measure is investigated through brief explanations including the necessity, theoretical basis and verification activities of the SC measure. As a result, it is confirmed that the SC measure can be used to explain the change of the operators' performance due to the task complexity implied by procedures. In addition, it seems that the SC measure may be useful for various purposes, particularly for scrutinizing the relationship between the procedural deviation and complicated procedures

  19. Statistical analysis of complex systems with nonclassical invariant measures

    KAUST Repository

    Fratalocchi, Andrea

    2011-02-28

    I investigate the problem of finding a statistical description of a complex many-body system whose invariant measure cannot be constructed stemming from classical thermodynamics ensembles. By taking solitons as a reference system and by employing a general formalism based on the Ablowitz-Kaup-Newell-Segur scheme, I demonstrate how to build an invariant measure and, within a one-dimensional phase space, how to develop a suitable thermodynamics. A detailed example is provided with a universal model of wave propagation, with reference to a transparent potential sustaining gray solitons. The system shows a rich thermodynamic scenario, with a free-energy landscape supporting phase transitions and controllable emergent properties. I finally discuss the origin of such behavior, trying to identify common denominators in the area of complex dynamics.

  20. Measuring cognitive load: performance, mental effort and simulation task complexity.

    Science.gov (United States)

    Haji, Faizal A; Rojas, David; Childs, Ruth; de Ribaupierre, Sandrine; Dubrowski, Adam

    2015-08-01

    Interest in applying cognitive load theory in health care simulation is growing. This line of inquiry requires measures that are sensitive to changes in cognitive load arising from different instructional designs. Recently, mental effort ratings and secondary task performance have shown promise as measures of cognitive load in health care simulation. We investigate the sensitivity of these measures to predicted differences in intrinsic load arising from variations in task complexity and learner expertise during simulation-based surgical skills training. We randomly assigned 28 novice medical students to simulation training on a simple or complex surgical knot-tying task. Participants completed 13 practice trials, interspersed with computer-based video instruction. On trials 1, 5, 9 and 13, knot-tying performance was assessed using time and movement efficiency measures, and cognitive load was assessed using subjective rating of mental effort (SRME) and simple reaction time (SRT) on a vibrotactile stimulus-monitoring secondary task. Significant improvements in knot-tying performance (F(1.04,24.95)  = 41.1, p cognitive load (F(2.3,58.5)  = 57.7, p load among novices engaged in simulation-based learning. These measures can be used to track cognitive load during skills training. Mental effort ratings are also sensitive to small differences in intrinsic load arising from variations in the physical complexity of a simulation task. The complementary nature of these subjective and objective measures suggests their combined use is advantageous in simulation instructional design research. © 2015 John Wiley & Sons Ltd.

  1. On the extension of Importance Measures to complex components

    International Nuclear Information System (INIS)

    Dutuit, Yves; Rauzy, Antoine

    2015-01-01

    Importance Measures are indicators of the risk significance of the components of a system. They are widely used in various applications of Probabilistic Safety Analyses, off-line and on-line, in decision making for preventive and corrective purposes, as well as to rank components according to their contribution to the global risk. They are primarily defined for the case the support model is a coherent fault tree and failures of components are described by basic events of this fault tree. In this article, we study their extension to complex components, i.e. components whose failures are modeled by a gate rather than just a basic event. Although quite natural, such an extension has not received much attention in the literature. We show that it raises a number of problems. The Birnbaum Importance Measure and the notion of Critical States concentrate these difficulties. We present alternative solutions for the extension of these notions. We discuss their respective advantages and drawbacks. This article gives a new point of view on the mathematical foundations of Importance Measures and helps us to clarify their physical meaning. - Highlights: • We propose an extension of Importance Measures to complex components. • We define our extension in term minterms, i.e. states of the system. • We discuss the physical interpretation of Importance Measures in light of this interpretation

  2. Reliability of surface EMG measurements from the suprahyoid muscle complex

    DEFF Research Database (Denmark)

    Kothari, Mohit; Stubbs, Peter William; Pedersen, Asger Roer

    2017-01-01

    of using the suprahyoid muscle complex (SMC) using surface electromyography (sEMG) to assess changes to neural pathways by determining the reliability of measurements in healthy participants over days. Methods: Seventeen healthy participants were recruited. Measurements were performed twice with one week...... on stimulus type/intensity) had significantly different MEP values between day 1 and day 2 for single pulse and paired pulse TMS. A large stimulus artefact resulted in MEP responses that could not be assessed in four participants. Conclusions: The assessment of the SMC using sEMG following TMS was poorly...... reliable for ≈50% of participants. Although using sEMG to assess swallowing musculature function is easier to perform clinically and more comfortable to patients than invasive measures, as the measurement of muscle activity using TMS is unreliable, the use of sEMG for this muscle group is not recommended...

  3. Step Complexity Measure for Emergency Operating Procedures - Determining Weighting Factors

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Kim, Jaewhan; Ha, Jaejoo

    2003-01-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human error has been regarded as the primary cause of many events. Therefore, to ensure system safety, extensive effort has been made to identify the significant factors that can cause human error. According to related studies, written manuals or operating procedures are revealed as one of the important factors, and the understandability is pointed out as one of the major reasons for procedure-related human errors.Many qualitative checklists have been suggested to evaluate emergency operating procedures (EOPs) of NPPs so as to minimize procedure-related human errors. However, since qualitative evaluations using checklists have some drawbacks, a quantitative measure that can quantify the complexity of EOPs is indispensable.From this necessity, Park et al. suggested the step complexity (SC) measure to quantify the complexity of procedural steps included in EOPs. To verify the appropriateness of the SC measure, averaged step performance time data obtained from emergency training records of the loss-of-coolant accident (LOCA) and the excess steam demand event were compared with estimated SC scores. However, although averaged step performance time data and estimated SC scores show meaningful correlation, some important issues such as determining proper weighting factors have to be clarified to ensure the appropriateness of the SC measure. These were not properly dealt with due to a lack of backup data.In this paper, to resolve one of the important issues, emergency training records are additionally collected and analyzed in order to determine proper weighting factors. The total number of collected records is 66, and the training scenarios cover five emergency conditions including the LOCA, the steam generator tube rupture, the loss of all feedwater, the loss of off-site power, and the station blackout. From these records, average step performance time data are retrieved, and new

  4. Compositional segmentation and complexity measurement in stock indices

    Science.gov (United States)

    Wang, Haifeng; Shang, Pengjian; Xia, Jianan

    2016-01-01

    In this paper, we introduce a complexity measure based on the entropic segmentation called sequence compositional complexity (SCC) into the analysis of financial time series. SCC was first used to deal directly with the complex heterogeneity in nonstationary DNA sequences. We already know that SCC was found to be higher in sequences with long-range correlation than those with low long-range correlation, especially in the DNA sequences. Now, we introduce this method into financial index data, subsequently, we find that the values of SCC of some mature stock indices, such as S & P 500 (simplified with S & P in the following) and HSI, are likely to be lower than the SCC value of Chinese index data (such as SSE). What is more, we find that, if we classify the indices with the method of SCC, the financial market of Hong Kong has more similarities with mature foreign markets than Chinese ones. So we believe that a good correspondence is found between the SCC of the index sequence and the complexity of the market involved.

  5. Determination of complex microcalorimeter parameters with impedance measurements

    International Nuclear Information System (INIS)

    Saab, T.; Bandler, S.R.; Chervenak, J.; Figueroa-Feliciano, E.; Finkbeiner, F.; Iyomoto, N.; Kelley, R.L.; Kilbourne, C.A.; Lindeman, M.A.; Porter, F.S.; Sadleir, J.

    2006-01-01

    The proper understanding and modeling of a microcalorimeter's response requires accurate knowledge of a handful of parameters, such as C, G, α. While a few of these parameters are directly determined from the IV characteristics, some others, notoriously the heat capacity (C) and α, appear in degenerate combinations in most measurable quantities. The consideration of a complex microcalorimeter leads to an added ambiguity in the determination of the parameters. In general, the dependence of the microcalorimeter's complex impedance on these various parameters varies with frequency. This dependence allows us to determine individual parameters by fitting the prediction of the microcalorimeter model to impedance data. In this paper we describe efforts at characterizing the Goddard X-ray microcalorimeters. With the parameters determined by this method, we compare the pulse shape and noise spectra predictions to data taken with the same devices

  6. Simultaneous Rheoelectric Measurements of Strongly Conductive Complex Fluids

    Science.gov (United States)

    Helal, Ahmed; Divoux, Thibaut; McKinley, Gareth H.

    2016-12-01

    We introduce an modular fixture designed for stress-controlled rheometers to perform simultaneous rheological and electrical measurements on strongly conductive complex fluids under shear. By means of a nontoxic liquid metal at room temperature, the electrical connection to the rotating shaft is completed with minimal additional mechanical friction, allowing for simultaneous stress measurements at values as low as 1 Pa. Motivated by applications such as flow batteries, we use the capabilities of this design to perform an extensive set of rheoelectric experiments on gels formulated from attractive carbon-black particles, at concentrations ranging from 4 to 15 wt %. First, experiments on gels at rest prepared with different shear histories show a robust power-law scaling between the elastic modulus G0' and the conductivity σ0 of the gels—i.e., G0'˜σ0α, with α =1.65 ±0.04 , regardless of the gel concentration. Second, we report conductivity measurements performed simultaneously with creep experiments. Changes in conductivity in the early stage of the experiments, also known as the Andrade-creep regime, reveal for the first time that plastic events take place in the bulk, while the shear rate γ ˙ decreases as a weak power law of time. The subsequent evolution of the conductivity and the shear rate allows us to propose a local yielding scenario that is in agreement with previous velocimetry measurements. Finally, to establish a set of benchmark data, we determine the constitutive rheological and electrical behavior of carbon-black gels. Corrections first introduced for mechanical measurements regarding shear inhomogeneity and wall slip are carefully extended to electrical measurements to accurately distinguish between bulk and surface contributions to the conductivity. As an illustrative example, we examine the constitutive rheoelectric properties of five different grades of carbon-black gels and we demonstrate the relevance of this rheoelectric apparatus as a

  7. Entropies from Markov Models as Complexity Measures of Embedded Attractors

    Directory of Open Access Journals (Sweden)

    Julián D. Arias-Londoño

    2015-06-01

    Full Text Available This paper addresses the problem of measuring complexity from embedded attractors as a way to characterize changes in the dynamical behavior of different types of systems with a quasi-periodic behavior by observing their outputs. With the aim of measuring the stability of the trajectories of the attractor along time, this paper proposes three new estimations of entropy that are derived from a Markov model of the embedded attractor. The proposed estimators are compared with traditional nonparametric entropy measures, such as approximate entropy, sample entropy and fuzzy entropy, which only take into account the spatial dimension of the trajectory. The method proposes the use of an unsupervised algorithm to find the principal curve, which is considered as the “profile trajectory”, that will serve to adjust the Markov model. The new entropy measures are evaluated using three synthetic experiments and three datasets of physiological signals. In terms of consistency and discrimination capabilities, the results show that the proposed measures perform better than the other entropy measures used for comparison purposes.

  8. Upper bounds on quantum uncertainty products and complexity measures

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero, Angel; Sanchez-Moreno, Pablo; Dehesa, Jesus S. [Department of Atomic, Molecular and Nuclear Physics, University of Granada, Granada (Spain); Department of Applied Mathematics, University of Granada, Granada (Spain) and Institute Carlos I for Computational and Theoretical Physics, University of Granada, Granada (Spain); Department of Atomic, Molecular and Nuclear Physics, University of Granada, Granada (Spain); Institute Carlos I for Computational and Theoretical Physics, University of Granada, Granada (Spain)

    2011-10-15

    The position-momentum Shannon and Renyi uncertainty products of general quantum systems are shown to be bounded not only from below (through the known uncertainty relations), but also from above in terms of the Heisenberg-Kennard product . Moreover, the Cramer-Rao, Fisher-Shannon, and Lopez-Ruiz, Mancini, and Calbet shape measures of complexity (whose lower bounds have been recently found) are also bounded from above. The improvement of these bounds for systems subject to spherically symmetric potentials is also explicitly given. Finally, applications to hydrogenic and oscillator-like systems are done.

  9. Multifractality as a Measure of Complexity in Solar Flare Activity

    Science.gov (United States)

    Sen, Asok K.

    2007-03-01

    In this paper we use the notion of multifractality to describe the complexity in H α flare activity during the solar cycles 21, 22, and 23. Both northern and southern hemisphere flare indices are analyzed. Multifractal behavior of the flare activity is characterized by calculating the singularity spectrum of the daily flare index time series in terms of the Hölder exponent. The broadness of the singularity spectrum gives a measure of the degree of multifractality or complexity in the flare index data. The broader the spectrum, the richer and more complex is the structure with a higher degree of multifractality. Using this broadness measure, complexity in the flare index data is compared between the northern and southern hemispheres in each of the three cycles, and among the three cycles in each of the two hemispheres. Other parameters of the singularity spectrum can also provide information about the fractal properties of the flare index data. For instance, an asymmetry to the left or right in the singularity spectrum indicates a dominance of high or low fractal exponents, respectively, reflecting a relative abundance of large or small fluctuations in the total energy emitted by the flares. Our results reveal that in the even (22nd) cycle the singularity spectra are very similar for the northern and southern hemispheres, whereas in the odd cycles (21st and 23rd) they differ significantly. In particular, we find that in cycle 21, the northern hemisphere flare index data have higher complexity than its southern counterpart, with an opposite pattern prevailing in cycle 23. Furthermore, small-scale fluctuations in the flare index time series are predominant in the northern hemisphere in the 21st cycle and are predominant in the southern hemisphere in the 23rd cycle. Based on these findings one might suggest that, from cycle to cycle, there exists a smooth switching between the northern and southern hemispheres in the multifractality of the flaring process. This new

  10. Wind Ressources in Complex Terrain investigated with Synchronized Lidar Measurements

    Science.gov (United States)

    Mann, J.; Menke, R.; Vasiljevic, N.

    2017-12-01

    The Perdigao experiment was performed by a number of European and American universities in Portugal 2017, and it is probably the largest field campaign focussing on wind energy ressources in complex terrain ever conducted. 186 sonic anemometers on 50 masts, 20 scanning wind lidars and a host of other instruments were deployed. The experiment is a part of an effort to make a new European wind atlas. In this presentation we investigate whether scanning the wind speed over ridges in this complex terrain with multiple Doppler lidars can lead to an efficient mapping of the wind resources at relevant positions. We do that by having pairs of Doppler lidars scanning 80 m above the ridges in Perdigao. We compare wind resources obtained from the lidars and from the mast-mounted sonic anemometers at 80 m on two 100 m masts, one on each of the two ridges. In addition, the scanning lidar measurements are also compared to profiling lidars on the ridges. We take into account the fact that the profiling lidars may be biased due to the curvature of the streamlines over the instrument, see Bingol et al, Meteorolog. Z. vol. 18, pp. 189-195 (2009). We also investigate the impact of interruptions of the lidar measurements on the estimated wind resource. We calculate the relative differences of wind along the ridge from the lidar measurements and compare those to the same obtained from various micro-scale models. A particular subject investigated is how stability affects the wind resources. We often observe internal gravity waves with the scanning lidars during the night and we quantify how these affect the relative wind speed on the ridges.

  11. Measuring complexity with multifractals in texts. Translation effects

    International Nuclear Information System (INIS)

    Ausloos, M.

    2012-01-01

    Highlights: ► Two texts in English and one in Esperanto are transformed into 6 time series. ► D(q) and f(alpha) of such (and shuffled) time series are obtained. ► A model for text construction is presented based on a parametrized Cantor set. ► The model parameters can also be used when examining machine translated texts. ► Suggested extensions to higher dimensions: in 2D image analysis and on hypertexts. - Abstract: Should quality be almost a synonymous of complexity? To measure quality appears to be audacious, even very subjective. It is hereby proposed to use a multifractal approach in order to quantify quality, thus through complexity measures. A one-dimensional system is examined. It is known that (all) written texts can be one-dimensional nonlinear maps. Thus, several written texts by the same author are considered, together with their translation, into an unusual language, Esperanto, and asa baseline their corresponding shuffled versions. Different one-dimensional time series can be used: e.g. (i) one based on word lengths, (ii) the other based on word frequencies; both are used for studying, comparing and discussing the map structure. It is shown that a variety in style can be measured through the D(q) and f(α) curves characterizing multifractal objects. This allows to observe on the one hand whether natural and artificial languages significantly influence the writing and the translation, and whether one author’s texts differ technically from each other. In fact, the f(α) curves of the original texts are similar to each other, but the translated text shows marked differences. However in each case, the f(α) curves are far from being parabolic, – in contrast to the shuffled texts. Moreover, the Esperanto text has more extreme values. Criteria are thereby suggested for estimating a text quality, as if it is a time series only. A model is introduced in order to substantiate the findings: it consists in considering a text as a random Cantor set

  12. Measuring pair-wise molecular interactions in a complex mixture

    Science.gov (United States)

    Chakraborty, Krishnendu; Varma, Manoj M.; Venkatapathi, Murugesan

    2016-03-01

    Complex biological samples such as serum contain thousands of proteins and other molecules spanning up to 13 orders of magnitude in concentration. Present measurement techniques do not permit the analysis of all pair-wise interactions between the components of such a complex mixture to a given target molecule. In this work we explore the use of nanoparticle tags which encode the identity of the molecule to obtain the statistical distribution of pair-wise interactions using their Localized Surface Plasmon Resonance (LSPR) signals. The nanoparticle tags are chosen such that the binding between two molecules conjugated to the respective nanoparticle tags can be recognized by the coupling of their LSPR signals. This numerical simulation is done by DDA to investigate this approach using a reduced system consisting of three nanoparticles (a gold ellipsoid with aspect ratio 2.5 and short axis 16 nm, and two silver ellipsoids with aspect ratios 3 and 2 and short axes 8 nm and 10 nm respectively) and the set of all possible dimers formed between them. Incident light was circularly polarized and all possible particle and dimer orientations were considered. We observed that minimum peak separation between two spectra is 5 nm while maximum is 184nm.

  13. Measuring the complex behavior of the SO2 oxidation reaction

    Directory of Open Access Journals (Sweden)

    Muhammad Shahzad

    2015-09-01

    Full Text Available The two step reversible chemical reaction involving five chemical species is investigated. The quasi equilibrium manifold (QEM and spectral quasi equilibrium manifold (SQEM are used for initial approximation to simplify the mechanisms, which we want to utilize in order to investigate the behavior of the desired species. They show a meaningful picture, but for maximum clarity, the investigation method of invariant grid (MIG is employed. These methods simplify the complex chemical kinetics and deduce low dimensional manifold (LDM from the high dimensional mechanism. The coverage of the species near equilibrium point is investigated and then we shall discuss moving along the equilibrium of ODEs. The steady state behavior is observed and the Lyapunov function is utilized to study the stability of ODEs. Graphical results are used to describe the physical aspects of measurements.

  14. Measurement and Statistics of Application Business in Complex Internet

    Science.gov (United States)

    Wang, Lei; Li, Yang; Li, Yipeng; Wu, Shuhang; Song, Shiji; Ren, Yong

    Owing to independent topologies and autonomic routing mechanism, the logical networks formed by Internet application business behavior cause the significant influence on the physical networks. In this paper, the backbone traffic of TUNET (Tsinghua University Networks) is measured, further more, the two most important application business: HTTP and P2P are analyzed at IP-packet level. It is shown that uplink HTTP and P2P packets behavior presents spatio-temporal power-law characteristics with exponents 1.25 and 1.53 respectively. Downlink HTTP packets behavior also presents power-law characteristics, but has more little exponents γ = 0.82 which differs from traditional complex networks research result. Moreover, downlink P2P packets distribution presents an approximate power-law which means that flow equilibrium profits little from distributed peer-to peer mechanism actually.

  15. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  16. A measurement system for large, complex software programs

    Science.gov (United States)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  17. Analyzing complex networks through correlations in centrality measurements

    International Nuclear Information System (INIS)

    Ricardo Furlan Ronqui, José; Travieso, Gonzalo

    2015-01-01

    Many real world systems can be expressed as complex networks of interconnected nodes. It is frequently important to be able to quantify the relative importance of the various nodes in the network, a task accomplished by defining some centrality measures, with different centrality definitions stressing different aspects of the network. It is interesting to know to what extent these different centrality definitions are related for different networks. In this work, we study the correlation between pairs of a set of centrality measures for different real world networks and two network models. We show that the centralities are in general correlated, but with stronger correlations for network models than for real networks. We also show that the strength of the correlation of each pair of centralities varies from network to network. Taking this fact into account, we propose the use of a centrality correlation profile, consisting of the values of the correlation coefficients between all pairs of centralities of interest, as a way to characterize networks. Using the yeast protein interaction network as an example we show also that the centrality correlation profile can be used to assess the adequacy of a network model as a representation of a given real network. (paper)

  18. Methodology for Measuring the Complexity of Enterprise Information Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  19. TDR measurements looking for complex dielectric permittivity and complex magnetic permeability in lossy materials

    Science.gov (United States)

    Persico, Raffaele

    2017-04-01

    TDR probes can be exploited for the measure of the electromagnetic characteristics of the soil, or of any penetrable material. They are commonly exploited as instruments for the measure of the propagation velocity of the electromagnetic waves in the probed medium [1], in its turn useful for the proper focusing of GPR data [2-5]. However, a more refined hardware and processing can allow to extrapolate from these probes also the discrimination between dielectric and magnetic characteristics of the material under test, which can be relevant for a better interpretation of the buried scenario or in order to infer physical-chemical characteristics of the material at hand. This requires a TDR probe that can work in frequency domain, and in particular that allows to retrieve the reflection coefficient at the air soil interface. It has been already shown [6] that in lossless cases this can be promising. In the present contribution, it will be shown at the EGU conference that it is possible to look for both the relative complex permittivity and the relative magnetic permeability of the probed material, on condition that the datum has an acceptable SNR and that some diversity of information is guaranteed, either by multifrequency data or by a TDR that can prolong its arms in the soil. References [1] F. Soldovieri, G. Prisco, R. Persico, Application of Microwave Tomography in Hydrogeophysics: some examples, Vadose Zone Journal, vol. 7, n. 1 pp. 160-170, Feb. 2008. [2] I. Catapano, L. Crocco, R. Persico, M. Pieraccini, F. Soldovieri, "Linear and Nonlinear Microwave Tomography Approaches for Subsurface Prospecting: Validation on Real Data", IEEE Trans. on Antennas and Wireless Propagation Letters, vol. 5, pp. 49-53, 2006. [3] G. Leucci, N. Masini, R. Persico, F. Soldovieri." GPR and sonic tomography for structural restoration : the case of the Cathedral of Tricarico", Journal of Geophysics and Engineering, vol. 8, pp. S76-S92, Aug. 2011. [4] S. Piscitelli, E. Rizzo, F. Cristallo

  20. Dynamic portfolio managment based on complex quantile risk measures

    Directory of Open Access Journals (Sweden)

    Ekaterina V. Tulupova

    2011-05-01

    Full Text Available The article focuses on effectiveness evaluation combined measures of financial risks, which are convex combinations of measures VaR, CVaR and their analogues for the right distribution tail functions of a portfolio returns.

  1. Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR and Information Entropy

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2015-04-01

    Full Text Available Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL, Shili (SL and Renli (RL, so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.

  2. Escherichia coli pyruvate dehydrogenase complex: particle masses of the complex and component enzymes measured by scanning transmission electron microscopy

    International Nuclear Information System (INIS)

    CaJacob, C.A.; Frey, P.A.; Hainfeld, J.F.; Wall, J.S.; Yang, H.

    1985-01-01

    Particle masses of the Escherichia coli pyruvate dehydrogenase (PDH) complex and its component enzymes have been measured by scanning transmission electron microscopy (STEM). The particle mass of PDH complex measured by STEM is 5.28 X 10(6) with a standard deviation of 0.40 X 10(6). The masses of the component enzymes are 2.06 X 10(5) for the dimeric pyruvate dehydrogenase (E1), 1.15 X 10(5) for dimeric dihydrolipoyl dehydrogenase (E3), and 2.20 X 10(6) for dihydrolipoyl transacetylase (E2), the 24-subunit core enzyme. STEM measurements on PDH complex incubated with excess E3 or E1 failed to detect any additional binding of E3 but showed that the complex would bind additional E1 under forcing conditions. The additional E1 subunits were bound too weakly to represent binding sites in an isolated or isolable complex. The mass measurements by STEM are consistent with the subunit composition 24:24:12 when interpreted in the light of the flavin content of the complex and assuming 24 subunits in the core enzyme (E2)

  3. Complexity and Chaos - State-of-the-Art; Formulations and Measures of Complexity

    Science.gov (United States)

    2007-09-01

    196] GREEN , DG. Syntactic modelling and simulation. Simulation, 1990, 54, 281-286. [197] GREEN , DG. Emergent Behaviour in Biological...Systems. In GREEN , DG; BOSSOMAIER, TJ (Eds). Complex Systems - From Biology to Computation. Amsterdam: OS Press, 1993, 24-35. [198] GREENBERG, WJ. A...Transactions on Systems, Man and Cybernetics, 1989, 19, 1073-1077. [369] RAMSEY, FP. The foundations of mathematics. London: Routledge & Kegan Paul, 1950

  4. GFT centrality: A new node importance measure for complex networks

    Science.gov (United States)

    Singh, Rahul; Chakraborty, Abhishek; Manoj, B. S.

    2017-12-01

    Identifying central nodes is very crucial to design efficient communication networks or to recognize key individuals of a social network. In this paper, we introduce Graph Fourier Transform Centrality (GFT-C), a metric that incorporates local as well as global characteristics of a node, to quantify the importance of a node in a complex network. GFT-C of a reference node in a network is estimated from the GFT coefficients derived from the importance signal of the reference node. Our study reveals the superiority of GFT-C over traditional centralities such as degree centrality, betweenness centrality, closeness centrality, eigenvector centrality, and Google PageRank centrality, in the context of various arbitrary and real-world networks with different degree-degree correlations.

  5. Velocity-pressure correlation measurements in complex free shear flows

    International Nuclear Information System (INIS)

    Naka, Yoshitsugu; Obi, Shinnosuke

    2009-01-01

    Simultaneous measurements of fluctuating velocity and pressure were performed in various turbulent free shear flows including a turbulent mixing layer and the wing-tip vortex trailing from a NACA0012 half-wing. Two different methods for fluctuating static pressure measurement were considered: a direct method using a miniature Pitot tube and an indirect method where static pressure was calculated from total pressure. The pressure obtained by either of these methods was correlated with the velocity measured by an X-type hot-wire probe. The results from these two techniques agreed with each other in the turbulent mixing layer. In the wing-tip vortex case, however, some discrepancies were found, although overall characteristics of the pressure-related statistics were adequately captured by both methods.

  6. Electronic system for the complex measurement of a Wilberforce pendulum

    Science.gov (United States)

    Kos, B.; Grodzicki, M.; Wasielewski, R.

    2018-05-01

    The authors present a novel application of a micro-electro-mechanical measurement system to the description of basic physical phenomena in a model Wilberforce pendulum. The composition of the kit includes a tripod with a mounted spring with freely hanging bob, a module GY-521 on the MPU 6050 coupled with an Arduino Uno, which in conjunction with a PC acts as measuring set. The system allows one to observe the swing of the pendulum in real time. Obtained data stays in good agreement with both theoretical predictions and previous works. The aim of this article is to introduce the study of a Wilberforce pendulum to the canon of physical laboratory exercises due to its interesting properties and multifaceted method of measurement.

  7. Measurement and documentation of complex PTSD in treatment seeking traumatized refugees

    DEFF Research Database (Denmark)

    Palic, Sabina

    The aim of the thesis is to study complex traumatization and its measurement in treatment seeking traumatized refugees. Historically there have been repeated attempts to create a diagnosis for complex posttraumatic stress disorder (complex PTSD) to capture the more diverse, trauma related symptoms...... to measuring symptoms of PTSD, anxiety, and depression. This renders documentation, measurement, and treatment of possible complex traumatic adaptations in traumatized refugees very difficult. The thesis comprises two studies using different measures and different samples. The first study investigated complex...... in the traumatized refugees an important challenge. The second study in the thesis examined the proposed diversity of psychiatric morbidity in complex PTSD using a global psychiatric measure –the Health of Nation Outcome Scales (HoNOS). Article 3 showed that a group of consecutive refugees outpatients from a Danish...

  8. Simulation and Efficient Measurements of Intensities for Complex Imaging Sequences

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Rasmussen, Morten Fischer; Stuart, Matthias Bo

    2014-01-01

    on the sequence to simulate both intensity and mechanical index (MI) according to FDA rules. A 3 MHz BK Medical 8820e convex array transducer is used with the SARUS scanner. An Onda HFL-0400 hydrophone and the Onda AIMS III system measures the pressure field for three imaging schemes: a fixed focus, single...

  9. Measuring Viscosity with a Levitating Magnet: Application to Complex Fluids

    Science.gov (United States)

    Even, C.; Bouquet, F.; Remond, J.; Deloche, B.

    2009-01-01

    As an experimental project proposed to students in fourth year of university, a viscometer was developed, consisting of a small magnet levitating in a viscous fluid. The viscous force acting on the magnet is directly measured: viscosities in the range 10-10[superscript 6] mPa s are obtained. This experiment is used as an introduction to complex…

  10. Complex permittivity measurements of ferroelectric employing composite dielectric resonator technique

    Czech Academy of Sciences Publication Activity Database

    Krupka, J.; Zychowicz, T.; Bovtun, Viktor; Veljko, Sergiy

    2006-01-01

    Roč. 53, č. 10 (2006), s. 1883-1888 ISSN 0885-3010 R&D Projects: GA AV ČR(CZ) IAA1010213; GA ČR(CZ) GA202/04/0993; GA ČR(CZ) GA202/06/0403 Institutional research plan: CEZ:AV0Z10100520 Keywords : dielectric resonator * ferroelectrics * microwave measurements Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.729, year: 2006

  11. Complex optimization of radiometric control and measurement systems

    International Nuclear Information System (INIS)

    Onishchenko, A.M.

    1995-01-01

    Fundamentals of a new approach to increase in the accuracy of radiometric systems of control and measurements are presented in succession. Block diagram of the new concept of radiometric system optimization is provided. The approach involving radical increase in accuracy and envisages ascertaining of controlled parameter by the totality of two intelligence signals closely correlated with each other. The new concept makes use of system analysis as a unified one-piece object, permitting euristic synthesis of the system. 4 refs., 3 figs

  12. Urban sustainability : complex interactions and the measurement of risk

    Directory of Open Access Journals (Sweden)

    Lidia Diappi

    1999-05-01

    Full Text Available This paper focuses on the concept of asustainable city and its theoretical implications for the urban system. Urban sustainability is based on positive interactions among three different urban sub-systems : social, economic and physical, where social well-being coexists with economic development and environmental quality. This utopian scenario doesn’t appear. Affluent economy is often associated with poverty and criminality, labour variety and urban efficiency coexist with pollution and congestion. The research subject is the analysis of local risk and opportunity conditions, based on the application of a special definition of risk elaborated and made operative with the production of a set of maps representing the multidimensional facets of spatial organisation in urban sustainability. The interactions among the economic/social and environmental systems are complex and unpredictable and present the opportunity for a new methodology of scientific investigation : the connectionistic approach, processed by Self-Reflexive Neural Networks (SRNN. These Networks are a useful instrument of investigation and analogic questioning of the Data Base. Once the SRNN has learned the structure of the weights from the DB, by querying the network with the maximization or minimization of specific groups of attributes, it is possible to read the related properties and to rank the areas. The survey scale assumed by the research is purposefully aimed at the micro-scale and concerns the Municipality of Milan which is spatially divided into 144 zones.

  13. Characterization of known protein complexes using k-connectivity and other topological measures

    Science.gov (United States)

    Gallagher, Suzanne R; Goldberg, Debra S

    2015-01-01

    Many protein complexes are densely packed, so proteins within complexes often interact with several other proteins in the complex. Steric constraints prevent most proteins from simultaneously binding more than a handful of other proteins, regardless of the number of proteins in the complex. Because of this, as complex size increases, several measures of the complex decrease within protein-protein interaction networks. However, k-connectivity, the number of vertices or edges that need to be removed in order to disconnect a graph, may be consistently high for protein complexes. The property of k-connectivity has been little used previously in the investigation of protein-protein interactions. To understand the discriminative power of k-connectivity and other topological measures for identifying unknown protein complexes, we characterized these properties in known Saccharomyces cerevisiae protein complexes in networks generated both from highly accurate X-ray crystallography experiments which give an accurate model of each complex, and also as the complexes appear in high-throughput yeast 2-hybrid studies in which new complexes may be discovered. We also computed these properties for appropriate random subgraphs.We found that clustering coefficient, mutual clustering coefficient, and k-connectivity are better indicators of known protein complexes than edge density, degree, or betweenness. This suggests new directions for future protein complex-finding algorithms. PMID:26913183

  14. Centrality measures and thermodynamic formalism for complex networks.

    Science.gov (United States)

    Delvenne, Jean-Charles; Libert, Anne-Sophie

    2011-04-01

    In the study of small and large networks it is customary to perform a simple random walk where the random walker jumps from one node to one of its neighbors with uniform probability. The properties of this random walk are intimately related to the combinatorial properties of the network. In this paper we propose to use the Ruelle-Bowens random walk instead, whose probability transitions are chosen in order to maximize the entropy rate of the walk on an unweighted graph. If the graph is weighted, then a free energy is optimized instead of the entropy rate. Specifically, we introduce a centrality measure for large networks, which is the stationary distribution attained by the Ruelle-Bowens random walk; we name it entropy rank. We introduce a more general version, which is able to deal with disconnected networks, under the name of free-energy rank. We compare the properties of those centrality measures with the classic PageRank and hyperlink-induced topic search (HITS) on both toy and real-life examples, in particular their robustness to small modifications of the network. We show that our centrality measures are more discriminating than PageRank, since they are able to distinguish clearly pages that PageRank regards as almost equally interesting, and are more sensitive to the medium-scale details of the graph.

  15. Skill networks and measures of complex human capital.

    Science.gov (United States)

    Anderson, Katharine A

    2017-11-28

    We propose a network-based method for measuring worker skills. We illustrate the method using data from an online freelance website. Using the tools of network analysis, we divide skills into endogenous categories based on their relationship with other skills in the market. Workers who specialize in these different areas earn dramatically different wages. We then show that, in this market, network-based measures of human capital provide additional insight into wages beyond traditional measures. In particular, we show that workers with diverse skills earn higher wages than those with more specialized skills. Moreover, we can distinguish between two different types of workers benefiting from skill diversity: jacks-of-all-trades, whose skills can be applied independently on a wide range of jobs, and synergistic workers, whose skills are useful in combination and fill a hole in the labor market. On average, workers whose skills are synergistic earn more than jacks-of-all-trades. Copyright © 2017 the Author(s). Published by PNAS.

  16. Ibogaine: complex pharmacokinetics, concerns for safety, and preliminary efficacy measures.

    Science.gov (United States)

    Mash, D C; Kovera, C A; Pablo, J; Tyndale, R F; Ervin, F D; Williams, I C; Singleton, E G; Mayor, M

    2000-09-01

    Ibogaine is an indole alkaloid found in the roots of Tabernanthe Iboga (Apocynaceae family), a rain forest shrub that is native to western Africa. Ibogaine is used by indigenous peoples in low doses to combat fatigue, hunger and thirst, and in higher doses as a sacrament in religious rituals. Members of American and European addict self-help groups have claimed that ibogaine promotes long-term drug abstinence from addictive substances, including psychostimulants and opiates. Anecdotal reports attest that a single dose of ibogaine eliminates opiate withdrawal symptoms and reduces drug craving for extended periods of time. The purported efficacy of ibogaine for the treatment of drug dependence may be due in part to an active metabolite. The majority of ibogaine biotransformation proceeds via CYP2D6, including the O-demethylation of ibogaine to 12-hydroxyibogamine (noribogaine). Blood concentration-time effect profiles of ibogaine and noribogaine obtained for individual subjects after single oral dose administrations demonstrate complex pharmacokinetic profiles. Ibogaine has shown preliminary efficacy for opiate detoxification and for short-term stabilization of drug-dependent persons as they prepare to enter substance abuse treatment. We report here that ibogaine significantly decreased craving for cocaine and heroin during inpatient detoxification. Self-reports of depressive symptoms were also significantly lower after ibogaine treatment and at 30 days after program discharge. Because ibogaine is cleared rapidly from the blood, the beneficial aftereffects of the drug on craving and depressed mood may be related to the effects of noribogaine on the central nervous system.

  17. Measuring the complex field scattered by single submicron particles

    Energy Technology Data Exchange (ETDEWEB)

    Potenza, Marco A. C., E-mail: marco.potenza@unimi.it; Sanvito, Tiziano [Department of Physics, University of Milan, via Celoria, 16 – I-20133 Milan (Italy); CIMAINA, University of Milan, via Celoria, 16 – I-20133 Milan (Italy); EOS s.r.l., viale Ortles 22/4, I-20139 Milan (Italy); Pullia, Alberto [Department of Physics, University of Milan, via Celoria, 16 – I-20133 Milan (Italy)

    2015-11-15

    We describe a method for simultaneous measurements of the real and imaginary parts of the field scattered by single nanoparticles illuminated by a laser beam, exploiting a self-reference interferometric scheme relying on the fundamentals of the Optical Theorem. Results obtained with calibrated spheres of different materials are compared to the expected values obtained through a simplified analytical model without any free parameters, and the method is applied to a highly polydisperse water suspension of Poly(D,L-lactide-co-glycolide) nanoparticles. Advantages with respect to existing methods and possible applications are discussed.

  18. Representativeness of wind measurements in moderately complex terrain

    Science.gov (United States)

    van den Bossche, Michael; De Wekker, Stephan F. J.

    2018-02-01

    We investigated the representativeness of 10-m wind measurements in a 4 km × 2 km area of modest relief by comparing observations at a central site with those at four satellite sites located in the same area. Using a combination of established and new methods to quantify and visualize representativeness, we found significant differences in wind speed and direction between the four satellite sites and the central site. The representativeness of the central site wind measurements depended strongly on surface wind speed and direction, and atmospheric stability. Through closer inspection of the observations at one of the satellite sites, we concluded that terrain-forced flows combined with thermally driven downslope winds caused large biases in wind direction and speed. We used these biases to generate a basic model, showing that terrain-related differences in wind observations can to a large extent be predicted. Such a model is a cost-effective way to enhance an area's wind field determination and to improve the outcome of pollutant dispersion and weather forecasting models.

  19. Design of New Complex Detector Used for Gross Beta Measuring

    International Nuclear Information System (INIS)

    Zhang Junmin

    2010-01-01

    The level of gross β for radioactive aerosol in the containment of nuclear plants can indicate how serious the radioactive pollution is in the shell, and it can provide evidence which shows whether there is the phenomenon of leak in the boundaries of confined aquifer of the primary coolant circuit equipment.In the process of measuring, the counting of gross β is influenced by γ. In order to avoid the influence of γ, a new method was introduced and a new detector was designed using plastic scintillator as the major detecting component and BGO as the sub-component. Based on distinctive difference of light attenuation time, signal induced in them can be discriminated. γ background in plastic scintillator was subtracted according to the counting of γ in BGO. The functions of absolute detection efficiency were obtained. The simulation for Monte-Carlo method shows that the influence of γ background is decreased about one order of magnitude. (authors)

  20. Critical Hydrologic and Atmospheric Measurements in Complex Alpine Regions

    Science.gov (United States)

    Parlange, M. B.; Bou-Zeid, E.; Barrenetxea, G.; Krichane, M.; Ingelrest, F.; Couach, O.; Luyet, V.; Vetterli, M.; Lehning, M.; Duffy, C.; Tobin, C.; Selker, J.; Kumar, M.

    2007-12-01

    The Alps are often referred to as the « Water Towers of Europe » and as such play an essential role in European water resources. The impact of climatic change is expected to be particularly pronounced in the Alps and the lack of detailed hydrologic field observations is problematic for predictions of hydrologic and hazard assessment. Advances in information technology and communications provide important possibilities to improve the situation with relatively few measurements. We will present sensorscope technology (arrays of wireless weather stations including soil moisture, pressure, and temperature) that has now been deployed at the Le Genepi and Grand St. Bernard pass. In addition, a Distributed Temperature Sensor array on the stream beds has been deployed and stream discharge monitored. The high spatial resolution data collected in these previously "ungaged" regions are used in conjunction with new generation hydrologic models. The framework as to what is possible today with sensor arrays and modeling in extreme mountain environments is discussed.

  1. Characterization of measurement errors using structure-from-motion and photogrammetry to measure marine habitat structural complexity.

    Science.gov (United States)

    Bryson, Mitch; Ferrari, Renata; Figueira, Will; Pizarro, Oscar; Madin, Josh; Williams, Stefan; Byrne, Maria

    2017-08-01

    Habitat structural complexity is one of the most important factors in determining the makeup of biological communities. Recent advances in structure-from-motion and photogrammetry have resulted in a proliferation of 3D digital representations of habitats from which structural complexity can be measured. Little attention has been paid to quantifying the measurement errors associated with these techniques, including the variability of results under different surveying and environmental conditions. Such errors have the potential to confound studies that compare habitat complexity over space and time. This study evaluated the accuracy, precision, and bias in measurements of marine habitat structural complexity derived from structure-from-motion and photogrammetric measurements using repeated surveys of artificial reefs (with known structure) as well as natural coral reefs. We quantified measurement errors as a function of survey image coverage, actual surface rugosity, and the morphological community composition of the habitat-forming organisms (reef corals). Our results indicated that measurements could be biased by up to 7.5% of the total observed ranges of structural complexity based on the environmental conditions present during any particular survey. Positive relationships were found between measurement errors and actual complexity, and the strength of these relationships was increased when coral morphology and abundance were also used as predictors. The numerous advantages of structure-from-motion and photogrammetry techniques for quantifying and investigating marine habitats will mean that they are likely to replace traditional measurement techniques (e.g., chain-and-tape). To this end, our results have important implications for data collection and the interpretation of measurements when examining changes in habitat complexity using structure-from-motion and photogrammetry.

  2. A comparison of LMC and SDL complexity measures on binomial distributions

    Science.gov (United States)

    Piqueira, José Roberto C.

    2016-02-01

    The concept of complexity has been widely discussed in the last forty years, with a lot of thinking contributions coming from all areas of the human knowledge, including Philosophy, Linguistics, History, Biology, Physics, Chemistry and many others, with mathematicians trying to give a rigorous view of it. In this sense, thermodynamics meets information theory and, by using the entropy definition, López-Ruiz, Mancini and Calbet proposed a definition for complexity that is referred as LMC measure. Shiner, Davison and Landsberg, by slightly changing the LMC definition, proposed the SDL measure and the both, LMC and SDL, are satisfactory to measure complexity for a lot of problems. Here, SDL and LMC measures are applied to the case of a binomial probability distribution, trying to clarify how the length of the data set implies complexity and how the success probability of the repeated trials determines how complex the whole set is.

  3. The Complex Trauma Questionnaire (ComplexTQ:Development and preliminary psychometric properties of an instrument for measuring early relational trauma

    Directory of Open Access Journals (Sweden)

    Carola eMaggiora Vergano

    2015-09-01

    Full Text Available Research on the etiology of adult psychopathology and its relationship with childhood trauma has focused primarily on specific forms of maltreatment. This study developed an instrument for the assessment of childhood and adolescence trauma that would aid in identifying the role of co-occurring childhood stressors and chronic adverse conditions. The Complex Trauma Questionnaire (ComplexTQ, in both clinician and self-report versions, is a measure for the assessment of multi-type maltreatment: physical, psychological, and sexual abuse; physical and emotional neglect as well as other traumatic experiences, such rejection, role reversal, witnessing domestic violence, separations, and losses. The four-point Likert scale allows to specifically indicate with which caregiver the traumatic experience has occurred. A total of 229 participants, a sample of 79 nonclinical and that of 150 high-risk and clinical participants, were assessed with the ComplexTQ clinician version applied to Adult Attachment Interview (AAI transcripts. Initial analyses indicate acceptable inter-rater reliability. A good fit to a 6-factor model regarding the experience with the mother and to a 5-factor model with the experience with the father was obtained; the internal consistency of factors derived was good. Convergent validity was provided with the AAI scales. ComplexTQ factors discriminated normative from high-risk and clinical samples. The findings suggest a promising, reliable, and valid measurement of early relational trauma that is reported; furthermore, it is easy to complete and is useful for both research and clinical practice.

  4. Measurements of complex impedance in microwave high power systems with a new bluetooth integrated circuit.

    Science.gov (United States)

    Roussy, Georges; Dichtel, Bernard; Chaabane, Haykel

    2003-01-01

    By using a new integrated circuit, which is marketed for bluetooth applications, it is possible to simplify the method of measuring the complex impedance, complex reflection coefficient and complex transmission coefficient in an industrial microwave setup. The Analog Devices circuit AD 8302, which measures gain and phase up to 2.7 GHz, operates with variable level input signals and is less sensitive to both amplitude and frequency fluctuations of the industrial magnetrons than are mixers and AM crystal detectors. Therefore, accurate gain and phase measurements can be performed with low stability generators. A mechanical setup with an AD 8302 is described; the calibration procedure and its performance are presented.

  5. Measuring case-mix complexity of tertiary care hospitals using DRGs.

    Science.gov (United States)

    Park, Hayoung; Shin, Youngsoo

    2004-02-01

    The objectives of the study were to develop a model that measures and evaluates case-mix complexity of tertiary care hospitals, and to examine the characteristics of such a model. Physician panels defined three classes of case complexity and assigned disease categories represented by Adjacent Diagnosis Related Groups (ADRGs) to one of three case complexity classes. Three types of scores, indicating proportions of inpatients in each case complexity class standardized by the proportions at the national level, were defined to measure the case-mix complexity of a hospital. Discharge information for about 10% of inpatient episodes at 85 hospitals with bed size larger than 400 and their input structure and research and education activity were used to evaluate the case-mix complexity model. Results show its power to predict hospitals with the expected functions of tertiary care hospitals, i.e. resource intensive care, expensive input structure, and high levels of research and education activities.

  6. Measurement of the total solar energy transmittance (g-value) for complex glazings

    DEFF Research Database (Denmark)

    Duer, Karsten

    1999-01-01

    Four different complex glazings have been investigated in the Danish experimental setup METSET.The purpose of the measurements is to increase the confidence in the calorimetric measurements and to perform measurements and corrections according to a method developed in the ALTSET project...

  7. Weak convergence to isotropic complex S α S $S\\alpha S$ random measure

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2017-09-01

    Full Text Available Abstract In this paper, we prove that an isotropic complex symmetric α-stable random measure ( 0 < α < 2 $0<\\alpha<2$ can be approximated by a complex process constructed by integrals based on the Poisson process with random intensity.

  8. Comparison of Measures of Morphosyntactic Complexity in French-Speaking School-Aged Children

    Science.gov (United States)

    Mimeau, Catherine; Plourde, Vickie; Ouellet, Andrée-Anne; Dionne, Ginette

    2015-01-01

    This study examined the validity and reliability of different measures of morphosyntactic complexity, including the Morphosyntactic Complexity Scale (MSCS), a novel adaptation of the Developmental Sentence Scoring, in French-speaking school-aged children. Seventy-three Quebec children from kindergarten to Grade 3 completed a definition task and a…

  9. Economic Complexity and Human Development: DEA performance measurement in Asia and Latin America

    OpenAIRE

    Ferraz, Diogo; Moralles, Hérick Fernando; Suarez Campoli, Jéssica; Ribeiro de Oliveira, Fabíola Cristina; do Nascimento Rebelatto, Daisy Aparecida

    2018-01-01

    Economic growth is not the unique factor to explain human development. Due to that many authors have prioritized studies to measure the Human Development Index. However, these indices do not analyze how Economic Complexity can increase Human Development. The aim of this paper is to determine the efficiency of a set of nations from Latin America and Asia, to measure a country’s performance in converting Economic Complexity into Human Development, between 2010 and 2014. The method used was Data...

  10. Variances as order parameter and complexity measure for random Boolean networks

    International Nuclear Information System (INIS)

    Luque, Bartolo; Ballesteros, Fernando J; Fernandez, Manuel

    2005-01-01

    Several order parameters have been considered to predict and characterize the transition between ordered and disordered phases in random Boolean networks, such as the Hamming distance between replicas or the stable core, which have been successfully used. In this work, we propose a natural and clear new order parameter: the temporal variance. We compute its value analytically and compare it with the results of numerical experiments. Finally, we propose a complexity measure based on the compromise between temporal and spatial variances. This new order parameter and its related complexity measure can be easily applied to other complex systems

  11. Variances as order parameter and complexity measure for random Boolean networks

    Energy Technology Data Exchange (ETDEWEB)

    Luque, Bartolo [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain); Ballesteros, Fernando J [Observatori Astronomic, Universitat de Valencia, Ed. Instituts d' Investigacio, Pol. La Coma s/n, E-46980 Paterna, Valencia (Spain); Fernandez, Manuel [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain)

    2005-02-04

    Several order parameters have been considered to predict and characterize the transition between ordered and disordered phases in random Boolean networks, such as the Hamming distance between replicas or the stable core, which have been successfully used. In this work, we propose a natural and clear new order parameter: the temporal variance. We compute its value analytically and compare it with the results of numerical experiments. Finally, we propose a complexity measure based on the compromise between temporal and spatial variances. This new order parameter and its related complexity measure can be easily applied to other complex systems.

  12. Geochemical processes in acidic water caused by the weathering of metal sulphides; Procesos geoquimicos en aguas acidas por meteorizacion de sulfuros

    Energy Technology Data Exchange (ETDEWEB)

    Asta Andres, M. P.; Acero Salazar, P.; Auque Sanz, L. F.; Gimeno Serrano, M. J.; Gomez Jimenez, J. B.

    2011-07-01

    Acid generated by the oxidative dissolution of metal sulphides is one of the main sources of pollution in runoff water, groundwater, soils and sediments throughout the world. These types of water are very acidic and contain high concentrations of sulphate and other potentially contaminating elements such Fe, As, Cd, Sb, Zn and Cu. The acidity generated by sulphide oxidation processes is mainly controlled by the type, quantity and distribution of the sulphide-rich rocks, by the physical characteristics of the rocks (since they determine the accessibility of aqueous solutions and gases to the sulphides), by the presence of microorganisms able to catalyze the main chemical reactions involved in the formation of acid drainage, and by the existence of minerals capable of neutralizing acidity. As a result, the generation of acidic water is a very complex problem, the study of which must be undertaken via a multidisciplinary approach, taking into account geological, geochemical, mineralogical and microbiological aspects among others. The aim of our work is to provide a general overview of these processes and other factors that influence the generation and evolution of these systems, together with information concerning current scientific knowledge about each of these approaches. Thus we hope to provide a basic background to the understanding and study of acid-water systems associated with the weathering of metal sulphides and the processes involved in the generation, migration, evolution and natural attenuation of acidic waters in these environments. (Author) 65 refs.

  13. On Generalized Stam Inequalities and Fisher–Rényi Complexity Measures

    Directory of Open Access Journals (Sweden)

    Steeve Zozor

    2017-09-01

    Full Text Available Information-theoretic inequalities play a fundamental role in numerous scientific and technological areas (e.g., estimation and communication theories, signal and information processing, quantum physics, … as they generally express the impossibility to have a complete description of a system via a finite number of information measures. In particular, they gave rise to the design of various quantifiers (statistical complexity measures of the internal complexity of a (quantum system. In this paper, we introduce a three-parametric Fisher–Rényi complexity, named ( p , β , λ -Fisher–Rényi complexity, based on both a two-parametic extension of the Fisher information and the Rényi entropies of a probability density function ρ characteristic of the system. This complexity measure quantifies the combined balance of the spreading and the gradient contents of ρ , and has the three main properties of a statistical complexity: the invariance under translation and scaling transformations, and a universal bounding from below. The latter is proved by generalizing the Stam inequality, which lowerbounds the product of the Shannon entropy power and the Fisher information of a probability density function. An extension of this inequality was already proposed by Bercher and Lutwak, a particular case of the general one, where the three parameters are linked, allowing to determine the sharp lower bound and the associated probability density with minimal complexity. Using the notion of differential-escort deformation, we are able to determine the sharp bound of the complexity measure even when the three parameters are decoupled (in a certain range. We determine as well the distribution that saturates the inequality: the ( p , β , λ -Gaussian distribution, which involves an inverse incomplete beta function. Finally, the complexity measure is calculated for various quantum-mechanical states of the harmonic and hydrogenic systems, which are the two main

  14. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    International Nuclear Information System (INIS)

    Park, J. K.; Jung, W. D.; Kim, J. W.; Ha, J. J.

    2001-04-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed

  15. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. K.; Jung, W. D.; Kim, J. W.; Ha, J. J

    2001-04-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed.

  16. AUTOMATED MEASURING COMPLEX FOR ACCEPTANCE TESTING OF DC AND UNDULATED-CURRENT TRACTION MOTORS

    Directory of Open Access Journals (Sweden)

    A. Yu. Drubetskyi

    2016-12-01

    Full Text Available Purpose. In the paper it is necessary: 1 to familiarize the reader with the modern classification of measurement and diagnostics, familiarize with problems of automating the measurement of basic parameters during program execution of qualification tests of traction motors; 2 to make recommendations to improve the measurement ac-curacy, reduce labor intensity of work for carrying out measurements, and reduce the requirements for the qualification of the staff; 3 to provide practical implementation of measurement system, built on the basis of the practical recommendations contained in the article. Methodology. The work presents the classification of measurement and diagnostic tools. The author considered a list of equipment that can be used in measurement systems, as well as third-party options for measuring complex and measuring complex using stand management system. Their functional schemes were proposed. The author compared the advantages and disadvantages of these schemes to make recommendations on areas of their optimal use. Findings. Having analyzed the functional scheme of measuring systems, it was found that the use of the control system microcontroller as a measuring complex is expedient if the measurements have largely a test process control function. The use of a third-party measuring complex is more appropriate in cases when it is required: to eliminate dependence on the stand management system, to provide high mobility and reduce the requirements for the qualification of the staff. Originality. The work presents a brief over-view of the measurement means. The author developed the functional schemes of measuring systems using stand management system and third-party measuring complex, proposed the criteria for evaluating their optimal use. Practical value. Based on the proposed functional diagram, the measuring system on National Instruments hard-ware and software basis was set up. The sensors by LEM Company were used as primary

  17. Study of immobilization of waste from treatment of acid waters of a uranium mining facility

    International Nuclear Information System (INIS)

    Goda, R.T.; Oliveira, A.P. de; Silva, N.C. da; Villegas, R.A.S.; Ferreira, A.M.

    2017-01-01

    This study aimed to produce scientific and technical knowledge aiming at the development of techniques to immobilize the waste generated in the treatment of acid waters in the UTM-INB Caldas uranium mining and processing facility using Portland cement. This residue (calcium diuranate - DUCA) contains uranium compounds and metal hydroxides in a matrix of calcium sulfate. It is observed that this material, in contact with the lake of acid waters of the mine's own pit, undergoes resolubilization and, therefore, changes the quality of the acidic water contained therein, changing the treatment parameters. For the study of immobilization of this residue, the mass of water contained in both the residue deposited in the pit of the mine and in the pulp resulting from the treatment of the acid waters was determined. In addition, different DUCA / CEMENT / WATER ratios were used for immobilization and subsequent mechanical strength and leaching tests. The results showed that in the immobilized samples with 50% cement mass condition, no uranium was detected in the leaching tests, and the mechanical strength at compression was 9.4 MPa, which indicates that more studies are needed, but indicate a good capacity to immobilize uranium in cement

  18. Silicon Isotope Fractionation During Acid Water-Igneous Rock Interaction

    Science.gov (United States)

    van den Boorn, S. H.; van Bergen, M. J.; Vroon, P. Z.

    2007-12-01

    Silica enrichment by metasomatic/hydrothermal alteration is a widespread phenomenon in crustal environments where acid fluids interact with silicate rocks. High-sulfidation epithermal ore deposits and acid-leached residues at hot-spring settings are among the best known examples. Acid alteration acting on basalts has also been invoked to explain the relatively high silica contents of the surface of Mars. We have analyzed basaltic-andesitic lavas from the Kawah Ijen volcanic complex (East Java, Indonesia) that were altered by interaction with highly acid (pH~1) sulfate-chloride water of its crater lake and seepage stream. Quantitative removal of major elements during this interaction has led to relative increase in SiO2 contents. Our silicon isotope data, obtained by HR-MC-ICPMS and reported relative to the NIST RM8546 (=NBS28) standard, show a systematic increase in &δ&&30Si from -0.2‰ (±0.3, 2sd) for unaltered andesites and basalts to +1.5‰ (±0.3, 2sd) for the most altered/silicified rocks. These results demonstrate that silicification induced by pervasive acid alteration is accompanied by significant Si isotope fractionation, so that alterered products become isotopically heavier than the precursor rocks. Despite the observed enrichment in SiO2, the rocks have experienced an overall net loss of silicon upon alteration, if Nb is considered as perfectly immobile. The observed &δ&&30Si values of the alteration products appeared to correlate well with the inferred amounts of silicon loss. These findings would suggest that &28Si is preferentially leached during water-rock interaction, implying that dissolved silica in the ambient lake and stream water is isotopically light. However, layered opaline lake sediments, that are believed to represent precipitates from the silica-saturated water show a conspicuous &30Si-enrichment (+1.2 ± 0.2‰). Because anorganic precipitation is known to discriminate against the heavy isotope (e.g. Basile- Doelsch et al., 2006

  19. Vapor-liquid equilibria for nitric acid-water and plutonium nitrate-nitric acid-water solutions

    International Nuclear Information System (INIS)

    Maimoni, A.

    1980-01-01

    The liquid-vapor equilibrium data for nitric acid and nitric acid-plutnonium nitrate-water solutions were examined to develop correlations covering the range of conditions encountered in nuclear fuel reprocessing. The scanty available data for plutonium nitrate solutions are of poor quality but allow an order of magnitude estimate to be made. A formal thermodynamic analysis was attempted initially but was not successful due to the poor quality of the data as well as the complex chemical equilibria involved in the nitric acid and in the plutonium nitrate solutions. Thus, while there was no difficulty in correlating activity coefficients for nitric acid solutions over relatively narrow temperature ranges, attempts to extend the correlations over the range 25 0 C to the boiling point were not successful. The available data were then analyzed using empirical correlations from which normal boiling points and relative volatilities can be obtained over the concentration ranges 0 to 700 g/l Pu, 0 to 13 M nitric acid. Activity coefficients are required, however, if estimates of individual component vapor pressures are needed. The required ternary activity coefficients can be approximated from the correlations

  20. Complex numbers in chemometrics: examples from multivariate impedance measurements on lipid monolayers.

    Science.gov (United States)

    Geladi, Paul; Nelson, Andrew; Lindholm-Sethson, Britta

    2007-07-09

    Electrical impedance gives multivariate complex number data as results. Two examples of multivariate electrical impedance data measured on lipid monolayers in different solutions give rise to matrices (16x50 and 38x50) of complex numbers. Multivariate data analysis by principal component analysis (PCA) or singular value decomposition (SVD) can be used for complex data and the necessary equations are given. The scores and loadings obtained are vectors of complex numbers. It is shown that the complex number PCA and SVD are better at concentrating information in a few components than the naïve juxtaposition method and that Argand diagrams can replace score and loading plots. Different concentrations of Magainin and Gramicidin A give different responses and also the role of the electrolyte medium can be studied. An interaction of Gramicidin A in the solution with the monolayer over time can be observed.

  1. Changes in the Complexity of Heart Rate Variability with Exercise Training Measured by Multiscale Entropy-Based Measurements

    Directory of Open Access Journals (Sweden)

    Frederico Sassoli Fazan

    2018-01-01

    Full Text Available Quantifying complexity from heart rate variability (HRV series is a challenging task, and multiscale entropy (MSE, along with its variants, has been demonstrated to be one of the most robust approaches to achieve this goal. Although physical training is known to be beneficial, there is little information about the long-term complexity changes induced by the physical conditioning. The present study aimed to quantify the changes in physiological complexity elicited by physical training through multiscale entropy-based complexity measurements. Rats were subject to a protocol of medium intensity training ( n = 13 or a sedentary protocol ( n = 12 . One-hour HRV series were obtained from all conscious rats five days after the experimental protocol. We estimated MSE, multiscale dispersion entropy (MDE and multiscale SDiff q from HRV series. Multiscale SDiff q is a recent approach that accounts for entropy differences between a given time series and its shuffled dynamics. From SDiff q , three attributes (q-attributes were derived, namely SDiff q m a x , q m a x and q z e r o . MSE, MDE and multiscale q-attributes presented similar profiles, except for SDiff q m a x . q m a x showed significant differences between trained and sedentary groups on Time Scales 6 to 20. Results suggest that physical training increases the system complexity and that multiscale q-attributes provide valuable information about the physiological complexity.

  2. The effect of electrode contact resistance and capacitive coupling on Complex Resistivity measurements

    DEFF Research Database (Denmark)

    Ingeman-Nielsen, Thomas

    2006-01-01

    The effect of electrode contact resistance and capacitive coupling on complex resistivity (CR) measurements is studied in this paper. An equivalent circuit model for the receiver is developed to describe the effects. The model shows that CR measurements are severely affected even at relatively lo...... with the contact resistance artificially increased by resistors. The results emphasize the importance of keeping contact resistance low in CR measurements....

  3. Design and Functional Validation of a Complex Impedance Measurement Device for Characterization of Ultrasonic Transducers

    International Nuclear Information System (INIS)

    De-Cock, Wouter; Cools, Jan; Leroux, Paul

    2013-06-01

    This paper presents the design and practical implementation of a complex impedance measurement device capable of characterization of ultrasonic transducers. The device works in the frequency range used by industrial ultrasonic transducers which is below the measurement range of modern high end network analyzers. The device uses the Goertzel algorithm instead of the more common FFT algorithm to calculate the magnitude and phase component of the impedance under test. A theoretical overview is given followed by a practical approach and measurement results. (authors)

  4. Raven’s Progressive Matrices, manipulations of complexity and measures of accuracy, speed and confidence

    OpenAIRE

    LAZAR STANKOV; KARL SCHWEIZER

    2007-01-01

    This paper examines the effects of complexity-enhancing manipulations of two cognitive tasks – Swaps and Triplet Numbers tests (Stankov, 2000) – on their relationship with Raven’s Progressive Matrices test representing aspects of fluid intelligence. The complexity manipulations involved four treatment levels, each requiring an increasing number of components and relationships among these components. The accuracy, speed of processing, and confidence measures were decomposed into experimental a...

  5. Using Response Times to Measure Strategic Complexity and the Value of Thinking in Games

    OpenAIRE

    Gill, David; Prowse, Victoria L.

    2017-01-01

    Response times are a simple low-cost indicator of the process of reasoning in strategic games (Rubinstein, 2007; Rubinstein, 2016). We leverage the dynamic nature of response-time data from repeated strategic interactions to measure the strategic complexity of a situation by how long people think on average when they face that situation (where we define situations according to the characteristics of play in the previous round). We find that strategic complexity varies significantly across sit...

  6. Lanthanide complexes as luminogenic probes to measure sulfide levels in industrial samples

    International Nuclear Information System (INIS)

    Thorson, Megan K.; Ung, Phuc; Leaver, Franklin M.; Corbin, Teresa S.; Tuck, Kellie L.; Graham, Bim; Barrios, Amy M.

    2015-01-01

    A series of lanthanide-based, azide-appended complexes were investigated as hydrogen sulfide-sensitive probes. Europium complex 1 and Tb complex 3 both displayed a sulfide-dependent increase in luminescence, while Tb complex 2 displayed a decrease in luminescence upon exposure to NaHS. The utility of the complexes for monitoring sulfide levels in industrial oil and water samples was investigated. Complex 3 provided a sensitive measure of sulfide levels in petrochemical water samples (detection limit ∼ 250 nM), while complex 1 was capable of monitoring μM levels of sulfide in partially refined crude oil. - Highlights: • Lanthanide–azide based sulfide sensors were synthesized and characterized. • The probes have excitation and emission profiles compatible with sulfide-contaminated samples from the petrochemical industry. • A terbium-based probe was used to measure the sulfide concentration in oil refinery wastewater. • A europium-based probe had compatibility with partially refined crude oil samples.

  7. Lanthanide complexes as luminogenic probes to measure sulfide levels in industrial samples

    Energy Technology Data Exchange (ETDEWEB)

    Thorson, Megan K. [Department of Medicinal Chemistry, University of Utah College of Pharmacy, Salt Lake City, UT 84108 (United States); Ung, Phuc [Monash Institute of Pharmaceutical Sciences, Monash University, Victoria 3052 (Australia); Leaver, Franklin M. [Water & Energy Systems Technology, Inc., Kaysville, UT 84037 (United States); Corbin, Teresa S. [Quality Services Laboratory, Tesoro Refining and Marketing, Salt Lake City, UT 84103 (United States); Tuck, Kellie L., E-mail: kellie.tuck@monash.edu [School of Chemistry, Monash University, Victoria 3800 (Australia); Graham, Bim, E-mail: bim.graham@monash.edu [Monash Institute of Pharmaceutical Sciences, Monash University, Victoria 3052 (Australia); Barrios, Amy M., E-mail: amy.barrios@utah.edu [Department of Medicinal Chemistry, University of Utah College of Pharmacy, Salt Lake City, UT 84108 (United States)

    2015-10-08

    A series of lanthanide-based, azide-appended complexes were investigated as hydrogen sulfide-sensitive probes. Europium complex 1 and Tb complex 3 both displayed a sulfide-dependent increase in luminescence, while Tb complex 2 displayed a decrease in luminescence upon exposure to NaHS. The utility of the complexes for monitoring sulfide levels in industrial oil and water samples was investigated. Complex 3 provided a sensitive measure of sulfide levels in petrochemical water samples (detection limit ∼ 250 nM), while complex 1 was capable of monitoring μM levels of sulfide in partially refined crude oil. - Highlights: • Lanthanide–azide based sulfide sensors were synthesized and characterized. • The probes have excitation and emission profiles compatible with sulfide-contaminated samples from the petrochemical industry. • A terbium-based probe was used to measure the sulfide concentration in oil refinery wastewater. • A europium-based probe had compatibility with partially refined crude oil samples.

  8. Measuring streetscape complexity based on the statistics of local contrast and spatial frequency.

    Directory of Open Access Journals (Sweden)

    André Cavalcante

    Full Text Available Streetscapes are basic urban elements which play a major role in the livability of a city. The visual complexity of streetscapes is known to influence how people behave in such built spaces. However, how and which characteristics of a visual scene influence our perception of complexity have yet to be fully understood. This study proposes a method to evaluate the complexity perceived in streetscapes based on the statistics of local contrast and spatial frequency. Here, 74 streetscape images from four cities, including daytime and nighttime scenes, were ranked for complexity by 40 participants. Image processing was then used to locally segment contrast and spatial frequency in the streetscapes. The statistics of these characteristics were extracted and later combined to form a single objective measure. The direct use of statistics revealed structural or morphological patterns in streetscapes related to the perception of complexity. Furthermore, in comparison to conventional measures of visual complexity, the proposed objective measure exhibits a higher correlation with the opinion of the participants. Also, the performance of this method is more robust regarding different time scenarios.

  9. The step complexity measure for emergency operating procedures - comparing with simulation data

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Kim, Jaewhan; Ha, Jaejoo; Shin, Yunghwa

    2001-01-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. Therefore, to prevent occurrences of accidents or to ensure system safety, extensive effort has been made to identify significant factors that cause human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors, and complexity or understandability of a procedure is pointed out as one of the major reasons that make procedure-related human errors. Many qualitative checklists are suggested to evaluate emergency operating procedures (EOPs) of NPPs. However, since qualitative evaluations using checklists have some drawbacks, a quantitative measure that can quantify the complexity of EOPs is imperative to compensate for them. In order to quantify the complexity of EOPs, Park et al. suggested the step complexity (SC) measure to quantify the complexity of a step included in EOPs. In this paper, to ensure the appropriateness of the SC measure, SC scores are compared with averaged step performance time data obtained from emergency training records. The total number of available records is 36, and training scenarios are the loss of coolant accident and the excess steam dump event. The number of scenario is 18 each. From these emergency training records, step performance time data for 39 steps are retrieved, and they are compared with estimated SC scores of them. In addition, several questions that are needed to clarify the appropriateness of the SC measure are also discussed. As a result, it was observed that estimated SC scores and step performance time data have a statistically meaningful correlation. Thus, it can be concluded that the SC measure can quantify the complexity of steps included in EOPs

  10. Random walk-based similarity measure method for patterns in complex object

    Directory of Open Access Journals (Sweden)

    Liu Shihu

    2017-04-01

    Full Text Available This paper discusses the similarity of the patterns in complex objects. The complex object is composed both of the attribute information of patterns and the relational information between patterns. Bearing in mind the specificity of complex object, a random walk-based similarity measurement method for patterns is constructed. In this method, the reachability of any two patterns with respect to the relational information is fully studied, and in the case of similarity of patterns with respect to the relational information can be calculated. On this bases, an integrated similarity measurement method is proposed, and algorithms 1 and 2 show the performed calculation procedure. One can find that this method makes full use of the attribute information and relational information. Finally, a synthetic example shows that our proposed similarity measurement method is validated.

  11. Multi-complexity ensemble measures for gait time series analysis: application to diagnostics, monitoring and biometrics.

    Science.gov (United States)

    Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina

    2015-01-01

    Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.

  12. An Attractor-Based Complexity Measurement for Boolean Recurrent Neural Networks

    Science.gov (United States)

    Cabessa, Jérémie; Villa, Alessandro E. P.

    2014-01-01

    We provide a novel refined attractor-based complexity measurement for Boolean recurrent neural networks that represents an assessment of their computational power in terms of the significance of their attractor dynamics. This complexity measurement is achieved by first proving a computational equivalence between Boolean recurrent neural networks and some specific class of -automata, and then translating the most refined classification of -automata to the Boolean neural network context. As a result, a hierarchical classification of Boolean neural networks based on their attractive dynamics is obtained, thus providing a novel refined attractor-based complexity measurement for Boolean recurrent neural networks. These results provide new theoretical insights to the computational and dynamical capabilities of neural networks according to their attractive potentialities. An application of our findings is illustrated by the analysis of the dynamics of a simplified model of the basal ganglia-thalamocortical network simulated by a Boolean recurrent neural network. This example shows the significance of measuring network complexity, and how our results bear new founding elements for the understanding of the complexity of real brain circuits. PMID:24727866

  13. Measuring acute rehabilitation needs in trauma: preliminary evaluation of the Rehabilitation Complexity Scale.

    Science.gov (United States)

    Hoffman, Karen; West, Anita; Nott, Philippa; Cole, Elaine; Playford, Diane; Liu, Clarence; Brohi, Karim

    2013-01-01

    Injury severity, disability and care dependency are frequently used as surrogate measures for rehabilitation requirements following trauma. The true rehabilitation needs of patients may be different but there are no validated tools for the measurement of rehabilitation complexity in acute trauma care. The aim of the study was to evaluate the potential utility of the Rehabilitation Complexity Scale (RCS) version 2 in measuring acute rehabilitation needs in trauma patients. A prospective observation study of 103 patients with traumatic injuries in a Major Trauma Centre. Rehabilitation complexity was measured using the RCS and disability was measured using the Barthel Index. Demographic information and injury characteristics were obtained from the trauma database. The RCS was closely correlated with injury severity (r=0.69, p<0.001) and the Barthel Index (r=0.91, p<0.001). However the Barthel was poor at discriminating between patients rehabilitation needs, especially for patients with higher injury severities. Of 58 patients classified as 'very dependent' by the Barthel, 21 (36%) had low or moderate rehabilitation complexity. The RCS correlated with acute hospital length of stay (r=0.64, p=<0.001) and patients with a low RCS were more likely to be discharged home. The Barthel which had a flooring effect (56% of patients classified as very dependent were discharged home) and lacked discrimination despite close statistical correlation. The RCS outperformed the ISS and the Barthel in its ability to identify rehabilitation requirements in relation to injury severity, rehabilitation complexity, length of stay and discharge destination. The RCS is potentially a feasible and useful tool for the assessment of rehabilitation complexity in acute trauma care by providing specific measurement of patients' rehabilitation requirements. A larger longitudinal study is needed to evaluate the RCS in the assessment of patient need, service provision and trauma system performance

  14. Complex Permittivity Measurements of Textiles and Leather in a Free Space: An Angular-Invariant Approach

    OpenAIRE

    Kapilevich, B.; Litvak, B.; Anisimov, M.; Hardon, D.; Pinhasi, Y.

    2012-01-01

    The paper describes the complex permittivity measurements of textiles and leathers in a free space at 330 GHz. The destructive role of the Rayleigh scattering effect is considered and the angular-invariant limit for an incidence angle has been found out experimentally within 25–30 degrees. If incidence angle exceeds this critical parameter, the uncertainty caused by the Rayleigh scattering is drastically increased preventing accurate measurements of the real and imaginary parts of a bulky mat...

  15. Using measures of information content and complexity of time series as hydrologic metrics

    Science.gov (United States)

    The information theory has been previously used to develop metrics that allowed to characterize temporal patterns in soil moisture dynamics, and to evaluate and to compare performance of soil water flow models. The objective of this study was to apply information and complexity measures to characte...

  16. In vivo and in situ measurement and modelling of intra-body effective complex permittivity

    DEFF Research Database (Denmark)

    Nadimi, Esmaeil S; Blanes-Vidal, Victoria; Harslund, Jakob L F

    2015-01-01

    Radio frequency tracking of medical micro-robots in minimally invasive medicine is usually investigated upon the assumption that the human body is a homogeneous propagation medium. In this Letter, the authors conducted various trial programs to measure and model the effective complex permittivity e...

  17. Quantification of spatial structure of human proximal tibial bone biopsies using 3D measures of complexity

    DEFF Research Database (Denmark)

    Saparin, Peter I.; Thomsen, Jesper Skovhus; Prohaska, Steffen

    2005-01-01

    3D data sets of human tibia bone biopsies acquired by a micro-CT scanner. In order to justify the newly proposed approach, the measures of complexity of the bone architecture were compared with the results of traditional 2D bone histomorphometry. The proposed technique is able to quantify...

  18. Experimental investigation of thermodynamic properties of binary mixture of acetic acid + n-butanol and acetic acid + water at temperature from 293.15 K to 343.15 K

    Science.gov (United States)

    Paul, M. Danish John; Shruthi, N.; Anantharaj, R.

    2018-04-01

    The derived thermodynamic properties like excess molar volume, partial molar volume, excess partial molar volume and apparent volume of binary mixture of acetic acid + n-butanolandacetic acid + water has been investigated using measured density of mixtures at temperatures from 293.15 K to 343.15.

  19. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  20. Sequential Washing with Electrolyzed Alkaline and Acidic Water Effectively Removes Pathogens from Metal Surfaces.

    Directory of Open Access Journals (Sweden)

    Yuichiro Nakano

    Full Text Available Removal of pathogenic organisms from reprocessed surgical instruments is essential to prevent iatrogenic infections. Some bacteria can make persistent biofilms on medical devices. Contamination of non-disposable equipment with prions also represents a serious risk to surgical patients. Efficient disinfection of prions from endoscopes and other instruments such as high-resolution cameras remains problematic because these instruments do not tolerate aggressive chemical or heat treatments. Herein, we develop a new washing system that uses both the alkaline and acidic water produced by electrolysis. Electrolyzed acidic water, containing HCl and HOCl as active substances, has been reported to be an effective disinfectant. A 0.15% NaCl solution was electrolyzed and used immediately to wash bio-contaminated stainless steel model systems with alkaline water (pH 11.9 with sonication, and then with acidic water (pH 2.7 without sonication. Two bacterial species (Staphylococcus aureus and Pseudomonas aeruginosa and a fungus (Candida albicans were effectively removed or inactivated by the washing process. In addition, this process effectively removed or inactivated prions from the stainless steel surfaces. This washing system will be potentially useful for the disinfection of clinical devices such as neuroendoscopes because electrolyzed water is gentle to both patients and equipment and is environmentally sound.

  1. Characterization of U(VI)-carbonato ternary complexes on hematite: EXAFS and electrophoretic mobility measurements

    Science.gov (United States)

    Bargar, John R.; Reitmeyer, Rebecca; Lenhart, John J.; Davis, James A.

    2000-01-01

    We have measured U(VI) adsorption on hematite using EXAFS spectroscopy and electrophoresis under conditions relevant to surface waters and aquifers (0.01 to 10 μM dissolved uranium concentrations, in equilibrium with air, pH 4.5 to 8.5). Both techniques suggest the existence of anionic U(VI)-carbonato ternary complexes. Fits to EXAFS spectra indicate that U(VI) is simultaneously coordinated to surface FeO6 octahedra and carbonate (or bicarbonate) ligands in bidentate fashions, leading to the conclusion that the ternary complexes have an inner-sphere metal bridging (hematite-U(VI)-carbonato) structure. Greater than or equal to 50% of adsorbed U(VI) was comprised of monomeric hematite-U(VI)-carbonato ternary complexes, even at pH 4.5. Multimeric U(VI) species were observed at pH ≥ 6.5 and aqueous U(VI) concentrations approximately an order of magnitude more dilute than the solubility of crystalline β-UO2(OH)2. Based on structural constraints, these complexes were interpreted as dimeric hematite-U(VI)-carbonato ternary complexes. These results suggest that Fe-oxide-U(VI)-carbonato complexes are likely to be important transport-limiting species in oxic aquifers throughout a wide range of pH values.

  2. The method of measurement and synchronization control for large-scale complex loading system

    International Nuclear Information System (INIS)

    Liao Min; Li Pengyuan; Hou Binglin; Chi Chengfang; Zhang Bo

    2012-01-01

    With the development of modern industrial technology, measurement and control system was widely used in high precision, complex industrial control equipment and large-tonnage loading device. The measurement and control system is often used to analyze the distribution of stress and displacement in the complex bearing load or the complex nature of the mechanical structure itself. In ITER GS mock-up with 5 flexible plates, for each load combination, detect and measure potential slippage between the central flexible plate and the neighboring spacers is necessary as well as the potential slippage between each pre-stressing bar and its neighboring plate. The measurement and control system consists of seven sets of EDC controller and board, computer system, 16-channel quasi-dynamic strain gauge, 25 sets of displacement sensors, 7 sets of load and displacement sensors in the cylinders. This paper demonstrates the principles and methods of EDC220 digital controller to achieve synchronization control, and R and D process of multi-channel loading control software and measurement software. (authors)

  3. Development of the step complexity measure for emergency operating procedures using entropy concepts

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Ha, Jaejoo

    2001-01-01

    For a nuclear power plant (NPP), symptom-based emergency operating procedures (EOPs) have been adopted to enhance the safety of NPPs through reduction of operators' workload under emergency conditions. Symptom-based EOPs, however, could place a workload on operators because they have to not only identify related symptoms, but also understand the context of steps that should be carried out. Therefore, many qualitative checklists are suggested to ensure the appropriateness of steps included in EOPs. However, since these qualitative evaluations have some drawbacks, a quantitative measure that can roughly estimate the complexity of EOP steps is imperative to compensate for them. In this paper, a method to evaluate the complexity of an EOP step is developed based on entropy measures that have been used in software engineering. Based on these, step complexity (SC) measure that can evaluate SC from various viewpoints (such as the amount of information/operators' actions included in each EOP step, and the logic structure of each EOP step) was developed. To verify the suitableness of the SC measure, estimated SC values are compared with subjective task load scores obtained from the NASA-TLX (task load index) method and step performance time obtained from a full scope simulator. From these comparisons, it was observed that estimated SC values generally agree with the NASA-TLX scores and step performance time data. Thus, it could be concluded that the developed SC measure would be considered for evaluating SC of an EOP step

  4. Approach to determine measurement uncertainty in complex nanosystems with multiparametric dependencies and multivariate output quantities

    Science.gov (United States)

    Hampel, B.; Liu, B.; Nording, F.; Ostermann, J.; Struszewski, P.; Langfahl-Klabes, J.; Bieler, M.; Bosse, H.; Güttler, B.; Lemmens, P.; Schilling, M.; Tutsch, R.

    2018-03-01

    In many cases, the determination of the measurement uncertainty of complex nanosystems provides unexpected challenges. This is in particular true for complex systems with many degrees of freedom, i.e. nanosystems with multiparametric dependencies and multivariate output quantities. The aim of this paper is to address specific questions arising during the uncertainty calculation of such systems. This includes the division of the measurement system into subsystems and the distinction between systematic and statistical influences. We demonstrate that, even if the physical systems under investigation are very different, the corresponding uncertainty calculation can always be realized in a similar manner. This is exemplarily shown in detail for two experiments, namely magnetic nanosensors and ultrafast electro-optical sampling of complex time-domain signals. For these examples the approach for uncertainty calculation following the guide to the expression of uncertainty in measurement (GUM) is explained, in which correlations between multivariate output quantities are captured. To illustate the versatility of the proposed approach, its application to other experiments, namely nanometrological instruments for terahertz microscopy, dimensional scanning probe microscopy, and measurement of concentration of molecules using surface enhanced Raman scattering, is shortly discussed in the appendix. We believe that the proposed approach provides a simple but comprehensive orientation for uncertainty calculation in the discussed measurement scenarios and can also be applied to similar or related situations.

  5. Complex Hand Dexterity: A Review of Biomechanical Methods for Measuring Musical Performance

    Directory of Open Access Journals (Sweden)

    Cheryl Diane Metcalf

    2014-05-01

    Full Text Available Complex hand dexterity is fundamental to our interactions with the physical, social and cultural environment. Dexterity can be an expression of creativity and precision in a range of activities, including musical performance. Little is understood about complex hand dexterity or how virtuoso expertise is acquired, due to the versatility of movement combinations available to complete any given task. This has historically limited progress of the field because of difficulties in measuring movements of the hand. Recent developments in methods of motion capture and analysis mean it is now possible to explore the intricate movements of the hand and fingers. These methods allow us insights into the neurophysiological mechanisms underpinning complex hand dexterity and motor learning. They also allow investigation into the key factors that contribute to injury, recovery and functional compensation.The application of such analytical techniques within musical performance provides a multidisciplinary framework for purposeful investigation into the process of learning and skill acquisition in instrumental performance. These highly skilled manual and cognitive tasks present the ultimate achievement in complex hand dexterity. This paper will review methods of assessing instrumental performance in music, focusing specifically on biomechanical measurement and the associated technical challenges faced when measuring highly dexterous activities.

  6. Auto-correlation based intelligent technique for complex waveform presentation and measurement

    International Nuclear Information System (INIS)

    Rana, K P S; Singh, R; Sayann, K S

    2009-01-01

    Waveform acquisition and presentation forms the heart of many measurement systems. Particularly, data acquisition and presentation of repeating complex signals like sine sweep and frequency-modulated signals introduces the challenge of waveform time period estimation and live waveform presentation. This paper presents an intelligent technique, for waveform period estimation of both the complex and simple waveforms, based on the normalized auto-correlation method. The proposed technique is demonstrated using LabVIEW based intensive simulations on several simple and complex waveforms. Implementation of the technique is successfully demonstrated using LabVIEW based virtual instrumentation. Sine sweep vibration waveforms are successfully presented and measured for electrodynamic shaker system generated vibrations. The proposed method is also suitable for digital storage oscilloscope (DSO) triggering, for complex signals acquisition and presentation. This intelligence can be embodied into the DSO, making it an intelligent measurement system, catering wide varieties of the waveforms. The proposed technique, simulation results, robustness study and implementation results are presented in this paper.

  7. Accurate and simple measurement method of complex decay schemes radionuclide activity

    International Nuclear Information System (INIS)

    Legrand, J.; Clement, C.; Bac, C.

    1975-01-01

    A simple method for the measurement of the activity is described. It consists of using a well-type sodium iodide crystal whose efficiency mith monoenergetic photon rays has been computed or measured. For each radionuclide with a complex decay scheme a total efficiency is computed; it is shown that the efficiency is very high, near 100%. The associated incertainty is low, in spite of the important uncertainties on the different parameters used in the computation. The method has been applied to the measurement of the 152 Eu primary reference [fr

  8. Directed clustering coefficient as a measure of systemic risk in complex banking networks

    Science.gov (United States)

    Tabak, Benjamin M.; Takami, Marcelo; Rocha, Jadson M. C.; Cajueiro, Daniel O.; Souza, Sergio R. S.

    2014-01-01

    Recent literature has focused on the study of systemic risk in complex networks. It is clear now, after the crisis of 2008, that the aggregate behavior of the interaction among agents is not straightforward and it is very difficult to predict. Contributing to this debate, this paper shows that the directed clustering coefficient may be used as a measure of systemic risk in complex networks. Furthermore, using data from the Brazilian interbank network, we show that the directed clustering coefficient is negatively correlated with domestic interest rates.

  9. Validation of the complex of measures of medical rehabilitation of children victims of Chernobyl accident

    International Nuclear Information System (INIS)

    Paramonov, Z.M.

    1999-01-01

    Special complex program including social medical, organizational and hygienic aspects of studying state of health of children and the characteristics of the system of medical service have been worked out. The peculiarities of changes in the state of health in children and its correlation with the level of the internal irradiation as well as the ways to form the latter in the chain 'soil-water complex - food stuffs - organism' have been determined. Special rehabilitation measures and their application in the network of therapeutic sanatorium centers for radiation protection are validated. The expediency and necessity of medico-hygienic protection of children was established

  10. Hierarchical Model for the Similarity Measurement of a Complex Holed-Region Entity Scene

    Directory of Open Access Journals (Sweden)

    Zhanlong Chen

    2017-11-01

    Full Text Available Complex multi-holed-region entity scenes (i.e., sets of random region with holes are common in spatial database systems, spatial query languages, and the Geographic Information System (GIS. A multi-holed-region (region with an arbitrary number of holes is an abstraction of the real world that primarily represents geographic objects that have more than one interior boundary, such as areas that contain several lakes or lakes that contain islands. When the similarity of the two complex holed-region entity scenes is measured, the number of regions in the scenes and the number of holes in the regions are usually different between the two scenes, which complicates the matching relationships of holed-regions and holes. The aim of this research is to develop several holed-region similarity metrics and propose a hierarchical model to measure comprehensively the similarity between two complex holed-region entity scenes. The procedure first divides a complex entity scene into three layers: a complex scene, a micro-spatial-scene, and a simple entity (hole. The relationships between the adjacent layers are considered to be sets of relationships, and each level of similarity measurements is nested with the adjacent one. Next, entity matching is performed from top to bottom, while the similarity results are calculated from local to global. In addition, we utilize position graphs to describe the distribution of the holed-regions and subsequently describe the directions between the holes using a feature matrix. A case study that uses the Great Lakes in North America in 1986 and 2015 as experimental data illustrates the entire similarity measurement process between two complex holed-region entity scenes. The experimental results show that the hierarchical model accounts for the relationships of the different layers in the entire complex holed-region entity scene. The model can effectively calculate the similarity of complex holed-region entity scenes, even if the

  11. Low Complexity Track Initialization from a Small Set of Non-Invertible Measurements

    Directory of Open Access Journals (Sweden)

    Wolfgang Koch

    2008-02-01

    Full Text Available Target tracking from non-invertible measurement sets, for example, incomplete spherical coordinates measured by asynchronous sensors in a sensor network, is a task of data fusion present in a lot of applications. Difficulties in tracking using extended Kalman filters lead to unstable behavior, mainly caused by poor initialization. Instead of using high complexity numerical batch-estimators, we offer an analytical approach to initialize the filter from a minimum number of observations. This directly pertains to multi-hypothesis tracking (MHT, where in the presence of clutter and/or multiple targets (i low complexity algorithms are desirable and (ii using a small set of measurements avoids the combinatorial explosion. Our approach uses no numerical optimization, simply evaluating several equations to find the state estimates. This is possible since we avoid an over-determined setup by initializing only from the minimum necessary subset of measurements. Loss in accuracy is minimized by choosing the best subset using an optimality criterion and incorporating the leftover measurements afterwards. Additionally, we provide the possibility to estimate only sub-sets of parameters, and to reliably model the resulting added uncertainties by the covariance matrix. We compare two different implementations, differing in the approximation of the posterior: linearizing the measurement equation as in the extended Kalman filter (EKF or employing the unscented transform (UT. The approach will be studied in two practical examples: 3D track initialization using bearingsonly measurements or using slant-range and azimuth only.

  12. A Novel Method for Assessing Task Complexity in Outpatient Clinical-Performance Measures.

    Science.gov (United States)

    Hysong, Sylvia J; Amspoker, Amber B; Petersen, Laura A

    2016-04-01

    Clinical-performance measurement has helped improve the quality of health-care; yet success in attaining high levels of quality across multiple domains simultaneously still varies considerably. Although many sources of variability in care quality have been studied, the difficulty required to complete the clinical work itself has received little attention. We present a task-based methodology for evaluating the difficulty of clinical-performance measures (CPMs) by assessing the complexity of their component requisite tasks. Using Functional Job Analysis (FJA), subject-matter experts (SMEs) generated task lists for 17 CPMs; task lists were rated on ten dimensions of complexity, and then aggregated into difficulty composites. Eleven outpatient work SMEs; 133 VA Medical Centers nationwide. Clinical Performance: 17 outpatient CPMs (2000-2008) at 133 VA Medical Centers nationwide. Measure Difficulty: for each CPM, the number of component requisite tasks and the average rating across ten FJA complexity scales for the set of tasks comprising the measure. Measures varied considerably in the number of component tasks (M = 10.56, SD = 6.25, min = 5, max = 25). Measures of chronic care following acute myocardial infarction exhibited significantly higher measure difficulty ratings compared to diabetes or screening measures, but not to immunization measures ([Formula: see text] = 0.45, -0.04, -0.05, and -0.06 respectively; F (3, 186) = 3.57, p = 0.015). Measure difficulty ratings were not significantly correlated with the number of component tasks (r = -0.30, p = 0.23). Evaluating the difficulty of achieving recommended CPM performance levels requires more than simply counting the tasks involved; using FJA to assess the complexity of CPMs' component tasks presents an alternate means of assessing the difficulty of primary-care CPMs and accounting for performance variation among measures and performers. This in turn could be used in designing

  13. Methods and instrumental techniques for the study of acidic water systems; Metodologias y tecnicas instrumentales para el estudio de sistemas de aguas acidas

    Energy Technology Data Exchange (ETDEWEB)

    Acero Salazar, P.; Asta Andres, M. P.; Torrento Aguerri, C.; Gimeno Serrano, M. J.; Auque Sanz, L. F.; Gomez Jimenez, J. B.

    2011-07-01

    From a geochemical point of view acidic waters are very complex systems in which many interaction processes take place between surface and ground waters, gases (particularly atmospheric oxygen), acid-generating minerals, solid phases responsible for the natural attenuation of elements in solution and also many types of biological activity. Owing to this high complexity, the quality and reliability of any geochemical study focusing on this type of system will depend largely upon the use of appropriate methods of sampling, preservation and analysis of waters, minerals, gases and biological samples. We describe here the main methods and techniques used in geochemical studies of acid waters associated with sulphide mineral environments, taking into account not only the various sample types but also the features of the main types of system (open pits, tailings ponds, acid streams etc.). We also explain the main applications and limitations of each method or technique and provide references to earlier technical and scientific studies in which further information can be obtained. (Author) 97 refs.

  14. Fractal based complexity measure and variation in force during sustained isometric muscle contraction: effect of aging.

    Science.gov (United States)

    Arjunan, Sridhar P; Kumar, Dinesh K; Bastos, Teodiano

    2012-01-01

    This study has investigated the effect of age on the fractal based complexity measure of muscle activity and variance in the force of isometric muscle contraction. Surface electromyogram (sEMG) and force of muscle contraction were recorded from 40 healthy subjects categorized into: Group 1: Young - age range 20-30; 10 Males and 10 Females, Group 2: Old - age range 55-70; 10 Males and 10 Females during isometric exercise at Maximum Voluntary contraction (MVC). The results show that there is a reduction in the complexity of surface electromyogram (sEMG) associated with aging. The results demonstrate that there is an increase in the coefficient of variance (CoV) of the force of muscle contraction and a decrease in complexity of sEMG for the Old age group when compared with the Young age group.

  15. BETWEEN PARCIMONY AND COMPLEXITY: COMPARING PERFORMANCE MEASURES FOR ROMANIAN BANKING INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    ANCA MUNTEANU

    2012-01-01

    Full Text Available The main objective of this study is to establish the relationship between traditional measures of performance (ROE, ROA and NIM and EVA in order to gain some insight about the relevance of using more sophisticated performance measurements tools. Towards this end the study uses two acknowledged statistical measures: Kendall’s Tau and Spearman rank correlation Index. Using data from 12 Romanian banking institutions that report under IFRS for the period 2006-2010 the results suggest that generally EVA is highly correlated with Residual Income in the years that present positive operational profits whereas for the years with negative outcome the correlation is low. ROA and ROE are the measure that best correlates with EVA for the entire period and thus -applying Occam’s razor- could be used as a substitute for more complex shareholder earnings measures.

  16. Multi-attribute integrated measurement of node importance in complex networks.

    Science.gov (United States)

    Wang, Shibo; Zhao, Jinlou

    2015-11-01

    The measure of node importance in complex networks is very important to the research of networks stability and robustness; it also can ensure the security of the whole network. Most researchers have used a single indicator to measure the networks node importance, so that the obtained measurement results only reflect certain aspects of the networks with a loss of information. Meanwhile, because of the difference of networks topology, the nodes' importance should be described by combining the character of the networks topology. Most of the existing evaluation algorithms cannot completely reflect the circumstances of complex networks, so this paper takes into account the degree of centrality, the relative closeness centrality, clustering coefficient, and topology potential and raises an integrated measuring method to measure the nodes' importance. This method can reflect nodes' internal and outside attributes and eliminate the influence of network structure on the node importance. The experiments of karate network and dolphin network show that networks topology structure integrated measure has smaller range of metrical result than a single indicator and more universal. Experiments show that attacking the North American power grid and the Internet network with the method has a faster convergence speed than other methods.

  17. An Assessment of Wind Plant Complex Flows Using Advanced Doppler Radar Measurements

    Science.gov (United States)

    Gunter, W. S.; Schroeder, J.; Hirth, B.; Duncan, J.; Guynes, J.

    2015-12-01

    As installed wind energy capacity continues to steadily increase, the need for comprehensive measurements of wind plant complex flows to further reduce the cost of wind energy has been well advertised by the industry as a whole. Such measurements serve diverse perspectives including resource assessment, turbine inflow and power curve validation, wake and wind plant layout model verification, operations and maintenance, and the development of future advanced wind plant control schemes. While various measurement devices have been matured for wind energy applications (e.g. meteorological towers, LIDAR, SODAR), this presentation will focus on the use of advanced Doppler radar systems to observe the complex wind flows within and surrounding wind plants. Advanced Doppler radars can provide the combined advantage of a large analysis footprint (tens of square kilometers) with rapid data analysis updates (a few seconds to one minute) using both single- and dual-Doppler data collection methods. This presentation demonstrates the utility of measurements collected by the Texas Tech University Ka-band (TTUKa) radars to identify complex wind flows occurring within and nearby operational wind plants, and provide reliable forecasts of wind speeds and directions at given locations (i.e. turbine or instrumented tower sites) 45+ seconds in advance. Radar-derived wind maps reveal commonly observed features such as turbine wakes and turbine-to-turbine interaction, high momentum wind speed channels between turbine wakes, turbine array edge effects, transient boundary layer flow structures (such as wind streaks, frontal boundaries, etc.), and the impact of local terrain. Operational turbine or instrumented tower data are merged with the radar analysis to link the observed complex flow features to turbine and wind plant performance.

  18. Randomness Representation of Turbulence in Canopy Flows Using Kolmogorov Complexity Measures

    Directory of Open Access Journals (Sweden)

    Dragutin Mihailović

    2017-09-01

    Full Text Available Turbulence is often expressed in terms of either irregular or random fluid flows, without quantification. In this paper, a methodology to evaluate the randomness of the turbulence using measures based on the Kolmogorov complexity (KC is proposed. This methodology is applied to experimental data from a turbulent flow developing in a laboratory channel with canopy of three different densities. The methodology is even compared with the traditional approach based on classical turbulence statistics.

  19. RNACompress: Grammar-based compression and informational complexity measurement of RNA secondary structure

    Directory of Open Access Journals (Sweden)

    Chen Chun

    2008-03-01

    Full Text Available Abstract Background With the rapid emergence of RNA databases and newly identified non-coding RNAs, an efficient compression algorithm for RNA sequence and structural information is needed for the storage and analysis of such data. Although several algorithms for compressing DNA sequences have been proposed, none of them are suitable for the compression of RNA sequences with their secondary structures simultaneously. This kind of compression not only facilitates the maintenance of RNA data, but also supplies a novel way to measure the informational complexity of RNA structural data, raising the possibility of studying the relationship between the functional activities of RNA structures and their complexities, as well as various structural properties of RNA based on compression. Results RNACompress employs an efficient grammar-based model to compress RNA sequences and their secondary structures. The main goals of this algorithm are two fold: (1 present a robust and effective way for RNA structural data compression; (2 design a suitable model to represent RNA secondary structure as well as derive the informational complexity of the structural data based on compression. Our extensive tests have shown that RNACompress achieves a universally better compression ratio compared with other sequence-specific or common text-specific compression algorithms, such as Gencompress, winrar and gzip. Moreover, a test of the activities of distinct GTP-binding RNAs (aptamers compared with their structural complexity shows that our defined informational complexity can be used to describe how complexity varies with activity. These results lead to an objective means of comparing the functional properties of heteropolymers from the information perspective. Conclusion A universal algorithm for the compression of RNA secondary structure as well as the evaluation of its informational complexity is discussed in this paper. We have developed RNACompress, as a useful tool

  20. Three-dimensional quantification of structures in trabecular bone using measures of complexity

    DEFF Research Database (Denmark)

    Marwan, Norbert; Kurths, Jürgen; Thomsen, Jesper Skovhus

    2009-01-01

    The study of pathological changes of bone is an important task in diagnostic procedures of patients with metabolic bone diseases such as osteoporosis as well as in monitoring the health state of astronauts during long-term space flights. The recent availability of high-resolution three-dimensiona......The study of pathological changes of bone is an important task in diagnostic procedures of patients with metabolic bone diseases such as osteoporosis as well as in monitoring the health state of astronauts during long-term space flights. The recent availability of high-resolution three......-dimensional (3D) imaging of bone challenges the development of data analysis techniques able to assess changes of the 3D microarchitecture of trabecular bone. We introduce an approach based on spatial geometrical properties and define structural measures of complexity for 3D image analysis. These measures...... evaluate different aspects of organization and complexity of 3D structures, such as complexity of its surface or shape variability. We apply these measures to 3D data acquired by high-resolution microcomputed tomography (µCT) from human proximal tibiae and lumbar vertebrae at different stages...

  1. Instrumentation measurement and testing complex for detection and identification of radioactive materials using the emitted radiation

    International Nuclear Information System (INIS)

    Samossadny, V.T.; Dmitrenko, V.V.; Kadlin, V.V.; Kolesnikov, S.V.; Ulin, S.E.; Grachev, V.M.; Vlasik, K.F.; Dedenko, G.L.; Novikov, D.V.; Uteshev, Z.M.

    2006-01-01

    Simultaneous measurement of neutron and gamma radiation is a very usefull method for effective nuclear materials identification and control. The gamma-ray-neutron complex described in the paper is based on two multi-layer 3 He neutrons detectors and two High Pressure Xenon gamma-ray spectrometers assembled in one unit. All these detectors were callibrated on neutron and gamma-ray sources. The main characteristics of the instrumentation , its testing results and gamma-ray and neutron radiation parameters, which have been measured are represented in the paper. The gamma-neutron sources and fissile materials reliable detection and identification capability was demonstrated

  2. An Evaluation of Fractal Surface Measurement Methods for Characterizing Landscape Complexity from Remote-Sensing Imagery

    Science.gov (United States)

    Lam, Nina Siu-Ngan; Qiu, Hong-Lie; Quattrochi, Dale A.; Emerson, Charles W.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The rapid increase in digital data volumes from new and existing sensors necessitates the need for efficient analytical tools for extracting information. We developed an integrated software package called ICAMS (Image Characterization and Modeling System) to provide specialized spatial analytical functions for interpreting remote sensing data. This paper evaluates the three fractal dimension measurement methods: isarithm, variogram, and triangular prism, along with the spatial autocorrelation measurement methods Moran's I and Geary's C, that have been implemented in ICAMS. A modified triangular prism method was proposed and implemented. Results from analyzing 25 simulated surfaces having known fractal dimensions show that both the isarithm and triangular prism methods can accurately measure a range of fractal surfaces. The triangular prism method is most accurate at estimating the fractal dimension of higher spatial complexity, but it is sensitive to contrast stretching. The variogram method is a comparatively poor estimator for all of the surfaces, particularly those with higher fractal dimensions. Similar to the fractal techniques, the spatial autocorrelation techniques are found to be useful to measure complex images but not images with low dimensionality. These fractal measurement methods can be applied directly to unclassified images and could serve as a tool for change detection and data mining.

  3. On the role of complex phases in the quantum statistics of weak measurements

    International Nuclear Information System (INIS)

    Hofmann, Holger F

    2011-01-01

    Weak measurements carried out between quantum state preparation and post-selection result in complex values for self-adjoint operators, corresponding to complex conditional probabilities for the projections on specific eigenstates. In this paper it is shown that the complex phases of these weak conditional probabilities describe the dynamic response of the system to unitary transformations. Quantum mechanics thus unifies the statistical overlap of different states with the dynamical structure of transformations between these states. Specifically, it is possible to identify the phase of weak conditional probabilities directly with the action of a unitary transform that maximizes the overlap of initial and final states. This action provides a quantitative measure of how much quantum correlations can diverge from the deterministic relations between physical properties expected from classical physics or hidden variable theories. In terms of quantum information, the phases of weak conditional probabilities thus represent the logical tension between sets of three quantum states that is at the heart of quantum paradoxes. (paper)

  4. Application of a Dual-Arm Robot in Complex Sample Preparation and Measurement Processes.

    Science.gov (United States)

    Fleischer, Heidi; Drews, Robert Ralf; Janson, Jessica; Chinna Patlolla, Bharath Reddy; Chu, Xianghua; Klos, Michael; Thurow, Kerstin

    2016-10-01

    Automation systems with applied robotics have already been established in industrial applications for many years. In the field of life sciences, a comparable high level of automation can be found in the areas of bioscreening and high-throughput screening. Strong deficits still exist in the development of flexible and universal fully automated systems in the field of analytical measurement. Reasons are the heterogeneous processes with complex structures, which include sample preparation and transport, analytical measurements using complex sensor systems, and suitable data analysis and evaluation. Furthermore, the use of nonstandard sample vessels with various shapes and volumes results in an increased complexity. The direct use of existing automation solutions from bioscreening applications is not possible. A flexible automation system for sample preparation, analysis, and data evaluation is presented in this article. It is applied for the determination of cholesterol in biliary endoprosthesis using gas chromatography-mass spectrometry (GC-MS). A dual-arm robot performs both transport and active manipulation tasks to ensure human-like operation. This general robotic concept also enables the use of manual laboratory devices and equipment and is thus suitable in areas with a high standardization grade. © 2016 Society for Laboratory Automation and Screening.

  5. Information and complexity measures in the interface of a metal and a superconductor

    Science.gov (United States)

    Moustakidis, Ch. C.; Panos, C. P.

    2018-06-01

    Fisher information, Shannon information entropy and Statistical Complexity are calculated for the interface of a normal metal and a superconductor, as a function of the temperature for several materials. The order parameter Ψ (r) derived from the Ginzburg-Landau theory is used as an input together with experimental values of critical transition temperature Tc and the superconducting coherence length ξ0. Analytical expressions are obtained for information and complexity measures. Thus Tc is directly related in a simple way with disorder and complexity. An analytical relation is found of the Fisher Information with the energy profile of superconductivity i.e. the ratio of surface free energy and the bulk free energy. We verify that a simple relation holds between Shannon and Fisher information i.e. a decomposition of a global information quantity (Shannon) in terms of two local ones (Fisher information), previously derived and verified for atoms and molecules by Liu et al. Finally, we find analytical expressions for generalized information measures like the Tsallis entropy and Fisher information. We conclude that the proper value of the non-extensivity parameter q ≃ 1, in agreement with previous work using a different model, where q ≃ 1.005.

  6. About my Child: measuring 'Complexity' in neurodisability. Evidence of reliability and validity.

    Science.gov (United States)

    Ritzema, A M; Lach, L M; Rosenbaum, P; Nicholas, D

    2016-05-01

    About my Child, 26-item version (AMC-26) was developed as a measure of child health 'complexity' and has been proposed as a tool for understanding the functional needs of children and the priorities of families. The current study investigated the reliability and validity of AMC-26 with a sample of caregivers of children with neurodevelopmental disorders (NDD; n = 258) who completed AMC-26 as part of a larger study on parenting children with NDD. A subsample of children from the larger study (n = 49) were assessed using standardized measures of cognitive and adaptive functioning. Factor analysis revealed that a four-component model explained 51.12% of the variance. Cronbach's alpha was calculated for each of the four factors and for the scale as a whole, and ranged from 0.75 to 0.85, suggesting a high level of internal consistency. Construct validity was tested through comparisons with the results of standardized measures of child functioning. Predicted relationships for factors one, two and three were statistically significant and in the expected directions. Predictions for factor four were partially supported. AMC-26 was also expected to serve as an indicator of caregiver distress. Drawing on a sample of caregivers from the larger study (n = 251) the model was found to be significant and explained 23% of the variance in caregiver depressive symptoms (R(2)  = .053, F (1, 249) = 14.06, P child function and child health complexity. Such a measure may help elucidate the relationships between child complexity and family well-being. This is an important avenue for further investigation. © 2016 John Wiley & Sons Ltd.

  7. Using Complexity Metrics With R-R Intervals and BPM Heart Rate Measures

    DEFF Research Database (Denmark)

    Wallot, Sebastian; Fusaroli, Riccardo; Tylén, Kristian

    2013-01-01

    Lately, growing attention in the health sciences has been paid to the dynamics of heart rate as indicator of impending failures and for prognoses. Likewise, in social and cognitive sciences, heart rate is increasingly employed as a measure of arousal, emotional engagement and as a marker of inter......Lately, growing attention in the health sciences has been paid to the dynamics of heart rate as indicator of impending failures and for prognoses. Likewise, in social and cognitive sciences, heart rate is increasingly employed as a measure of arousal, emotional engagement and as a marker...... of interpersonal coordination. However, there is no consensus about which measurements and analytical tools are most appropriate in mapping the temporal dynamics of heart rate and quite different metrics are reported in the literature. As complexity metrics of heart rate variability depend critically...

  8. Comparison of Different Measurement Techniques and a CFD Simulation in Complex Terrain

    International Nuclear Information System (INIS)

    Schulz, Christoph; Lutz, Thorsten; Hofsäß, Martin; Anger, Jan; Wen Cheng, Po; Rautenberg, Alexander; Bange, Jens

    2016-01-01

    This paper deals with a comparison of data collected by measurements and a simulation for a complex terrain test site in southern Germany. Lidar, met mast, unmanned aerial vehicle (UAV) measurements of wind speed and direction and Computational Fluid Dynamics (CFD) data are compared to each other. The site is characterised regarding its flow features and the suitability for a wind turbine test field. A Delayed-Detached-Eddy- Simulation (DES) was employed using measurement data to generate generic turbulent inflow. A good agreement of the wind profiles between the different approaches was reached. The terrain slope leads to a speed-up, a change of turbulence intensity as well as to flow angle variations. (paper)

  9. Measuring working memory in aphasia: Comparing performance on complex span and N-back tasks

    Directory of Open Access Journals (Sweden)

    Maria Ivanova

    2014-04-01

    No significant correlations were observed between performance on complex span task and N-back tasks.Furthermore, performance on the modified listening span was related to performance on the comprehension subtest of the QASA, while no relationship was found for 2-back and 0-back tasks.Our results mirror studies in healthy controls that demonstrated no relationship between performance on the two tasks(Jaeggi et al., 2010; Kane et al., 2007. Thus although N-back tasks seem similar to traditional complex span measures and may also index abilities related to cognitive processing, the evidence to date does not warrant their direct association with the construct of WM. Implications for future investigation of cognitive deficits in aphasia will be discussed.

  10. Positron life time and annihilation Doppler broadening measurements on transition metal complexes

    International Nuclear Information System (INIS)

    Levay, B.; Burger, K.

    1982-01-01

    Positron life time and annihilation Doppler broadening measurements have been carried out on 44 solid coordination compounds. Several correlations have been found between the annihilation life time (tau 1 ) and line shape parameters (L) and the chemical structure of the compounds. Halide ligands were the most active towards positrons. This fact supports the assumption on the possible formation of [e + X - ] positron-halide bound state. The life time was decreasing and the annihilation energy spectra were broadening with the increasing negative character of the halides. The aromatic base ligands affected the positron-halide interaction according to their basicity and space requirement and thus they indirectly affected the annihilation parameters, too. In the planar and tetrahedral complexes the electron density on the central met--al ion affected directly the annihilation parameters, while in the octahedral mixed complexes it had only an ind--irect effect through the polarization of the halide ligands. (author)

  11. Combining measurements to estimate properties and characterization extent of complex biochemical mixtures; applications to Heparan Sulfate

    Science.gov (United States)

    Pradines, Joël R.; Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Farutin, Victor; Huang, Yongqing; Gunay, Nur Sibel; Capila, Ishan

    2016-04-01

    Complex mixtures of molecular species, such as glycoproteins and glycosaminoglycans, have important biological and therapeutic functions. Characterization of these mixtures with analytical chemistry measurements is an important step when developing generic drugs such as biosimilars. Recent developments have focused on analytical methods and statistical approaches to test similarity between mixtures. The question of how much uncertainty on mixture composition is reduced by combining several measurements still remains mostly unexplored. Mathematical frameworks to combine measurements, estimate mixture properties, and quantify remaining uncertainty, i.e. a characterization extent, are introduced here. Constrained optimization and mathematical modeling are applied to a set of twenty-three experimental measurements on heparan sulfate, a mixture of linear chains of disaccharides having different levels of sulfation. While this mixture has potentially over two million molecular species, mathematical modeling and the small set of measurements establish the existence of nonhomogeneity of sulfate level along chains and the presence of abundant sulfate repeats. Constrained optimization yields not only estimations of sulfate repeats and sulfate level at each position in the chains but also bounds on these levels, thereby estimating the extent of characterization of the sulfation pattern which is achieved by the set of measurements.

  12. Operational Complexity of Supplier-Customer Systems Measured by Entropy—Case Studies

    Directory of Open Access Journals (Sweden)

    Ladislav Lukáš

    2016-04-01

    Full Text Available This paper discusses a unified entropy-based approach for the quantitative measurement of operational complexity of company supplier-customer relations. Classical Shannon entropy is utilized. Beside this quantification tool, we also explore the relations between Shannon entropy and (c,d-entropy in more details. An analytic description of so called iso-quant curves is given, too. We present five case studies, albeit in an anonymous setting, describing various details of general procedures for measuring the operational complexity of supplier-customer systems. In general, we assume a problem-oriented database exists, which contains detailed records of all product forecasts, orders and deliveries both in quantity and time, scheduled and realized, too. Data processing detects important flow variations both in volumes and times, e.g., order—forecast, delivery—order, and actual production—scheduled one. The unifying quantity used for entropy computation is the time gap between actual delivery time and order issue time, which is nothing else but a lead time in inventory control models. After data consistency checks, histograms and empirical distribution functions are constructed. Finally, the entropy, information-theoretic measure of supplier-customer operational complexity, is calculated. Basic steps of the algorithm are mentioned briefly, too. Results of supplier-customer system analysis from selected Czech small and medium-sized enterprises (SMEs are presented in various computational and managerial decision making details. An enterprise is ranked as SME one, if it has at most 250 employees and its turnover does not exceed 50 million USD per year, or its balance sheet total does not exceed 43 million USD per year, alternatively.

  13. Decoupling Hydrogen and Oxygen Production in Acidic Water Electrolysis Using a Polytriphenylamine-Based Battery Electrode.

    Science.gov (United States)

    Ma, Yuanyuan; Dong, Xiaoli; Wang, Yonggang; Xia, Yongyao

    2018-03-05

    Hydrogen production through water splitting is considered a promising approach for solar energy harvesting. However, the variable and intermittent nature of solar energy and the co-production of H 2 and O 2 significantly reduce the flexibility of this approach, increasing the costs of its use in practical applications. Herein, using the reversible n-type doping/de-doping reaction of the solid-state polytriphenylamine-based battery electrode, we decouple the H 2 and O 2 production in acid water electrolysis. In this architecture, the H 2 and O 2 production occur at different times, which eliminates the issue of gas mixing and adapts to the variable and intermittent nature of solar energy, facilitating the conversion of solar energy to hydrogen (STH). Furthermore, for the first time, we demonstrate a membrane-free solar water splitting through commercial photovoltaics and the decoupled acid water electrolysis, which potentially paves the way for a new approach for solar water splitting. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Complex permittivity measurement at millimetre-wave frequencies during the fermentation process of Japanese sake

    International Nuclear Information System (INIS)

    Kouzai, Masaki; Nishikata, Atsuhiro; Fukunaga, Kaori; Miyaoka, Shunsuke

    2007-01-01

    Various chemical reactions occur simultaneously in barrels during the fermentation processes of alcoholic beverages. Chemical analyses are employed to monitor the change in chemical components, such as glucose and ethyl alcohol. The tests are carried out with extracted specimens, are costly and require time. We have developed a permittivity measurement system for liquid specimens in the frequency range from 2.6 to 50 GHz, and applied the system to fermentation monitoring. Experimental results proved that the observed change in complex permittivity suggests a decrease in the amount of glucose and an increase in alcohol content, which are the key chemical components during the fermentation process

  15. Low-cost sensor integrators for measuring the transmissivity of complex canopies to photosynthetically active radiation

    International Nuclear Information System (INIS)

    Newman, S.M.

    1985-01-01

    A system has been designed, tested and evaluated for measuring the transmissivities of complex canopies to photosynthetically active radiation (PAR). The system consists of filtered silicon photocells in cosine corrected mounts with outputs integrated by the use of chemical coulometers. The reading accumulated by the coulometers was taken electronically by the use of microcomputers. The low-cost sensor integrators, which do not require batteries, performed as expected and proved ideal for the study of agroforestry systems in remote areas. Information on the PAR transmissivity of a temperate agroforestry system in the form of an intercropped orchard is also presented. (author)

  16. Measures of Morphological Complexity of Gray Matter on Magnetic Resonance Imaging for Control Age Grouping

    Directory of Open Access Journals (Sweden)

    Tuan D. Pham

    2015-12-01

    Full Text Available Current brain-age prediction methods using magnetic resonance imaging (MRI attempt to estimate the physiological brain age via some kind of machine learning of chronological brain age data to perform the classification task. Such a predictive approach imposes greater risk of either over-estimate or under-estimate, mainly due to limited training data. A new conceptual framework for more reliable MRI-based brain-age prediction is by systematic brain-age grouping via the implementation of the phylogenetic tree reconstruction and measures of information complexity. Experimental results carried out on a public MRI database suggest the feasibility of the proposed concept.

  17. Direct measurement and modulation of single-molecule coordinative bonding forces in a transition metal complex

    DEFF Research Database (Denmark)

    Hao, Xian; Zhu, Nan; Gschneidtner, Tina

    2013-01-01

    remain a daunting challenge. Here we demonstrate an interdisciplinary and systematic approach that enables measurement and modulation of the coordinative bonding forces in a transition metal complex. Terpyridine is derived with a thiol linker, facilitating covalent attachment of this ligand on both gold...... substrate surfaces and gold-coated atomic force microscopy tips. The coordination and bond breaking between terpyridine and osmium are followed in situ by electrochemically controlled atomic force microscopy at the single-molecule level. The redox state of the central metal atom is found to have...

  18. Using continuous underway isotope measurements to map water residence time in hydrodynamically complex tidal environments

    Science.gov (United States)

    Downing, Bryan D.; Bergamaschi, Brian; Kendall, Carol; Kraus, Tamara; Dennis, Kate J.; Carter, Jeffery A.; von Dessonneck, Travis

    2016-01-01

    Stable isotopes present in water (δ2H, δ18O) have been used extensively to evaluate hydrological processes on the basis of parameters such as evaporation, precipitation, mixing, and residence time. In estuarine aquatic habitats, residence time (τ) is a major driver of biogeochemical processes, affecting trophic subsidies and conditions in fish-spawning habitats. But τ is highly variable in estuaries, owing to constant changes in river inflows, tides, wind, and water height, all of which combine to affect τ in unpredictable ways. It recently became feasible to measure δ2H and δ18O continuously, at a high sampling frequency (1 Hz), using diffusion sample introduction into a cavity ring-down spectrometer. To better understand the relationship of τ to biogeochemical processes in a dynamic estuarine system, we continuously measured δ2H and δ18O, nitrate and water quality parameters, on board a small, high-speed boat (5 to >10 m s–1) fitted with a hull-mounted underwater intake. We then calculated τ as is classically done using the isotopic signals of evaporation. The result was high-resolution (∼10 m) maps of residence time, nitrate, and other parameters that showed strong spatial gradients corresponding to geomorphic attributes of the different channels in the area. The mean measured value of τ was 30.5 d, with a range of 0–50 d. We used the measured spatial gradients in both τ and nitrate to calculate whole-ecosystem uptake rates, and the values ranged from 0.006 to 0.039 d–1. The capability to measure residence time over single tidal cycles in estuaries will be useful for evaluating and further understanding drivers of phytoplankton abundance, resolving differences attributable to mixing and water sources, explicitly calculating biogeochemical rates, and exploring the complex linkages among time-dependent biogeochemical processes in hydrodynamically complex environments such as estuaries.

  19. A Thorax Simulator for Complex Dynamic Bioimpedance Measurements With Textile Electrodes.

    Science.gov (United States)

    Ulbrich, Mark; Muhlsteff, Jens; Teichmann, Daniel; Leonhardt, Steffen; Walter, Marian

    2015-06-01

    Bioimpedance measurements on the human thorax are suitable for assessment of body composition or hemodynamic parameters, such as stroke volume; they are non-invasive, easy in application and inexpensive. When targeting personal healthcare scenarios, the technology can be integrated into textiles to increase ease, comfort and coverage of measurements. Bioimpedance is generally measured using two electrodes injecting low alternating currents (0.5-10 mA) and two additional electrodes to measure the corresponding voltage drop. The impedance is measured either spectroscopically (bioimpedance spectroscopy, BIS) between 5 kHz and 1 MHz or continuously at a fixed frequency around 100 kHz (impedance cardiography, ICG). A thorax simulator is being developed for testing and calibration of bioimpedance devices and other new developments. For the first time, it is possible to mimic the complete time-variant properties of the thorax during an impedance measurement. This includes the dynamic real part and dynamic imaginary part of the impedance with a peak-to-peak value of 0.2 Ω and an adjustable base impedance (24.6 Ω ≥ Z0 ≥ 51.6 Ω). Another novelty is adjustable complex electrode-skin contact impedances for up to 8 electrodes to evaluate bioimpedance devices in combination with textile electrodes. In addition, an electrocardiographic signal is provided for cardiographic measurements which is used in ICG devices. This provides the possibility to generate physiologic impedance changes, and in combination with an ECG, all parameters of interest such as stroke volume (SV), pre-ejection period (PEP) or extracellular resistance (Re) can be simulated. The speed of all dynamic signals can be altered. The simulator was successfully tested with commercially available BIS and ICG devices and the preset signals are measured with high correlation (r = 0.996).

  20. Measuring the complex admittance and tunneling rate of a germanium hut wire hole quantum dot

    Science.gov (United States)

    Li, Yan; Li, Shu-Xiao; Gao, Fei; Li, Hai-Ou; Xu, Gang; Wang, Ke; Liu, He; Cao, Gang; Xiao, Ming; Wang, Ting; Zhang, Jian-Jun; Guo, Guo-Ping

    2018-05-01

    We investigate the microwave reflectometry of an on-chip reflection line cavity coupled to a Ge hut wire hole quantum dot. The amplitude and phase responses of the cavity can be used to measure the complex admittance and evaluate the tunneling rate of the quantum dot, even in the region where transport signal through the quantum dot is too small to be measured by conventional direct transport means. The experimental observations are found to be in good agreement with a theoretical model of the hybrid system based on cavity frequency shift and linewidth shift. Our experimental results take the first step towards fast and sensitive readout of charge and spin states in Ge hut wire hole quantum dot.

  1. Rb-Sr measurements on metamorphic rocks from the Barro Alto Complex, Goias, Brazil

    International Nuclear Information System (INIS)

    Fuck, R.A.; Neves, B.B.B.; Cordani, U.G.; Kawashita, K.

    1988-01-01

    The Barro Alto Complex comprises a highly deformed and metamorphosed association of plutonic, volcanic, and sedimentary rocks exposed in a 150 x 25 Km boomerang-like strip in Central Goias, Brazil. It is the southernmost tip of an extensive yet discontinuous belt of granulite and amphibolite facies metamorphic rocks which include the Niquelandia and Cana Brava complexes to the north. Two rock associations are distinguished within the granulite belt. The first one comprises a sequence of fine-grained mafic granulite, hypersthene-quartz-feldspar granulite, garnet quartzite, sillimanite-garnet-cordierite gneiss, calc-silicate rock, and magnetite-rich iron formation. The second association comprises medium-to coarse-grained mafic rocks. The medium-grade rocks of the western/northern portion (Barro Alto Complex) comprise both layered mafic rocks and a volcanic-sedimentary sequence, deformed and metamorphosed under amphibolite facies conditions. The fine-grained amphibolite form the basal part of the Juscelandia meta volcanic-sedimentary sequence. A geochronologic investigation by the Rb-Sr method has been carried out mainly on felsic rocks from the granulite belt and gneisses of the Juscelandia sequence. The analytical results for the Juscelandia sequence are presented. Isotope results for rocks from different outcrops along the gneiss layer near Juscelandia are also presented. In conclusion, Rb-Sr isotope measurements suggest that the Barro Alto rocks have undergone at least one important metamorphic event during Middle Proterozoic times, around 1300 Ma ago. During that event volcanic and sedimentary rocks of the Juscelandia sequence, as well as the underlying gabbro-anorthosite layered complex, underwent deformation and recrystallization under amphibolite facies conditions. (author)

  2. A broadband variable-temperature test system for complex permittivity measurements of solid and powder materials

    Science.gov (United States)

    Zhang, Yunpeng; Li, En; Zhang, Jing; Yu, Chengyong; Zheng, Hu; Guo, Gaofeng

    2018-02-01

    A microwave test system to measure the complex permittivity of solid and powder materials as a function of temperature has been developed. The system is based on a TM0n0 multi-mode cylindrical cavity with a slotting structure, which provides purer test modes compared to a traditional cavity. To ensure the safety, effectiveness, and longevity, heating and testing are carried out separately and the sample can move between two functional areas through an Alundum tube. Induction heating and a pneumatic platform are employed to, respectively, shorten the heating and cooling time of the sample. The single trigger function of the vector network analyzer is added to test software to suppress the drift of the resonance peak during testing. Complex permittivity is calculated by the rigorous field theoretical solution considering multilayer media loading. The variation of the cavity equivalent radius caused by the sample insertion holes is discussed in detail, and its influence to the test result is analyzed. The calibration method for the complex permittivity of the Alundum tube and quartz vial (for loading powder sample), which vary with the temperature, is given. The feasibility of the system has been verified by measuring different samples in a wide range of relative permittivity and loss tangent, and variable-temperature test results of fused quartz and SiO2 powder up to 1500 °C are compared with published data. The results indicate that the presented system is reliable and accurate. The stability of the system is verified by repeated and long-term tests, and error analysis is presented to estimate the error incurred due to the uncertainties in different error sources.

  3. The complex ion structure of warm dense carbon measured by spectrally resolved x-ray scattering

    Energy Technology Data Exchange (ETDEWEB)

    Kraus, D.; Barbrel, B.; Falcone, R. W. [Department of Physics, University of California, Berkeley, California 94720 (United States); Vorberger, J. [Max-Planck-Institut für Physik komplexer Systeme, Nöthnitzer Straße 38, 01187 Dresden (Germany); Helfrich, J.; Frydrych, S.; Ortner, A.; Otten, A.; Roth, F.; Schaumann, G.; Schumacher, D.; Siegenthaler, K.; Wagner, F.; Roth, M. [Institut für Kernphysik, Technische Universität Darmstadt, Schlossgartenstraße 9, 64289 Darmstadt (Germany); Gericke, D. O.; Wünsch, K. [Centre for Fusion, Space and Astrophysics, Department of Physics, University of Warwick, Coventry CV4 7AL (United Kingdom); Bachmann, B.; Döppner, T. [Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Bagnoud, V.; Blažević, A. [GSI Helmholtzzentrum für Schwerionenforschung GmbH, Planckstraße 1, 64291 Darmstadt (Germany); and others

    2015-05-15

    We present measurements of the complex ion structure of warm dense carbon close to the melting line at pressures around 100 GPa. High-pressure samples were created by laser-driven shock compression of graphite and probed by intense laser-generated x-ray sources with photon energies of 4.75 keV and 4.95 keV. High-efficiency crystal spectrometers allow for spectrally resolving the scattered radiation. Comparing the ratio of elastically and inelastically scattered radiation, we find evidence for a complex bonded liquid that is predicted by ab-initio quantum simulations showing the influence of chemical bonds under these conditions. Using graphite samples of different initial densities we demonstrate the capability of spectrally resolved x-ray scattering to monitor the carbon solid-liquid transition at relatively constant pressure of 150 GPa. Showing first single-pulse scattering spectra from cold graphite of unprecedented quality recorded at the Linac Coherent Light Source, we demonstrate the outstanding possibilities for future high-precision measurements at 4th Generation Light Sources.

  4. Measurement of unsteady convection in a complex fenestration using laser interferometry

    Energy Technology Data Exchange (ETDEWEB)

    Poulad, M.E.; Naylor, D. [Ryerson Univ., Toronto, ON (Canada). Dept. of Mechanical and Industrial Engineering; Oosthuizen, P.H. [Queen' s Univ., Kingston, ON (Canada). Dept. of Mechanical and Materials Engineering

    2009-06-15

    Complex fenestration involving windows with between-panes louvered blinds is gaining interest as a means to control solar gains in buildings. However, the heat transfer performance of this type of shading system is not well understood, especially at high Rayleigh numbers. A Mach-Zehnder interferometer was used in this study to measure the unsteady convective heat transfer in a tall enclosure with between-panes blind that was heated to simulate absorbed solar radiation. Digital cinematography was combined with laser interferometry to make time-averaged measurements of unsteady and turbulent free convective heat transfer. This paper described the procedures used to measure the time-average local heat flux. Under strongly turbulent conditions, the average Nusselt number for the enclosure was found to compare well with empirical correlations. A total sampling time of about ten seconds was needed in this experiment to obtain a stationary time-average heat flux. The time-average heat flux was found to be relatively insensitive to the camera frame rate. The local heat flux was found to be unsteady and periodic. Heating of the blind made the flow more unstable, producing a higher amplitude heat flux variation than for the unheated blind condition. This paper reported on only a small set of preliminary measurements. This study is being extended to other blind angles and glazing spacings. The next phase will focus on flow visualization studies to characterize the nature of the flow. 8 refs., 2 tabs., 7 figs.

  5. SOCIAL MEASUREMENT OF YOUTH’S HEALTH: DESIGNING OF INDICATORS OF COMPLEX SOCIOLOGICAL RESEARCH

    Directory of Open Access Journals (Sweden)

    Vitalii Valeriyevich Kulish

    2017-06-01

    Full Text Available Purpose. The article is devoted to solving the problem of social measurement of modern youth’s health. The subject of the analysis is the content of the concept, characteristics and indicators of the social health of young people, which enable using sociological research’ methods to measure a given status of the younger generation in contemporary Russian society. The purpose of this work is to define the theoretical and methodological foundations of the sociological analysis of the young people social health and to substantiate its main indicators in the tools of complex sociological research. Methodology of the study. The basis of the research is formed by the system approach, the complex approach, the logical-conceptual method and general scientific methods of research: comparative analysis, system analysis, construction of social indicators, modeling. Results. The social health of young people is defined through the category “status” and is considered as an integrated indicator of the social quality of the younger generation. It is substantiated that the social health of youth is a status of socio-demographic community in which it is able not only to adapt to the changing conditions of the social environment but is also ready to transform actively the surrounding reality, having the potential to resist destructive social phenomena and processes. The main indicators that allow measuring the social health of young people by sociological methods are determined: adaptability in the social environment, social activity in all spheres of public life, social orientation and significance of activity, behavior regulativity by social norms and universal values, creativity of thinking and behavior, readiness for social integration and self-development. A system of social indicators and indicators for conducting a sociological study of social health in historical memory, value orientations and everyday practices of young people has been developed.

  6. Characterization of Activated Carbon from Coal and Its Application as Adsorbent on Mine Acid Water Treatment

    Directory of Open Access Journals (Sweden)

    Siti Hardianti

    2017-06-01

    Full Text Available Anthracite and Sub-bituminous as activated carbon raw material had been utilized especially in mining field as adsorbent of dangerous heavy metal compound resulted in mining activity. Carbon from coal was activated physically and chemically in various temperature and particle sizes. Characterization was carried out in order to determine the adsorbent specification produced hence can be used and applied accordingly. Proximate and ultimate analysis concluded anthracite has fixed carbon 88.91% while sub-bituminous 49.05%. NaOH was used in chemical activation while heated at 400-500°C whereas physical activation was conducted at 800-1000°C. Activated carbon has high activity in adsorbing indicated by high iodine number resulted from analysis. SEM-EDS result confirmed that activated carbon made from coal has the quality in accordance to SNI and can be used as adsorbent in acid water treatment.

  7. Proton exchange in systems: Glucose-water and uric acid-water

    International Nuclear Information System (INIS)

    Maarof, S.

    2007-01-01

    It is clear that formation of glucose-water and uric acid-water solutions is related in principle to interaction accepter - donor between hydrogen atom in water and oxygen atom in glucose or uric acid. The proton exchange in hydrogen bond system is an integral process and it goes by tunnel mechanism (transfer of proton within the hydrogen bridge in these structures). Proton exchange process goes very quickly at low concentrations for glucose and uric acid solutions, because these compounds are able to form more than one hydrogen bond, which helps the proton transfer within obtained structure. However, at its high concentrations, the process becomes very slow due to higher viscosity of its solutions, which result in break down of the structures, and more hydrogen bonds. (author)

  8. Selective extraction of metals from products of mine acidic water treatment

    International Nuclear Information System (INIS)

    Andreeva, N.N.; Romanchuk, S.A.; Voronin, N.N.; Demidov, V.D.; Pasynkova, T.A.; Manuilova, O.A.; Ivanova, N.V.

    1989-01-01

    A study was made on possibility of processing of foam products prepared during flotation purification of mine acidic waters for the purpose of selective extraction of non-ferrous (Co, Ni) and rare earth elements (REE) and their separation from the basic macrocomponent of waters-iron. Optimal conditions of selective metal extraction from foam flotation products are the following: T=333 K, pH=3.0-3.5, ratio of solid and liquid phase - 1:4-1:7, duration of sulfuric acid leaching - 30 min. Rare earth extraction under such conditions equals 87.6-93.0%. The degree of valuable component concentration equals ∼ 10. Rare earths are separated from iron by extraction methods

  9. Measuring spatial patterns in floodplains: A step towards understanding the complexity of floodplain ecosystems: Chapter 6

    Science.gov (United States)

    Scown, Murray W.; Thoms, Martin C.; DeJager, Nathan R.; Gilvear, David J.; Greenwood, Malcolm T.; Thoms, Martin C.; Wood, Paul J.

    2016-01-01

    Floodplains can be viewed as complex adaptive systems (Levin, 1998) because they are comprised of many different biophysical components, such as morphological features, soil groups and vegetation communities as well as being sites of key biogeochemical processing (Stanford et al., 2005). Interactions and feedbacks among the biophysical components often result in additional phenomena occuring over a range of scales, often in the absence of any controlling factors (sensu Hallet, 1990). This emergence of new biophysical features and rates of processing can lead to alternative stable states which feed back into floodplain adaptive cycles (cf. Hughes, 1997; Stanford et al., 2005). Interactions between different biophysical components, feedbacks, self emergence and scale are all key properties of complex adaptive systems (Levin, 1998; Phillips, 2003; Murray et al., 2014) and therefore will influence the manner in which we study and view spatial patterns. Measuring the spatial patterns of floodplain biophysical components is a prerequisite to examining and understanding these ecosystems as complex adaptive systems. Elucidating relationships between pattern and process, which are intrinsically linked within floodplains (Ward et al., 2002), is dependent upon an understanding of spatial pattern. This knowledge can help river scientists determine the major drivers, controllers and responses of floodplain structure and function, as well as the consequences of altering those drivers and controllers (Hughes and Cass, 1997; Whited et al., 2007). Interactions and feedbacks between physical, chemical and biological components of floodplain ecosystems create and maintain a structurally diverse and dynamic template (Stanford et al., 2005). This template influences subsequent interactions between components that consequently affect system trajectories within floodplains (sensu Bak et al., 1988). Constructing and evaluating models used to predict floodplain ecosystem responses to

  10. A physicochemical study of Al(+3) interactions with edible seaweed biomass in acidic waters.

    Science.gov (United States)

    Lodeiro, Pablo; López-García, Marta; Herrero, Luz; Barriada, José L; Herrero, Roberto; Cremades, Javier; Bárbara, Ignacio; Sastre de Vicente, Manuel E

    2012-09-01

    In this article, a study of the Al(+3) interactions in acidic waters with biomass of different edible seaweeds: brown (Fucus vesiculosus, Saccorhiza polyschides), red (Mastocarpus stellatus, Gelidium sesquipedale, Chondrus crispus), and green (Ulva rigida, Codium tomentosum), has been performed. The influence of both, the initial concentration of metal and the solution pH, on the Al-uptake capacity of the biomass has been analyzed. From preliminary tests, species Fucus vesiculosus and Gelidium sesquipedale have been selected for a more exhaustive analysis. Sorption kinetic studies demonstrated that 60 min are enough to reach equilibrium. The intraparticle diffusion model has been used to describe kinetic data. Equilibrium studies have been carried out at pH values of 1, 2.5, and 4. Langmuir isotherms showed that the best uptake values, obtained at pH 4, were 33 mg/g for F. vesiculosus and 9.2 mg/g for G. sesquipedale. These edible seaweeds have been found particularly effective in binding aluminum metal ions for most of the conditions tested. Physicochemical data reported at these low pH values could be of interest, not only in modeling aluminum-containing antacids-food pharmacokinetic processes produced in the stomach (pH values 1 to 3) but in remediation studies in acidic waters. Aluminum is thought to be linked to neurological disruptions such as Alzheimer's disease. In this article, the adsorption ability of different types of edible seaweeds toward aluminum has been studied. The choice of low pH values is due to the fact that stomach region is acidic with a pH value between 1 and 3 as a consequence of hydrochloric secretion; so physicochemical data reported in this study could be of interest in modeling drug-food interactions, in particular those referring to aluminum-containing antacids-food pharmacokinetic processes produced in the gastrointestinal tract. © 2012 Institute of Food Technologists®

  11. Vertical profiles of urban aerosol complex refractive index in the frame of ESQUIF airborne measurements

    Science.gov (United States)

    Raut, J.-C.; Chazette, P.

    2008-02-01

    A synergy between lidar, sunphotometer and in situ measurements has been applied to airborne observations performed during the Etude et Simulation de la QUalité de l'air en Ile-de-France (ESQUIF), enabling the retrieval of vertical profiles for the aerosol complex refractive index (ACRI) and single-scattering albedo with a vertical resolution of 200 m over Paris area. The averaged value over the entire planetary boundary layer (PBL) for the ACRI is close to 1.51(±0.02)-i0.017(±0.003) at 532 nm. The single-scattering albedo of the corresponding aerosols is found to be ~0.9 at the same wavelength. A good agreement is found with previous studies for urban aerosols. A comparison of vertical profiles of ACRI with simulations combining in situ measurements and relative humidity (RH) profiles has highlighted a modification in aerosol optical properties linked to their history and the origin of the air mass. The determination of ACRI in the atmospheric column enabled to retrieve vertical profiles of extinction coefficient in accordance with lidar profiles measurements.

  12. Surface Complexation Modeling of Calcite Zeta Potential Measurement in Mixed Brines for Carbonate Wettability Characterization

    Science.gov (United States)

    Song, J.; Zeng, Y.; Biswal, S. L.; Hirasaki, G. J.

    2017-12-01

    We presents zeta potential measurements and surface complexation modeling (SCM) of synthetic calcite in various conditions. The systematic zeta potential measurement and the proposed SCM provide insight into the role of four potential determining cations (Mg2+, SO42- , Ca2+ and CO32-) and CO2 partial pressure in calcite surface charge formation and facilitate the revealing of calcite wettability alteration induced by brines with designed ionic composition ("smart water"). Brines with varying potential determining ions (PDI) concentration in two different CO2 partial pressure (PCO2) are investigated in experiments. Then, a double layer SCM is developed to model the zeta potential measurements. Moreover, we propose a definition for contribution of charged surface species and quantitatively analyze the variation of charged species contribution when changing brine composition. After showing our model can accurately predict calcite zeta potential in brines containing mixed PDIs, we apply it to predict zeta potential in ultra-low and pressurized CO2 environments for potential applications in carbonate enhanced oil recovery including miscible CO2 flooding and CO2 sequestration in carbonate reservoirs. Model prediction reveals that pure calcite surface will be positively charged in all investigated brines in pressurized CO2 environment (>1atm). Moreover, the sensitivity of calcite zeta potential to CO2 partial pressure in the various brine is found to be in the sequence of Na2CO3 > Na2SO4 > NaCl > MgCl2 > CaCl2 (Ionic strength=0.1M).

  13. Measuring Early Communication in Spanish Speaking Children: The Communication Complexity Scale in Peru.

    Science.gov (United States)

    Atwood, Erin; Brady, Nancy C; Esplund, Amy

    There is a great need in the United States to develop presymbolic evaluation tools that are widely available and accurate for individuals that come from a bilingual and/or multicultural setting. The Communication Complexity Scale (CCS) is a measure that evaluates expressive presymbolic communication including gestures, vocalizations and eye gaze. Studying the effectiveness of this tool in a Spanish speaking environment was undertaken to determine the applicability of the CCS with Spanish speaking children. Methods & Procedures: In 2011-2012, researchers from the University of Kansas and Centro Ann Sullivan del Perú (CASP) investigated communication in a cohort of 71 young Spanish speaking children with developmental disabilities and a documented history of self-injurious, stereotyped and aggressive behaviors. Communication was assessed first by parental report with translated versions of the Communication and Symbolic Behavior Scales (CSBS), a well-known assessment of early communication, and then eleven months later with the CCS. We hypothesized that the CCS and the CSBS measures would be significantly correlated in this population of Spanish speaking children. The CSBS scores from time 1 with a mean participant age of 41 months were determined to have a strong positive relationship to the CCS scores obtained at time 2 with a mean participant age of 52 months. The CCS is strongly correlated to a widely accepted measure of early communication. These findings support the validity of the Spanish version of the CCS and demonstrate its usefulness for children from another culture and for children in a Spanish speaking environment.

  14. Growing complex network of citations of scientific papers: Modeling and measurements.

    Science.gov (United States)

    Golosovsky, Michael; Solomon, Sorin

    2017-01-01

    We consider the network of citations of scientific papers and use a combination of the theoretical and experimental tools to uncover microscopic details of this network growth. Namely, we develop a stochastic model of citation dynamics based on the copying-redirection-triadic closure mechanism. In a complementary and coherent way, the model accounts both for statistics of references of scientific papers and for their citation dynamics. Originating in empirical measurements, the model is cast in such a way that it can be verified quantitatively in every aspect. Such validation is performed by measuring citation dynamics of physics papers. The measurements revealed nonlinear citation dynamics, the nonlinearity being intricately related to network topology. The nonlinearity has far-reaching consequences including nonstationary citation distributions, diverging citation trajectories of similar papers, runaways or "immortal papers" with infinite citation lifetime, etc. Thus nonlinearity in complex network growth is our most important finding. In a more specific context, our results can be a basis for quantitative probabilistic prediction of citation dynamics of individual papers and of the journal impact factor.

  15. Prediction of the dynamic response of complex transmission line systems for unsteady pressure measurements

    International Nuclear Information System (INIS)

    Antonini, C; Persico, G; Rowe, A L

    2008-01-01

    Among the measurement and control systems of gas turbine engines, a recent new issue is the possibility of performing unsteady pressure measurements to detect flow anomalies in an engine or to evaluate loads on aerodynamic surfaces. A possible answer to this demand could be extending the use of well known and widely used transmission line systems, which have been applied so far to steady monitoring, to unsteady measurements thanks to proper dynamic modeling and compensation. Despite the huge number of models existing in the literature, a novel method has been developed, which is at the same time easy-to-handle, flexible and capable of reproducing the actual physics of the problem. Furthermore, the new model is able to deal with arbitrary complex networks of lines and cavities, and thus its applicability is not limited to series-connected systems. The main objectives of this paper are to show the derivation of the model, its validation against experimental tests and example of its applicability

  16. A fluorescence anisotropy method for measuring protein concentration in complex cell culture media.

    Science.gov (United States)

    Groza, Radu Constantin; Calvet, Amandine; Ryder, Alan G

    2014-04-22

    The rapid, quantitative analysis of the complex cell culture media used in biopharmaceutical manufacturing is of critical importance. Requirements for cell culture media composition profiling, or changes in specific analyte concentrations (e.g. amino acids in the media or product protein in the bioprocess broth) often necessitate the use of complicated analytical methods and extensive sample handling. Rapid spectroscopic methods like multi-dimensional fluorescence (MDF) spectroscopy have been successfully applied for the routine determination of compositional changes in cell culture media and bioprocess broths. Quantifying macromolecules in cell culture media is a specific challenge as there is a need to implement measurements rapidly on the prepared media. However, the use of standard fluorescence spectroscopy is complicated by the emission overlap from many media components. Here, we demonstrate how combining anisotropy measurements with standard total synchronous fluorescence spectroscopy (TSFS) provides a rapid, accurate quantitation method for cell culture media. Anisotropy provides emission resolution between large and small fluorophores while TSFS provides a robust measurement space. Model cell culture media was prepared using yeastolate (2.5 mg mL(-1)) spiked with bovine serum albumin (0 to 5 mg mL(-1)). Using this method, protein emission is clearly discriminated from background yeastolate emission, allowing for accurate bovine serum albumin (BSA) quantification over a 0.1 to 4.0 mg mL(-1) range with a limit of detection (LOD) of 13.8 μg mL(-1). Copyright © 2014. Published by Elsevier B.V.

  17. PAFit: A Statistical Method for Measuring Preferential Attachment in Temporal Complex Networks.

    Directory of Open Access Journals (Sweden)

    Thong Pham

    Full Text Available Preferential attachment is a stochastic process that has been proposed to explain certain topological features characteristic of complex networks from diverse domains. The systematic investigation of preferential attachment is an important area of research in network science, not only for the theoretical matter of verifying whether this hypothesized process is operative in real-world networks, but also for the practical insights that follow from knowledge of its functional form. Here we describe a maximum likelihood based estimation method for the measurement of preferential attachment in temporal complex networks. We call the method PAFit, and implement it in an R package of the same name. PAFit constitutes an advance over previous methods primarily because we based it on a nonparametric statistical framework that enables attachment kernel estimation free of any assumptions about its functional form. We show this results in PAFit outperforming the popular methods of Jeong and Newman in Monte Carlo simulations. What is more, we found that the application of PAFit to a publically available Flickr social network dataset yielded clear evidence for a deviation of the attachment kernel from the popularly assumed log-linear form. Independent of our main work, we provide a correction to a consequential error in Newman's original method which had evidently gone unnoticed since its publication over a decade ago.

  18. A New Efficient Analytical Method for Picolinate Ion Measurements in Complex Aqueous Solutions

    Energy Technology Data Exchange (ETDEWEB)

    Parazols, M.; Dodi, A. [CEA Cadarache, Lab Anal Radiochim and Chim, DEN, F-13108 St Paul Les Durance (France)

    2010-07-01

    This study focuses on the development of a new simple but sensitive, fast and quantitative liquid chromatography method for picolinate ion measurement in high ionic strength aqueous solutions. It involves cation separation over a chromatographic CS16 column using methane sulfonic acid as a mobile phase and detection by UV absorbance (254 nm). The CS16 column is a high-capacity stationary phase exhibiting both cation exchange and RP properties. It allows interaction with picolinate ions which are in their zwitterionic form at the pH of the mobile phase (1.3-1.7). Analysis is performed in 30 min with a detection limit of about 0.05 {mu}M and a quantification limit of about 0.15 {mu}M. Moreover, this analytical technique has been tested efficiently on complex aqueous samples from an effluent treatment facility. (authors)

  19. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    International Nuclear Information System (INIS)

    Cuerva, A.

    1997-01-01

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC1400-12 Ref. [9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc. ) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  20. Measuring complexity in a business cycle model of the Kaldor type

    International Nuclear Information System (INIS)

    Januario, Cristina; Gracio, Clara; Duarte, Jorge

    2009-01-01

    The purpose of this paper is to study the dynamical behavior of a family of two-dimensional nonlinear maps associated to an economic model. Our objective is to measure the complexity of the system using techniques of symbolic dynamics in order to compute the topological entropy. The analysis of the variation of this important topological invariant with the parameters of the system, allows us to distinguish different chaotic scenarios. Finally, we use a another topological invariant to distinguish isentropic dynamics and we exhibit numerical results about maps with the same topological entropy. This work provides an illustration of how our understanding of higher dimensional economic models can be enhanced by the theory of dynamical systems.

  1. Simultaneous measurement of amyloid fibril formation by dynamic light scattering and fluorescence reveals complex aggregation kinetics.

    Directory of Open Access Journals (Sweden)

    Aaron M Streets

    Full Text Available An apparatus that combines dynamic light scattering and Thioflavin T fluorescence detection is used to simultaneously probe fibril formation in polyglutamine peptides, the aggregating subunit associated with Huntington's disease, in vitro. Huntington's disease is a neurodegenerative disorder in a class of human pathologies that includes Alzheimer's and Parkinson's disease. These pathologies are all related by the propensity of their associated protein or polypeptide to form insoluble, β-sheet rich, amyloid fibrils. Despite the wide range of amino acid sequence in the aggregation prone polypeptides associated with these diseases, the resulting amyloids display strikingly similar physical structure, an observation which suggests a physical basis for amyloid fibril formation. Thioflavin T fluorescence reports β-sheet fibril content while dynamic light scattering measures particle size distributions. The combined techniques allow elucidation of complex aggregation kinetics and are used to reveal multiple stages of amyloid fibril formation.

  2. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    Energy Technology Data Exchange (ETDEWEB)

    Cuerva, A.

    1997-10-01

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC 1400-12 Re.[9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc.) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  3. Estimating the operator's performance time of emergency procedural tasks based on a task complexity measure

    International Nuclear Information System (INIS)

    Jung, Won Dae; Park, Jink Yun

    2012-01-01

    It is important to understand the amount of time required to execute an emergency procedural task in a high-stress situation for managing human performance under emergencies in a nuclear power plant. However, the time to execute an emergency procedural task is highly dependent upon expert judgment due to the lack of actual data. This paper proposes an analytical method to estimate the operator's performance time (OPT) of a procedural task, which is based on a measure of the task complexity (TACOM). The proposed method for estimating an OPT is an equation that uses the TACOM as a variable, and the OPT of a procedural task can be calculated if its relevant TACOM score is available. The validity of the proposed equation is demonstrated by comparing the estimated OPTs with the observed OPTs for emergency procedural tasks in a steam generator tube rupture scenario.

  4. Combination of optically measured coordinates and displacements for quantitative investigation of complex objects

    Science.gov (United States)

    Andrae, Peter; Beeck, Manfred-Andreas; Jueptner, Werner P. O.; Nadeborn, Werner; Osten, Wolfgang

    1996-09-01

    Holographic interferometry makes it possible to measure high precision displacement data in the range of the wavelength of the used laser light. However, the determination of 3D- displacement vectors of objects with complex surfaces requires the measurement of 3D-object coordinates not only to consider local sensitivities but to distinguish between in-plane deformation, i.e. strains, and out-of-plane components, i.e. shears, too. To this purpose both the surface displacement and coordinates have to be combined and it is advantageous to make the data available for CAE- systems. The object surface has to be approximated analytically from the measured point cloud to generate a surface mesh. The displacement vectors can be assigned to the nodes of this surface mesh for visualization of the deformation of the object under test. They also can be compared to the results of FEM-calculations or can be used as boundary conditions for further numerical investigations. Here the 3D-object coordinates are measured in a separate topometric set-up using a modified fringe projection technique to acquire absolute phase values and a sophisticated geometrical model to map these phase data onto coordinates precisely. The determination of 3D-displacement vectors requires the measurement of several interference phase distributions for at least three independent sensitivity directions depending on the observation and illumination directions as well as the 3D-position of each measuring point. These geometric quantities have to be transformed into a reference coordinate system of the interferometric set-up in order to calculate the geometric matrix. The necessary transformation can be realized by means of a detection of object features in both data sets and a subsequent determination of the external camera orientation. This paper presents a consistent solution for the measurement and combination of shape and displacement data including their transformation into simulation systems. The

  5. Progressive evolution and a measure for its noise-dependent complexity

    Science.gov (United States)

    Fussy, Siegfried; Grössing, Gerhard; Schwabl, Herbert

    1999-03-01

    -Queen-effect." Additionally, for the memory based model a parameter was found indicating a limited range of noise allowing for the most complex behavior of the model, whereas the entropy of the system provides only a monotonous measure with respect to the varying noise level.

  6. FLOODPLAIN-CHANNEL COMPLEX OF SMALL RIVER: ASSESSMENT OF CURRENT STATE, OPTIMIZATION MEASURES

    Directory of Open Access Journals (Sweden)

    Kovalchuk I.

    2016-05-01

    Full Text Available The article describes main methodological principles of geoecological assessment of riverbed-floodplain complex condition of one of the small rivers in Ukrainian Carpathians. According to our long-term field, cartographic, laboratory and remote sensing research, division of riverbed into homogeneous geoecological segments was made, as well as their standardization in accordance to the trends of unfavorable processes. Main reasons for deterioration of quality characteristics of channel-floodplain river complex were outlined; the role of natural and anthropogenic factors in deterioration of geoecological condition of the river and its floodplain complex was analyzed. Based on the assessment results it is possible to state that the condition of study segments of the Berezhnytsya river flood-plain and stream-way complex was marked as “excellent”, “good” and “satisfactory”. “Unsatisfactory” and “catastrophic” river and flood-plain condition has not been detected yet, although within Dashava urban settlement the river area condition is close to the “satisfactory” grade. The best situation is at the river head as human impact is minimized here and natural vegetation is preserved. Downstream we trace the tendency of condition worsening as anthropogenic load on the basin system and flood-plain and stream-way complex increases. Its negative impact is balanced by large forests, thus in segments limited by Banya Lysovytska village and Lotatnyky village the river and flood-plain condition is rated as “good”. So, downstream from the named village the value of such an important natural barrier as forest is reducing and anthropogenic load on the river significantly increases. The latter manifests in an intensive agricultural reclamation and housing development of flood-plains. Since degradation processes are rapidly developing over a considerable part of the Berezhnytsya river, negative changes are visible and only the study area

  7. Local difference measures between complex networks for dynamical system model evaluation.

    Science.gov (United States)

    Lange, Stefan; Donges, Jonathan F; Volkholz, Jan; Kurths, Jürgen

    2015-01-01

    A faithful modeling of real-world dynamical systems necessitates model evaluation. A recent promising methodological approach to this problem has been based on complex networks, which in turn have proven useful for the characterization of dynamical systems. In this context, we introduce three local network difference measures and demonstrate their capabilities in the field of climate modeling, where these measures facilitate a spatially explicit model evaluation.Building on a recent study by Feldhoff et al. [8] we comparatively analyze statistical and dynamical regional climate simulations of the South American monsoon system [corrected]. types of climate networks representing different aspects of rainfall dynamics are constructed from the modeled precipitation space-time series. Specifically, we define simple graphs based on positive as well as negative rank correlations between rainfall anomaly time series at different locations, and such based on spatial synchronizations of extreme rain events. An evaluation against respective networks built from daily satellite data provided by the Tropical Rainfall Measuring Mission 3B42 V7 reveals far greater differences in model performance between network types for a fixed but arbitrary climate model than between climate models for a fixed but arbitrary network type. We identify two sources of uncertainty in this respect. Firstly, climate variability limits fidelity, particularly in the case of the extreme event network; and secondly, larger geographical link lengths render link misplacements more likely, most notably in the case of the anticorrelation network; both contributions are quantified using suitable ensembles of surrogate networks. Our model evaluation approach is applicable to any multidimensional dynamical system and especially our simple graph difference measures are highly versatile as the graphs to be compared may be constructed in whatever way required. Generalizations to directed as well as edge- and node

  8. Characterization of a complex near-surface structure using well logging and passive seismic measurements

    Science.gov (United States)

    Benjumea, Beatriz; Macau, Albert; Gabàs, Anna; Figueras, Sara

    2016-04-01

    We combine geophysical well logging and passive seismic measurements to characterize the near-surface geology of an area located in Hontomin, Burgos (Spain). This area has some near-surface challenges for a geophysical study. The irregular topography is characterized by limestone outcrops and unconsolidated sediments areas. Additionally, the near-surface geology includes an upper layer of pure limestones overlying marly limestones and marls (Upper Cretaceous). These materials lie on top of Low Cretaceous siliciclastic sediments (sandstones, clays, gravels). In any case, a layer with reduced velocity is expected. The geophysical data sets used in this study include sonic and gamma-ray logs at two boreholes and passive seismic measurements: three arrays and 224 seismic stations for applying the horizontal-to-vertical amplitude spectra ratio method (H/V). Well-logging data define two significant changes in the P-wave-velocity log within the Upper Cretaceous layer and one more at the Upper to Lower Cretaceous contact. This technique has also been used for refining the geological interpretation. The passive seismic measurements provide a map of sediment thickness with a maximum of around 40 m and shear-wave velocity profiles from the array technique. A comparison between seismic velocity coming from well logging and array measurements defines the resolution limits of the passive seismic techniques and helps it to be interpreted. This study shows how these low-cost techniques can provide useful information about near-surface complexity that could be used for designing a geophysical field survey or for seismic processing steps such as statics or imaging.

  9. Radiometric characterization of six soils in the microwave X-range through complex permittivity measurements

    International Nuclear Information System (INIS)

    Palme, U.W.

    1987-10-01

    Estimating and monitoring up-to-date soil moisture conditions over extensive areas through passive (or active) microwave remote sensing techniques requires the knowledge of the complex relative permittivity (ε r * ) in function of soil moisture. X-band measurements of ε r * for different moisture conditions were made in laboratory for soil samples of six important Soils (PV 2 , LV 3 , LR d , LE 1 , SAP and Sc). Using a theoretical model and computational programmes developed, these measurements allowed estimates of the emissive characteristics of the soils that would be expected with the X-Band Microwave Radiometer built at INPE. The results, new, for soils from tropical regions, showed that only the physical characteristics and properties of the soils are not sufficient to explain the behaviour of ε r * in function of soil moisture, indicating that the chemical and/or mineralogical properties of the soils do have an important contribution. The results also showed thast ε r * in function of soil moisture depends on soil class. (author) [pt

  10. Synchronous Measurement of Ultrafast Anisotropy Decay of the B850 in Bacterial LH2 Complex

    International Nuclear Information System (INIS)

    Wang Yun-Peng; Du Lu-Chao; Zhu Gang-Bei; Wang Zhuan; Weng Yu-Xiang

    2015-01-01

    Ultrafast anisotropic decay is a prominent parameter revealing ultrafast energy and electron transfer; however, it is difficult to be determined reliably owing to the requirement of a simultaneous availability of the parallel and perpendicular polarized decay kinetics. Nowadays, any measurement of anisotropic decay is a kind of approach to the exact simultaneity. Here we report a novel method for a synchronous ultrafast anisotropy decay measurement, which can well determine the anisotropy, even at a very early time, as the rising phase of the excitation laser pulse. The anisotropic decay of the B850 in bacterial light harvesting antenna complex LH2 of Rhodobacter sphaeroides in solution at room temperature with coherent excitation is detected by this method, which shows a polarization response time of 30 fs, and the energy transfer from the initial excitation to the bacteriochlorophylls in B850 ring takes about 70 fs. The anisotropic decay that is probed at the red side of the absorption spectrum, such as 880 nm, has an initial value of 0.4, corresponding to simulated emission, while the blue side with an anisotropy of 0.1 contributes to the ground-state bleaching. Our results show that the coherent excitation covering the whole ring might not be realized owing to the symmetry breaking of LH2: from C_9 symmetry in membrane to C_2 symmetry in solution. (atomic and molecular physics)

  11. Synchronous Measurement of Ultrafast Anisotropy Decay of the B850 in Bacterial LH2 Complex

    Science.gov (United States)

    Wang, Yun-Peng; Du, Lu-Chao; Zhu, Gang-Bei; Wang, Zhuan; Weng, Yu-Xiang

    2015-02-01

    Ultrafast anisotropic decay is a prominent parameter revealing ultrafast energy and electron transfer; however, it is difficult to be determined reliably owing to the requirement of a simultaneous availability of the parallel and perpendicular polarized decay kinetics. Nowadays, any measurement of anisotropic decay is a kind of approach to the exact simultaneity. Here we report a novel method for a synchronous ultrafast anisotropy decay measurement, which can well determine the anisotropy, even at a very early time, as the rising phase of the excitation laser pulse. The anisotropic decay of the B850 in bacterial light harvesting antenna complex LH2 of Rhodobacter sphaeroides in solution at room temperature with coherent excitation is detected by this method, which shows a polarization response time of 30 fs, and the energy transfer from the initial excitation to the bacteriochlorophylls in B850 ring takes about 70 fs. The anisotropic decay that is probed at the red side of the absorption spectrum, such as 880 nm, has an initial value of 0.4, corresponding to simulated emission, while the blue side with an anisotropy of 0.1 contributes to the ground-state bleaching. Our results show that the coherent excitation covering the whole ring might not be realized owing to the symmetry breaking of LH2: from C9 symmetry in membrane to C2 symmetry in solution.

  12. Complexity of MRI induced heating on metallic leads: Experimental measurements of 374 configurations

    Directory of Open Access Journals (Sweden)

    Mendoza Gonzalo

    2008-03-01

    Full Text Available Abstract Background MRI induced heating on PM leads is a very complex issue. The widely varying results described in literature suggest that there are many factors that influence the degree of heating and that not always are adequately addressed by existing testing methods. Methods We present a wide database of experimental measurements of the heating of metallic wires and PM leads in a 1.5 T RF coil. The aim of these measurements is to systematically quantify the contribution of some potential factors involved in the MRI induced heating: the length and the geometric structure of the lead; the implant location within the body and the lead path; the shape of the phantom used to simulate the human trunk and its relative position inside the RF coil. Results We found that the several factors are the primary influence on heating at the tip. Closer locations of the leads to the edge of the phantom and to the edge of the coil produce maximum heating. The lead length is the other crucial factor, whereas the implant area does not seem to have a major role in the induced temperature increase. Also the lead structure and the geometry of the phantom revealed to be elements that can significantly modify the amount of heating. Conclusion Our findings highlight the factors that have significant effects on MRI induced heating of implanted wires and leads. These factors must be taken into account by those who plan to study or model MRI heating of implants. Also our data should help those who wish to develop guidelines for defining safe medical implants for MRI patients. In addition, our database of the entire set of measurements can help those who wish to validate their numerical models of implants that may be exposed to MRI systems.

  13. Mitochondrial Complex 1 Activity Measured by Spectrophotometry Is Reduced across All Brain Regions in Ageing and More Specifically in Neurodegeneration.

    Science.gov (United States)

    Pollard, Amelia Kate; Craig, Emma Louise; Chakrabarti, Lisa

    2016-01-01

    Mitochondrial function, in particular complex 1 of the electron transport chain (ETC), has been shown to decrease during normal ageing and in neurodegenerative disease. However, there is some debate concerning which area of the brain has the greatest complex 1 activity. It is important to identify the pattern of activity in order to be able to gauge the effect of age or disease related changes. We determined complex 1 activity spectrophotometrically in the cortex, brainstem and cerebellum of middle aged mice (70-71 weeks), a cerebellar ataxic neurodegeneration model (pcd5J) and young wild type controls. We share our updated protocol on the measurements of complex1 activity and find that mitochondrial fractions isolated from frozen tissues can be measured for robust activity. We show that complex 1 activity is clearly highest in the cortex when compared with brainstem and cerebellum (p<0.003). Cerebellum and brainstem mitochondria exhibit similar levels of complex 1 activity in wild type brains. In the aged brain we see similar levels of complex 1 activity in all three-brain regions. The specific activity of complex 1 measured in the aged cortex is significantly decreased when compared with controls (p<0.0001). Both the cerebellum and brainstem mitochondria also show significantly reduced activity with ageing (p<0.05). The mouse model of ataxia predictably has a lower complex 1 activity in the cerebellum, and although reductions are measured in the cortex and brain stem, the remaining activity is higher than in the aged brains. We present clear evidence that complex 1 activity decreases across the brain with age and much more specifically in the cerebellum of the pcd5j mouse. Mitochondrial impairment can be a region specific phenomenon in disease, but in ageing appears to affect the entire brain, abolishing the pattern of higher activity in cortical regions.

  14. Get a grip on chaos: Tailored measures for complex systems on surfaces

    Science.gov (United States)

    Firle, Sascha Oliver

    Complex systems are ubiquitous in physics, biology and mathematics. This thesis is concerned with describing and understanding complex systems. Some new concepts about how large systems can be viewed in a lower dimensional framework are proposed. The systems presented are examples from ecology and chemistry. In both cases we have a large amount of interacting units that can be understood by The predator-prey system investigated consists of ground beetles, Pterostichus cuprens L. (Coleoptera: Carabidae), that feeds on bird-cherry oat aphids. The beetles' movement can consistently be described by a combined model of surface diffusion and biased random walk. This allows conclusions about how fast and in which fashion the beetle covers its habitat. Movement is dependent on aphid densities and predation, in turn modifies aphid distributions locally. The presented generalized functional response theory describes predation rates in the presence of spatial heterogeneity. A single measure for fragmentation captures all essential features of the prey aggregation and allows the estimation of outbreak densities and distributions. The chemical example is the catalytic oxidation of CO on a Pt(110) single crystal surface. Unstable periodic orbits reconstructed from experimental data are used to reveal the topology of the attractor, underlying the time series dynamics. The found braid supports an orbit which implies that the time series is chaotic. The system is simulated numerically by a set of partial differential equations for surface coverage in one space dimension. The bifurcation diagram of the corresponding traveling wave ODE reveals the homoclinic and heteroclinic orbits that organize the phase space and mediate the transition to chaos. Studies in the PDE- framework relate this to the stability and to the interaction of pulse-like solutions.

  15. Soil Temperature Variability in Complex Terrain measured using Distributed a Fiber-Optic Distributed Temperature Sensing

    Science.gov (United States)

    Seyfried, M. S.; Link, T. E.

    2013-12-01

    Soil temperature (Ts) exerts critical environmental controls on hydrologic and biogeochemical processes. Rates of carbon cycling, mineral weathering, infiltration and snow melt are all influenced by Ts. Although broadly reflective of the climate, Ts is sensitive to local variations in cover (vegetative, litter, snow), topography (slope, aspect, position), and soil properties (texture, water content), resulting in a spatially and temporally complex distribution of Ts across the landscape. Understanding and quantifying the processes controlled by Ts requires an understanding of that distribution. Relatively few spatially distributed field Ts data exist, partly because traditional Ts data are point measurements. A relatively new technology, fiber optic distributed temperature system (FO-DTS), has the potential to provide such data but has not been rigorously evaluated in the context of remote, long term field research. We installed FO-DTS in a small experimental watershed in the Reynolds Creek Experimental Watershed (RCEW) in the Owyhee Mountains of SW Idaho. The watershed is characterized by complex terrain and a seasonal snow cover. Our objectives are to: (i) evaluate the applicability of fiber optic DTS to remote field environments and (ii) to describe the spatial and temporal variability of soil temperature in complex terrain influenced by a variable snow cover. We installed fiber optic cable at a depth of 10 cm in contrasting snow accumulation and topographic environments and monitored temperature along 750 m with DTS. We found that the DTS can provide accurate Ts data (+/- .4°C) that resolves Ts changes of about 0.03°C at a spatial scale of 1 m with occasional calibration under conditions with an ambient temperature range of 50°C. We note that there are site-specific limitations related cable installation and destruction by local fauna. The FO-DTS provide unique insight into the spatial and temporal variability of Ts in a landscape. We found strong seasonal

  16. Complex Correlation Measure: a novel descriptor for Poincaré plot

    Directory of Open Access Journals (Sweden)

    Gubbi Jayavardhana

    2009-08-01

    Full Text Available Abstract Background Poincaré plot is one of the important techniques used for visually representing the heart rate variability. It is valuable due to its ability to display nonlinear aspects of the data sequence. However, the problem lies in capturing temporal information of the plot quantitatively. The standard descriptors used in quantifying the Poincaré plot (SD1, SD2 measure the gross variability of the time series data. Determination of advanced methods for capturing temporal properties pose a significant challenge. In this paper, we propose a novel descriptor "Complex Correlation Measure (CCM" to quantify the temporal aspect of the Poincaré plot. In contrast to SD1 and SD2, the CCM incorporates point-to-point variation of the signal. Methods First, we have derived expressions for CCM. Then the sensitivity of descriptors has been shown by measuring all descriptors before and after surrogation of the signal. For each case study, lag-1 Poincaré plots were constructed for three groups of subjects (Arrhythmia, Congestive Heart Failure (CHF and those with Normal Sinus Rhythm (NSR, and the new measure CCM was computed along with SD1 and SD2. ANOVA analysis distribution was used to define the level of significance of mean and variance of SD1, SD2 and CCM for different groups of subjects. Results CCM is defined based on the autocorrelation at different lags of the time series, hence giving an in depth measurement of the correlation structure of the Poincaré plot. A surrogate analysis was performed, and the sensitivity of the proposed descriptor was found to be higher as compared to the standard descriptors. Two case studies were conducted for recognizing arrhythmia and congestive heart failure (CHF subjects from those with NSR, using the Physionet database and demonstrated the usefulness of the proposed descriptors in biomedical applications. CCM was found to be a more significant (p = 6.28E-18 parameter than SD1 and SD2 in discriminating

  17. Radiometric Measurements of the Thermal Conductivity of Complex Planetary-like Materials

    Science.gov (United States)

    Piqueux, S.; Christensen, P. R.

    2012-12-01

    Planetary surface temperatures and thermal inertias are controlled by the physical and compositional characteristics of the surface layer material, which result from current and past geological activity. For this reason, temperature measurements are often acquired because they provide fundamental constraints on the geological history and habitability. Examples of regolith properties affecting surface temperatures and inertias are: grain sizes and mixture ratios, solid composition in the case of ices, presence of cement between grains, regolith porosity, grain roughness, material layering etc.. Other important factors include volatile phase changes, and endogenic or exogenic heat sources (i.e. geothermal heat flow, impact-related heat, biological activity etc.). In the case of Mars, the multitude of instruments observing the surface temperature at different spatial and temporal resolutions (i.e. IRTM, Thermoskan, TES, MiniTES, THEMIS, MCS, REMS, etc.) in conjunction with other instruments allows us to probe and characterize the thermal properties of the surface layer with an unprecedented resolution. While the derivation of thermal inertia values from temperature measurements is routinely performed by well-established planetary regolith numerical models, constraining the physical properties of the surface layer from thermal inertia values requires the additional step of laboratory measurements. The density and specific heat are usually constant and sufficiently well known for common geological materials, but the bulk thermal conductivity is highly variable as a function of the physical characteristics of the regolith. Most laboratory designs do not allow an investigation of the thermal conductivity of complex regolith configurations similar to those observed on planetary surfaces (i.e. cemented material, large grains, layered material, and temperature effects) because the samples are too small and need to be soft to insert heating or measuring devices. For this

  18. Binary, ternary and quaternary liquid-liquid equilibria in 1-butanol, oleic acid, water and n-heptane mixtures

    NARCIS (Netherlands)

    Winkelman, J. G. M.; Kraai, G. N.; Heeres, H. J.

    2009-01-01

    This work reports on liquid-liquid equilibria in the system 1-butanol, oleic acid, water and n-heptane used for biphasic, lipase catalysed esterifications. The literature was studied on the mutual solubility in binary systems of water and each of the organic components. Experimental results were

  19. Complex Contact Angles Calculated from Capillary Rise Measurements on Rock Fracture Faces

    Science.gov (United States)

    Perfect, E.; Gates, C. H.; Brabazon, J. W.; Santodonato, L. J.; Dhiman, I.; Bilheux, H.; Bilheux, J. C.; Lokitz, B. S.

    2017-12-01

    Contact angles for fluids in unconventional reservoir rocks are needed for modeling hydraulic fracturing leakoff and subsequent oil and gas extraction. Contact angle measurements for wetting fluids on rocks are normally performed using polished flat surfaces. However, such prepared surfaces are not representative of natural rock fracture faces, which have been shown to be rough over multiple scales. We applied a variant of the Wilhelmy plate method for determining contact angle from the height of capillary rise on a vertical surface to the wetting of rock fracture faces by water in the presence of air. Cylindrical core samples (5.05 cm long x 2.54 cm diameter) of Mancos shale and 6 other rock types were investigated. Mode I fractures were created within the cores using the Brazilian method. Each fractured core was then separated into halves exposing the fracture faces. One fracture face from each rock type was oriented parallel to a collimated neutron beam in the CG-1D imaging instrument at ORNL's High Flux Isotope Reactor. Neutron radiography was performed using the multi-channel plate detector with a spatial resolution of 50 μm. Images were acquired every 60 s after a water reservoir contacted the base of the fracture face. The images were normalized to the initial dry condition so that the upward movement of water on the fracture face was clearly visible. The height of wetting at equilibrium was measured on the normalized images using ImageJ. Contact angles were also measured on polished flat surfaces using the conventional sessile drop method. Equilibrium capillary rise on the exposed fracture faces was up to 8.5 times greater than that predicted for polished flat surfaces from the sessile drop measurements. These results indicate that rock fracture faces are hyperhydrophilic (i.e., the height of capillary rise is greater than that predicted for a contact angle of zero degrees). The use of complex numbers permitted calculation of imaginary contact angles for

  20. Measuring microscopic evolution processes of complex networks based on empirical data

    International Nuclear Information System (INIS)

    Chi, Liping

    2015-01-01

    Aiming at understanding the microscopic mechanism of complex systems in real world, we perform the measurement that characterizes the evolution properties on two empirical data sets. In the Autonomous Systems Internet data, the network size keeps growing although the system suffers a high rate of node deletion (r = 0.4) and link deletion (q = 0.81). However, the average degree keeps almost unchanged during the whole time range. At each time step the external links attached to a new node are about c = 1.1 and the internal links added between existing nodes are approximately m = 8. For the Scientific Collaboration data, it is a cumulated result of all the authors from 1893 up to the considered year. There is no deletion of nodes and links, r = q = 0. The external and internal links at each time step are c = 1.04 and m = 0, correspondingly. The exponents of degree distribution p(k) ∼ k -γ of these two empirical datasets γ data are in good agreement with that obtained theoretically γ theory . The results indicate that these evolution quantities may provide an insight into capturing the microscopic dynamical processes that govern the network topology. (paper)

  1. Impact of automation: Measurement of performance, workload and behaviour in a complex control environment.

    Science.gov (United States)

    Balfe, Nora; Sharples, Sarah; Wilson, John R

    2015-03-01

    This paper describes an experiment that was undertaken to compare three levels of automation in rail signalling; a high level in which an automated agent set routes for trains using timetable information, a medium level in which trains were routed along pre-defined paths, and a low level where the operator (signaller) was responsible for the movement of all trains. These levels are described in terms of a Rail Automation Model based on previous automation theory (Parasuraman et al., 2000). Performance, subjective workload, and signaller activity were measured for each level of automation running under both normal operating conditions and abnormal, or disrupted, conditions. The results indicate that perceived workload, during both normal and disrupted phases of the experiment, decreased as the level of automation increased and performance was most consistent (i.e. showed the least variation between participants) with the highest level of automation. The results give a strong case in favour of automation, particularly in terms of demonstrating the potential for automation to reduce workload, but also suggest much benefit can achieved from a mid-level of automation potentially at a lower cost and complexity. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  2. Tailored complex degree of mutual coherence for plane-of-interest interferometry with reduced measurement uncertainty

    Science.gov (United States)

    Fütterer, G.

    2017-10-01

    A problem of interferometers is the elimination of parasitic reflections. Parasitic reflections and modulated intensity signals, which are not related to the reference surface (REF) or the surface under test (SUT) in a direct way, can increase the measurement uncertainty significantly. In some situations standard methods might be used in order to eliminate reflections from the backside of the optical element under test. For instance, match the test object to an absorber, while taking the complex refractive index into account, can cancel out back reflections completely. This causes additional setup time and chemical contamination. In some situations an angular offset might be combined with an aperture stop. This reduces spatial resolution and it does not work if the disturbing wave field propagates in the same direction as the wave field, which propagates from the SUT. However, a stack of surfaces is a problem. An increased spectral bandwidth might be used in order to obtain a separation of the plane-of-interest from other planes. Depending on the interferometer used, this might require an optical path difference of zero or it might cause a reduction of the visibility to V embodiment of a modified interferometer, will be discussed.

  3. Method for Determining the Activation Energy Distribution Function of Complex Reactions by Sieving and Thermogravimetric Measurements.

    Science.gov (United States)

    Bufalo, Gennaro; Ambrosone, Luigi

    2016-01-14

    A method for studying the kinetics of thermal degradation of complex compounds is suggested. Although the method is applicable to any matrix whose grain size can be measured, herein we focus our investigation on thermogravimetric analysis, under a nitrogen atmosphere, of ground soft wheat and ground maize. The thermogravimetric curves reveal that there are two well-distinct jumps of mass loss. They correspond to volatilization, which is in the temperature range 298-433 K, and decomposition regions go from 450 to 1073 K. Thermal degradation is schematized as a reaction in the solid state whose kinetics is analyzed separately in each of the two regions. By means of a sieving analysis different size fractions of the material are separated and studied. A quasi-Newton fitting algorithm is used to obtain the grain size distribution as best fit to experimental data. The individual fractions are thermogravimetrically analyzed for deriving the functional relationship between activation energy of the degradation reactions and the particle size. Such functional relationship turns out to be crucial to evaluate the moments of the activation energy distribution, which is unknown in terms of the distribution calculated by sieve analysis. From the knowledge of moments one can reconstruct the reaction conversion. The method is applied first to the volatilization region, then to the decomposition region. The comparison with the experimental data reveals that the method reproduces the experimental conversion with an accuracy of 5-10% in the volatilization region and of 3-5% in the decomposition region.

  4. Continuity of care in mental health: understanding and measuring a complex phenomenon.

    Science.gov (United States)

    Burns, T; Catty, J; White, S; Clement, S; Ellis, G; Jones, I R; Lissouba, P; McLaren, S; Rose, D; Wykes, T

    2009-02-01

    Continuity of care is considered by patients and clinicians an essential feature of good quality care in long-term disorders, yet there is general agreement that it is a complex concept. Most policies emphasize it and encourage systems to promote it. Despite this, there is no accepted definition or measure against which to test policies or interventions designed to improve continuity. We aimed to operationalize a multi-axial model of continuity of care and to use factor analysis to determine its validity for severe mental illness. A multi-axial model of continuity of care comprising eight facets was operationalized for quantitative data collection from mental health service users using 32 variables. Of these variables, 22 were subsequently entered into a factor analysis as independent components, using data from a clinical population considered to require long-term consistent care. Factor analysis produced seven independent continuity factors accounting for 62.5% of the total variance. These factors, Experience and Relationship, Regularity, Meeting Needs, Consolidation, Managed Transitions, Care Coordination and Supported Living, were close but not identical to the original theoretical model. We confirmed that continuity of care is multi-factorial. Our seven factors are intuitively meaningful and appear to work in mental health. These factors should be used as a starting-point in research into the determinants and outcomes of continuity of care in long-term disorders.

  5. PIV measurement of the complex and transient cross-flow over a circular cylinder

    International Nuclear Information System (INIS)

    Kuwabara, Joji; Someya, Satoshi; Okamoto, Koji

    2007-01-01

    This paper describe about measurement for the complex and transient cross-flow over a circular cylinder with the dynamic (time resolved) PIV (particle image velocimetry) techniques. The experiment was carried out water flow tunnel with a working section of 50x50 mm, at the Reynolds number 6.7 x 10 3 to 2.7 x 10 4 . This circular cylinder constructed with MEXFLON resin, the end of circular cylinder is rigidly supported and the other is free. The MEXFLON is fluorine resin; its refractive index is almost same as the water with high transparency. Very high speed water flow among the test section had been clearly visualized and captured by high speed camera. The fluctuations of the flow structure also are clearly obtained with high spatial and high temporal resolution, 512x512pixel with 10,000fps. It corresponds to set up number of thousands LDV array at the test section. Consequently, we found there are asynchronous vibration between parallel-ward and perpendicular-ward to main flow. (author)

  6. Investigating the Appropriateness of the TACOM Measure: Application to the Complexity of Proceduralized Tasks for High Speed Train Drivers

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea; Ko, Jong Hyun

    2010-01-01

    According to wide-spread experience in many industries, a procedure is one of the most effective countermeasures to reduce the possibility of human related problems. Unfortunately, a systematic framework to evaluate the complexity of procedural tasks seems to be very scant. For this reason, the TACOM measure, which can quantify the complexity of procedural tasks, has been developed. In this study, the appropriateness of the TACOM measure is investigated by comparing TACOM scores regarding the procedural tasks of high speed train drivers with the associated workload scores measured by the NASA-TLX technique. As a result, it is observed that there is a meaningful correlation between the TACOM scores and the associated NASA-TLX scores. Therefore, it is expected that the TACOM measure can properly quantify the complexity of procedural tasks

  7. Investigating the Appropriateness of the TACOM Measure: Application to the Complexity of Proceduralized Tasks for High Speed Train Drivers

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ko, Jong Hyun [Nuclear Engineering and Technology Institute, Daejeon (Korea, Republic of)

    2010-02-15

    According to wide-spread experience in many industries, a procedure is one of the most effective countermeasures to reduce the possibility of human related problems. Unfortunately, a systematic framework to evaluate the complexity of procedural tasks seems to be very scant. For this reason, the TACOM measure, which can quantify the complexity of procedural tasks, has been developed. In this study, the appropriateness of the TACOM measure is investigated by comparing TACOM scores regarding the procedural tasks of high speed train drivers with the associated workload scores measured by the NASA-TLX technique. As a result, it is observed that there is a meaningful correlation between the TACOM scores and the associated NASA-TLX scores. Therefore, it is expected that the TACOM measure can properly quantify the complexity of procedural tasks

  8. Measuring kinetics of complex single ion channel data using mean-variance histograms.

    Science.gov (United States)

    Patlak, J B

    1993-07-01

    The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean

  9. Implementing digital holograms to create and measure complex-plane optical fields

    CSIR Research Space (South Africa)

    Dudley, Angela L

    2016-02-01

    Full Text Available The coherent superposition of a Gaussian beam with an optical vortex can be mathematically described to occupy the complex plane. The authors provide a simple analogy between the mathematics, in the form of the complex plane, and the visual...

  10. Comparison of surface extraction techniques performance in computed tomography for 3D complex micro-geometry dimensional measurements

    DEFF Research Database (Denmark)

    Torralba, Marta; Jiménez, Roberto; Yagüe-Fabra, José A.

    2018-01-01

    micro-geometries as well (i.e., in the sub-mm dimensional range). However, there are different factors that may influence the CT process performance, being one of them the surface extraction technique used. In this paper, two different extraction techniques are applied to measure a complex miniaturized......The number of industrial applications of computed tomography (CT) for dimensional metrology in 100–103 mm range has been continuously increasing, especially in the last years. Due to its specific characteristics, CT has the potential to be employed as a viable solution for measuring 3D complex...... dental file by CT in order to analyze its contribution to the final measurement uncertainty in complex geometries at the mm to sub-mm scales. The first method is based on a similarity analysis: the threshold determination; while the second one is based on a gradient or discontinuity analysis: the 3D...

  11. Water Accounting Plus (WA+) - a water accounting procedure for complex river basins based on satellite measurements

    Science.gov (United States)

    Karimi, P.; Bastiaanssen, W. G. M.; Molden, D.

    2012-11-01

    Coping with the issue of water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links hydrological flows to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper we introduce Water Accounting Plus (WA+), which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use on the water cycle is described explicitly by defining land use groups with common characteristics. Analogous to financial accounting, WA+ presents four sheets including (i) a resource base sheet, (ii) a consumption sheet, (iii) a productivity sheet, and (iv) a withdrawal sheet. Every sheet encompasses a set of indicators that summarize the overall water resources situation. The impact of external (e.g. climate change) and internal influences (e.g. infrastructure building) can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used for 3 out of the 4 sheets, but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  12. Complex bounds and microstructural recovery from measurements of sea ice permittivity

    International Nuclear Information System (INIS)

    Gully, A.; Backstrom, L.G.E.; Eicken, H.; Golden, K.M.

    2007-01-01

    Sea ice is a porous composite of pure ice with brine, air, and salt inclusions. The polar sea ice packs play a key role in the earth's ocean-climate system, and they host robust algal and bacterial communities that support the Arctic and Antarctic ecosystems. Monitoring the sea ice packs on global or regional scales is an increasingly important problem, typically involving the interaction of an electromagnetic wave with sea ice. In the quasistatic regime where the wavelength is much longer than the composite microstructural scale, the electromagnetic behavior is characterized by the effective complex permittivity tensor ε*. In assessing the impact of climate change on the polar sea ice covers, current satellites and algorithms can predict ice extent, but the thickness distribution remains an elusive, yet most important feature. In recent years, electromagnetic induction devices using low frequency waves have been deployed on ships, helicopters and planes to obtain thickness data. Here we compare two sets of theoretical bounds to extensive outdoor tank and in situ field data on ε* at 50MHz taken in the Arctic and Antarctic. The sea ice is assumed to be a two phase composite of ice and brine with known constituent permittivities. The first set of bounds assumes only knowledge of the brine volume fraction or porosity, and the second set further assumes statistical isotropy of the microstructure. We obtain excellent agreement between theory and experiment, and are able to observe the apparent violation of the isotropic bounds as the vertically oriented microstructure becomes increasingly connected for higher porosities. Moreover, these bounds are inverted to obtain estimates of the porosity from the measurements of ε*. We find that the temporal variations of the reconstructed porosity, which is directly related to temperature, closely follow the actual behavior

  13. Characterization of Nuclear Materials Using Complex of Non-Destructive and Mass-Spectroscopy Methods of Measurements

    International Nuclear Information System (INIS)

    Gorbunova, A.; Kramchaninov, A.

    2015-01-01

    Information and Analytical Centre for nuclear materials investigations was established in Russian Federation in the February 2 of 2009 by ROSATOM State Atomic Energy Corporation (the order #80). Its purpose is in preventing unauthorized access to nuclear materials and excluding their illicit traffic. Information and Analytical Centre includes analytical laboratory to provide composition and properties of nuclear materials of unknown origin for their identification. According to Regulation the Centre deals with: · identification of nuclear materials of unknown origin to provide information about their composition and properties; · arbitration analyzes of nuclear materials; · comprehensive research of nuclear and radioactive materials for developing techniques characterization of materials; · interlaboratory measurements; · measurements for control and accounting; · confirmatory measurements. Complex of non-destructive and mass-spectroscopy techniques was developed for the measurements. The complex consists of: · gamma-ray techniques on the base of MGAU, MGA and FRAM codes for uranium and plutonium isotopic composition; · gravimetrical technique with gamma-spectroscopy in addition for uranium content; · calorimetric technique for plutonium mass; · neutron multiplicity technique for plutonium mass; · measurement technique on the base of mass-spectroscopy for uranium isotopic composition; · measurement technique on the base of mass-spectroscopy for metallic impurities. Complex satisfies the state regulation requirements of ensuring the uniformity of measurements including the Russian Federation Federal Law on Ensuring the Uniformity of Measurements #102-FZ, Interstate Standard GOST R ISO/IEC 17025-2006, National Standards of Russian Federation GOST R 8.563-2009, GOST R 8.703-2010, Federal Regulations NRB-99/2009, OSPORB 99/2010. Created complex is provided in reference materials, equipment end certificated techniques. The complex is included in accredited

  14. An approach to measuring adolescents' perception of complexity for pictures of fruit and vegetable mixes

    DEFF Research Database (Denmark)

    Mielby, Line Holler; Bennedbæk-Jensen, Sidsel; Edelenbos, Merete

    2013-01-01

    . An adolescent consumer group (n = 242) and an adult consumer group (n = 86) subsequently rated the pictures on simplicity and attractiveness. Pearson's correlation coefficients revealed strong correlations between the sensory panel and both consumer groups' usage of simplicity. This suggests that simplicity can...... adolescents' perception of complexity of pictures of fruit and vegetable mixes. A sensory panel evaluated 10 descriptive attributes, including simplicity and complexity, for 24 pictures of fruit and vegetable mixes. The descriptive analysis found strong inverse correlation between complexity and simplicity...

  15. A novel approach for rapidly and cost-effectively assessing toxicity of toxic metals in acidic water using an acidophilic iron-oxidizing biosensor.

    Science.gov (United States)

    Yang, Shih-Hung; Cheng, Kuo-Chih; Liao, Vivian Hsiu-Chuan

    2017-11-01

    Contamination by heavy metals and metalloids is a serious environmental and health concern. Acidic wastewaters are often associated with toxic metals which may enter and spread into agricultural soils. Several biological assays have been developed to detect toxic metals; however, most of them can only detect toxic metals in a neutral pH, not in an acidic environment. In this study, an acidophilic iron-oxidizing bacterium (IOB) Strain Y10 was isolated, characterized, and used to detect toxic metals toxicity in acidic water at pH 2.5. The colorimetric acidophilic IOB biosensor was based on the inhibition of the iron oxidizing ability of Strain Y10, an acidophilic iron-oxidizing bacterium, by metals toxicity. Our results showed that Strain Y10 is acidophilic iron-oxidizing bacterium. Thiobacillus caldus medium (TCM) (pH 2.5) supplied with both S 4 O 6 2- and glucose was the optimum growth medium for Strain Y10. The optimum temperature and pH for the growth of Strain Y10 was 45 °C and pH 2.5, respectively. Our study demonstrates that the color-based acidophilic IOB biosensor can be semi-quantitatively observed by eye or quantitatively measured by spectrometer to detect toxicity from multiple toxic metals at pH 2.5 within 45 min. Our study shows that monitoring toxic metals in acidic water is possible by using the acidophilic IOB biosensor. Our study thus provides a novel approach for rapid and cost-effective detection of toxic metals in acidic conditions that can otherwise compromise current methods of chemical analysis. This method also allows for increased efficiency when screening large numbers of environmental samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. [Design, use and introduction of the practice of measureable complexes for psychophysiological studies].

    Science.gov (United States)

    Bokser, O Ia; Gurtovoĭ, E S

    1997-01-01

    The paper outlines a background of chronoreaction measurement, an important trend of psychophysiological studies. It mainly deals with the chronoreaction measuring methods and tools introduced into production.

  17. Measurement of the speed of sound by observation of the Mach cones in a complex plasma under microgravity conditions

    Energy Technology Data Exchange (ETDEWEB)

    Zhukhovitskii, D. I., E-mail: dmr@ihed.ras.ru; Fortov, V. E.; Molotkov, V. I.; Lipaev, A. M.; Naumkin, V. N. [Joint Institute of High Temperatures, Russian Academy of Sciences, Izhorskaya 13, Bd. 2, 125412 Moscow (Russian Federation); Thomas, H. M. [Research Group Complex Plasma, DLR, Oberpfaffenhofen, 82234 Wessling (Germany); Ivlev, A. V.; Morfill, G. E. [Max-Planck-Institut für extraterrestrische Physik, Giessenbachstrasse, 85748 Garching (Germany); Schwabe, M. [Department of Chemical and Biomolecular Engineering, Graves Lab, D75 Tan Hall, University of California, Berkeley, CA 94720 (United States)

    2015-02-15

    We report the first observation of the Mach cones excited by a larger microparticle (projectile) moving through a cloud of smaller microparticles (dust) in a complex plasma with neon as a buffer gas under microgravity conditions. A collective motion of the dust particles occurs as propagation of the contact discontinuity. The corresponding speed of sound was measured by a special method of the Mach cone visualization. The measurement results are incompatible with the theory of ion acoustic waves. The estimate for the pressure in a strongly coupled Coulomb system and a scaling law for the complex plasma make it possible to derive an evaluation for the speed of sound, which is in a reasonable agreement with the experiments in complex plasmas.

  18. Measurement of the speed of sound by observation of the Mach cones in a complex plasma under microgravity conditions

    International Nuclear Information System (INIS)

    Zhukhovitskii, D. I.; Fortov, V. E.; Molotkov, V. I.; Lipaev, A. M.; Naumkin, V. N.; Thomas, H. M.; Ivlev, A. V.; Morfill, G. E.; Schwabe, M.

    2015-01-01

    We report the first observation of the Mach cones excited by a larger microparticle (projectile) moving through a cloud of smaller microparticles (dust) in a complex plasma with neon as a buffer gas under microgravity conditions. A collective motion of the dust particles occurs as propagation of the contact discontinuity. The corresponding speed of sound was measured by a special method of the Mach cone visualization. The measurement results are incompatible with the theory of ion acoustic waves. The estimate for the pressure in a strongly coupled Coulomb system and a scaling law for the complex plasma make it possible to derive an evaluation for the speed of sound, which is in a reasonable agreement with the experiments in complex plasmas

  19. An investigation of ozone and planetary boundary layer dynamics over the complex topography of Grenoble combining measurements and modeling

    OpenAIRE

    Couach , O.; Balin , I.; Jiménez , R.; Ristori , P.; Perego , S.; Kirchner , F.; Simeonov , V.; Calpini , B.; Van Den Bergh , H.

    2003-01-01

    This paper concerns an evaluation of ozone (O3) and planetary boundary layer (PBL) dynamics over the complex topography of the Grenoble region through a combination of measurements and mesoscale model (METPHOMOD) predictions for three days, during July 1999. The measurements of O3 and PBL structure were obtained with a Differential Absorption Lidar (DIAL) system, situated 20 km south of Grenoble at Vif (310 m ASL). The combined lidar observations ...

  20. Finalizing a measurement framework for the burden of treatment in complex patients with chronic conditions

    Directory of Open Access Journals (Sweden)

    Eton DT

    2015-03-01

    % were coping with multiple chronic conditions. A preliminary conceptual framework using data from the first 32 interviews was evaluated and was modified using narrative data from 18 additional interviews with a racially and socioeconomically diverse sample of patients. The final framework features three overarching themes with associated subthemes. These themes included: 1 work patients must do to care for their health (eg, taking medications, keeping medical appointments, monitoring health; 2 challenges/stressors that exacerbate perceived burden (eg, financial, interpersonal, provider obstacles; and 3 impacts of burden (eg, role limitations, mental exhaustion. All themes and subthemes were subsequently confirmed in focus groups. Conclusion: The final conceptual framework can be used as a foundation for building a patient self-report measure to systematically study treatment burden for research and analytical purposes, as well as to promote meaningful clinic-based dialogue between patients and providers about the challenges inherent in maintaining complex self-management of health. Keywords: treatment burden, conceptual framework, adherence, questionnaire, self-management, multi-morbidity

  1. Methodological possibilities for using the electron and ion energy balance in thermospheric complex measurements

    International Nuclear Information System (INIS)

    Serafimov, K.B.; Serafimova, M.K.

    1991-01-01

    Combination of ground based measurements for determination of basic thermospheric characteristics is proposed . An expression for the energy transport between components of space plasma is also derived and discussed within the framework of the presented methodology which could be devided into the folowing major sections: 1) application of ionosonde, absorption measurements, TEC-measurements using Faradey radiation or the differential Doppler effect; 2) ground-based airglow measurements; 3) airglow and palsma satelite measurements. 9 refs

  2. Structure, Dynamics, and Kinetics of Weak Protein-Protein Complexes from NMR Spin Relaxation Measurements of Titrated Solutions

    International Nuclear Information System (INIS)

    Salmon, L.; Licinio, A.; Jensen, M.R.; Blackledge, M.; Ortega Roldan, J.L.; Van Nuland, N.; Lescop, E.

    2011-01-01

    We have recently presented a titration approach for the determination of residual dipolar couplings (RDCs) from experimentally inaccessible complexes. Here, we extend this approach to the measurement of 15 N spin relaxation rates and demonstrate that this can provide long-range structural, dynamic, and kinetic information about these elusive systems. (authors)

  3. Life cycle costs measurement of complex systems manufactured by an engineer-to-order company

    NARCIS (Netherlands)

    Öner, K.B.; Franssen, R.; Kiesmüller, G.P.; Houtum, van G.J.J.A.N.; Qui, R.G.; Russell, D.W.; Sullivan, W.G.

    2007-01-01

    Complex technical systems such as packaging lines, computer networks, material handling systems, are crucial for the operations at the companies (or institutions) where they are installed. Companies require high availability because their primary processes may halt when these systems are down. High

  4. Exploring the dynamic and complex integration of sustainability performance measurement into product development

    DEFF Research Database (Denmark)

    Rodrigues, Vinicius Picanco; Morioka, S.; Pigosso, Daniela Cristina Antelmi

    2016-01-01

    In order to deal with the complex and dynamic nature of sustainability integration into the product development process, this research explore the use of a qualitative System Dynamics approach by using the causal loop diagram (CLD) tool. A literature analysis was followed by a case study, aiming ...

  5. Objective measures of renal mass anatomic complexity predict rates of major complications following partial nephrectomy.

    Science.gov (United States)

    Simhan, Jay; Smaldone, Marc C; Tsai, Kevin J; Canter, Daniel J; Li, Tianyu; Kutikov, Alexander; Viterbo, Rosalia; Chen, David Y T; Greenberg, Richard E; Uzzo, Robert G

    2011-10-01

    The association between tumor complexity and postoperative complications after partial nephrectomy (PN) has not been well characterized. We evaluated whether increasing renal tumor complexity, quantitated by nephrometry score (NS), is associated with increased complication rates following PN using the Clavien-Dindo classification system (CCS). We queried our prospectively maintained kidney cancer database for patients undergoing PN from 2007 to 2010 for whom NS was available. All patients underwent PN. Tumors were categorized into low- (NS: 4-6), moderate- (NS: 7-9), and high-complexity (NS: 10-12) lesions. Complication rates within 30 d were graded (CCS: I-5), stratified as minor (CCS: I or 2) or major (CCS: 3-5), and compared between groups. A total of 390 patients (mean age: 58.0 ± 11.9 yr; 66.9% male) undergoing PN (44.6% open, 55.4% robotic) for low- (28%), moderate- (55.6%), and high-complexity (16.4%) tumors (mean tumor size: 3.74 ± 2.4 cm; median: 3.2 cm) from 2007 to 2010 were identified. Tumor size, estimated blood loss, and ischemia time all significantly differed (prenal tumors. Copyright © 2011 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  6. In vivo measurements of the triceps surae complex architecture in man: implications for muscle function

    NARCIS (Netherlands)

    Maganaris, C.N.; Baltzopoulos, V.; Sargeant, A.J.

    1998-01-01

    1. The objectives of this study were to (1) quantify experimentally in vivo changes in pennation angle, fibre length and muscle thickness in the triceps surae complex in man in response to changes in ankle position and isometric plantarflexion moment and (2) compare changes in the above muscle

  7. Force and complexity of tongue task training influences behavioral measures of motor learning

    DEFF Research Database (Denmark)

    Kothari, Mohit; Svensson, Peter; Huo, Xueliang

    2012-01-01

    Relearning of motor skills is important in neurorehabilitation. We investigated the improvement of training success during simple tongue protrusion (two force levels) and a more complex tongue-training paradigm using the Tongue Drive System (TDS). We also compared subject-based reports of fun, pain...... training influences behavioral aspects of tongue motor learning....

  8. Stark effect measurements on monomers and trimers of reconstituted light-harvesting complex II of plants

    NARCIS (Netherlands)

    Palacios, M.A.; Caffarri, S.; Bassi, R.; Grondelle, van R.; Amerongen, van H.

    2004-01-01

    The electric-field induced absorption changes (Stark effect) of reconstituted light-harvesting complex II (LHCII) in different oligomerisation states - monomers and trimers - with different xanthophyll content have been probed at 77 K. The Stark spectra of the reconstituted control samples,

  9. Using the Solution Space Diagram in Measuring the Effect of Sector Complexity During Merging Scenarios

    NARCIS (Netherlands)

    Abdul Rahman, S.M.B.; Van Paassen, M.M.; Mulder, M.

    2011-01-01

    When designing Air Traffic Control (ATC) sectors and procedures, traffic complexity and workload are important issues. For predicting ATC workload, metrics based on the Solution Space Diagram (SSD) have been proposed. This paper studies the effect of sector design on workload and SSD metrics. When

  10. Probing Nuclear Spin Effects on Electronic Spin Coherence via EPR Measurements of Vanadium(IV) Complexes.

    Science.gov (United States)

    Graham, Michael J; Krzyaniak, Matthew D; Wasielewski, Michael R; Freedman, Danna E

    2017-07-17

    Quantum information processing (QIP) has the potential to transform numerous fields from cryptography, to finance, to the simulation of quantum systems. A promising implementation of QIP employs unpaired electronic spins as qubits, the fundamental units of information. Though molecular electronic spins offer many advantages, including chemical tunability and facile addressability, the development of design principles for the synthesis of complexes that exhibit long qubit superposition lifetimes (also known as coherence times, or T 2 ) remains a challenge. As nuclear spins in the local qubit environment are a primary cause of shortened superposition lifetimes, we recently conducted a study which employed a modular spin-free ligand scaffold to place a spin-laden propyl moiety at a series of fixed distances from an S = 1 / 2 vanadium(IV) ion in a series of vanadyl complexes. We found that, within a radius of 4.0(4)-6.6(6) Å from the metal center, nuclei did not contribute to decoherence. To assess the generality of this important design principle and test its efficacy in a different coordination geometry, we synthesized and investigated three vanadium tris(dithiolene) complexes with the same ligand set employed in our previous study: K 2 [V(C 5 H 6 S 4 ) 3 ] (1), K 2 [V(C 7 H 6 S 6 ) 3 ] (2), and K 2 [V(C 9 H 6 S 8 ) 3 ] (3). We specifically interrogated solutions of these complexes in DMF-d 7 /toluene-d 8 with pulsed electron paramagnetic resonance spectroscopy and electron nuclear double resonance spectroscopy and found that the distance dependence present in the previously synthesized vanadyl complexes holds true in this series. We further examined the coherence properties of the series in a different solvent, MeCN-d 3 /toluene-d 8 , and found that an additional property, the charge density of the complex, also affects decoherence across the series. These results highlight a previously unknown design principle for augmenting T 2 and open new pathways for the

  11. Computation of complexity measures of morphologically significant zones decomposed from binary fractal sets via multiscale convexity analysis

    International Nuclear Information System (INIS)

    Lim, Sin Liang; Koo, Voon Chet; Daya Sagar, B.S.

    2009-01-01

    Multiscale convexity analysis of certain fractal binary objects-like 8-segment Koch quadric, Koch triadic, and random Koch quadric and triadic islands-is performed via (i) morphologic openings with respect to recursively changing the size of a template, and (ii) construction of convex hulls through half-plane closings. Based on scale vs convexity measure relationship, transition levels between the morphologic regimes are determined as crossover scales. These crossover scales are taken as the basis to segment binary fractal objects into various morphologically prominent zones. Each segmented zone is characterized through normalized morphologic complexity measures. Despite the fact that there is no notably significant relationship between the zone-wise complexity measures and fractal dimensions computed by conventional box counting method, fractal objects-whether they are generated deterministically or by introducing randomness-possess morphologically significant sub-zones with varied degrees of spatial complexities. Classification of realistic fractal sets and/or fields according to sub-zones possessing varied degrees of spatial complexities provides insight to explore links with the physical processes involved in the formation of fractal-like phenomena.

  12. Application of the modified Wheeler cap method for radiation efficiency measurement of balanced electrically small antennas in complex environment

    DEFF Research Database (Denmark)

    Zhang, Jiaying; Pivnenko, Sergey; Breinbjerg, Olav

    2010-01-01

    In this paper, application of a modified Wheeler cap method for the radiation efficiency measurement of balanced electrically small antennas is presented. It is shown that the limitations on the cavity dimension can be overcome and thus measurement in a large cavity is possible. The cavity loss...... is investigated, and a modified radiation efficiency formula that includes the cavity loss is introduced. Moreover, a modification of the technique is proposed that involves the antenna working complex environment inside the Wheeler Cap and thus makes possible measurement of an antenna close to a hand or head...

  13. Structural measurements and cell line studies of the copper-PEG-Rifampicin complex against Mycobacterium tuberculosis.

    Science.gov (United States)

    Manning, Thomas; Mikula, Rachel; Wylie, Greg; Phillips, Dennis; Jarvis, Jackie; Zhang, Fengli

    2015-02-01

    The bacterium responsible for tuberculosis is increasing its resistance to antibiotics resulting in new multidrug-resistant Mycobacterium tuberculosis (MDR-TB) and extensively drug-resistant tuberculosis (XDR-TB). In this study, several analytical techniques including NMR, FT-ICR, MALDI-MS, LC-MS and UV/Vis are used to study the copper-Rifampicin-Polyethylene glycol (PEG-3350) complex. The copper (II) cation is a carrier for the antibiotic Rifampicin as well as nutrients for the bacterium. The NIH-NIAID cell line containing several Tb strains (including antibiotic resistant strains) is tested against seven copper-PEG-RIF complex variations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. On Tight Separation for Blum Measures Applied to Turing Machine Buffer Complexity

    Czech Academy of Sciences Publication Activity Database

    Šíma, Jiří; Žák, Stanislav

    2017-01-01

    Roč. 152, č. 4 (2017), s. 397-409 ISSN 0169-2968 R&D Projects: GA ČR GBP202/12/G061; GA ČR GAP202/10/1333 Institutional support: RVO:67985807 Keywords : Turing machine * hierarchy * buffer complexity * diagonalization Subject RIV: IN - Informatics, Computer Science OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 0.687, year: 2016

  15. Inferring a Drive-Response Network from Time Series of Topological Measures in Complex Networks with Transfer Entropy

    Directory of Open Access Journals (Sweden)

    Xinbo Ai

    2014-11-01

    Full Text Available Topological measures are crucial to describe, classify and understand complex networks. Lots of measures are proposed to characterize specific features of specific networks, but the relationships among these measures remain unclear. Taking into account that pulling networks from different domains together for statistical analysis might provide incorrect conclusions, we conduct our investigation with data observed from the same network in the form of simultaneously measured time series. We synthesize a transfer entropy-based framework to quantify the relationships among topological measures, and then to provide a holistic scenario of these measures by inferring a drive-response network. Techniques from Symbolic Transfer Entropy, Effective Transfer Entropy, and Partial Transfer Entropy are synthesized to deal with challenges such as time series being non-stationary, finite sample effects and indirect effects. We resort to kernel density estimation to assess significance of the results based on surrogate data. The framework is applied to study 20 measures across 2779 records in the Technology Exchange Network, and the results are consistent with some existing knowledge. With the drive-response network, we evaluate the influence of each measure by calculating its strength, and cluster them into three classes, i.e., driving measures, responding measures and standalone measures, according to the network communities.

  16. Tensor Norms and the Classical Communication Complexity of Nonlocal Quantum Measurement

    OpenAIRE

    Shi, Yaoyun; Zhu, Yufan

    2005-01-01

    We initiate the study of quantifying nonlocalness of a bipartite measurement by the minimum amount of classical communication required to simulate the measurement. We derive general upper bounds, which are expressed in terms of certain tensor norms of the measurement operator. As applications, we show that (a) If the amount of communication is constant, quantum and classical communication protocols with unlimited amount of shared entanglement or shared randomness compute the same set of funct...

  17. Quantifying complexity of financial short-term time series by composite multiscale entropy measure

    Science.gov (United States)

    Niu, Hongli; Wang, Jun

    2015-05-01

    It is significant to study the complexity of financial time series since the financial market is a complex evolved dynamic system. Multiscale entropy is a prevailing method used to quantify the complexity of a time series. Due to its less reliability of entropy estimation for short-term time series at large time scales, a modification method, the composite multiscale entropy, is applied to the financial market. To qualify its effectiveness, its applications in the synthetic white noise and 1 / f noise with different data lengths are reproduced first in the present paper. Then it is introduced for the first time to make a reliability test with two Chinese stock indices. After conducting on short-time return series, the CMSE method shows the advantages in reducing deviations of entropy estimation and demonstrates more stable and reliable results when compared with the conventional MSE algorithm. Finally, the composite multiscale entropy of six important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  18. Prediction and measurement of complexation of radionuclide mixtures by α-isosaccharinic, gluconic and picolinic acids

    International Nuclear Information System (INIS)

    Nick Evans; Peter Warwick; Monica Felipe-Sotelo

    2012-01-01

    The purpose of this study was to investigate the effects of competition between cobalt, europium and strontium for isosaccharinate, gluconate and picolinate. Systems where results indicated that competitive effects were significant have been identified. Thermodynamic calculations were performed for each system for comparison with the experimental results. Some exceptions may be due to precipitation of some species, or presence of species not in databases, or formation of mixed-metal complexes, or sorption to the solid phase(s). In some of the experiments, the complexity of the systems studied caused difficulty in identifying consistent trends. By concentrating on the results for simpler systems (i.e. for solubilities in the presence and absence of organic complexants and with just one competing metal ion), the evidence for competition effects has been investigated. Evidence for solubility enhancement due to organic ligands was apparent in the data for the systems Co with gluconate and Eu with isosaccharinate and gluconate. Of these above cases, the systems in which the effects of the competing ion are consistent with competition were limited to the cases of Eu with isosaccharinate and Sr as the competing ion, and Eu with gluconate and either Co or Sr as the competing ion. (author)

  19. MEASURING ACCURACY AND COMPLEXITY OF AN L2 LEARNER’S ORAL PRODUCTION

    Directory of Open Access Journals (Sweden)

    Teguh Khaerudin

    2015-03-01

    Full Text Available This paper aims at examining the influence of different tasks on the degree of task performance in a second language learner’s oral production. The underlying assumption is that among the three aspects of language performance in L2, i.e. fluency, accuracy, and complexity, learners may prioritize only one of them (Ellis & Barkhuizen, 2005, p. 150 and that their decision to prioritize one particular area of language performance may be determined by the characteristics of the task given to the learners (Skehan & Foster, 1997. Having a written record of an oral production, the writer focuses this study on determining the degree of complexity and accuracy, and analyzing whether the different tasks change the level of learner’s oral performance. The results show that learner’s accuracy from both tasks remains in the same level. However, both task conditions, which do not allow speech plan, result in no improvement in accuracy level and a minor improvement in the complexity level.

  20. Instrumentation Suite for Acoustic Propagation Measurements in Complex Shallow Water Environments

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Obtain at-sea measurements to test theoretical and modeling predictions of acoustic propagation in dynamic, inhomogeneous, and nonisotropic shallow water...

  1. Performance Analysis with Network-Enhanced Complexities: On Fading Measurements, Event-Triggered Mechanisms, and Cyber Attacks

    Directory of Open Access Journals (Sweden)

    Derui Ding

    2014-01-01

    Full Text Available Nowadays, the real-world systems are usually subject to various complexities such as parameter uncertainties, time-delays, and nonlinear disturbances. For networked systems, especially large-scale systems such as multiagent systems and systems over sensor networks, the complexities are inevitably enhanced in terms of their degrees or intensities because of the usage of the communication networks. Therefore, it would be interesting to (1 examine how this kind of network-enhanced complexities affects the control or filtering performance; and (2 develop some suitable approaches for controller/filter design problems. In this paper, we aim to survey some recent advances on the performance analysis and synthesis with three sorts of fashionable network-enhanced complexities, namely, fading measurements, event-triggered mechanisms, and attack behaviors of adversaries. First, these three kinds of complexities are introduced in detail according to their engineering backgrounds, dynamical characteristic, and modelling techniques. Then, the developments of the performance analysis and synthesis issues for various networked systems are systematically reviewed. Furthermore, some challenges are illustrated by using a thorough literature review and some possible future research directions are highlighted.

  2. Complexity attack resistant flow lookup achemes for IPv6: a measurement based comparison

    OpenAIRE

    Malone, David; Tobin, R. Joshua

    2008-01-01

    In this paper we look at the problem of choosing a good flow state lookup scheme for IPv6 firewalls. We want to choose a scheme which is fast when dealing with typical traffic, but whose performance will not degrade unnecessarily when subject to a complexity attack. We demonstrate the existing problem and, using captured traffic, assess a number of replacement schemes that are hash and tree based. Our aim is to improve FreeBSD’s ipfw firewall, and so finally we implement the most pro...

  3. Magnetic hysteresis and complex susceptibility as measures of ac losses in a multifilamentary NbTi superconductor

    International Nuclear Information System (INIS)

    Goldfarb, R.B.; Clark, A.F.

    1985-01-01

    Magnetization and ac susceptibility of a standard NbTi superconductor were measured as a function of longitudinal dc magnetic field. The ac-field-amplitude and frequency dependences of the complex susceptibility are examined. The magnetization is related to the susceptibility by means of a theoretical derivation based on the field dependence of the critical current density. Hysteresis losses, obtained directly from dc hysteresis loops and derived theoretically from ac susceptibility and critical current density, were in reasonable agreement

  4. A microfluidic device for simultaneous measurement of viscosity and flow rate of blood in a complex fluidic network

    OpenAIRE

    Jun Kang, Yang; Yeom, Eunseop; Lee, Sang-Joon

    2013-01-01

    Blood viscosity has been considered as one of important biophysical parameters for effectively monitoring variations in physiological and pathological conditions of circulatory disorders. Standard previous methods make it difficult to evaluate variations of blood viscosity under cardiopulmonary bypass procedures or hemodialysis. In this study, we proposed a unique microfluidic device for simultaneously measuring viscosity and flow rate of whole blood circulating in a complex fluidic network i...

  5. Corrective Measures Study Modeling Results for the Southwest Plume - Burial Ground Complex/Mixed Waste Management Facility

    International Nuclear Information System (INIS)

    Harris, M.K.

    1999-01-01

    Groundwater modeling scenarios were performed to support the Corrective Measures Study and Interim Action Plan for the southwest plume of the Burial Ground Complex/Mixed Waste Management Facility. The modeling scenarios were designed to provide data for an economic analysis of alternatives, and subsequently evaluate the effectiveness of the selected remedial technologies for tritium reduction to Fourmile Branch. Modeling scenarios assessed include no action, vertical barriers, pump, treat, and reinject; and vertical recirculation wells

  6. X-ray crystallography, electrochemistry, spectral and thermal analysis of some tetradentate schiff base complexes and formation constant measurements

    Czech Academy of Sciences Publication Activity Database

    Asadi, Z.; Savarypour, N.; Dušek, Michal; Eigner, Václav

    2017-01-01

    Roč. 47, č. 11 (2017), s. 1501-1508 ISSN 2470-1556 R&D Projects: GA ČR(CZ) GA15-12653S Institutional support: RVO:68378271 Keywords : X-ray crystallography * transition metal Schiff base complexes * thermogravimetry * electrochemistry * formation constant measurements Subject RIV: BM - Solid Matter Physics ; Magnetism OBOR OECD: Condensed matter physics (including formerly solid state physics, supercond.)

  7. Mercury mitigative measures related to hydroelectric reservoirs. The La Grande Complex experience

    International Nuclear Information System (INIS)

    Sbeghen, J.; Schetagne, R.

    1995-01-01

    Quebec Hydro's plan for mitigation of mercury contamination in fish and wildlife in the La Grande river basin was presented. The hazard and environmental threat posed by mercury contamination through flooding was described. Implications of mercury contamination for the Cree natives was discussed and provisions of the James Bay mercury agreement were described. Potential 'at source' remedial measures were described, including soil and vegetation removal, controlled burning of soils and vegetation, capping of flooded soils, lime or sulphite salt addition, sediment suspension, genetic manipulation of bacterial populations, selenium addition, nutrient addition, intensive fishing, and reservoir draining. Compensation measures were considered since no practical medium term remedial measures could be found. A case study of the Eastmain-1 Reservoir's $213 000 000 deforestation program was cited as a possible model. It was concluded that realistically, compensation produced the only feasible health risk reduction program, since none of the 'at source' remedial measure were technically or economically feasible. 24 refs

  8. Effect of Complex Working Conditions on Nurses Who Exert Coercive Measures in Forensic Psychiatric Care.

    Science.gov (United States)

    Gustafsson, Niclas; Salzmann-Erikson, Martin

    2016-09-01

    Nurses who exert coercive measures on patients within psychiatric care are emotionally affected. However, research on their working conditions and environment is limited. The purpose of the current study was to describe nurses' experiences and thoughts concerning the exertion of coercive measures in forensic psychiatric care. The investigation was a qualitative interview study using unstructured interviews; data were analyzed with inductive content analysis. Results described participants' thoughts and experiences of coercive measures from four main categories: (a) acting against the patients' will, (b) reasoning about ethical justifications, (c) feelings of compassion, and (d) the need for debriefing. The current study illuminates the working conditions of nurses who exert coercive measures in clinical practice with patients who have a long-term relationship with severe symptomatology. The findings are important to further discuss how nurses and leaders can promote a healthier working environment. [Journal of Psychosocial Nursing and Mental Health Services, 54(9), 37-43.]. Copyright 2016, SLACK Incorporated.

  9. Tritium/3He measurements in young groundwater: Progress in applications to complex hydrogeological systems

    Science.gov (United States)

    Schlosser, Peter; Shapiro, Stephanie D.; Stute, Martin; Plummer, Niel

    2000-01-01

    Tritium/3He dating has been applied to many problems in groundwater hydrology including, for example, determination of circulation patterns, mean residence times, recharge rates, or bank infiltration. Here, we discuss recent progress in the application of the tritium/3He dating method to sites with complex hydrogeological settings. Specifically, we report on tritium/3He dating at sites with (a) river infiltration into the basaltic fractured rock aquifer of the Eastern Snake River Plain, and (b) river infiltration through sinkholes into the karstic limestone Upper Floridian aquifer near Valdosta, Georgia.Tritium/3He dating has been applied to many problems in groundwater hydrology including, for example, determination of circulation patterns, mean residence times, recharge rates, or bank infiltration. Here, we discuss recent progress in the application of the tritium/3He dating method to sites with complex hydrogeological settings. Specifically, we report on tritium/3He dating at sites with (a) river infiltration into the basaltic fractured rock aquifer of the Eastern Snake River Plain, and (b) river infiltration through sinkholes into the karstic limestone Upper Floridian aquifer near Valdosta, Georgia.

  10. Amines are likely to enhance neutral and ion-induced sulfuric acid-water nucleation in the atmosphere more effectively than ammonia

    Directory of Open Access Journals (Sweden)

    T. Kurtén

    2008-07-01

    Full Text Available We have studied the structure and formation thermodynamics of dimer clusters containing H2SO4 or HSO4 together with ammonia and seven different amines possibly present in the atmosphere, using the high-level ab initio methods RI-MP2 and RI-CC2. As expected from e.g. proton affinity data, the binding of all studied amine-H2SO4 complexes is significantly stronger than that of NH3•H2SO4, while most amine-HSO4 complexes are only somewhat more strongly bound than NH3•HSO4. Further calculations on larger cluster structures containing dimethylamine or ammonia together with two H2SO4 molecules or one H2SO4 molecule and one HSO4 ion demonstrate that amines, unlike ammonia, significantly assist the growth of not only neutral but also ionic clusters along the H2SO4 co-ordinate. A sensitivity analysis indicates that the difference in complexation free energies for amine- and ammonia-containing clusters is large enough to overcome the mass-balance effect caused by the fact that the concentration of amines in the atmosphere is probably 2 or 3 orders of magnitude lower than that of ammonia. This implies that amines might be more important than ammonia in enhancing neutral and especially ion-induced sulfuric acid-water nucleation in the atmosphere.

  11. On the complexity of measuring forests microclimate and interpreting its relevance in habitat ecology: the example of Ixodes ricinus ticks

    Directory of Open Access Journals (Sweden)

    Denise Boehnke

    2017-11-01

    Full Text Available Abstract Background Ecological field research on the influence of meteorological parameters on a forest inhabiting species is confronted with the complex relations between measured data and the real conditions the species is exposed to. This study highlights this complexity for the example of Ixodes ricinus. This species lives mainly in forest habitats near the ground, but field research on impacts of meteorological conditions on population dynamics is often based on data from nearby official weather stations or occasional in situ measurements. In addition, studies use very different data approaches to analyze comparable research questions. This study is an extensive examination of the methodology used to analyze the impact of meteorological parameters on Ixodes ricinus and proposes a methodological approach that tackles the underlying complexity. Methods Our specifically developed measurement concept was implemented at 25 forest study sites across Baden-Württemberg, Germany. Meteorological weather stations recorded data in situ and continuously between summer 2012 and autumn 2015, including relative humidity measures in the litter layer and different heights above it (50 cm, 2 m. Hourly averages of relative humidity were calculated and compared with data from the nearest official weather station. Results Data measured directly in the forest can differ dramatically from conditions recorded at official weather stations. In general, data indicate a remarkable relative humidity decrease from inside to outside the forest and from ground to atmosphere. Relative humidity measured in the litter layer were, on average, 24% higher than the official data and were much more balanced, especially in summer. Conclusions The results illustrate the need for, and benefit of, continuous in situ measurements to grasp the complex relative humidity conditions in forests. Data from official weather stations do not accurately represent actual humidity conditions in

  12. On the complexity of measuring forests microclimate and interpreting its relevance in habitat ecology: the example of Ixodes ricinus ticks.

    Science.gov (United States)

    Boehnke, Denise; Gebhardt, Reiner; Petney, Trevor; Norra, Stefan

    2017-11-06

    Ecological field research on the influence of meteorological parameters on a forest inhabiting species is confronted with the complex relations between measured data and the real conditions the species is exposed to. This study highlights this complexity for the example of Ixodes ricinus. This species lives mainly in forest habitats near the ground, but field research on impacts of meteorological conditions on population dynamics is often based on data from nearby official weather stations or occasional in situ measurements. In addition, studies use very different data approaches to analyze comparable research questions. This study is an extensive examination of the methodology used to analyze the impact of meteorological parameters on Ixodes ricinus and proposes a methodological approach that tackles the underlying complexity. Our specifically developed measurement concept was implemented at 25 forest study sites across Baden-Württemberg, Germany. Meteorological weather stations recorded data in situ and continuously between summer 2012 and autumn 2015, including relative humidity measures in the litter layer and different heights above it (50 cm, 2 m). Hourly averages of relative humidity were calculated and compared with data from the nearest official weather station. Data measured directly in the forest can differ dramatically from conditions recorded at official weather stations. In general, data indicate a remarkable relative humidity decrease from inside to outside the forest and from ground to atmosphere. Relative humidity measured in the litter layer were, on average, 24% higher than the official data and were much more balanced, especially in summer. The results illustrate the need for, and benefit of, continuous in situ measurements to grasp the complex relative humidity conditions in forests. Data from official weather stations do not accurately represent actual humidity conditions in forest stands and the explanatory power of short period and

  13. Interferometry with flexible point source array for measuring complex freeform surface and its design algorithm

    Science.gov (United States)

    Li, Jia; Shen, Hua; Zhu, Rihong; Gao, Jinming; Sun, Yue; Wang, Jinsong; Li, Bo

    2018-06-01

    The precision of the measurements of aspheric and freeform surfaces remains the primary factor restrict their manufacture and application. One effective means of measuring such surfaces involves using reference or probe beams with angle modulation, such as tilted-wave-interferometer (TWI). It is necessary to improve the measurement efficiency by obtaining the optimum point source array for different pieces before TWI measurements. For purpose of forming a point source array based on the gradients of different surfaces under test, we established a mathematical model describing the relationship between the point source array and the test surface. However, the optimal point sources are irregularly distributed. In order to achieve a flexible point source array according to the gradient of test surface, a novel interference setup using fiber array is proposed in which every point source can be independently controlled on and off. Simulations and the actual measurement examples of two different surfaces are given in this paper to verify the mathematical model. Finally, we performed an experiment of testing an off-axis ellipsoidal surface that proved the validity of the proposed interference system.

  14. The CD control improvement by using CDSEM 2D measurement of complex OPC patterns

    Science.gov (United States)

    Chou, William; Cheng, Jeffrey; Lee, Adder; Cheng, James; Tzeng, Alex C.; Lu, Colbert; Yang, Ray; Lee, Hong Jen; Bandoh, Hideaki; Santo, Izumi; Zhang, Hao; Chen, Chien Kang

    2016-10-01

    As the process node becomes more advanced, the accuracy and precision in OPC pattern CD are required in mask manufacturing. CD SEM is an essential tool to confirm the mask quality such as CD control, CD uniformity and CD mean to target (MTT). Unfortunately, in some cases of arbitrary enclosed patterns or aggressive OPC patterns, for instance, line with tiny jogs and curvilinear SRAF, CD variation depending on region of interest (ROI) is a very serious problem in mask CD control, even it decreases the wafer yield. For overcoming this situation, the 2-dimensional (2D) method by Holon is adopted. In this paper, we summarize the comparisons of error budget between conventional (1D) and 2D data using CD SEM and the CD performance between mask and wafer by complex OPC patterns including ILT features.

  15. Measurement of complex RF susceptibility using a series Q-meter

    International Nuclear Information System (INIS)

    Kisselev, Yu.F.; Dulya, C.M.; Niinikoski, T.O.

    1995-01-01

    In this paper we have for the first time derived closed form expressions for the nuclear magnetic susceptibility in terms of the series Q-meter output voltage. We discuss the corrections involved in determining nuclear polarization from NMR signals by using the deuteron and proton spin systems as examples. Deuteron signals are shown to exhibit a false asymmetry, while proton signals have substantial shape distortions. Moreover, for the first time the importance of making a phase correction is demonstrated. We conclude that the series Q-meter with real part detection is not sufficient to produce an output voltage from which the nuclear susceptibility can be determined. An additional phase-sensitive detector is proposed for obtaining the imaginary part of the signal required for unambiguous extraction of the complex RF susceptibility. ((orig.))

  16. Measurement of net electric charge and dipole moment of dust aggregates in a complex plasma.

    Science.gov (United States)

    Yousefi, Razieh; Davis, Allen B; Carmona-Reyes, Jorge; Matthews, Lorin S; Hyde, Truell W

    2014-09-01

    Understanding the agglomeration of dust particles in complex plasmas requires knowledge of basic properties such as the net electrostatic charge and dipole moment of the dust. In this study, dust aggregates are formed from gold-coated mono-disperse spherical melamine-formaldehyde monomers in a radiofrequency (rf) argon discharge plasma. The behavior of observed dust aggregates is analyzed both by studying the particle trajectories and by employing computer models examining three-dimensional structures of aggregates and their interactions and rotations as induced by torques arising from their dipole moments. These allow the basic characteristics of the dust aggregates, such as the electrostatic charge and dipole moment, as well as the external electric field, to be determined. It is shown that the experimental results support the predicted values from computer models for aggregates in these environments.

  17. ELF field in the proximity of complex power line configuration measurement procedures

    International Nuclear Information System (INIS)

    Benes, M.; Comelli, M.; Villalta, R.

    2006-01-01

    The issue of how to measure magnetic induction fields generated by various power line configurations, when there are several power lines that run across the same exposure area, has become a matter of interest and study within the Regional Environment Protection Agency of Friuli Venezia Giulia. In classifying the various power line typologies the definition of double circuit line was given: in this instance the magnetic field is determined by knowing the electrical and geometric parameters of the line. In the case of independent lines instead, the field is undetermined. It is therefore pointed out how, in the latter case, extracting projected information from a set of measurements of the magnetic field alone is impossible. Making measurements throughout the territory of service has in several cases offered the opportunity to define standard operational procedures. (authors)

  18. Areal Measurements of Ozone, Water, and Heat Fluxes Over Land With Different Surface Complexity, Using Aircraft

    International Nuclear Information System (INIS)

    Hicks, Bruce B.

    2001-01-01

    Contemporary models addressing issues of air quality and/or atmospheric deposition continue to exploit air-surface exchange formulations originating from single-tower studies. In reality,these expressions describe situations that are rare in the real world - nearly flat and spatially homogeneous. There have been several theoretical suggestions about how to extend from single-point understanding to areal descriptions, but so far the capability to address the problem experimentally has been limited. In recent years, however, developments in sensing technology have permitted adaptation of eddy-correlation methods to low-flying aircraft in a far more cost-effective manner than previously. A series of field experiments has been conducted, ranging from flat farmland to rolling countryside, employing a recently modified research aircraft operated by the US NationalOceanic and Atmospheric Administration (NOAA). The results demonstrate the complexity of the spatial heterogeneity question,especially for pollutants (ozone in particular). In general, the uncertainty associated with the adoption of any single-point formulation when describing areal averages is likely to be in the range 10% to 40%. In the case of sensible and latent heat fluxes, the overall behavior is controlled by the amount of energy available. For pollutant deposition, there is no constraint equivalent to the net radiation limitation on convective heat exchange. Consequently, dry deposition rates and air-surface exchange of trace gases in general are especially vulnerable to errors in spatial extrapolation. The results indicate that the susceptibility of dry deposition formulations to terrain complexity depends on the deposition velocity itself. For readily transferred pollutants (such as HNO 3 ), a factor of two error could be involved

  19. Determination of solubility isotherms of barium and strontium nitrates in the system acetic acid-water at 250 C

    International Nuclear Information System (INIS)

    Hubicki, W.; Piskorek, M.

    1976-01-01

    Investigastions of the solubility of barium and strontium nitrates were carried out in the system: acetic acid - water at 25 0 C. When one compares the isotherms of solubility of barium and strontium nitrates, one can observe that it is possible to separate the admixtures of barium from strontium nitrates as a result of fractional crystallization of these nitrates from actic acid solution at the temperatures lower than 31.3 0 C, i.e. below the temperature of transformation: Sr(NO 3 ) 2 . 4H 2 O reversible to Sr(NO 3 ) 2 + 4H 2 O for aqueous solution. (author)

  20. Measuring the pollutant transport capacity of dissolved organic matter in complex matrixes

    DEFF Research Database (Denmark)

    Persson, L.; Alsberg, T.; Odham, G.

    2003-01-01

    Dissolved organic matter (DOM) facilitated transport in contaminated groundwater was investigated through the measurement of the binding capacity of landfill leachate DOM (Vejen, Denmark) towards two model pollutants (pyrene and phenanthrene). Three different methods for measuring binding capacity....... It was further concluded that DOM facilitated transport should be taken into account for non-ionic PAHs with lg K OW above 5, at DOM concentrations above 250 mg C/L. The total DOM concentration was found to be more important for the potential of facilitated transport than differences in the DOM binding capacity....

  1. Identification of complex model thermal boundary conditions based on exterior temperature measurement

    International Nuclear Information System (INIS)

    Lu Jianming; Ouyang Guangyao; Zhang Ping; Rong Bojun

    2012-01-01

    Combining the advantages of the finite element software in temperature field analyzing with the multivariate function optimization arithmetic, a feasibility method based on the exterior temperature was proposed to get the thermal boundary conditions, which was required in temperature field analyzing. The thermal boundary conditions can be obtained only by some temperature measurement values. Taking the identification of the convection heat transfer coefficient of a high power density diesel engine cylinder head as an example, the calculation result shows that when the temperature measurement error was less than 0.5℃, the maximum relative error was less than 2%. It is shown that the new method was feasible (authors)

  2. Measuring the complex permittivity tensor of uniaxial biological materials with coplanar waveguide transmission line

    Science.gov (United States)

    A simple and accurate technique is described for measuring the uniaxial permittivity tensor of biological materials with a coplanar waveguide transmission-line configuration. Permittivity tensor results are presented for several chicken and beef fresh meat samples at 2.45 GHz....

  3. High level language for measurement complex control based on the computer E-100I

    Science.gov (United States)

    Zubkov, B. V.

    1980-01-01

    A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.

  4. Measuring marine iron(III) complexes by CLE-AdSV

    NARCIS (Netherlands)

    Town, R.M.; Leeuwen, van H.P.

    2005-01-01

    Iron(iii) speciation data, as determined by competitive ligand exchange?adsorptive stripping voltammetry (CLE-AdSV), is reconsidered in the light of the kinetic features of the measurement. The very large stability constants reported for iron(iii) in marine ecosystems are shown to be possibly due to

  5. Reply to Comments on Measuring marine iron(III) complexes by CLE-AdSV

    NARCIS (Netherlands)

    Town, R.M.; Leeuwen, van H.P.

    2005-01-01

    The interpretation of CLE-AdSV based iron(iii) speciation data for marine waters has been called into question in light of the kinetic features of the measurement. The implications of the re-think may have consequences for understanding iron biogeochemistry and its impact on ecosystem functioning.

  6. Coaxial Sensors For Broad-Band Complex Permittivity Measurements of Petroleum Fluids

    Energy Technology Data Exchange (ETDEWEB)

    Folgeroe, K.

    1996-12-31

    This doctoral thesis verifies that dielectric spectroscopy and microwave permittivity measurements can be used to characterize petroleum liquids. It concentrates on developing sensors for three potential industrial applications: quality characterization of crude oil and petroleum fractions, monitoring of gas-hydrate formation in water-in-oil emulsions, and determination of water-content in thin liquid layers. The development of a permittivity measurement system for crude oil and petroleum fractions is described. As black oils have low dielectric constant and loss, the system must be very sensitive in order to measure the dielectric spectra and to distinguish oils of different permittivity. Such a system was achieved by combining impedance and scattering parameter measurements with appropriate permittivity calculation methods. The frequency range from 10 kHz to 6 GHz was found convenient for observing the main dispersion of the oils. All the oils had dielectric constants between 2.1 and 2.9 and dielectric loss below 0.01. The oils studied were samples of the feedstock for the cracker and coke processes at a petroleum refinery. This verifies that dielectric spectroscopy is a potential technique for on-line quality monitoring of the feedstock at petroleum refineries. Gas hydrates may cause major problems like clogging of pipelines. Dielectric spectroscopy is proposed as a means of monitoring the formation of gas hydrates in emulsions. It is found that open-ended coaxial probes fulfill the sensitivity requirements for such sensors. 312 refs., 87 figs., 20 tabs.

  7. A technique for accurate measurements of temperature variations in solution calorimetry and osmometry of actinide complexes

    International Nuclear Information System (INIS)

    Ponkshe, M.R.; Samuel, J.K.

    1982-01-01

    The temperature variations of the order of 10 3- to 10 -4 C are measured by means of matched pair of thermistors and constant current techniques. The factors deciding the sensitivity and accuracy are fully discussed. Also the factors which put restrictions on the practical detection limits are also described. (author)

  8. Microwave generation and complex microwave responsivity measurements on small Dayem bridges

    DEFF Research Database (Denmark)

    Pedersen, Niels Falsig; Sørensen, O; Mygind, Jesper

    1977-01-01

    Measurements of the active properties of a Dayem micro-bridge at X-band frequencies is described. The bridge was mounted in a microwave cavity designed to match the bridge properly and the microwave output from the cavity was detected using a sensitive X-band spectrometer. Microwave power...

  9. Online measurement of mental representations of complex spatial decision problems : comparison of CNET and hard laddering

    NARCIS (Netherlands)

    Horeni, O.; Arentze, T.A.; Dellaert, B.G.C.; Timmermans, H.J.P.

    2014-01-01

    This paper introduces the online Causal Network Elicitation Technique (CNET), as a technique for measuring components of mental representations of choice tasks and compares it with the more common technique of online 'hard' laddering (HL). While CNET works in basically two phases, one in open

  10. Online measurement of mental representations of complex spatial decision problems : Comparison of CNET and hard laddering

    NARCIS (Netherlands)

    O. Horeni (Oliver); T.A. Arentze (Theo); B.G.C. Dellaert (Benedict); H.J.P. Timmermans (Harry)

    2013-01-01

    textabstractThis paper introduces the online Causal Network Elicitation Technique (CNET), as a technique for measuring components of mental representations of choice tasks and compares it with the more common technique of online ‘hard’ laddering (HL). While CNET works in basically two phases, one in

  11. Comparisons between field- and LiDAR-based measures of stand structrual complexity

    Science.gov (United States)

    Van R. Kane; Robert J. McGaughey; Jonathan D. Bakker; Rolf F. Gersonde; James A. Lutz; Jerry F. Franklin

    2010-01-01

    Forest structure, as measured by the physical arrangement of trees and their crowns, is a fundamental attribute of forest ecosystems that changes as forests progress through successional stages. We examined whether LiDAR data could be used to directly assess the successional stage of forests by determining the degree to which the LiDAR data would show the same relative...

  12. Predictive value of symptom level measurements for complex regional pain syndrome type I

    NARCIS (Netherlands)

    Perez, R. S. G. M.; Keijzer, C.; Bezemer, P. D.; Zuurmond, W. W. A.; de Lange, J. J.

    2005-01-01

    The validity with respect to presence or absence of CRPS I according to Veldman's criteria was assessed for measured pain, temperature, volume differences and limitations in range of motion. Evaluated were 155 assessments of 66 outpatients, initially diagnosed with CRPS I, but many of them not so on

  13. Soil temperature variability in complex terrain measured using fiber-optic distributed temperature sensing

    Science.gov (United States)

    Soil temperature (Ts) exerts critical controls on hydrologic and biogeochemical processes but magnitude and nature of Ts variability in a landscape setting are rarely documented. Fiber optic distributed temperature sensing systems (FO-DTS) potentially measure Ts at high density over a large extent. ...

  14. Parameter optimization of measuring and control elements in the monitoring systems of complex technical objects

    Science.gov (United States)

    Nekrylov, Ivan; Korotaev, Valery; Blokhina, Anastasia; Kleshchenok, Maksim

    2017-06-01

    In the world is the widespread adoption of measuring equipment of new generation, which is characterized by small size, high automation level, a multi-channel, digital filtering, satellite synchronization, wireless communication, digital record in long-term memory with great resource, powered by long-lived sources, etc. However, modern equipment base of the Russian institutions and the level of development of technical facilities and measuring technologies lag far behind developed countries. For this reason, the vacated niches are actively developed by foreign companies. For example, more than 70% instrumentation performing works on the territory of Russia, equipped with imported equipment (products of Sweden and Germany companies); the amount of work performed with German equipment is more than 70% of the total volume of these works; more than 80% of industrial measurements are performed using HEXAGON equipment (Sweden). These trends show that the Russian sector of measuring technology gradually become import-dependent, which poses a threat to the economic security of the country and consistent with national priorities. The results of the research will allow to develop the theory of formation of control systems of the displacement with high accuracy and unattainable for the existing analogue ergonomic and weight characteristics combined with a comparable or lower cost. These advantages will allow you to be successful competition, and eventually to supplant the existing system, which had no fundamental changes in the last 20 years and, therefore, retained all the drawbacks: large size and weight, high power consumption, the dependence on magnetic fields

  15. Directed weighted network structure analysis of complex impedance measurements for characterizing oil-in-water bubbly flow.

    Science.gov (United States)

    Gao, Zhong-Ke; Dang, Wei-Dong; Xue, Le; Zhang, Shan-Shan

    2017-03-01

    Characterizing the flow structure underlying the evolution of oil-in-water bubbly flow remains a contemporary challenge of great interests and complexity. In particular, the oil droplets dispersing in a water continuum with diverse size make the study of oil-in-water bubbly flow really difficult. To study this issue, we first design a novel complex impedance sensor and systematically conduct vertical oil-water flow experiments. Based on the multivariate complex impedance measurements, we define modalities associated with the spatial transient flow structures and construct modality transition-based network for each flow condition to study the evolution of flow structures. In order to reveal the unique flow structures underlying the oil-in-water bubbly flow, we filter the inferred modality transition-based network by removing the edges with small weight and resulting isolated nodes. Then, the weighted clustering coefficient entropy and weighted average path length are employed for quantitatively assessing the original network and filtered network. The differences in network measures enable to efficiently characterize the evolution of the oil-in-water bubbly flow structures.

  16. Measuring The Impact Of Innovations On Efficiency In Complex Hospital Settings

    Directory of Open Access Journals (Sweden)

    Bonća Petra Došenović

    2015-12-01

    Full Text Available In this paper the authors propose an approach for measuring the impact of innovations on hospital efficiency. The suggested methodology can be applied to any type of innovation, including technology-based innovations, as well as consumer-focused and business model innovations. The authors apply the proposed approach to measure the impact of transcanalicular diode laser-assisted dacryocystorhinostomy (DCR, i.e. an innovation introduced in the surgical procedure for treating a tear duct blockage, on the efficiency of general hospitals in Slovenia. They demonstrate that the impact of an innovation on hospital efficiency depends not only on the features of the studied innovation but also on the characteristics of hospitals adopting the innovation and their external environment represented by a set of comparable hospitals.

  17. Using a complex audit tool to measure workload, staffing and quality in district nursing.

    Science.gov (United States)

    Kirby, Esther; Hurst, Keith

    2014-05-01

    This major community, workload, staffing and quality study is thought to be the most comprehensive community staffing project in England. It involved over 400 staff from 46 teams in 6 localities and is unique because it ties community staffing activity to workload and quality. Scotland was used to benchmark since the same evidence-based Safer Nursing Care Tool methodology developed by the second-named author was used (apart from quality) and took into account population and geographical similarities. The data collection method tested quality standards, acuity, dependency and nursing interventions by looking at caseloads, staff activity and service quality and funded, actual, temporary and recommended staffing. Key findings showed that 4 out of 6 localities had a heavy workload index that stretched staffing numbers and time spent with patients. The acuity and dependency of patients leaned heavily towards the most dependent and acute categories requiring more face-to-face care. Some areas across the localities had high levels of temporary staff, which affected quality and increased cost. Skill and competency shortages meant that a small number of staff had to travel significantly across the county to deliver complex care to some patients.

  18. Predicting Falls in People with Multiple Sclerosis: Fall History Is as Accurate as More Complex Measures

    Directory of Open Access Journals (Sweden)

    Michelle H. Cameron

    2013-01-01

    Full Text Available Background. Many people with MS fall, but the best method for identifying those at increased fall risk is not known. Objective. To compare how accurately fall history, questionnaires, and physical tests predict future falls and injurious falls in people with MS. Methods. 52 people with MS were asked if they had fallen in the past 2 months and the past year. Subjects were also assessed with the Activities-specific Balance Confidence, Falls Efficacy Scale-International, and Multiple Sclerosis Walking Scale-12 questionnaires, the Expanded Disability Status Scale, Timed 25-Foot Walk, and computerized dynamic posturography and recorded their falls daily for the following 6 months with calendars. The ability of baseline assessments to predict future falls was compared using receiver operator curves and logistic regression. Results. All tests individually provided similar fall prediction (area under the curve (AUC 0.60–0.75. A fall in the past year was the best predictor of falls (AUC 0.75, sensitivity 0.89, specificity 0.56 or injurious falls (AUC 0.69, sensitivity 0.96, specificity 0.41 in the following 6 months. Conclusion. Simply asking people with MS if they have fallen in the past year predicts future falls and injurious falls as well as more complex, expensive, or time-consuming approaches.

  19. Presentation of a quality organisation pattern for the design, manufacturing and implementation of complex measuring chains

    International Nuclear Information System (INIS)

    Hervault, V.; Piguet, M.

    1995-01-01

    The EFMT (Experience Feedback, Measures-Tests) branch of EDF Research and Development Division designs and installs instrumentation systems for power generation sites. These systems include either testing (thermal and mechanical operation surveys) or process control instruments. The context in which instrumentation is developed and used has much varied during the past few years from both technical and organisational viewpoint. An instrumentation system consists of a set of measuring chains associated to communication supports and an acquisition software; the technical knowledge involved are highly diversified: measurement, field bus, computing, data processing... Customers now include quality requirements in their specifications and often make reference to standards of the EN 29000 series. The EFMT Branch has defined a quality approach applicable to instrumentation field which aims at ensuring the technical success (namely to attain the expected characteristics) and meeting customers' quality requirements. This approach based upon project management techniques defines the design implementation and operating process phases. It emphasizes a global approach to instrumentation while promoting the communication between the partners in a project. This paper presents the whole approach and underlines its critical phases: users' requirements, testing and acceptance procedures. (authors). 5 refs., 2 figs., 1 tab

  20. The Importance of and the Complexities Associated With Measuring Continuity of Care During Resident Training: Possible Solutions Do Exist.

    Science.gov (United States)

    Carney, Patricia A; Conry, Colleen M; Mitchell, Karen B; Ericson, Annie; Dickinson, W Perry; Martin, James C; Carek, Peter J; Douglass, Alan B; Eiff, M Patrice

    2016-04-01

    Evolutions in care delivery toward the patient-centered medical home have influenced important aspects of care continuity. Primary responsibility for a panel of continuity patients is a foundational requirement in family medicine residencies. In this paper we characterize challenges in measuring continuity of care in residency training in this new era of primary care. We synthesized the literature and analyzed information from key informant interviews and group discussions with residency faculty and staff to identify the challenges and possible solutions for measuring continuity of care during family medicine training. We specifically focused on measuring interpersonal continuity at the patient level, resident level, and health care team level. Challenges identified in accurately measuring interpersonal continuity of care during residency training include: (1) variability in empanelment approaches for all patients, (2) scheduling complexity in different types of visits, (3) variability in ability to attain continuity counts at the level of the resident, and (4) shifting make-up of health care teams, especially in residency training. Possible solutions for each challenge are presented. Philosophical issues related to continuity are discussed, including whether true continuity can be achieved during residency training and whether qualitative rather than quantitative measures of continuity are better suited to residencies. Measuring continuity of care in residency training is challenging but possible, though improvements in precision and assessment of the comprehensive nature of the relationships are needed. Definitions of continuity during training and the role continuity measurement plays in residency need further study.

  1. Validation of the Five-Phase Method for Simulating Complex Fenestration Systems with Radiance against Field Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Geisler-Moroder, David [Bartenbach GmbH, Aldrans (Austria); Lee, Eleanor S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ward, Gregory J. [Anyhere Software, Albany, NY (United States)

    2016-08-29

    The Five-Phase Method (5-pm) for simulating complex fenestration systems with Radiance is validated against field measurements. The capability of the method to predict workplane illuminances, vertical sensor illuminances, and glare indices derived from captured and rendered high dynamic range (HDR) images is investigated. To be able to accurately represent the direct sun part of the daylight not only in sensor point simulations, but also in renderings of interior scenes, the 5-pm calculation procedure was extended. The validation shows that the 5-pm is superior to the Three-Phase Method for predicting horizontal and vertical illuminance sensor values as well as glare indices derived from rendered images. Even with input data from global and diffuse horizontal irradiance measurements only, daylight glare probability (DGP) values can be predicted within 10% error of measured values for most situations.

  2. Pseudo-stokes vector from complex signal representation of a speckle pattern and its applications to micro-displacement measurement

    DEFF Research Database (Denmark)

    Wang, W.; Ishijima, R.; Matsuda, A.

    2010-01-01

    As an improvement of the intensity correlation used widely in conventional electronic speckle photography, we propose a new technique for displacement measurement based on correlating Stokes-like parameters derivatives for transformed speckle patterns. The method is based on a Riesz transform of ...... are presented that demonstrate the validity and advantage of the proposed pseudo-Stokes vector correlation technique over conventional intensity correlation technique....... of the intensity speckle pattern, which converts the original real-valued signal into a complex signal. In closest analogy to the polarisation of a vector wave, the Stokes-like vector constructed from the spatial derivative of the generated complex signal has been applied for correlation. Experimental results...

  3. Food deserts in Winnipeg, Canada: a novel method for measuring a complex and contested construct

    Directory of Open Access Journals (Sweden)

    Joyce Slater

    2017-10-01

    Full Text Available Introduction: "Food deserts" have emerged over the past 20 years as spaces of concern for communities, public health authorities and researchers because of their potential negative impact on dietary quality and subsequent health outcomes. Food deserts are residential geographic spaces, typically in urban settings, where low-income residents have limited or no access to retail food establishments with sufficient variety at affordable cost. Research on food deserts presents methodological challenges including retail food store identification and classification, identification of low-income populations, and transportation and proximity metrics. Furthermore, the complex methods often used in food desert research can be difficult to reproduce and communicate to key stakeholders. To address these challenges, this study sought to demonstrate the feasibility of implementing a simple and reproducible method of identifying food deserts using data easily available in the Canadian context. Methods: This study was conducted in Winnipeg, Canada in 2014. Food retail establishments were identified from Yellow Pages and verified by public health dietitians. We calculated two scenarios of food deserts based on location of the lowest-income quintile population: (a living ≥ 500 m from a national chain grocery store, or (b living ≥ 500 m from a national chain grocery store or a full-service grocery store. Results: The number of low-income residents living in a food desert ranged from 64 574 to 104 335, depending on the scenario used. Conclusion: This study shows that food deserts affect a significant proportion of the Winnipeg population, and while concentrated in the urban core, exist in suburban neighbourhoods also. The methods utilized represent an accessible and transparent, reproducible process for identifying food deserts. These methods can be used for costeffective, periodic surveillance and meaningful engagement with communities, retailers and policy

  4. Measuring and Calculative Complex for Registration of Quasi-Static and Dynamic Processes of Electromagnetic Irradiation

    Directory of Open Access Journals (Sweden)

    V. I. Ovchinnikov

    2007-01-01

    Full Text Available The paper is devoted to the development of measuring device to register dynamic processes of electromagnetic irradiation during the treatment of materials with energy of explosion. Standard units to register main parameters of the explosion do not allow predict and control results of the process. So, to overcome disadvantages of former control units a new one has been developed applying Hall’s sensors. The device developed allows effectively register of the inductive component of the electromagnetic irradiation in wide range of temperature for many shot-time processes.

  5. Private health insurance: New measures of a complex and changing industry

    Science.gov (United States)

    Arnett, Ross H.; Trapnell, Gordon R.

    1984-01-01

    Private health insurance benefit payments are an integral component of estimates of national health expenditures. Recent analyses indicate that the insurance industry has undergone significant changes since the mid-1970's. As a result of these study findings and corresponding changes to estimating techniques, private health insurance estimates have been revised upward. This has had a major impact on national health expenditure estimates. This article describes the changes that have occurred in the industry, discusses some of the implications of those changes, presents a new methodology to measure private health insurance and the resulting estimate levels, and then examines concepts that underpin these estimates. PMID:10310950

  6. The Experimental Measurement of Aerodynamic Heating About Complex Shapes at Supersonic Mach Numbers

    Science.gov (United States)

    Neumann, Richard D.; Freeman, Delma C.

    2011-01-01

    In 2008 a wind tunnel test program was implemented to update the experimental data available for predicting protuberance heating at supersonic Mach numbers. For this test the Langley Unitary Wind Tunnel was also used. The significant differences for this current test were the advances in the state-of-the-art in model design, fabrication techniques, instrumentation and data acquisition capabilities. This current paper provides a focused discussion of the results of an in depth analysis of unique measurements of recovery temperature obtained during the test.

  7. Microprocessor measuring complex for study on electrodynamical characteristics of accelerating structures

    International Nuclear Information System (INIS)

    Val'dner, O.A.; Ponomarenko, A.G.; Rodionov, A.E.

    1990-01-01

    The instruments designed for measuring the distributions of electromagnetic field determining the resonance frequencies, Q-values and hunt resistances of resonators with relative error of 1%; 10 -6 ; 3%; 5% respectively, are described. The general block-diagram of the instruments is presented. They consist of the following elements: high-stability programmable generator; resonantor under investigation, which is connected according to the two-port layout; semiconducting SHF detector; probe movement system; SHF amplitude and phase meter; signal analog processing unit; analog-to-digital converter; microcomputer; graphic display; printer and cassette-type magnetic tape storage

  8. The United Kingdom Acid Waters Monitoring Network: a review of the first 15 years and introduction to the special issue

    International Nuclear Information System (INIS)

    Monteith, D.T.; Evans, C.D.

    2005-01-01

    The United Kingdom Acid Waters Monitoring Network (AWMN) was established in 1988 to determine the ecological impact of acidic emissions control policy on acid-sensitive lakes and streams. AWMN data have been used to explore a range of causal linkages necessary to connect changes in emissions to chemical and, ultimately, biological recovery. Regional scale reductions in sulphur (S) deposition have been found to have had an immediate influence on surface water chemistry, including increases in acid neutralising capacity, pH and alkalinity and declines in aluminium toxicity. These in turn can be linked to changes in the aquatic biota which are consistent with 'recovery' responses. A continuation of the current programme is essential in order to better understand apparent non-linearity between nitrogen (N) in deposition and runoff, the substantial rise in organic acid concentrations, and the likely impacts of forecast climate change and other potential constraints on further biological improvement. - After 15 years of the UK Acid Waters Monitoring Network, we can now draw clear conclusions regarding the impact of emission reductions on acidified UK fresh waters

  9. Component activities in the system thorium nitrate-nitric acid-water at 25oC

    International Nuclear Information System (INIS)

    Lemire, R.J.; Brown, C.P.

    1982-01-01

    The equilibrium composition of the vapor above thorium nitrate-nitric acid-water mixtures has been studied as a function of the concentrations of thorium nitrate and nitric acid using a transpiration technique. At 25 o C, the thorium nitrate concentrations m T ranged from 0.1 to 2.5 molal and the nitric acid concentrations m N from 0.3 to 25 molal. The vapor pressure of the nitric acid was found to increase with increasing thorium nitrate concentration for a constant molality of nitric acid in aqueous solution. At constant m T , the nitric acid vapor pressure was particularly enhanced at low nitric acid concentrations. The water vapor pressures decreased regularly with increasing concentrations of both nitric acid and thorium nitrate. The experimental data were fitted to Scatchard's ion-component model, and to empirical multiparameter functions. From the fitting parameters, and available literature data for the nitric acid-water and thorium nitrate-water systems at 25 o C, expressions were calculated for the variation of water and thorium nitrate activities, as functions of the nitric acid and thorium nitrate concentrations, using the Gibbs-Duhem equation. Calculated values for the thorium nitrate activities were strongly dependent on the form of the function originally used to fit the vapor pressure data. (author)

  10. Sulfur dioxide concentration measurements in the vicinity of the Albert Funk mining and metallurgical plant complex

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, M

    1976-01-01

    This article discusses the ambient air concentration of sulfur dioxide in the area of Freiberg, GDR. The emission of sulfur dioxide results for the most part from brown coal combustion in heat and power plants and in metallurgical plants. Sulfur dioxide emission from neighboring industrial centers such as Dresden and North Bohemian towns affects the Freiburg area to some extent. The use of brown coal in household heating contributes an average of 50 kg of sulfur dioxide emission per coal burning household annually. A total of 1260 measurements at 28 points in the vicinity of Freiberg were made in the year 1972 in evaluating the concentration of sulfur dioxide present in the air. In 75% of the measurements the concentrations were below 0.15 mg/mat3, in 12% between 0.15 and 0.2 mg/mat3, in 7% between 0.2 and 0.3 mg/mat3 and in 6% between 0.3 and 0.5 mg/mat3. The results are described as average industrial pollution. The influence of air temperature, wind velocity, fog, season and time of day are also discussed. (4 refs.) (In German)

  11. Circular dichroism measured on single chlorosomal light-harvesting complexes of green photosynthetic bacteria

    KAUST Repository

    Furumaki, Shu

    2012-12-06

    We report results on circular dichroism (CD) measured on single immobilized chlorosomes of a triple mutant of green sulfur bacterium Chlorobaculum tepidum. The CD signal is measured by monitoring chlorosomal bacteriochlorphyll c fluorescence excited by alternate left and right circularly polarized laser light with a fixed wavelength of 733 nm. The excitation wavelength is close to a maximum of the negative CD signal of a bulk solution of the same chlorosomes. The average CD dissymmetry parameter obtained from an ensemble of individual chlorosomes was gs = -0.025, with an intrinsic standard deviation (due to variations between individual chlorosomes) of 0.006. The dissymmetry value is about 2.5 times larger than that obtained at the same wavelength in the bulk solution. The difference can be satisfactorily explained by taking into account the orientation factor in the single-chlorosome experiments. The observed distribution of the dissymmetry parameter reflects the well-ordered nature of the mutant chlorosomes. © 2012 American Chemical Society.

  12. AN ADVANCED CALIBRATION PROCEDURE FOR COMPLEX IMPEDANCE SPECTRUM MEASUREMENTS OF ADVANCED ENERGY STORAGE DEVICES

    Energy Technology Data Exchange (ETDEWEB)

    William H. Morrison; Jon P. Christophersen; Patrick Bald; John L. Morrison

    2012-06-01

    With the increasing demand for electric and hybrid electric vehicles and the explosion in popularity of mobile and portable electronic devices such as laptops, cell phones, e-readers, tablet computers and the like, reliance on portable energy storage devices such as batteries has likewise increased. The concern for the availability of critical systems in turn drives the availability of battery systems and thus the need for accurate battery health monitoring has become paramount. Over the past decade the Idaho National Laboratory (INL), Montana Tech of the University of Montana (Tech), and Qualtech Systems, Inc. (QSI) have been developing the Smart Battery Status Monitor (SBSM), an integrated battery management system designed to monitor battery health, performance and degradation and use this knowledge for effective battery management and increased battery life. Key to the success of the SBSM is an in-situ impedance measurement system called the Impedance Measurement Box (IMB). One of the challenges encountered has been development of an accurate, simple, robust calibration process. This paper discusses the successful realization of this process.

  13. Circular dichroism measured on single chlorosomal light-harvesting complexes of green photosynthetic bacteria

    KAUST Repository

    Furumaki, Shu; Yabiku, Yu; Habuchi, Satoshi; Tsukatani, Yusuke; Bryant, Donald A.; Vá cha, Martin

    2012-01-01

    We report results on circular dichroism (CD) measured on single immobilized chlorosomes of a triple mutant of green sulfur bacterium Chlorobaculum tepidum. The CD signal is measured by monitoring chlorosomal bacteriochlorphyll c fluorescence excited by alternate left and right circularly polarized laser light with a fixed wavelength of 733 nm. The excitation wavelength is close to a maximum of the negative CD signal of a bulk solution of the same chlorosomes. The average CD dissymmetry parameter obtained from an ensemble of individual chlorosomes was gs = -0.025, with an intrinsic standard deviation (due to variations between individual chlorosomes) of 0.006. The dissymmetry value is about 2.5 times larger than that obtained at the same wavelength in the bulk solution. The difference can be satisfactorily explained by taking into account the orientation factor in the single-chlorosome experiments. The observed distribution of the dissymmetry parameter reflects the well-ordered nature of the mutant chlorosomes. © 2012 American Chemical Society.

  14. Kinetic measurements of the hydrolytic degradation of cefixime: effect of Captisol complexation and water-soluble polymers.

    Science.gov (United States)

    Mallick, Subrata; Mondal, Arijit; Sannigrahi, Santanu

    2008-07-01

    We have taken kinetic measurements of the hydrolytic degradation of cefixime, and have studied the effect of Captisol complexation and water-soluble polymers on that degradation. The phase solubility of cefixime in Captisol was determined. Kinetic measurements were carried out as a function of pH and temperature. High-performance liquid chromatography (HPLC) was performed to assay all the samples of phase-solubility analysis and kinetic measurements. Chromatographic separation of the degradation products was also performed by HPLC. FT-IR spectroscopy was used to investigate the presence of any interaction between cefixime and Captisol and soluble polymer. The phase-solubility study showed A(L)-type behaviour. The pH-rate profile of cefixime exhibited a U-shaped profile whilst the degradation of cefixime alone was markedly accelerated with elevated temperature. A strong stabilizing influence of the cefixime-Captisol complexation and hypromellose was observed against aqueous mediated degradation, as compared with povidone and macrogol. The unfavourable effect of povidone and macrogol may have been due to the steric hindrance, which prevented the guest molecule from entering the cyclodextrin cavity, whereas hypromellose did not produce any steric hindrance.

  15. Building a measurement framework of burden of treatment in complex patients with chronic conditions: a qualitative study.

    Science.gov (United States)

    Eton, David T; Ramalho de Oliveira, Djenane; Egginton, Jason S; Ridgeway, Jennifer L; Odell, Laura; May, Carl R; Montori, Victor M

    2012-01-01

    Burden of treatment refers to the workload of health care as well as its impact on patient functioning and well-being. We set out to build a conceptual framework of issues descriptive of burden of treatment from the perspective of the complex patient, as a first step in the development of a new patient-reported measure. We conducted semistructured interviews with patients seeking medication therapy management services at a large, academic medical center. All patients had a complex regimen of self-care (including polypharmacy), and were coping with one or more chronic health conditions. We used framework analysis to identify and code themes and subthemes. A conceptual framework of burden of treatment was outlined from emergent themes and subthemes. Thirty-two patients (20 female, 12 male, age 26-85 years) were interviewed. Three broad themes of burden of treatment emerged including: the work patients must do to care for their health; problem-focused strategies and tools to facilitate the work of self-care; and factors that exacerbate the burden felt. The latter theme encompasses six subthemes including challenges with taking medication, emotional problems with others, role and activity limitations, financial challenges, confusion about medical information, and health care delivery obstacles. We identified several key domains and issues of burden of treatment amenable to future measurement and organized them into a conceptual framework. Further development work on this conceptual framework will inform the derivation of a patient-reported measure of burden of treatment.

  16. The challenging measurement of protein in complex biomass-derived samples

    DEFF Research Database (Denmark)

    Haven, M.O.; Jørgensen, H.

    2014-01-01

    and fast protein measurement on this type of samples was the ninhydrin assay. This method has also been used widely for this purpose, but with two different methods for protein hydrolysis prior to the assay - alkaline or acidic hydrolysis. In samples containing glucose or ethanol, there was significant...... that the presence of cellulose, lignin and glucose (above 50 g/kg) could significantly affect the results of the assay. Comparison of analyses performed with the ninhydrin assay and with a CN analyser revealed that there was good agreement between these two analytical methods, but care has to be taken when applying...... the ninhydrin assay. If used correctly, the ninhydrin assay can be used as a fast method to evaluate the adsorption of cellulases to lignin....

  17. The complex of measures on inclusion of small businesses in innovation clusters

    Directory of Open Access Journals (Sweden)

    A. V. Kupchinsky

    2016-01-01

    Full Text Available Modern practice of managing and its display in scientific publications demonstrate that development of world economy with all evidence proves the major role and the importance of sector of small business structures in national economy. In the modern world the national economy in many respects began to be determined by the balanced and sustainable development of the small business structures recognized now as conductors and creators of new opening and technologies, moreover, as the strategic instrument of the structural transformations of a modern economic system of the country often directed to high-quality increase in efficiency of reproduction process of regional economy. Now in Russia the level of development of an innovative entrepreneurship is very low. It is possible to state lack of properly created institutional environment for development of a small entrepreneurship in the innovative sphere. Clasterisation represents process of consolidation of a number of the organizations of various industries for increase in competitiveness, implementation of innovations, effective development and receipt of other benefits. According to separation of economy on real and virtual, the possibility of creation of both real, and virtual clusters increases. Creation and development of regional clusters will help to create the necessary level of activity of small business structures in innovative activities that will favorably affect increase in competitiveness of both regional, and national economy. The package of measures including measures for involvement of small business structures in clusters is developed for development of a cluster initiative and increase in innovative development of the region. Application of this program will allow to reach synergy effect at the expense of high degree of concentration and cooperation of small business structures and increase in effectiveness of their activities.

  18. Recommendations for a first Core Outcome Measurement set for complex regional PAin syndrome Clinical sTudies (COMPACT)

    Science.gov (United States)

    Grieve, Sharon; Perez, Roberto SGM; Birklein, Frank; Brunner, Florian; Bruehl, Stephen; Harden R, Norman; Packham, Tara; Gobeil, Francois; Haigh, Richard; Holly, Janet; Terkelsen, Astrid; Davies, Lindsay; Lewis, Jennifer; Thomassen, Ilona; Connett, Robyn; Worth, Tina; Vatine, Jean-Jacques; McCabe, Candida S

    2017-01-01

    Complex Regional Pain Syndrome (CRPS) is a persistent pain condition that remains incompletely understood and challenging to treat. Historically, a wide range of different outcome measures have been used to capture the multidimensional nature of CRPS. This has been a significant limiting factor in the advancement of our understanding of the mechanisms and management of CRPS. In 2013, an international consortium of patients, clinicians, researchers and industry representatives was established, to develop and agree on a minimum core set of standardised outcome measures for use in future CRPS clinical research, including but not limited to clinical trials within adult populations The development of a core measurement set was informed through workshops and supplementary work, using an iterative consensus process. ‘What is the clinical presentation and course of CRPS, and what factors influence it?’ was agreed as the most pertinent research question that our standardised set of patient-reported outcome measures should be selected to answer. The domains encompassing the key concepts necessary to answer the research question were agreed as: pain, disease severity, participation and physical function, emotional and psychological function, self efficacy, catastrophizing and patient's global impression of change. The final core measurement set included the optimum generic or condition-specific patient-reported questionnaire outcome measures, which captured the essence of each domain, and one clinician reported outcome measure to capture the degree of severity of CRPS. The next step is to test the feasibility and acceptability of collecting outcome measure data using the core measurement set in the CRPS population internationally. PMID:28178071

  19. Measurements in Vacuum of the Complex Permittivity of Planetary Regolith Analog Materials in Support of the OSIRIS-REx Mission

    Science.gov (United States)

    Boivin, A.; Hickson, D. C.; Cunje, A.; Tsai, C. A.; Ghent, R. R.; Daly, M. G.

    2017-12-01

    In preparation for the OSIRIS-REx sample return mission, ground based radar data have been used to help characterize the carbonaceous asteroid (101955) Bennu as well as to produce a 3-D shape model. Radar data have also been used to derive the near-surface bulk density of the asteroid, a key engineering factor for sample acquisition and return. The relationship between radar albedo and bulk density of the nearsurface depends on the relative permittivity of the material, in this case regolith. The relative permittivity is complex such that ɛ r = ɛ r' + i ɛ r'', where ɛ r' is the dielectric constant and ɛ r'' is the loss factor. Laboratory permittivity measurements have been made in the past on a myriad of samples including Earth materials, lunar Apollo and analog samples, Mars soil analog samples, some meteorites, and cometary analog samples in support of the Rosetta mission. These measurements have been made in different frequency bands and in various conditions; however, no measurements to date have systematically explored the effect of changes in mineralogy on the complex permittivity, and particularly the loss tangent (tanδ , the ratio of ɛ r'' to ɛ r'). The loss tangent controls the absorption of the signal by the material. Continuing our investigation of the effects of mineralogy on these properties, we will present for the first time results of complex permittivity measurements of the UCF/DSI-CI-2 CI asteroid regolith simulant produced by Deep Space Industries Inc. The simulant is mineralogically similar to the CI meteorite Orgueil. CI meteorites are the most spectrally similar meteorites to (101955) Bennu. Since the simulant has been provided to us un-mixed, several sub-samples will be created containing different amounts of carbon, thus allowing us to systematically investigate the effects of carbon content on the permittivity. In order to remove moisture from our samples, powders are baked at 250°C for 48hrs prior to being loaded into a coaxial

  20. An Attentional Resources-effectiveness Measure in Complex Diagnostic Tasks in NPPs

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2007-01-01

    The main role of the human operators in main control rooms (MCRs) of nuclear power plants (NPPs) is generally to supervise and operate the system. The operator's tasks in NPPs are performed through a series of cognitive activities: monitoring the environment, detecting data or information, understanding and assessing the situation, diagnosing the symptoms, decision-making, planning responses, and implementing the responses. In NPPs, there are a lot of information sources that should be monitored but the operators have only limited capacity of attention and memory. Because it is impossible to monitor all information sources, the operators continuously decide where to allocate their attentional resources. This kind of cognitive skill is called selective attention. In order for operators to effectively monitor, detect, and thus understand the state of a system, the operator should allocate their attentional resources to valuable information sources. Hence, the effectiveness of selective attention is expected to be able to reflect the effectiveness of monitoring, detection, and eventually understanding. In this study, an attentional resources effectiveness measure is proposed which based on cost benefit (or resource-effectiveness) principle

  1. An Attentional Resources-effectiveness Measure in Complex Diagnostic Tasks in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Jun Su; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2007-07-01

    The main role of the human operators in main control rooms (MCRs) of nuclear power plants (NPPs) is generally to supervise and operate the system. The operator's tasks in NPPs are performed through a series of cognitive activities: monitoring the environment, detecting data or information, understanding and assessing the situation, diagnosing the symptoms, decision-making, planning responses, and implementing the responses. In NPPs, there are a lot of information sources that should be monitored but the operators have only limited capacity of attention and memory. Because it is impossible to monitor all information sources, the operators continuously decide where to allocate their attentional resources. This kind of cognitive skill is called selective attention. In order for operators to effectively monitor, detect, and thus understand the state of a system, the operator should allocate their attentional resources to valuable information sources. Hence, the effectiveness of selective attention is expected to be able to reflect the effectiveness of monitoring, detection, and eventually understanding. In this study, an attentional resources effectiveness measure is proposed which based on cost benefit (or resource-effectiveness) principle.

  2. Analysis of spontaneous MEG activity in mild cognitive impairment and Alzheimer's disease using spectral entropies and statistical complexity measures

    Science.gov (United States)

    Bruña, Ricardo; Poza, Jesús; Gómez, Carlos; García, María; Fernández, Alberto; Hornero, Roberto

    2012-06-01

    Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz-Mancini-Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.

  3. Spectroscopic methods for aqueous cyclodextrin inclusion complex binding measurement for 1,4-dioxane, chlorinated co-contaminants, and ozone

    Science.gov (United States)

    Khan, Naima A.; Johnson, Michael D.; Carroll, Kenneth C.

    2018-03-01

    Recalcitrant organic contaminants, such as 1,4-dioxane, typically require advanced oxidation process (AOP) oxidants, such as ozone (O3), for their complete mineralization during water treatment. Unfortunately, the use of AOPs can be limited by these oxidants' relatively high reactivities and short half-lives. These drawbacks can be minimized by partial encapsulation of the oxidants within a cyclodextrin cavity to form inclusion complexes. We determined the inclusion complexes of O3 and three common co-contaminants (trichloroethene, 1,1,1-trichloroethane, and 1,4-dioxane) as guest compounds within hydroxypropyl-β-cyclodextrin. Both direct (ultraviolet or UV) and competitive (fluorescence changes with 6-p-toluidine-2-naphthalenesulfonic acid as the probe) methods were used, which gave comparable results for the inclusion constants of these species. Impacts of changing pH and NaCl concentrations were also assessed. Binding constants increased with pH and with ionic strength, which was attributed to variations in guest compound solubility. The results illustrate the versatility of cyclodextrins for inclusion complexation with various types of compounds, binding measurement methods are applicable to a wide range of applications, and have implications for both extraction of contaminants and delivery of reagents for treatment of contaminants in wastewater or contaminated groundwater.

  4. The general theory of the Quasi-reproducible experiments: How to describe the measured data of complex systems?

    Science.gov (United States)

    Nigmatullin, Raoul R.; Maione, Guido; Lino, Paolo; Saponaro, Fabrizio; Zhang, Wei

    2017-01-01

    In this paper, we suggest a general theory that enables to describe experiments associated with reproducible or quasi-reproducible data reflecting the dynamical and self-similar properties of a wide class of complex systems. Under complex system we understand a system when the model based on microscopic principles and suppositions about the nature of the matter is absent. This microscopic model is usually determined as ;the best fit" model. The behavior of the complex system relatively to a control variable (time, frequency, wavelength, etc.) can be described in terms of the so-called intermediate model (IM). One can prove that the fitting parameters of the IM are associated with the amplitude-frequency response of the segment of the Prony series. The segment of the Prony series including the set of the decomposition coefficients and the set of the exponential functions (with k = 1,2,…,K) is limited by the final mode K. The exponential functions of this decomposition depend on time and are found by the original algorithm described in the paper. This approach serves as a logical continuation of the results obtained earlier in paper [Nigmatullin RR, W. Zhang and Striccoli D. General theory of experiment containing reproducible data: The reduction to an ideal experiment. Commun Nonlinear Sci Numer Simul, 27, (2015), pp 175-192] for reproducible experiments and includes the previous results as a partial case. In this paper, we consider a more complex case when the available data can create short samplings or exhibit some instability during the process of measurements. We give some justified evidences and conditions proving the validity of this theory for the description of a wide class of complex systems in terms of the reduced set of the fitting parameters belonging to the segment of the Prony series. The elimination of uncontrollable factors expressed in the form of the apparatus function is discussed. To illustrate how to apply the theory and take advantage of its

  5. An investigation of ozone and planetary boundary layer dynamics over the complex topography of Grenoble combining measurements and modeling

    Directory of Open Access Journals (Sweden)

    O. Couach

    2003-01-01

    Full Text Available This paper concerns an evaluation of ozone (O3 and planetary boundary layer (PBL dynamics over the complex topography of the Grenoble region through a combination of measurements and mesoscale model (METPHOMOD predictions for three days, during July 1999. The measurements of O3 and PBL structure were obtained with a Differential Absorption Lidar (DIAL system, situated 20 km south of Grenoble at Vif (310 m ASL. The combined lidar observations and model calculations are in good agreement with atmospheric measurements obtained with an instrumented aircraft (METAIR. Ozone fluxes were calculated using lidar measurements of ozone vertical profiles concentrations and the horizontal wind speeds measured with a Radar Doppler wind profiler (DEGREANE. The ozone flux patterns indicate that the diurnal cycle of ozone production is controlled by local thermal winds. The convective PBL maximum height was some 2700 m above the land surface while the nighttime residual ozone layer was generally found between 1200 and 2200 m. Finally we evaluate the magnitude of the ozone processes at different altitudes in order to estimate the photochemical ozone production due to the primary pollutants emissions of Grenoble city and the regional network of automobile traffic.

  6. Unstable work histories and fertility in France: An adaptation of sequence complexity measures to employment trajectories

    Directory of Open Access Journals (Sweden)

    Daniel Ciganda

    2015-04-01

    Full Text Available Background: The emergence of new evidence suggesting a sign shift in the long-standing negativecorrelation between prosperity and fertility levels has sparked a renewed interest in understanding the relationship between economic conditions and fertility decisions. In thiscontext, the notion of uncertainty has gained relevance in analyses of low fertility. So far, most studies have approached this notion using snapshot indicators such as type of contract or employment situation. However, these types of measures seem to be fallingshort in capturing what is intrinsically a dynamic process. Objective: Our first objective is to analyze to what extent employment trajectories have become lessstable over time, and the second, to determine whether or not employment instability has an impact on the timing and quantum of fertility in France.Additionally, we present a new indicator of employment instability that takes into account both the frequency and duration of unemployment, with the objective of comparing its performance against other, more commonly used indicators of economic uncertainty. Methods: Our study combines exploratory (Sequence Analysis with confirmatory (Event History, Logistic Regression methods to understand the relationship between early life-course uncertainty and the timing and intensity of fertility. We use employment histories from the three available waves of the Etude des relations familiales et intergenerationnelles (ERFI, a panel survey carried out by INED and INSEE which constitutes the base of the Generations and Gender Survey (GGS in France. Results: Although France is characterized by strong family policies and high and stable fertility levels, we find that employment instability not only has a strong and persistent negative effect on the final number of children for both men and women, but also contributes to fertility postponement in the case of men.Regarding the timing of the transition to motherhood, we show how

  7. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    Science.gov (United States)

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of

  8. From theory to 'measurement' in complex interventions: Methodological lessons from the development of an e-health normalisation instrument

    Directory of Open Access Journals (Sweden)

    Finch Tracy L

    2012-05-01

    Full Text Available Abstract Background Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1 describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2 identify key issues and methodological challenges for advancing work in this field. Methods A 30-item instrument (Technology Adoption Readiness Scale (TARS for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT. NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice was used by health care professionals. Results The developed instrument was pre-tested in two professional samples (N = 46; N = 231. Ratings of items representing normalisation ‘processes’ were significantly related to staff members’ perceptions of whether or not e-health had become ‘routine’. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. Conclusions To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1 greater attention to underlying theoretical assumptions and extent of translation work required; (2 the need for appropriate but flexible approaches to outcomes

  9. Origin of middle rare earth element enrichments in acid waters of a Canadian high Arctic lake.

    Science.gov (United States)

    Johannesson, Kevin H.; Zhou, Xiaoping

    1999-01-01

    -Middle rare earth element (MREE) enriched rock-normalized rare earth element (REE) patterns of a dilute acidic lake (Colour Lake) in the Canadian High Arctic, were investigated by quantifying whole-rock REE concentrations of rock samples collected from the catchment basin, as well as determining the acid leachable REE fraction of these rocks. An aliquot of each rock sample was leached with 1 N HNO 3 to examine the readily leachable REE fraction of each rock, and an additional aliquot was leached with a 0.04 M NH 2OH · HCl in 25% (v/v) CH 3COOH solution, designed specifically to reduce Fe-Mn oxides/oxyhydroxides. Rare earth elements associated with the leachates that reacted with clastic sedimentary rock samples containing petrographically identifiable Fe-Mn oxide/oxyhydroxide cements and/or minerals/amorphous phases, exhibited whole-rock-normalized REE patterns similar to the lake waters, whereas whole-rock-normalized leachates from mafic igneous rocks and other clastic sedimentary rocks from the catchment basin differed substantially from the lake waters. The whole-rock, leachates, and lake water REE data support acid leaching or dissolution of MREE enriched Fe-Mn oxides/oxyhydroxides contained and identified within some of the catchment basin sedimentary rocks as the likely source of the unique lake water REE patterns. Solution complexation modelling of the REEs in the inflow streams and lake waters indicate that free metal ions (e.g., Ln 3+, where Ln = any REE) and sulfate complexes (LnSO 4+) are the dominant forms of dissolved REEs. Consequently, solution complexation reactions involving the REEs during weathering, transport to the lake, or within the lake, cannot be invoked to explain the MREE enrichments observed in the lake waters.

  10. Interaction of Cucurbit(5)uril with U(VI) in formic acid water medium

    International Nuclear Information System (INIS)

    Rawat, Neetika; Kar, Aishwarya; Tomar, B.S.; Nayak, S.K.; Mohapatra, M.

    2015-01-01

    Cucurbit(n)urils (CBn) are a new class of macrocyclic cage compounds capable of binding organic and inorganic species, owing to their unique pumpkin like structure comprising of both hydrophobic cavity and hydrophilic portal. Complexation of U(VI) with Cucurbit(5)uril (CB5) in 50 wt% formic acid medium has been studied by UV-Vis spectroscopy. In order to understand the species formed, the interaction of formic acid with CB5 was studied by monitoring fluorescence of CB5. Formic was found to form 1:1 species with interaction constant (K) 17.4 M -1 . (author)

  11. Ensemble Classification of Alzheimer's Disease and Mild Cognitive Impairment Based on Complex Graph Measures from Diffusion Tensor Images

    Science.gov (United States)

    Ebadi, Ashkan; Dalboni da Rocha, Josué L.; Nagaraju, Dushyanth B.; Tovar-Moll, Fernanda; Bramati, Ivanei; Coutinho, Gabriel; Sitaram, Ranganatha; Rashidi, Parisa

    2017-01-01

    The human brain is a complex network of interacting regions. The gray matter regions of brain are interconnected by white matter tracts, together forming one integrative complex network. In this article, we report our investigation about the potential of applying brain connectivity patterns as an aid in diagnosing Alzheimer's disease and Mild Cognitive Impairment (MCI). We performed pattern analysis of graph theoretical measures derived from Diffusion Tensor Imaging (DTI) data representing structural brain networks of 45 subjects, consisting of 15 patients of Alzheimer's disease (AD), 15 patients of MCI, and 15 healthy subjects (CT). We considered pair-wise class combinations of subjects, defining three separate classification tasks, i.e., AD-CT, AD-MCI, and CT-MCI, and used an ensemble classification module to perform the classification tasks. Our ensemble framework with feature selection shows a promising performance with classification accuracy of 83.3% for AD vs. MCI, 80% for AD vs. CT, and 70% for MCI vs. CT. Moreover, our findings suggest that AD can be related to graph measures abnormalities at Brodmann areas in the sensorimotor cortex and piriform cortex. In this way, node redundancy coefficient and load centrality in the primary motor cortex were recognized as good indicators of AD in contrast to MCI. In general, load centrality, betweenness centrality, and closeness centrality were found to be the most relevant network measures, as they were the top identified features at different nodes. The present study can be regarded as a “proof of concept” about a procedure for the classification of MRI markers between AD dementia, MCI, and normal old individuals, due to the small and not well-defined groups of AD and MCI patients. Future studies with larger samples of subjects and more sophisticated patient exclusion criteria are necessary toward the development of a more precise technique for clinical diagnosis. PMID:28293162

  12. New Parameterizations for Neutral and Ion-Induced Sulfuric Acid-Water Particle Formation in Nucleation and Kinetic Regimes

    Science.gov (United States)

    Määttänen, Anni; Merikanto, Joonas; Henschel, Henning; Duplissy, Jonathan; Makkonen, Risto; Ortega, Ismael K.; Vehkamäki, Hanna

    2018-01-01

    We have developed new parameterizations of electrically neutral homogeneous and ion-induced sulfuric acid-water particle formation for large ranges of environmental conditions, based on an improved model that has been validated against a particle formation rate data set produced by Cosmics Leaving OUtdoor Droplets (CLOUD) experiments at European Organization for Nuclear Research (CERN). The model uses a thermodynamically consistent version of the Classical Nucleation Theory normalized using quantum chemical data. Unlike the earlier parameterizations for H2SO4-H2O nucleation, the model is applicable to extreme dry conditions where the one-component sulfuric acid limit is approached. Parameterizations are presented for the critical cluster sulfuric acid mole fraction, the critical cluster radius, the total number of molecules in the critical cluster, and the particle formation rate. If the critical cluster contains only one sulfuric acid molecule, a simple formula for kinetic particle formation can be used: this threshold has also been parameterized. The parameterization for electrically neutral particle formation is valid for the following ranges: temperatures 165-400 K, sulfuric acid concentrations 104-1013 cm-3, and relative humidities 0.001-100%. The ion-induced particle formation parameterization is valid for temperatures 195-400 K, sulfuric acid concentrations 104-1016 cm-3, and relative humidities 10-5-100%. The new parameterizations are thus applicable for the full range of conditions in the Earth's atmosphere relevant for binary sulfuric acid-water particle formation, including both tropospheric and stratospheric conditions. They are also suitable for describing particle formation in the atmosphere of Venus.

  13. A microfluidic device for simultaneous measurement of viscosity and flow rate of blood in a complex fluidic network.

    Science.gov (United States)

    Jun Kang, Yang; Yeom, Eunseop; Lee, Sang-Joon

    2013-01-01

    Blood viscosity has been considered as one of important biophysical parameters for effectively monitoring variations in physiological and pathological conditions of circulatory disorders. Standard previous methods make it difficult to evaluate variations of blood viscosity under cardiopulmonary bypass procedures or hemodialysis. In this study, we proposed a unique microfluidic device for simultaneously measuring viscosity and flow rate of whole blood circulating in a complex fluidic network including a rat, a reservoir, a pinch valve, and a peristaltic pump. To demonstrate the proposed method, a twin-shaped microfluidic device, which is composed of two half-circular chambers, two side channels with multiple indicating channels, and one bridge channel, was carefully designed. Based on the microfluidic device, three sequential flow controls were applied to identify viscosity and flow rate of blood, with label-free and sensorless detection. The half-circular chamber was employed to achieve mechanical membrane compliance for flow stabilization in the microfluidic device. To quantify the effect of flow stabilization on flow fluctuations, a formula of pulsation index (PI) was analytically derived using a discrete fluidic circuit model. Using the PI formula, the time constant contributed by the half-circular chamber is estimated to be 8 s. Furthermore, flow fluctuations resulting from the peristaltic pumps are completely removed, especially under periodic flow conditions within short periods (T viscosity with respect to varying flow rate conditions [(a) known blood flow rate via a syringe pump, (b) unknown blood flow rate via a peristaltic pump]. As a result, the flow rate and viscosity of blood can be simultaneously measured with satisfactory accuracy. In addition, the proposed method was successfully applied to identify the viscosity of rat blood, which circulates in a complex fluidic network. These observations confirm that the proposed method can be used for

  14. Rotational study of the CH4–CO complex: Millimeter-wave measurements and ab initio calculations

    International Nuclear Information System (INIS)

    Surin, L. A.; Tarabukin, I. V.; Panfilov, V. A.; Schlemmer, S.; Kalugina, Y. N.; Faure, A.; Rist, C.; Avoird, A. van der

    2015-01-01

    The rotational spectrum of the van der Waals complex CH 4 –CO has been measured with the intracavity OROTRON jet spectrometer in the frequency range of 110–145 GHz. Newly observed and assigned transitions belong to the K = 2–1 subband correlating with the rotationless j CH4 = 0 ground state and the K = 2–1 and K = 0–1 subbands correlating with the j CH4 = 2 excited state of free methane. The (approximate) quantum number K is the projection of the total angular momentum J on the intermolecular axis. The new data were analyzed together with the known millimeter-wave and microwave transitions in order to determine the molecular parameters of the CH 4 –CO complex. Accompanying ab initio calculations of the intermolecular potential energy surface (PES) of CH 4 –CO have been carried out at the explicitly correlated coupled cluster level of theory with single, double, and perturbative triple excitations [CCSD(T)-F12a] and an augmented correlation-consistent triple zeta (aVTZ) basis set. The global minimum of the five-dimensional PES corresponds to an approximately T-shaped structure with the CH 4 face closest to the CO subunit and binding energy D e = 177.82 cm −1 . The bound rovibrational levels of the CH 4 –CO complex were calculated for total angular momentum J = 0–6 on this intermolecular potential surface and compared with the experimental results. The calculated dissociation energies D 0 are 91.32, 94.46, and 104.21 cm −1 for A (j CH4 = 0), F (j CH4 = 1), and E (j CH4 = 2) nuclear spin modifications of CH 4 –CO, respectively

  15. Increasing the sensitivity of NMR diffusion measurements by paramagnetic longitudinal relaxation enhancement, with application to ribosome–nascent chain complexes

    International Nuclear Information System (INIS)

    Chan, Sammy H. S.; Waudby, Christopher A.; Cassaignau, Anaïs M. E.; Cabrita, Lisa D.; Christodoulou, John

    2015-01-01

    The translational diffusion of macromolecules can be examined non-invasively by stimulated echo (STE) NMR experiments to accurately determine their molecular sizes. These measurements can be important probes of intermolecular interactions and protein folding and unfolding, and are crucial in monitoring the integrity of large macromolecular assemblies such as ribosome–nascent chain complexes (RNCs). However, NMR studies of these complexes can be severely constrained by their slow tumbling, low solubility (with maximum concentrations of up to 10 μM), and short lifetimes resulting in weak signal, and therefore continuing improvements in experimental sensitivity are essential. Here we explore the use of the paramagnetic longitudinal relaxation enhancement (PLRE) agent NiDO2A on the sensitivity of 15 N XSTE and SORDID heteronuclear STE experiments, which can be used to monitor the integrity of these unstable complexes. We exploit the dependence of the PLRE effect on the gyromagnetic ratio and electronic relaxation time to accelerate recovery of 1 H magnetization without adversely affecting storage on N z during diffusion delays or introducing significant transverse relaxation line broadening. By applying the longitudinal relaxation-optimized SORDID pulse sequence together with NiDO2A to 70S Escherichia coli ribosomes and RNCs, NMR diffusion sensitivity enhancements of up to 4.5-fold relative to XSTE are achieved, alongside ∼1.9-fold improvements in two-dimensional NMR sensitivity, without compromising the sample integrity. We anticipate these results will significantly advance the use of NMR to probe dynamic regions of ribosomes and other large, unstable macromolecular assemblies.Graphical Abstract

  16. LIQUID-LIQUID EQUILIBRIA OF THE TERNARY SYSTEMS PROPIONIC ACID - WATER - SOLVENT (n-AMYL ALCOHOL AND n-AMYL ACETATE

    Directory of Open Access Journals (Sweden)

    Dilek ÖZMEN

    2005-02-01

    Full Text Available The experimental liquid-liquid equilibrium (LLE data have been obtained at 25 oC for ternary systems propionic acid-water-n-amyl alcohol and propionic acid-water-n-amyl acetate. The reliability of the experimental tie line data are checked using the methods of Othmer-Tobias and Hand. The distribution coefficients and separation factors were obtained from experimental results and are also reported. The predicted tie line data obtained by UNIFAC method are compared with experimental data. It is concluded that n-amyl alcohol and n-amyl acetate are suitable separating agents for dilute aqueous propionic acid solutions.

  17. Hot Air Balloon Experiments to Measure the Break-up of the Nocturnal Drainage Flow in Complex Terrain.

    Science.gov (United States)

    Berman, N. S.; Fernando, H. J. S.; Colomer, J.; Levy, M.; Zieren, L.

    1997-11-01

    In order to extend our understanding of the thermally driven atmospheric winds and their influence on pollutant transport, a hot air balloon experiment was conducted over a four day period in June, 1997 near Nogales, Arizona. The focus was on the early morning break-up of the stable down-slope and down-valley flow and the establishment of a convective boundary layer near the surface in the absence of synoptic winds. Temperature, elevation, position and particulate matter concentration were measured aloft and temperature gradient and wind velocity were measured at ground level. The wind velocity within the stable layer was generally less than 1.5 m/s. Just above the stable layer (about 300 meters above the valley) the wind shifted leading to an erosion of the stable layer from above. Surface heating after sunrise created a convective layer which rose from the ground until the stable layer was destroyed. Examples of temperature fluctuation measurements at various elevations during the establishment of the convective flow will be presented. Implications of results for turbulence parameterizations needed for numerical models of wind fields in complex terrain will be discussed.

  18. Considerations and Optimization of Time-Resolved PIV Measurements near Complex Wind-Generated Air-Water Wave Interface

    Science.gov (United States)

    Stegmeir, Matthew; Markfort, Corey

    2017-11-01

    Time Resolved PIV measurements are applied on both sides of air-water interface in order to study the coupling between air and fluid motion. The multi-scale and 3-dimensional nature of the wave structure poses several unique considerations to generate optimal-quality data very near the fluid interface. High resolution and dynamic range in space and time are required to resolve relevant flow scales along a complex and ever-changing interface. Characterizing the two-way coupling across the air-water interface provide unique challenges for optical measurement techniques. Approaches to obtain near-boundary measurement on both sides of interface are discussed, including optimal flow seeding procedures, illumination, data analysis, and interface tracking. Techniques are applied to the IIHR Boundary-Layer Wind-Wave Tunnel and example results presented for both sides of the interface. The facility combines a 30m long recirculating water channel with an open-return boundary layer wind tunnel, allowing for the study of boundary layer turbulence interacting with a wind-driven wave field.

  19. A novel technique for phase synchrony measurement from the complex motor imaginary potential of combined body and limb action

    Science.gov (United States)

    Zhou, Zhong-xing; Wan, Bai-kun; Ming, Dong; Qi, Hong-zhi

    2010-08-01

    In this study, we proposed and evaluated the use of the empirical mode decomposition (EMD) technique combined with phase synchronization analysis to investigate the human brain synchrony of the supplementary motor area (SMA) and primary motor area (M1) during complex motor imagination of combined body and limb action. We separated the EEG data of the SMA and M1 into intrinsic mode functions (IMFs) using the EMD method and determined the characteristic IMFs by power spectral density (PSD) analysis. Thereafter, the instantaneous phases of the characteristic IMFs were obtained by the Hilbert transformation, and the single-trial phase-locking value (PLV) features for brain synchrony measurement between the SMA and M1 were investigated separately. The classification performance suggests that the proposed approach is effective for phase synchronization analysis and is promising for the application of a brain-computer interface in motor nerve reconstruction of the lower limbs.

  20. The Multilateral Convention to Implement Tax Treaty Related Measures to Prevent BEPS—Some Thoughts on Complexity and Uncertainty

    Directory of Open Access Journals (Sweden)

    Kleist David

    2018-04-01

    Full Text Available The Multilateral Convention to Implement Tax Treaty Related Measures to Prevent Base Erosion and Profit Shifting (MLI, which was signed in June 2017, raises a multitude of questions relating not only to the text of the treaty provisions but also to the way the MLI will interact with tax treaties, for instance, and what it will mean for the future development of tax treaty law and international cooperation in tax matters. This article focuses on two aspects of the MLI. First, it deals with the substance of the MLI by providing an overview of its background and content, including the many options available to the contracting states under the MLI. Second, some thoughts are presented on the effects of the MLI in terms of complexity and uncertainty.

  1. Will hypertension performance measures used for pay-for-performance programs penalize those who care for medically complex patients?

    Science.gov (United States)

    Petersen, Laura A; Woodard, Lechauncy D; Henderson, Louise M; Urech, Tracy H; Pietz, Kenneth

    2009-06-16

    There is concern that performance measures, patient ratings of their care, and pay-for-performance programs may penalize healthcare providers of patients with multiple chronic coexisting conditions. We examined the impact of coexisting conditions on the quality of care for hypertension and patient perception of overall quality of their health care. We classified 141 609 veterans with hypertension into 4 condition groups: those with hypertension-concordant (diabetes mellitus, ischemic heart disease, dyslipidemia) and/or -discordant (arthritis, depression, chronic obstructive pulmonary disease) conditions or neither. We measured blood pressure control at the index visit, overall good quality of care for hypertension, including a follow-up interval, and patient ratings of satisfaction with their care. Associations between condition type and number of coexisting conditions on receipt of overall good quality of care were assessed with logistic regression. The relationship between patient assessment and objective measures of quality was assessed. Of the cohort, 49.5% had concordant-only comorbidities, 8.7% had discordant-only comorbidities, 25.9% had both, and 16.0% had none. Odds of receiving overall good quality after adjustment for age were higher for those with concordant comorbidities (odds ratio, 1.78; 95% confidence interval, 1.70 to 1.87), discordant comorbidities (odds ratio, 1.32; 95% confidence interval, 1.23 to 1.41), or both (odds ratio, 2.25; 95% confidence interval, 2.13 to 2.38) compared with neither. Findings did not change after adjustment for illness severity and/or number of primary care and specialty care visits. Patient assessment of quality did not vary by the presence of coexisting conditions and was not related to objective ratings of quality of care. Contrary to expectations, patients with greater complexity had higher odds of receiving high-quality care for hypertension. Subjective ratings of care did not vary with the presence or absence of

  2. Study of the liquid vapor equilibrium in the bromine-hydrobromic acid-water system

    Science.gov (United States)

    Benizri, R.; Lessart, P.; Courvoisier, P.

    1984-01-01

    A glass ebullioscope was built and at atmospheric pressure, liquid-vapor equilibria relative to the Br2-HBr-H2O system, in the concentration range of interest for evaluation of the Mark 13 cycle was studied. Measurements were performed for the brome-azeotrope (HBr-H2O) pseudo-binary system and for the ternary system at temperatures lower than 125 C and in the bromine concentration range up to 13% wt.

  3. FURFURAL YIELD AND DECOMPOSITION IN SODIUM 2,4DIMETHYLBENZENESULFONATE--SULFURIC ACID--WATER SOLUTIONS.

    Science.gov (United States)

    Batch-type microreactors (about 1/40 milliliter of reactants) were used to measure furfural yields from acidified xylose solutions containing sodium...It was found that presence of the salt did not affect the quantity of furfural produced, but greatly increased the rate of formation. The regular...increase in rate of furfural formation was directly related to the increase in the rate xylose decomposition, and furfural yields for all salt and acid

  4. Acid Water Neutralization Using Microbial Fuel Cells: An Alternative for Acid Mine Drainage Treatment

    Directory of Open Access Journals (Sweden)

    Eduardo Leiva

    2016-11-01

    Full Text Available Acid mine drainage (AMD is a complex environmental problem, which has adverse effects on surface and ground waters due to low pH, high toxic metals, and dissolved salts. New bioremediation approach based on microbial fuel cells (MFC can be a novel and sustainable alternative for AMD treatment. We studied the potential of MFC for acidic synthetic water treatment through pH neutralization in batch-mode and continuous-flow operation. We observed a marked pH increase, from ~3.7 to ~7.9 under batch conditions and to ~5.8 under continuous-flow operation. Likewise, batch reactors (non-MFC inoculated with different MFC-enriched biofilms showed a very similar pH increase, suggesting that the neutralization observed for batch operation was due to a synergistic influence of these communities. These preliminary results support the idea of using MFC technologies for AMD remediation, which could help to reduce costs associated with conventional technologies. Advances in this configuration could even be extrapolated to the recovery of heavy metals by precipitation or adsorption processes due to the acid neutralization.

  5. [Complex of psycho-hygienic correction measures of personality features of hiv-infected men and evaluation of their efficiency].

    Science.gov (United States)

    Serheta, Ihor V; Dudarenko, Oksana B; Mostova, Olha P; Lobastova, Tetiana V; Andriichuk, Vitalii M; Vakolyuk, Larysa M; Yakubovska, Olha M

    2018-01-01

    Introduction: In addition to adequate diagnosis and treatment of HIV-infected individuals, development, scientific substantiation and implementation of psycho-hygienic measures aimed at correcting the processes of forming personality traits and improving the psycho-emotional state of HIV-infected individuals are of particular importance. The aim: The purpose of the scientific research was to determine the most significant changes of situational and personal anxiety indicators, the degree of gravity of the asthenic state and depressive manifestations that were recorded in the context of the introduction of a number of measures for psycho-hygienic correction. Materials and methods: To determine the peculiarities of the impact of the proposed measures of psycho-hygienic correction and the study of the consequences of their implementation, two groups of comparison were created: a control group and an intervention group. 30 HIV-infected men who used a complex of measures for psycho-hygienic correction of personality traits and improvement of psycho-emotional state in their daily activities were included in the intervention group; 30 HIV-infected men who did not use this complex in their daily activities were included in the control group. Diagnosis and assessment of the anxiety of HIV-infected persons were carried out on the basis of The State-Trait Anxiety Inventory (STAI). The absence or presence of manifestations of an asthenic personality disorder in the subjects was determined by means of a test method created by L. Malkova for assessing asthenia. In order to determine the degree of manifestation of this characteristic, the psychic state of a person, as a level of expression of a depressive state, the psychometric Zung Depression Rating Scale was used to assess depression. Results: Studies have found that there was a statistically valid decrease of the level of indicators of situational anxiety among the representatives of the intervention group which reduced from

  6. The importance of chemical buffering for pelagic and benthic colonization in acidic waters

    International Nuclear Information System (INIS)

    Nixdorf, B.; Lessmann, D.; Steinberg, C. E. W.

    2003-01-01

    In poorly buffered areas acidification may occur for two reasons: through atmospheric deposition of acidifying substances and - in mining districts - through pyrite weathering. These different sources of acidity lead to distinct clearly geochemistry in lakes and rivers. In general, the geochemistry is the major determinant for the planktonic composition of the acidified water bodies, whereas the nutrient status mainly determines the level of biomass. A number of acidic mining lakes in Eastern Germany have to be neutralized to meet the water quality goals of the European Union Directives and to overcome the ecological degradation. This neutralization process is limnologically a short-term maturation of lakes, which permits biological succession to overcome two different geochemical buffer systems. First, the iron buffer system characterizes an initial state, when colonization starts: there is low organismic diversity and productivity, clear net heterotrophy in most cases. Organic carbon that serves as fuel for the food web derives mainly from allochthonous sources. In the second, less acidic state aluminum is the buffer. This state is found exceptionally among the hard water mining lakes, often as a result of deposition of acidifying substances onto soft water systems. Colonization in aluminum-buffered lakes is more complex and controlled by the sensitivity of the organisms towards both, protons and inorganic reactive aluminum species. In soft-water systems, calcium may act as antidote against acid and aluminum; however, this function is lost in hard water post mining lakes of similar proton concentrations. Nutrient limitations may occur, but these do not usually control qualitative and quantitative plankton composition. In these lakes, total pelagic biomass is controlled by the bioavailability of nutrients, particularly phosphorus

  7. DEVELOPMENT OF CONCEPT OF HARDWARE-SOFTWARE COMPLEX OF MODULAR DESIGN FOR DETERMINATION OF ANTENNA SYSTEMS׳ CHARACTERISTICS BASED ON MEASUREMENTS IN THE NEAR FIELD

    Directory of Open Access Journals (Sweden)

    A. G. Buday

    2017-01-01

    Full Text Available Measuring the amplitude-phase distribution of the radiation field of complex antenna systems on a certain surface close to the radiating aperture allows solving the problem of reconstructing the free-space diagram in the far field and also helps in determining the influence of various structural elements and defects of radiating surfaces on formation of directional diagram. The purpose of this work was to develop a universal hardware-software complex of a modular design aimed for determining the characteristics of wide range of antenna systems in respect of measurements of the amplitude-phase distribution of the radiation field in the near zone.The equations that connect the structure of radiation fields of the antenna system at various distances from it in planar, cylindrical and spherical coordinate systems as well as structural diagrams of the hardware part of measuring complexes have been analyzed.As a result, the concept of constructing a universal hardware-software complex for measuring the radiation field of various types of antenna systems with any type of measurement surface for solving a wide range of applied problems has been developed. A modular structure of hardware and software has been proposed; it allows reconfiguring the complex rapidly in order to measure the characteristics of any particular antenna system at all stages of product development and testing, and also makes the complex economically accessible even for small enterprises and organizations.

  8. Non-destructive failure analysis and measurement for molded devices and complex assemblies with X-ray CT and 3D image processing techniques

    International Nuclear Information System (INIS)

    Yin, Xiaoming; Liew, Seaw Jia; Jiang, Ting Ying; Xu, Jian; Kakarala, Ramakrishna

    2013-01-01

    In both automotive and healthcare sectors, reliable failure analysis and accurate measurement of molded devices and complex assemblies are important. Current methods of failure analysis and measurement require these molded parts to be cross-sectioned so that internal features or dimensions can be accessible. As a result, the parts are deemed unusable and additional failure introduced by sectioning may cause misinterpretation of the results. X-ray CT and 3D image processing techniques provide a new nondestructive solution for failure analysis and measurement of molded devices and complex assemblies. These techniques simplify failure analysis and measurement of molded devices and assemblies, and improve the productivity of molding manufacturing significantly.

  9. Validating an Agency-based Tool for Measuring Women's Empowerment in a Complex Public Health Trial in Rural Nepal.

    Science.gov (United States)

    Gram, Lu; Morrison, Joanna; Sharma, Neha; Shrestha, Bhim; Manandhar, Dharma; Costello, Anthony; Saville, Naomi; Skordis-Worrall, Jolene

    2017-01-02

    Despite the rising popularity of indicators of women's empowerment in global development programmes, little work has been done on the validity of existing measures of such a complex concept. We present a mixed methods validation of the use of the Relative Autonomy Index for measuring Amartya Sen's notion of agency freedom in rural Nepal. Analysis of think-aloud interviews ( n  = 7) indicated adequate respondent understanding of questionnaire items, but multiple problems of interpretation including difficulties with the four-point Likert scale, questionnaire item ambiguity and difficulties with translation. Exploratory Factor Analysis of a calibration sample ( n  = 511) suggested two positively correlated factors ( r  = 0.64) loading on internally and externally motivated behaviour. Both factors increased with decreasing education and decision-making power on large expenditures and food preparation. Confirmatory Factor Analysis on a validation sample ( n  = 509) revealed good fit (Root Mean Square Error of Approximation 0.05-0.08, Comparative Fit Index 0.91-0.99). In conclusion, we caution against uncritical use of agency-based quantification of women's empowerment. While qualitative and quantitative analysis revealed overall satisfactory construct and content validity, the positive correlation between external and internal motivations suggests the existence of adaptive preferences. High scores on internally motivated behaviour may reflect internalized oppression rather than agency freedom.

  10. Particle image velocimetry measurement of complex flow structures in the diffuser and spherical casing of a reactor coolant pump

    Directory of Open Access Journals (Sweden)

    Yongchao Zhang

    2018-04-01

    Full Text Available Understanding of turbulent flow in the reactor coolant pump (RCP is a premise of the optimal design of the RCP. Flow structures in the RCP, in view of the specially devised spherical casing, are more complicated than those associated with conventional pumps. Hitherto, knowledge of the flow characteristics of the RCP has been far from sufficient. Research into the nonintrusive measurement of the internal flow of the RCP has rarely been reported. In the present study, flow measurement using particle image velocimetry is implemented to reveal flow features of the RCP model. Velocity and vorticity distributions in the diffuser and spherical casing are obtained. The results illuminate the complexity of the flows in the RCP. Near the lower end of the discharge nozzle, three-dimensional swirling flows and flow separation are evident. In the diffuser, the imparity of the velocity profile with respect to different axial cross sections is verified, and the velocity increases gradually from the shroud to the hub. In the casing, velocity distribution is nonuniform over the circumferential direction. Vortices shed consistently from the diffuser blade trailing edge. The experimental results lend sound support for the optimal design of the RCP and provide validation of relevant numerical algorithms. Keywords: Diffuser, Flow Structures, Particle Image Velocimetry, Reactor Coolant Pump, Spherical Casing, Velocity Distribution

  11. Building a measurement framework of burden of treatment in complex patients with chronic conditions: a qualitative study

    Directory of Open Access Journals (Sweden)

    Eton DT

    2012-08-01

    Full Text Available David T Eton,1 Djenane Ramalho de Oliveira,2,3 Jason S Egginton,1 Jennifer L Ridgeway,1 Laura Odell,4 Carl R May,5 Victor M Montori1,61Division of Health Care Policy and Research, Department of Health Sciences Research, Mayo Clinic, Rochester, MN, USA; 2College of Pharmacy, Universidade Federal de Minas Gerais, Belo Horizonte, Brazil; 3Medication Therapy Management Program, Fairview Pharmacy Services LLC, Minneapolis, MN, USA; 4Pharmacy Services, Mayo Clinic, Rochester, MN, USA; 5Faculty of Health Sciences, University of Southampton, Southampton, UK; 6Knowledge and Evaluation Research Unit, Mayo Clinic, Rochester, MN, USABackground: Burden of treatment refers to the workload of health care as well as its impact on patient functioning and well-being. We set out to build a conceptual framework of issues descriptive of burden of treatment from the perspective of the complex patient, as a first step in the development of a new patient-reported measure.Methods: We conducted semistructured interviews with patients seeking medication therapy management services at a large, academic medical center. All patients had a complex regimen of self-care (including polypharmacy, and were coping with one or more chronic health conditions. We used framework analysis to identify and code themes and subthemes. A conceptual framework of burden of treatment was outlined from emergent themes and subthemes.Results: Thirty-two patients (20 female, 12 male, age 26–85 years were interviewed. Three broad themes of burden of treatment emerged including: the work patients must do to care for their health; problem-focused strategies and tools to facilitate the work of self-care; and factors that exacerbate the burden felt. The latter theme encompasses six subthemes including challenges with taking medication, emotional problems with others, role and activity limitations, financial challenges, confusion about medical information, and health care delivery obstacles

  12. Airborne measurements of turbulent trace gas fluxes and analysis of eddy structure in the convective boundary layer over complex terrain

    Science.gov (United States)

    Hasel, M.; Kottmeier, Ch.; Corsmeier, U.; Wieser, A.

    2005-03-01

    Using the new high-frequency measurement equipment of the research aircraft DO 128, which is described in detail, turbulent vertical fluxes of ozone and nitric oxide have been calculated from data sampled during the ESCOMPTE program in the south of France. Based on airborne turbulence measurements, radiosonde data and surface energy balance measurements, the convective boundary layer (CBL) is examined under two different aspects. The analysis covers boundary-layer convection with respect to (i) the control of CBL depth by surface heating and synoptic scale influences, and (ii) the structure of convective plumes and their vertical transport of ozone and nitric oxides. The orographic structure of the terrain causes significant differences between planetary boundary layer (PBL) heights, which are found to exceed those of terrain height variations on average. A comparison of boundary-layer flux profiles as well as mean quantities over flat and complex terrain and also under different pollution situations and weather conditions shows relationships between vertical gradients and corresponding turbulent fluxes. Generally, NO x transports are directed upward independent of the terrain, since primary emission sources are located near the ground. For ozone, negative fluxes are common in the lower CBL in accordance with the deposition of O 3 at the surface. The detailed structure of thermals, which largely carry out vertical transports in the boundary layer, are examined with a conditional sampling technique. Updrafts mostly contain warm, moist and NO x loaded air, while the ozone transport by thermals alternates with the background ozone gradient. Evidence for handover processes of trace gases to the free atmosphere can be found in the case of existing gradients across the boundary-layer top. An analysis of the size of eddies suggests the possibility of some influence of the heterogeneous terrain in mountainous area on the length scales of eddies.

  13. Assessment of Biopsychosocial Complexity and Health Care Needs: Measurement Properties of the INTERMED Self-Assessment Version.

    Science.gov (United States)

    van Reedt Dortland, Arianne K B; Peters, Lilian L; Boenink, Annette D; Smit, Jan H; Slaets, Joris P J; Hoogendoorn, Adriaan W; Joos, Andreas; Latour, Corine H M; Stiefel, Friedrich; Burrus, Cyrille; Guitteny-Collas, Marie; Ferrari, Silvia

    2017-05-01

    The INTERMED Self-Assessment questionnaire (IMSA) was developed as an alternative to the observer-rated INTERMED (IM) to assess biopsychosocial complexity and health care needs. We studied feasibility, reliability, and validity of the IMSA within a large and heterogeneous international sample of adult hospital inpatients and outpatients as well as its predictive value for health care use (HCU) and quality of life (QoL). A total of 850 participants aged 17 to 90 years from five countries completed the IMSA and were evaluated with the IM. The following measurement properties were determined: feasibility by percentages of missing values; reliability by Cronbach α; interrater agreement by intraclass correlation coefficients; convergent validity of IMSA scores with mental health (Short Form 36 emotional well-being subscale and Hospital Anxiety and Depression Scale), medical health (Cumulative Illness Rating Scale) and QoL (Euroqol-5D) by Spearman rank correlations; and predictive validity of IMSA scores with HCU and QoL by (generalized) linear mixed models. Feasibility, face validity, and reliability (Cronbach α = 0.80) were satisfactory. Intraclass correlation coefficient between IMSA and IM total scores was .78 (95% CI = .75-.81). Correlations of the IMSA with the Short Form 36, Hospital Anxiety and Depression Scale, Cumulative Illness Rating Scale, and Euroqol-5D (convergent validity) were -.65, .15, .28, and -.59, respectively. The IMSA significantly predicted QoL and also HCU (emergency department visits, hospitalization, outpatient visits, and diagnostic examinations) after 3- and 6-month follow-up. Results were comparable between hospital sites, inpatients and outpatients, as well as age groups. The IMSA is a generic and time-efficient method to assess biopsychosocial complexity and to provide guidance for multidisciplinary care trajectories in adult patients, with good reliability and validity across different cultures.

  14. Biomass burning and its effects on fine aerosol acidity, water content and nitrogen partitioning

    Science.gov (United States)

    Bougiatioti, Aikaterini; Nenes, Athanasios; Paraskevopoulou, Despina; Fourtziou, Luciana; Stavroulas, Iasonas; Liakakou, Eleni; Myriokefalitakis, Stelios; Daskalakis, Nikos; Weber, Rodney; Kanakidou, Maria; Gerasopoulos, Evangelos; Mihalopoulos, Nikolaos

    2017-04-01

    Aerosol acidity is an important property that drives the partitioning of semi-volatile species, the formation of secondary particulate matter and metal and nutrient solubility. Aerosol acidity varies considerably between aerosol types, RH, temperature, the degree of atmospheric chemical aging and may also change during transport. Among aerosol different sources, sea salt and dust have been well studied and their impact on aerosol acidity and water uptake is more or less understood. Biomass burning (BB) on the other hand, despite its significance as a source in a regional and global scale, is much less understood. Currently, there is no practical and accurate enough method, to directly measure the pH of in-situ aerosol. The combination of thermodynamic models, with targeted experimental observations can provide reliable predictions of aerosol particle water and pH, using as input the concentration of gas/aerosol species, temperature (T), and relative humidity (RH). As such an example, ISORROPIA-II (Fountoukis and Nenes, 2007) has been used for the thermodynamic analysis of measurements conducted in downtown Athens during winter 2013, in order to evaluate the effect of BB on aerosol water and acidity. Biomass burning, especially during night time, was found to contribute significantly to the increased organics concentrations, but as well to the BC component associated with wood burning, particulate nitrates, chloride, and potassium. These increased concentrations were found to impact on fine aerosol water, with Winorg having an average concentration of 11±14 μg m-3 and Worg 12±19 μg m-3 with the organic component constituting almost 38% of the total calculated submicron water. When investigating the fine aerosol acidity it was derived that aerosol was generally acidic, with average pH during strong BB influence of 2.8±0.5, value similar to the pH observed for regional aerosol influenced by important biomass burning episodes at the remote background site of

  15. The appropriateness of TACOM for a task complexity measure for emergency operating procedures of nuclear power plants - A comparison with OPAS scores

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea

    2007-01-01

    It is well known that complicated procedures frequently cause human performance related problems that can result in a serious consequence. Unfortunately a systematic framework to evaluate the complexity of procedures is very rare. For this reason Park et al. suggested a measure called TACOM (Task Complexity) which is able to quantify the complexity of tasks stipulated in procedures. In addition, it was observed that there is a significant correlation between averaged task performance time data and estimated TACOM scores. In this study, for an additional verification activity, TACOM scores are compared with operators' performance data that are measured by Operator Performance Assessment System (OPAS). As a result, it is believed that TACOM scores seem to be meaningfully correlated with OPAS scores. Thus, it is reasonable to expect that the result of this study can be regarded as a supplementary evidence for supporting the fact that TACOM measure is applicable for quantifying the complexity of tasks to be done by operators

  16. Densities, molar volumes, and isobaric expansivities of (d-xylose+hydrochloric acid+water) systems

    International Nuclear Information System (INIS)

    Zhang Qiufen; Yan Zhenning; Wang Jianji; Zhang Hucheng

    2006-01-01

    Densities of (d-xylose+HCl+water) have been measured at temperature in the range (278.15 to 318.15) K as a function of concentration of both d-xylose and hydrochloric acid. The densities have been used to estimate the molar volumes and isobaric expansivity of the ternary solutions. The molar volumes of the ternary solutions vary linearly with mole fraction of d-xylose. The standard partial molar volumes V 2,φ - bar for d-xylose in aqueous solutions of molality (0.2, 0.4, 0.7, 1.1, 1.6, and 2.1) mol.kg -1 HCl have been determined. In the investigated temperature range, the relation: V 2,φ - bar =c 1 +c 2 {(T/K)-273.15} 1/2 , can be used to describe the temperature dependence of the standard partial molar volumes. These results have, in conjunction with the results obtained in water, been used to deduce the standard volumes of transfer, Δ t V - bar , of d-xylose from water to aqueous HCl solutions. An increase in the transfer volume of d-xylose with increasing HCl concentrations has been explained by the stronger interactions of H + with the hydrophilic groups of d-xylose

  17. Toward Design Guidelines for Stream Restoration Structures: Measuring and Modeling Unsteady Turbulent Flows in Natural Streams with Complex Hydraulic Structures

    Science.gov (United States)

    Lightbody, A.; Sotiropoulos, F.; Kang, S.; Diplas, P.

    2009-12-01

    Despite their widespread application to prevent lateral river migration, stabilize banks, and promote aquatic habitat, shallow transverse flow training structures such as rock vanes and stream barbs lack quantitative design guidelines. Due to the lack of fundamental knowledge about the interaction of the flow field with the sediment bed, existing engineering standards are typically based on various subjective criteria or on cross-sectionally-averaged shear stresses rather than local values. Here, we examine the performance and stability of in-stream structures within a field-scale single-threaded sand-bed meandering stream channel in the newly developed Outdoor StreamLab (OSL) at the St. Anthony Falls Laboratory (SAFL). Before and after the installation of a rock vane along the outer bank of the middle meander bend, high-resolution topography data were obtained for the entire 50-m-long reach at 1-cm spatial scale in the horizontal and sub-millimeter spatial scale in the vertical. In addition, detailed measurements of flow and turbulence were obtained using acoustic Doppler velocimetry at twelve cross-sections focused on the vicinity of the structure. Measurements were repeated at a range of extreme events, including in-bank flows with an approximate flow rate of 44 L/s (1.4 cfs) and bankfull floods with an approximate flow rate of 280 L/s (10 cfs). Under both flow rates, the structure reduced near-bank shear stresses and resulted in both a deeper thalweg and near-bank aggradation. The resulting comprehensive dataset has been used to validate a large eddy simulation carried out by SAFL’s computational fluid dynamics model, the Virtual StreamLab (VSL). This versatile computational framework is able to efficiently simulate 3D unsteady turbulent flows in natural streams with complex in-stream structures and as a result holds promise for the development of much-needed quantitative design guidelines.

  18. Measurement issues associated with quantitative molecular biology analysis of complex food matrices for the detection of food fraud.

    Science.gov (United States)

    Burns, Malcolm; Wiseman, Gordon; Knight, Angus; Bramley, Peter; Foster, Lucy; Rollinson, Sophie; Damant, Andrew; Primrose, Sandy

    2016-01-07

    Following a report on a significant amount of horse DNA being detected in a beef burger product on sale to the public at a UK supermarket in early 2013, the Elliott report was published in 2014 and contained a list of recommendations for helping ensure food integrity. One of the recommendations included improving laboratory testing capacity and capability to ensure a harmonised approach for testing for food authenticity. Molecular biologists have developed exquisitely sensitive methods based on the polymerase chain reaction (PCR) or mass spectrometry for detecting the presence of particular nucleic acid or peptide/protein sequences. These methods have been shown to be specific and sensitive in terms of lower limits of applicability, but they are largely qualitative in nature. Historically, the conversion of these qualitative techniques into reliable quantitative methods has been beset with problems even when used on relatively simple sample matrices. When the methods are applied to complex sample matrices, as found in many foods, the problems are magnified resulting in a high measurement uncertainty associated with the result which may mean that the assay is not fit for purpose. However, recent advances in the technology and the understanding of molecular biology approaches have further given rise to the re-assessment of these methods for their quantitative potential. This review focuses on important issues for consideration when validating a molecular biology assay and the various factors that can impact on the measurement uncertainty of a result associated with molecular biology approaches used in detection of food fraud, with a particular focus on quantitative PCR-based and proteomics assays.

  19. Measuring and calculational complex on the base of multi-detector system and the kinetics parametrs by the method of neutron pulsed source

    International Nuclear Information System (INIS)

    Veselova, G.P.; Grachev, A.V.; Ivanova, N.K.

    1985-01-01

    Description of hardware of a measuring and calculation complex (MCC) designed for measuring neutron-physical characteristics of a reactor by the neutron pulsed method simultaneously from 8 detectors and the MERA-60 computer software used by MCC for measuring dependences of neutron generation and life time on the reactivity change is presented. The complex serviceability is tested at one of the PEI physical stands. MCC operation during a year has demonstrated its high reliability and a possibility of being used both for pulsed and other methods for investigating physical stands without introduction of supplementary equipment

  20. Investigating the Impact of Road Condition Complexity on Driving Workload Based on Subjective Measurement using NASA TLX

    Directory of Open Access Journals (Sweden)

    Sugiono Sugiono

    2017-01-01

    Full Text Available Prior researchers indicate that mental load is one of the most important contributors to a traffic accident. The aim of the paper is to investigate the impact and the correlation of road condition and driving experience on driver’s mental workload. The driving test consists of 3 road complicity situation (urban road, highway, rural road with 26 drivers with average 21 years old in different experience level (average 4.08 years’ experience. NASA TLX questioner is used as subjective driver’s mental load measurement with three dimensions relate to the demands imposed on the subject (Mental, Physical and Temporal Demands and three to the interaction of a subject with the task (Effort, Frustration, and Performance. There are 3 cameras placed on the left side, right side and front car to identify the road condition. According to experiment, it was found that drivers felt that frustration level, business, and mental-demand factors dominate the impact on high-level workload (96.15%. Highway road conditions provide an average overall workload score of 62 (OWS which was better compared to city road (OWS = 69 and rural road (OWS = 66. Based on street complexity, it is necessary to improve road conditions that resemble highway road by reducing potential hazard.

  1. Retrieval of aerosol complex refractive index from a synergy between lidar, sun photometer and in situ measurements during LISAIR experiment

    International Nuclear Information System (INIS)

    Raut, J.C.; Chazette, P.

    2007-01-01

    Particulate pollutant exchanges between the streets and the Planetary Boundary Layer (PBL), and their daily evolution linked to human activity were studied in the framework of the Lidar pour la Surveillance de l'AIR (LISAIR) experiment. This program lasted from 10 to 30 May 2005. A synergetic approach combining dedicated active (lidar) and passive (sun photometer) remote sensors as well as ground based in situ instrumentation (nephelometer, aethalometer and particle sizers) was used to investigate urban aerosol optical properties within Paris. Aerosol complex refractive indices were assessed to be 1.56-0.034 i at 355 nm and 1.59-0.040 i at 532 nm, thus leading to single-scattering albedo values between 0.80 and 0.88. These retrievals are consistent with soot components in the aerosol arising from traffic exhausts indicating that these pollutants have a radiative impact on climate. We also discussed the influence of relative humidity on aerosol properties. A good agreement was found between vertical extinction profile derived from lidar backscattering signal and retrieved from the coupling between radio sounding and ground in situ measurements. (authors)

  2. Experimental approach for the uncertainty assessment of 3D complex geometry dimensional measurements using computed tomography at the mm and sub-mm scales

    DEFF Research Database (Denmark)

    Jiménez, Roberto; Torralba, Marta; Yagüe-Fabra, José A.

    2017-01-01

    The dimensional verification of miniaturized components with 3D complex geometries is particularly challenging. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile techniques. However, the establishment of CT systems......’ traceability when measuring 3D complex geometries is still an open issue. In this work, an alternative method for the measurement uncertainty assessment of 3D complex geometries by using CT is presented. The method is based on the micro-CT system Maximum Permissible Error (MPE) estimation, determined...... experimentally by using several calibrated reference artefacts. The main advantage of the presented method is that a previous calibration of the component by a more accurate Coordinate Measuring System (CMS) is not needed. In fact, such CMS would still hold all the typical limitations of optical and tactile...

  3. Water Accounting Plus (WA+ – a water accounting procedure for complex river basins based on satellite measurements

    Directory of Open Access Journals (Sweden)

    P. Karimi

    2013-07-01

    Full Text Available Coping with water scarcity and growing competition for water among different sectors requires proper water management strategies and decision processes. A pre-requisite is a clear understanding of the basin hydrological processes, manageable and unmanageable water flows, the interaction with land use and opportunities to mitigate the negative effects and increase the benefits of water depletion on society. Currently, water professionals do not have a common framework that links depletion to user groups of water and their benefits. The absence of a standard hydrological and water management summary is causing confusion and wrong decisions. The non-availability of water flow data is one of the underpinning reasons for not having operational water accounting systems for river basins in place. In this paper, we introduce Water Accounting Plus (WA+, which is a new framework designed to provide explicit spatial information on water depletion and net withdrawal processes in complex river basins. The influence of land use and landscape evapotranspiration on the water cycle is described explicitly by defining land use groups with common characteristics. WA+ presents four sheets including (i a resource base sheet, (ii an evapotranspiration sheet, (iii a productivity sheet, and (iv a withdrawal sheet. Every sheet encompasses a set of indicators that summarise the overall water resources situation. The impact of external (e.g., climate change and internal influences (e.g., infrastructure building can be estimated by studying the changes in these WA+ indicators. Satellite measurements can be used to acquire a vast amount of required data but is not a precondition for implementing WA+ framework. Data from hydrological models and water allocation models can also be used as inputs to WA+.

  4. The development of a quantitative measure for the complexity of emergency tasks stipulated in emergency operating procedures of nuclear power plants

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea

    2006-11-01

    Previous studies have continuously pointed out that human performance is a decisive factor affecting the safety of complicated process systems. Subsequently, as the result of extensive efforts, it has been revealed that the provision of procedures is one of the most effective countermeasures, especially if human operators have to carry out their tasks under a very stressful environment. That is, since good procedures are helpful to not only enhance the performance of human operators but also the reduction of the possibility of a human error through stipulating detailed tasks to be done by human operators. Ironically, it has been emphasized that the performance of human operators could be impaired due to complicated procedures, because procedures directly govern the physical as well as cognitive behavior of human operators by institutionalizing detailed actions. Therefore, it is a prerequisite to develop a systematic framework that properly evaluate the complexity of tasks described in procedures. For this reason, a measure called TACOM (Task Complexity) that can quantify the complexity of emergency tasks described in the emergency operating procedures (EOPs) of NPPs has been developed. In this report, a technical background as well as practical steps to quantify the complexity of tasks were presented with a series of studies that were conducted to ensure the validity of the TACOM measure. As a result of validation studies, since it is shown that the TACOM measure seem to properly quantify the complexity of emergency tasks, it is desirable that the TACOM measure plays an important role in improving the performance of human operators

  5. The development of a quantitative measure for the complexity of emergency tasks stipulated in emergency operating procedures of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jung, Won Dea

    2006-11-15

    Previous studies have continuously pointed out that human performance is a decisive factor affecting the safety of complicated process systems. Subsequently, as the result of extensive efforts, it has been revealed that the provision of procedures is one of the most effective countermeasures, especially if human operators have to carry out their tasks under a very stressful environment. That is, since good procedures are helpful to not only enhance the performance of human operators but also the reduction of the possibility of a human error through stipulating detailed tasks to be done by human operators. Ironically, it has been emphasized that the performance of human operators could be impaired due to complicated procedures, because procedures directly govern the physical as well as cognitive behavior of human operators by institutionalizing detailed actions. Therefore, it is a prerequisite to develop a systematic framework that properly evaluate the complexity of tasks described in procedures. For this reason, a measure called TACOM (Task Complexity) that can quantify the complexity of emergency tasks described in the emergency operating procedures (EOPs) of NPPs has been developed. In this report, a technical background as well as practical steps to quantify the complexity of tasks were presented with a series of studies that were conducted to ensure the validity of the TACOM measure. As a result of validation studies, since it is shown that the TACOM measure seem to properly quantify the complexity of emergency tasks, it is desirable that the TACOM measure plays an important role in improving the performance of human operators.

  6. Complexity Plots

    KAUST Repository

    Thiyagalingam, Jeyarajan

    2013-06-01

    In this paper, we present a novel visualization technique for assisting the observation and analysis of algorithmic complexity. In comparison with conventional line graphs, this new technique is not sensitive to the units of measurement, allowing multivariate data series of different physical qualities (e.g., time, space and energy) to be juxtaposed together conveniently and consistently. It supports multivariate visualization as well as uncertainty visualization. It enables users to focus on algorithm categorization by complexity classes, while reducing visual impact caused by constants and algorithmic components that are insignificant to complexity analysis. It provides an effective means for observing the algorithmic complexity of programs with a mixture of algorithms and black-box software through visualization. Through two case studies, we demonstrate the effectiveness of complexity plots in complexity analysis in research, education and application. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  7. The first experimental confirmation of the fractional kinetics containing the complex-power-law exponents: Dielectric measurements of polymerization reactions

    Science.gov (United States)

    Nigmatullin, R. R.; Arbuzov, A. A.; Salehli, F.; Giz, A.; Bayrak, I.; Catalgil-Giz, H.

    2007-01-01

    For the first time we achieved incontestable evidence that the real process of dielectric relaxation during the polymerization reaction of polyvinylpyrrolidone (PVP) is described in terms of the fractional kinetic equations containing complex-power-law exponents. The possibility of the existence of the fractional kinetics containing non-integer complex-power-law exponents follows from the general theory of dielectric relaxation that has been suggested recently by one of the authors (R.R.N). Based on the physical/geometrical meaning of the fractional integral with complex exponents there is a possibility to develop a general theory of dielectric relaxation based on the self-similar (fractal) character of the reduced (averaged) microprocesses that take place in the mesoscale region. This theory contains some essential predictions related to existence of the non-integer power-law kinetics and the results of this paper can be considered as the first confirmation of existence of the kinetic phenomena that are described by fractional derivatives with complex-power-law exponents. We want to stress here that with the help of a new complex fitting function for the complex permittivity it becomes possible to describe the whole process for real and imaginary parts simultaneously throughout the admissible frequency range (30 Hz-13 MHz). The fitting parameters obtained for the complex permittivity function for three temperatures (70, 90 and 110 °C) confirm in general the picture of reaction that was known qualitatively before. They also reveal some new features, which improve the interpretation of the whole polymerization process. We hope that these first results obtained in the paper will serve as a good stimulus for other researches to find the traces of the existence of new fractional kinetics in other relaxation processes unrelated to the dielectric relaxation. These results should lead to the reconsideration and generalization of irreversibility and kinetic phenomena that

  8. Observations and Measurements of Wing Parameters of the Selected Beetle Species and the Design of a Mechanism Structure Implementing a Complex Wing Movement

    Directory of Open Access Journals (Sweden)

    Geisler T.

    2016-12-01

    Full Text Available Beetle wings perform a flapping movement, consisting of the rotation relative to the two axes. This paper presents the results of observations and measurements of wings operating parameters in different planes of some beetle species. High speed photos and videos were used. The concept of the mechanism performing a complex wing movement was proposed and developed.

  9. Observations and Measurements of Wing Parameters of the Selected Beetle Species and the Design of a Mechanism Structure Implementing a Complex Wing Movement

    Science.gov (United States)

    Geisler, T.

    2016-12-01

    Beetle wings perform a flapping movement, consisting of the rotation relative to the two axes. This paper presents the results of observations and measurements of wings operating parameters in different planes of some beetle species. High speed photos and videos were used. The concept of the mechanism performing a complex wing movement was proposed and developed.

  10. Thermally-activated vortex dynamics in bismuth calcium strontium copper oxide (Bi2CaSr2Cu2O8+δ) studied by complex susceptibility measurements

    NARCIS (Netherlands)

    Emmen, J.H.P.M.; Brabers, V.A.M.; Jonge, de W.J.M.

    1991-01-01

    Complex AC magnetic susceptibility has been measured on Bi2CaSr2Cu2O8+d single crystals with hnc, Hdc|c-axis. It will be shown that field, frequency and temperature dependence of both ¿' and ¿¿ in a constant but sufficiently large DC magnetic field can quantitatively be described by

  11. A sensitive dynamic viscometer for measuring the complex shear modulus in a steady shear flow using the method of orthogonal superposition

    NARCIS (Netherlands)

    Zeegers, J.C.H.; Zeegers, Jos; van den Ende, Henricus T.M.; Blom, C.; Altena, E.G.; Beukema, Gerrit J.; Beukema, G.J.; Mellema, J.

    1995-01-01

    A new instrument to carry out complex viscosity measurements in equilibrium and in a steady shear flow has been developed. A small amplitude harmonic excitation is superimposed orthogonally to the steady shear rate component. It is realized by a thin-walled cylinder, which oscillates in the axial

  12. Use of avidin-biotin-peroxidase complex for measurement of UV lesions in human DNA by microELISA

    Energy Technology Data Exchange (ETDEWEB)

    Leipold, B [Technischen Universitaet Muenchen (Germany, F.R.). Dermatologische Klinik; Remy, W [Max-Planck-Institut fuer Biochemie, Muenchen (Germany, F.R.)

    1984-02-10

    The avidin/biotin system was introduced into the standard enzyme-linked immunosorbent assay (ELISA) to increase its sensitivity for detecting UV lesions in human DNA. Goat anti-rabbit IgG-peroxidase used in the standard ELISA as second antibody was replaced by biotinylated goat anti-rabbit IgG plus the avidin-biotin-peroxidase complex (ABC) reagent. Sensitivity of detection of plate-fixed UV-DNA-antibody complexes was increased about 8-fold and photolesions in human DNA samples irradiated with as low a dose as 1 J/m/sup 2/ UVC or a suberythermal dose of UVB light could be detected.

  13. Rare Earth Electrochemical Property Measurements and Phase Diagram Development in a Complex Molten Salt Mixture for Molten Salt Recycle

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jinsuo; Guo, Shaoqiang

    2018-03-30

    Pyroprocessing is a promising alternative for the reprocessing of used nuclear fuel (UNF) that uses electrochemical methods. Compared to the hydrometallurgical reprocessing method, pyroprocessing has many advantages such as reduced volume of radioactive waste, simple waste processing, ability to treat refractory material, and compatibility with fast reactor fuel recycle. The key steps of the process are the electro-refining of the spent metallic fuel in the LiCl-KCl eutectic salt, which can be integrated with an electrolytic reduction step for the reprocessing of spent oxide fuels. During the electro-refining process, actinides and active fission products such rare earth (RE) elements are dissolved into the molten salt from the spent fuel at an anode basket. Then U and Pu are electro-deposited on the cathodes while REs with relatively negative reduction potentials are left in the molten salt bath. However, with the accumulation of lanthanides in the salt, the reduction potentials of REs will approach the values for U and Pu, affecting the recovery efficiency of U and Pu. Hence, RE drawdown is necessary to reduce salt waste after uranium and minor actinides recovery, which can also be performed by electrochemical separations. To separate various REs and optimize the drawdown process, physical properties of REs in LiCl-KCl salt and their concentration dependence are essential. Thus, the primary goal of present research is to provide fundamental data of REs and deduce phase diagrams of LiCl-KCl-RECl3 based complex molten salts. La, Nd and Gd are three representative REs that we are particularly interested in due to the high ratio of La and Nd in UNF, highest standard potential of Gd among all REs, and the existing literature data in dilute solution. Electrochemical measurements are performed to study the thermodynamics and transport properties of LaCl3, GdCl3, NdCl3, and NdCl2 in LiCl-KCl eutectic in the temperature range 723-823 K. Test are conducted in LiCl-KCl melt

  14. Theoretical and experimental study of a calorimetric technique for measuring energy deposition in materials caused by complex pile irradiation

    International Nuclear Information System (INIS)

    Mas, P.; Sciers, P.; Droulers, Y.

    1962-01-01

    Calorimetric methods may be used to measure gamma fluxes greater than 10 6 r/h near the cores of swimming pool reactors. The theory, design, and properties of isothermal calorimeters are discussed, and experimental results obtained with two types are presented. Measurement of energy deposition in materials and the long term integration of energy depositions are other uses of these devices. Results of measurements on heat deposition in steel and water are given. Fluxes were also measured. (authors) [fr

  15. Resting and Task-Modulated High-Frequency Brain Rhythms Measured by Scalp Encephalography in Infants with Tuberous Sclerosis Complex

    Science.gov (United States)

    Stamoulis, Catherine; Vogel-Farley, Vanessa; Degregorio, Geneva; Jeste, Shafali S.; Nelson, Charles A.

    2015-01-01

    The electrophysiological correlates of cognitive deficits in tuberous sclerosis complex (TSC) are not well understood, and modulations of neural dynamics by neuroanatomical abnormalities that characterize the disorder remain elusive. Neural oscillations (rhythms) are a fundamental aspect of brain function, and have dominant frequencies in a wide…

  16. Development of a dual-tracer real-time particle dry-deposition measurement technique for simple and complex terrain

    International Nuclear Information System (INIS)

    Sehmel, G.A.; Hodgson, W.H.; Campbell, J.A.

    1979-01-01

    Detectors are being developed and tested for measuring the airborne concentrations of lithium particles and SF 6 gas in real time. The airborne lithium detector will be used for real-time measurements of both particle dry-deposition velocities and resuspension rates. Both the lithium and SF 6 detectors will be used for measuring dry deposition in field experiments

  17. Rapid, topology-based particle tracking for high-resolution measurements of large complex 3D motion fields.

    Science.gov (United States)

    Patel, Mohak; Leggett, Susan E; Landauer, Alexander K; Wong, Ian Y; Franck, Christian

    2018-04-03

    Spatiotemporal tracking of tracer particles or objects of interest can reveal localized behaviors in biological and physical systems. However, existing tracking algorithms are most effective for relatively low numbers of particles that undergo displacements smaller than their typical interparticle separation distance. Here, we demonstrate a single particle tracking algorithm to reconstruct large complex motion fields with large particle numbers, orders of magnitude larger than previously tractably resolvable, thus opening the door for attaining very high Nyquist spatial frequency motion recovery in the images. Our key innovations are feature vectors that encode nearest neighbor positions, a rigorous outlier removal scheme, and an iterative deformation warping scheme. We test this technique for its accuracy and computational efficacy using synthetically and experimentally generated 3D particle images, including non-affine deformation fields in soft materials, complex fluid flows, and cell-generated deformations. We augment this algorithm with additional particle information (e.g., color, size, or shape) to further enhance tracking accuracy for high gradient and large displacement fields. These applications demonstrate that this versatile technique can rapidly track unprecedented numbers of particles to resolve large and complex motion fields in 2D and 3D images, particularly when spatial correlations exist.

  18. Experimental Approach for the Uncertainty Assessment of 3D Complex Geometry Dimensional Measurements Using Computed Tomography at the mm and Sub-mm Scales.

    Science.gov (United States)

    Jiménez, Roberto; Torralba, Marta; Yagüe-Fabra, José A; Ontiveros, Sinué; Tosello, Guido

    2017-05-16

    The dimensional verification of miniaturized components with 3D complex geometries is particularly challenging. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile techniques. However, the establishment of CT systems' traceability when measuring 3D complex geometries is still an open issue. In this work, an alternative method for the measurement uncertainty assessment of 3D complex geometries by using CT is presented. The method is based on the micro-CT system Maximum Permissible Error (MPE) estimation, determined experimentally by using several calibrated reference artefacts. The main advantage of the presented method is that a previous calibration of the component by a more accurate Coordinate Measuring System (CMS) is not needed. In fact, such CMS would still hold all the typical limitations of optical and tactile techniques, particularly when measuring miniaturized components with complex 3D geometries and their inability to measure inner parts. To validate the presented method, the most accepted standard currently available for CT sensors, the Verein Deutscher Ingenieure/Verband Deutscher Elektrotechniker (VDI/VDE) guideline 2630-2.1 is applied. Considering the high number of influence factors in CT and their impact on the measuring result, two different techniques for surface extraction are also considered to obtain a realistic determination of the influence of data processing on uncertainty. The uncertainty assessment of a workpiece used for micro mechanical material testing is firstly used to confirm the method, due to its feasible calibration by an optical CMS. Secondly, the measurement of a miniaturized dental file with 3D complex geometry is carried out. The estimated uncertainties are eventually compared with the component's calibration and the micro manufacturing tolerances to demonstrate the suitability of the presented CT calibration procedure. The 2U/T ratios resulting from the

  19. Study of complex resistivity measurement using current and potential waveform data; Denryu to den`i hakei data wo riyoshita fukusohi teiko sokutei no kento

    Energy Technology Data Exchange (ETDEWEB)

    Shima, H; Sakurai, K; Yamashita, Y [OYO Corp., Tokyo (Japan)

    1997-10-22

    This paper proposes a measurement method for complex resistivity using both current and potential waveforms. This method was applied to actual data. Especially, chargeability was discussed among complex resistivities. A method was proposed for determining the complex resistivity. At first, digital measurements of both current and potential waveforms were conducted. For the potential waveform, zero-order self-potential was canceled. Then, the FFT technique was applied to both current and potential waveforms, to determine both current and potential in the frequency domain. Hereafter, complex resistivity was determined through simple division. Since the inductive coupling was observed at higher frequencies, it was difficult to apply Cole-Cole model, simply. However, the inductive coupling could be removed using proper sampling frequency. Thus, a proper Cole-Cole dispersion curve could be obtained. Using this Cole-Cole dispersion curve, new chargeability could be defined. A linear relation between this chargeability and the ordinary time domain chargeability was made clear. 4 refs., 10 figs.

  20. Measuring unintended effects in peacebuilding: What the field of international cooperation can learn from innovative approaches shaped by complex contexts.

    Science.gov (United States)

    Lemon, Adrienne; Pinet, Mélanie

    2018-06-01

    Capturing unintended impacts has been a persistent struggle in all fields of international development, and the field of peacebuilding is no exception. However, because peacebuilding focuses on relationships in complex contexts, the field of peacebuilding has, by necessity, made efforts towards finding practical ways to reflect upon both the intended and unintended effects of this work. To explore what lessons can be learned from the peacebuilding field, this study examines the evaluations of Search for Common Ground, a peacebuilding organisation working in over 35 countries across the world. Analysis focuses on 96 evaluations completed between 2013 and 2016 in 24 countries across Africa, Asia, and the MENA regions that found unintended effects. Programmes focusing on women, youth, and radio were most effective at identifying and explaining unintended effects, likely because the project design guided broader lines of questioning from the beginning. The paper argues that OECD-DAC guidelines are not enough on their own to guide evaluators into exploration of unintended effects, and teams instead need to work together to decide where, when and how they will look for them. Different approaches were also used to capture positive and negative outcomes, suggesting that evaluators need to decide at what level they are evaluating and how to tie effects back to the project's contribution. This study explores evaluation techniques and approaches used to understand impact in complex contexts in the peacebuilding field, and draws on lessons learned for the benefit of other fields dealing with similar complexities in international development and cooperation among actors. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. A system for traceable measurement of the microwave complex permittivity of liquids at high pressures and temperatures

    International Nuclear Information System (INIS)

    Dimitrakis, G A; Robinson, J; Kingman, S; Lester, E; George, M; Poliakoff, M; Harrison, I; Gregory, A P; Lees, K

    2009-01-01

    A system has been developed for direct traceable dielectric measurements on liquids at high pressures and temperatures. The system consists of a coaxial reflectometric sensor terminated by a metallic cylindrical cell to contain the liquid. It has been designed for measurements on supercritical liquids, but as a first step measurements on dielectric reference liquids were performed. This paper reports on a full evaluation of the system up to 2.5 GHz using methanol, ethanol and n-propanol at pressures up to 9 MPa and temperatures up to 273 °C. A comprehensive approach to the evaluation of uncertainties using Monte Carlo modelling is used

  2. The combined effect of complex mixes of poisons on the organism of white rats in 30-day round-the-clock inhalation and measures of biological prevention

    OpenAIRE

    MIRZAKARIMOVA MALOKHAT ABDUVAKHIDOVNA

    2016-01-01

    The direction of “biological prevention” in the field of hygiene of the environment, which is understood as the complex of measures directed to the increase in resistance of individual person and population to exposure of harmful factors of the industrial and ambient environment, are increasingly being developed over the last years. For biopreventive maintenance only the means are used harmless at long application in preventive effective dosage. In this context in the industrial towns for res...

  3. Inversion of In Situ Light Absorption and Attenuation Measurements to Estimate Constituent Concentrations in Optically Complex Shelf Seas

    Science.gov (United States)

    Ramírez-Pérez, M.; Twardowski, M.; Trees, C.; Piera, J.; McKee, D.

    2018-01-01

    A deconvolution approach is presented to use spectral light absorption and attenuation data to estimate the concentration of the major nonwater compounds in complex shelf sea waters. The inversion procedure requires knowledge of local material-specific inherent optical properties (SIOPs) which are determined from natural samples using a bio-optical model that differentiates between Case I and Case II waters and uses least squares linear regression analysis to provide optimal SIOP values. A synthetic data set is used to demonstrate that the approach is fundamentally consistent and to test the sensitivity to injection of controlled levels of artificial noise into the input data. Self-consistency of the approach is further demonstrated by application to field data collected in the Ligurian Sea, with chlorophyll (Chl), the nonbiogenic component of total suspended solids (TSSnd), and colored dissolved organic material (CDOM) retrieved with RMSE of 0.61 mg m-3, 0.35 g m-3, and 0.02 m-1, respectively. The utility of the approach is finally demonstrated by application to depth profiles of in situ absorption and attenuation data resulting in profiles of optically significant constituents with associated error bar estimates. The advantages of this procedure lie in the simple input requirements, the avoidance of error amplification, full exploitation of the available spectral information from both absorption and attenuation channels, and the reasonably successful retrieval of constituent concentrations in an optically complex shelf sea.

  4. The Cu2+-nitrilotriacetic acid complex improves loading of α-helical double histidine site for precise distance measurements by pulsed ESR

    Science.gov (United States)

    Ghosh, Shreya; Lawless, Matthew J.; Rule, Gordon S.; Saxena, Sunil

    2018-01-01

    Site-directed spin labeling using two strategically placed natural histidine residues allows for the rigid attachment of paramagnetic Cu2+. This double histidine (dHis) motif enables extremely precise, narrow distance distributions resolved by Cu2+-based pulsed ESR. Furthermore, the distance measurements are easily relatable to the protein backbone-structure. The Cu2+ ion has, till now, been introduced as a complex with the chelating agent iminodiacetic acid (IDA) to prevent unspecific binding. Recently, this method was found to have two limiting concerns that include poor selectivity towards α-helices and incomplete Cu2+-IDA complexation. Herein, we introduce an alternative method of dHis-Cu2+ loading using the nitrilotriacetic acid (NTA)-Cu2+ complex. We find that the Cu2+-NTA complex shows a four-fold increase in selectivity toward α-helical dHis sites. Furthermore, we show that 100% Cu2+-NTA complexation is achievable, enabling precise dHis loading and resulting in no free Cu2+ in solution. We analyze the optimum dHis loading conditions using both continuous wave and pulsed ESR. We implement these findings to show increased sensitivity of the Double Electron-Electron Resonance (DEER) experiment in two different protein systems. The DEER signal is increased within the immunoglobulin binding domain of protein G (called GB1). We measure distances between a dHis site on an α-helix and dHis site either on a mid-strand or a non-hydrogen bonded edge-strand β-sheet. Finally, the DEER signal is increased twofold within two α-helix dHis sites in the enzymatic dimer glutathione S-transferase exemplifying the enhanced α-helical selectivity of Cu2+-NTA.

  5. Complexity-Based Measures Inform Effects of Tai Chi Training on Standing Postural Control: Cross-Sectional and Randomized Trial Studies.

    Science.gov (United States)

    Wayne, Peter M; Gow, Brian J; Costa, Madalena D; Peng, C-K; Lipsitz, Lewis A; Hausdorff, Jeffrey M; Davis, Roger B; Walsh, Jacquelyn N; Lough, Matthew; Novak, Vera; Yeh, Gloria Y; Ahn, Andrew C; Macklin, Eric A; Manor, Brad

    2014-01-01

    Diminished control of standing balance, traditionally indicated by greater postural sway magnitude and speed, is associated with falls in older adults. Tai Chi (TC) is a multisystem intervention that reduces fall risk, yet its impact on sway measures vary considerably. We hypothesized that TC improves the integrated function of multiple control systems influencing balance, quantifiable by the multi-scale "complexity" of postural sway fluctuations. To evaluate both traditional and complexity-based measures of sway to characterize the short- and potential long-term effects of TC training on postural control and the relationships between sway measures and physical function in healthy older adults. A cross-sectional comparison of standing postural sway in healthy TC-naïve and TC-expert (24.5±12 yrs experience) adults. TC-naïve participants then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Postural sway was assessed before and after the training during standing on a force-plate with eyes-open (EO) and eyes-closed (EC). Anterior-posterior (AP) and medio-lateral (ML) sway speed, magnitude, and complexity (quantified by multiscale entropy) were calculated. Single-legged standing time and Timed-Up-and-Go tests characterized physical function. At baseline, compared to TC-naïve adults (n = 60, age 64.5±7.5 yrs), TC-experts (n = 27, age 62.8±7.5 yrs) exhibited greater complexity of sway in the AP EC (P = 0.023), ML EO (P<0.001), and ML EC (P<0.001) conditions. Traditional measures of sway speed and magnitude were not significantly lower among TC-experts. Intention-to-treat analyses indicated no significant effects of short-term TC training; however, increases in AP EC and ML EC complexity amongst those randomized to TC were positively correlated with practice hours (P = 0.044, P = 0.018). Long- and short-term TC training were positively associated with physical function. Multiscale entropy offers a complementary

  6. Complexity-Based Measures Inform Effects of Tai Chi Training on Standing Postural Control: Cross-Sectional and Randomized Trial Studies.

    Directory of Open Access Journals (Sweden)

    Peter M Wayne

    Full Text Available Diminished control of standing balance, traditionally indicated by greater postural sway magnitude and speed, is associated with falls in older adults. Tai Chi (TC is a multisystem intervention that reduces fall risk, yet its impact on sway measures vary considerably. We hypothesized that TC improves the integrated function of multiple control systems influencing balance, quantifiable by the multi-scale "complexity" of postural sway fluctuations.To evaluate both traditional and complexity-based measures of sway to characterize the short- and potential long-term effects of TC training on postural control and the relationships between sway measures and physical function in healthy older adults.A cross-sectional comparison of standing postural sway in healthy TC-naïve and TC-expert (24.5±12 yrs experience adults. TC-naïve participants then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Postural sway was assessed before and after the training during standing on a force-plate with eyes-open (EO and eyes-closed (EC. Anterior-posterior (AP and medio-lateral (ML sway speed, magnitude, and complexity (quantified by multiscale entropy were calculated. Single-legged standing time and Timed-Up-and-Go tests characterized physical function.At baseline, compared to TC-naïve adults (n = 60, age 64.5±7.5 yrs, TC-experts (n = 27, age 62.8±7.5 yrs exhibited greater complexity of sway in the AP EC (P = 0.023, ML EO (P<0.001, and ML EC (P<0.001 conditions. Traditional measures of sway speed and magnitude were not significantly lower among TC-experts. Intention-to-treat analyses indicated no significant effects of short-term TC training; however, increases in AP EC and ML EC complexity amongst those randomized to TC were positively correlated with practice hours (P = 0.044, P = 0.018. Long- and short-term TC training were positively associated with physical function.Multiscale entropy offers a complementary

  7. Cutaneous noradrenaline measured by microdialysis in complex regional pain syndrome during whole-body cooling and heating

    DEFF Research Database (Denmark)

    Terkelsen, Astrid Juhl; Gierthmühlen, Janne; Petersen, Lars J.

    2013-01-01

    and in healthy volunteers. Seven patients and nine controls completed whole-body cooling (sympathetic activation) and heating (sympathetic inhibition) induced by a whole-body thermal suit with simultaneous measurement of the skin temperature, skin blood flow, and release of dermal noradrenaline. CRPS pain...

  8. [Motivation perception measurement of intermediate directors in three complex hospitals of the Region of the Maule, Chile].

    Science.gov (United States)

    Bustamante-Ubilla, Miguel Alejandro; del Río-Rivero, María Carolina; Lobos-Andrade, Germán Enrique; Villarreal-Navarrete, Patricia Isabel

    2009-01-01

    In this work, a questionnaire was designed and perceptions of motivation and demotivation of middle managers in three hospitals in the Region del Maule, Chile were measured. The fieldwork was carried out between September and October, 2006. A questionnaire that included 57 statements to measure attitude was administered and qualified according to a five-point Likert-type scale. The population studied included l25 professionals that supervise roughly 3 800 employees. Ten variables were identified, 5 motivational and 5 demotivational. Notable among the motivational variables are vocation and service-oriented spirit; among the demotivational variables are lack of recognition and commitment. It is affirmed that both motivational variables as well as demotivational variables are essentially qualitative and that economic and salary variables are less relevant and less hierarchical.

  9. Development of force-detected THz-ESR measurement system and its application to metal porphyrin complexes

    Science.gov (United States)

    Takahashi, Hideyuki; Okamoto, Tsubasa; Ohmichi, Eiji; Ohta, Hitoshi

    Electron spin resonance spectroscopy in the terahertz region (THz-ESR) is a promising technique to study biological materials such as metalloproteins because it directly probes the metal ion sites that play an important role in the emergence of functionality. By combining THz-ESR with force detection, the samples mass is reduced to the order of ng. This feature is of great advantage because the sample preparation process of biological materials is time-consuming. We developed a force-detected THz-ESR system utilizing optical interferometry for precise cantilever displacement measurement. In order to suppress the sensitivity fluctuation and instability of cantilever dynamics under high magnetic field, the tuning of interferometer is feedback-controlled during a measurement. By using this system, we successfully observed the ESR signal of hemin, which is a model substance of hemoglobin and myoglobin, in THz region.

  10. Importance of the method of leaf area measurement to the interpretation of gas exchange of complex shoots

    Science.gov (United States)

    W. K. Smith; A. W. Schoettle; M. Cui

    1991-01-01

    Net CO(2) uptake in full sunlight, total leaf area (TLA), projected leaf area of detached leaves (PLA), and the silhouette area of attached leaves in their natural orientation to the sun at midday on June 1 (SLA) were measured for sun shoots of six conifer species. Among species, TLA/SLA ranged between 5.2 and 10.0 (x bar = 7.3), TLA/PLA ranged between 2.5 and 2.9 (x...

  11. Procedure for measuring simultaneously the solar and visible properties of glazing with complex internal or external structures.

    Science.gov (United States)

    Gentle, A R; Smith, G B

    2014-10-20

    Accurate solar and visual transmittances of materials in which surfaces or internal structures are complex are often not easily amenable to standard procedures with laboratory-based spectrophotometers and integrating spheres. Localized "hot spots" of intensity are common in such materials, so data on small samples is unreliable. A novel device and simple protocols have been developed and undergone validation testing. Simultaneous solar and visible transmittance and reflectance data have been acquired for skylight components and multilayer polycarbonate roof panels. The pyranometer and lux sensor setups also directly yield "light coolness" in lumens/watt. Sample areas must be large, and, although mainly in sheet form, some testing has been done on curved panels. The instrument, its operation, and the simple calculations used are described. Results on a subset of diffuse and partially diffuse materials with no hot spots have been cross checked using 150 mm integrating spheres with a spectrophotometer and the Air Mass 1.5 spectrum. Indications are that results are as good or better than with such spheres for transmittance, but reflectance techniques need refinement for some sample types.

  12. Electrochemical oxidation of chlorpheniramine at polytyramine film doped with ruthenium (II) complex: Measurement, kinetic and thermodynamic studies

    International Nuclear Information System (INIS)

    Khudaish, Emad A.; Al-Hinaai, Mohammed; Al-Harthy, Salim; Laxman, Karthik

    2014-01-01

    Highlights: • XPS data confirm doping of ruthenium onto the polytyramine moiety. • Doping of Ru decreases the Pty resistivity and increases the electron transfer kinetics. • The resulting sensor is stable for a large range of CPM concentration. • Estimated values of thermodynamic and kinetic parameters were comparable. • Application to commercial dosage forms was excellent and satisfactory. - Abstract: A solid-state sensor based on polytyramine film deposited at glassy carbon electrode doped with tris(2,2′-bipyridyl)Ru(II) complex (Ru/Pty/GCE) was constructed electrochemically. A redox property represented by [Ru(bpy) 3 ] 3+/2+ couple immobilized at the Pty moiety was characterized using typical voltammetric techniques. The XPS data and AFM images confirm the grafting of Ru species on the top of Pty while the electrochemical impedance spectroscopy (EIS) data supports the immobilization of both surface modifiers onto the GCE. The constructed sensor exhibits a substantial reactivity, stability and high sensitivity to chlorpheniramine maleate (CPM) oxidation. The detection limit (S/N = 3) was brought down to 338 nM using differential pulse voltammetry method. Thermodynamic and kinetic parameters were evaluated using hydrodynamic method. The apparent diffusion coefficient and the heterogeneous electron transfer rate constant for the CPM oxidation were 2.67 × 10 −5 cm 2 s −1 and 3.21 × 10 −3 cm s −1 , respectively. Interference studies and real sample analysis were conducted with excellent performance and satisfactory results

  13. Area, speed and power measurements of FPGA-based complex orthogonal space-time block code channel encoders

    Science.gov (United States)

    Passas, Georgios; Freear, Steven; Fawcett, Darren

    2010-01-01

    Space-time coding (STC) is an important milestone in modern wireless communications. In this technique, more copies of the same signal are transmitted through different antennas (space) and different symbol periods (time), to improve the robustness of a wireless system by increasing its diversity gain. STCs are channel coding algorithms that can be readily implemented on a field programmable gate array (FPGA) device. This work provides some figures for the amount of required FPGA hardware resources, the speed that the algorithms can operate and the power consumption requirements of a space-time block code (STBC) encoder. Seven encoder very high-speed integrated circuit hardware description language (VHDL) designs have been coded, synthesised and tested. Each design realises a complex orthogonal space-time block code with a different transmission matrix. All VHDL designs are parameterisable in terms of sample precision. Precisions ranging from 4 bits to 32 bits have been synthesised. Alamouti's STBC encoder design [Alamouti, S.M. (1998), 'A Simple Transmit Diversity Technique for Wireless Communications', IEEE Journal on Selected Areas in Communications, 16:55-108.] proved to be the best trade-off, since it is on average 3.2 times smaller, 1.5 times faster and requires slightly less power than the next best trade-off in the comparison, which is a 3/4-rate full-diversity 3Tx-antenna STBC.

  14. Complex measurement system for long-term monitoring of prestressed railway bridges of the new Lehrter Bahnhof in Berlin

    Science.gov (United States)

    Habel, Wolfgang R.; Hofmann, Detlef; Kohlhoff, H.; Knapp, J.; Brandes, K.; Haenichen, H.; Inaudi, Daniele

    2002-07-01

    A new central railway station - Lehrter Bahnhof - is being built in Berlin. Because of construction activities in immediate vicinity and because of difficult soil conditions, different vertical displacements have to be expected. In order to avoid damage to the bridges and to a widely spanned glass roof which will be supported by two concrete bridges these two bridges have to be monitored with regard to their deformation performance right from the beginning of construction until commissioning as well as later on for several years. For this purpose, a monitoring concept has been developed and sensors with excellent long-term stability have been chosen. This paper describes the system for monitoring settlements and heaves by means of laser-based optics and hydrostatic leveling. Additionally, strain and inclination of the prestressed concrete bridges are redundantly monitored by embedded long-gage length fiber-optic strain sensors as well as resistive strain gages, and inclinometers. Measurements on-site are referenced by measurements on two test beams well-defined loaded under laboratory and field conditions. The paper also describes the measuring concept and the sensor techniques as well as installation of the sensor system and first results.

  15. Measurement of environmental impacts of telework adoption amidst change in complex organizations. AT and T survey methodology and results

    Energy Technology Data Exchange (ETDEWEB)

    Atkyns, Robert; Blazek, Michele; Roitz, Joseph [AT and T, 179 Bothin Road, 94930 Fairfax, CA (United States)

    2002-10-01

    Telecommuting practices and their environmental and organizational performance impacts have stimulated research across academic disciplines. Although telecommuting trends and impact projections are reported, few true longitudinal studies involving large organizations have been conducted. Published studies typically lack the research design elements to control a major confounding variable: rapid and widespread organizational change. Yet social science 'Best Practices' and market research industry quality control procedures exist that can help manage organizational change effects and other common sources of measurement error. In 1992, AT and T established a formal, corporate-wide telecommuting policy. A research and statistical modeling initiative was implemented to measure how flexible work arrangements reduce automotive emissions. Annual employee surveys were begun in 1994. As telecommuting benefits have been increasingly recognized within AT and T, the essential construct has been redefined as 'telework.' The survey's scope has expanded to address broader organization issues and provide guidance to multiple internal constituencies. This paper focuses upon the procedures used to reliably measure the adoption of telework practices and model their environmental impact, and contrasts those procedures with other, less reliable methodologies.

  16. Comparison of modern and traditional methods of soilsorption complex measurement : the basis of long -term studies and modelling

    Directory of Open Access Journals (Sweden)

    Kučera Aleš

    2014-03-01

    Full Text Available This paper presents the correlations between two different analytical methods of assessing soil nutrient contents. Soil nutrient content measurements measured using the flame atomic absorption spectrometry (FAAS method, which uses barium chloride extraction, were compared with those of the now-unused Gedroiz method, which uses ammonium chloride extraction (calcium by titration, magnesium, potassium and sodium by weighing. Natural forest soils from the Ukrainian Carpathians at the localities of Javorník and Pop Ivan were used. Despite the risk of analysis errors during the complicated analytical procedure, the results showed a high level of correlation between different nutrient content measurements across the whole soil profile. This allows concentration values given in different studies to be linearly recalculated on results of modern method. In this way, results can be used to study soil’s chemical changes over time from the soil samples that were analysed in the past using labour-intensive and time-consuming methods with a higher risk of analytic error.

  17. The neutron total cross-section measurement of 56Fe and 57Fe by using Japan Proton Accelerator Research Complex facility

    International Nuclear Information System (INIS)

    Kim, Eun Ae; Shvetsov, Valery; Cho, Moo Hyun; Won, Nam Kung; Kim, Kwang Soo; Yang, Sung Chul; Lee, Man Woo; Kim, Guin Yun; Yi, Kyoung Rak; Choi, Hong Yub; Ro, Tae Ik; Mizumoto, Motoharu; Katabuchi, Tatsuya; Igashira, Masayuki

    2012-01-01

    The measurement of neutron cross section using Time-Of-Flight (TOF) method gives significant information for the nuclear data research. In the present work, the neutron total cross section of 56 Fe and 57 Fe has been measured in the energy range between 10 eV and 100 keV by using the neutron beam produced from 3-GeV proton synchrotron accelerator. The 3-GeV proton synchrotron accelerator is located at Japan Proton Accelerator Research Complex (J-PARC) facility in Tokai village. In this study, the neutron total cross section data measured by 6 Li glass scintillator detector was compared with the evaluated values of ENDF/B-VII.0

  18. The neutron total cross-section measurement of {sup 56}Fe and {sup 57}Fe by using Japan Proton Accelerator Research Complex facility

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eun Ae; Shvetsov, Valery; Cho, Moo Hyun [Pohang University of Science and Technology, Pohang (Korea, Republic of); Won, Nam Kung [Pohang Accelerator Laboratory, Pohang (Korea, Republic of); Kim, Kwang Soo; Yang, Sung Chul; Lee, Man Woo; Kim, Guin Yun [Kyungpook National University, Daegu (Korea, Republic of); Yi, Kyoung Rak; Choi, Hong Yub; Ro, Tae Ik [Dong-A University, Pusan (Korea, Republic of); Mizumoto, Motoharu; Katabuchi, Tatsuya; Igashira, Masayuki [Tokyo Institute of Technology, Tokyo (Japan)

    2012-05-15

    The measurement of neutron cross section using Time-Of-Flight (TOF) method gives significant information for the nuclear data research. In the present work, the neutron total cross section of {sup 56}Fe and {sup 57}Fe has been measured in the energy range between 10 eV and 100 keV by using the neutron beam produced from 3-GeV proton synchrotron accelerator. The 3-GeV proton synchrotron accelerator is located at Japan Proton Accelerator Research Complex (J-PARC) facility in Tokai village. In this study, the neutron total cross section data measured by {sup 6}Li glass scintillator detector was compared with the evaluated values of ENDF/B-VII.0

  19. MODIS GPP/NPP for complex land use area: a case study of comparison between MODIS GPP/NPP and ground-based measurements over Korea

    Science.gov (United States)

    Shim, C.

    2013-12-01

    The Moderate Resolution Imaging Radiometer (MODIS) Gross Primary Productivity (GPP)/Net Primary Productivity (NPP) has been widely used for the study on global terrestrial ecosystem and carbon cycle. The current MODIS product with ~ 1 km spatial resolution, however, has limitation on the information on local scale environment (fairly comparable values of the MODIS here however, cannot assure the quality of the MOD17 over the complex vegetation area of Korea since the ground measurements except the eddy covariance tower flux measurements are highly inconsistent. Therefore, the comprehensive experiments to represents GPP/NPP over diverse vegetation types for a comparable scale of MODIS with a consistent measurement technique are necessary in order to evaluate the MODIS vegetation productivity data over Korea, which contains a large portion of highly heterogeneous vegetation area.

  20. Structure and equilibria of Ca 2+-complexes of glucose and sorbitol from multinuclear ( 1H, 13C and 43Ca) NMR measurements supplemented with molecular modelling calculations

    Science.gov (United States)

    Pallagi, A.; Dudás, Cs.; Csendes, Z.; Forgó, P.; Pálinkó, I.; Sipos, P.

    2011-05-01

    Ca 2+-complexation of D-glucose and D-sorbitol have been investigated with the aid of multinuclear ( 1H, 13C and 43Ca) NMR spectroscopy and ab initio quantum chemical calculations. Formation constants of the forming 1:1 complexes have been estimated from one-dimensional 13C NMR spectra obtained at constant ionic strength (1 M NaCl). Binding sites were identified from 2D 1H- 43Ca NMR spectra. 2D NMR measurements and ab initio calculations indicated that Ca 2+ ions were bound in a tridentate manner via the glycosidic OH, the ethereal oxygen in the ring and the OH on the terminal carbon for the α- and β-anomers of glucose and for sorbitol simultaneous binding of four hydroxide moieties (C1, C2, C4 and C6) was suggested.

  1. Approximate Entropy as a measure of complexity in sap flow temporal dynamics of two tropical tree species under water deficit

    Directory of Open Access Journals (Sweden)

    Gustavo M. Souza

    2004-09-01

    Full Text Available Approximate Entropy (ApEn, a model-independent statistics to quantify serial irregularities, was used to evaluate changes in sap flow temporal dynamics of two tropical species of trees subjected to water deficit. Water deficit induced a decrease in sap flow of G. ulmifolia, whereas C. legalis held stable their sap flow levels. Slight increases in time series complexity were observed in both species under drought condition. This study showed that ApEn could be used as a helpful tool to assess slight changes in temporal dynamics of physiological data, and to uncover some patterns of plant physiological responses to environmental stimuli.Entropia Aproximada (ApEn, um modelo estatístico independente para quantificar irregularidade em séries temporais, foi utilizada para avaliar alterações na dinâmica temporal do fluxo de seiva em duas espécies arbóreas tropicais submetidas à deficiência hídrica. A deficiência hídrica induziu uma grande redução no fluxo de seiva em G. ulmifolia, enquanto que na espécie C. legalis manteve-se estável. A complexidade das séries temporais foi levemente aumentada sob deficiência hídrica. O estudo mostrou que ApEn pode ser usada como um método para detectar pequenas alterações na dinâmica temporal de dados fisiológicos, e revelar alguns padrões de respostas fisiológicas a estímulos ambientais.

  2. Measurement

    NARCIS (Netherlands)

    Boumans, M.; Durlauf, S.N.; Blume, L.E.

    2008-01-01

    Measurement theory takes measurement as the assignment of numbers to properties of an empirical system so that a homomorphism between the system and a numerical system is established. To avoid operationalism, two approaches can be distinguished. In the axiomatic approach it is asserted that if the

  3. Anthropometric measurements of lip-nose complex in 11-17 years old males of Mashhad using photographic analysis

    Directory of Open Access Journals (Sweden)

    Pourmomeni Abbas Ali

    2010-04-01

    Full Text Available ntroduction: Although there are several methods to evaluate facial nerve palsy, most of them are not objective. In case of symmetric movements of face, photoshop software is useful for objective assessment of facial nerve injuries.  Materials and Methods: In this descriptive-analytic study, the facial movements of sixty normal subjects (30 females and 30 males were photographed. Displacement of facial movements in specific landmarks was measured by Photoshop software. The collected data then were analyzed by SPSS software. Results: The mean displacement of forehead wrinkles and landmarks on cheeks in right and left sides was respectively 10.6 mm, 10.1 mm and 9.4 mm, 9.7 mm. The mean displacement of oral commissure in right and left sides during smile was 11.8 mm and 11.5 mm. The comparison showed no significant difference between both sides (P>0.05. The mean distance between landmarks (lateral canthus, oral commissure and Cheek and axis of face were compared too. The results showed that both sides were symmetric. Conclusion: Facial movements were measurable by Photoshop software and this method was applicable to assessment of facial nerve palsy and also synkinesis.

  4. Measurement of activity concentration of 222Rn in ground waters drawn from two wells drilled in the Amparo Complex metamorphic rocks, municipio de Amparo, SP

    International Nuclear Information System (INIS)

    Oliveira, Igor Jose Chaves de

    2008-01-01

    A sampling system was assembled for field 222 Rn activity concentration measurements in ground waters. The system consists of a sampling flask that prevents the contact between the water sample and the atmosphere and a closed line for radon extraction from water. The system, its operation and calibration, are described in full detail, as well as, the conversion of the measured alpha counting rates in activity concentrations. The assembled system was used in 222 Rn activity concentrations measurements in ground waters drawn from two wells drilled in the Amparo Complex metamorphic rocks. The wells are located at the urban area of the city of Amparo and are exploited for public use water. One well, named Vale Verde, is 56 meters deep and crosses 18 meters of soil, 26 meters of quartz rich gneiss and 12 meters of biotite-gneiss. The other well, named Seabra, is 117 meters deep, crosses 28 meters of soil and weathered rocks and ends in granite-gneiss. The mean activity concentrations for the year long observation were (377 +- 25) Bq/dm 3 , for Seabra well, and (1282 +- 57) Bq/dm3, for the Vale Verde well. The 222 Rn activity concentrations fall in the activity concentration range reported in the literature for similar geology areas and are larger than the concentrations found neighboring areas of the same metamorphic Complex. The seasonal activity concentration variations seem to correlate with rain fall variations in the study area. (author)

  5. Intentional cargo disruption by nefarious means: Examining threats, systemic vulnerabilities and securitisation measures in complex global supply chains.

    Science.gov (United States)

    McGreevy, Conor; Harrop, Wayne

    2015-01-01

    Global trade and commerce requires products to be securely contained and transferred in a timely way across great distances and between national boundaries. Throughout the process, cargo and containers are stored, handled and checked by a range of authorities and authorised agents. Intermodal transportation involves the use of container ships, planes, railway systems, land bridges, road networks and barges. This paper examines the the nefarious nature of intentional disruption and nefarious risks associated with the movement of cargo and container freight. The paper explores main threats, vulnerabilities and security measures relevant to significant intermodal transit risk issues such as theft, piracy, terrorism, contamination, counterfeiting and product tampering. Three risk and vulnerability models are examined and basic standards and regulations that are relevant to safe and secure transit of container goods across international supply networks are outlined.

  6. Complexity-Based Measures Inform Effects of Tai Chi Training on Standing Postural Control: Cross-Sectional and Randomized Trial Studies

    Science.gov (United States)

    Wayne, Peter M.; Gow, Brian J.; Costa, Madalena D.; Peng, C.-K.; Lipsitz, Lewis A.; Hausdorff, Jeffrey M.; Davis, Roger B.; Walsh, Jacquelyn N.; Lough, Matthew; Novak, Vera; Yeh, Gloria Y.; Ahn, Andrew C.; Macklin, Eric A.; Manor, Brad

    2014-01-01

    Background Diminished control of standing balance, traditionally indicated by greater postural sway magnitude and speed, is associated with falls in older adults. Tai Chi (TC) is a multisystem intervention that reduces fall risk, yet its impact on sway measures vary considerably. We hypothesized that TC improves the integrated function of multiple control systems influencing balance, quantifiable by the multi-scale “complexity” of postural sway fluctuations. Objectives To evaluate both traditional and complexity-based measures of sway to characterize the short- and potential long-term effects of TC training on postural control and the relationships between sway measures and physical function in healthy older adults. Methods A cross-sectional comparison of standing postural sway in healthy TC-naïve and TC-expert (24.5±12 yrs experience) adults. TC-naïve participants then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Postural sway was assessed before and after the training during standing on a force-plate with eyes-open (EO) and eyes-closed (EC). Anterior-posterior (AP) and medio-lateral (ML) sway speed, magnitude, and complexity (quantified by multiscale entropy) were calculated. Single-legged standing time and Timed-Up–and-Go tests characterized physical function. Results At baseline, compared to TC-naïve adults (n = 60, age 64.5±7.5 yrs), TC-experts (n = 27, age 62.8±7.5 yrs) exhibited greater complexity of sway in the AP EC (P = 0.023), ML EO (Padults. Trial Registration ClinicalTrials.gov NCT01340365 PMID:25494333

  7. Measurement of Functional Cognition and Complex Everyday Activities in Older Adults with Mild Cognitive Impairment and Mild Dementia: Validity of the Large Allen's Cognitive Level Screen.

    Science.gov (United States)

    Wesson, Jacqueline; Clemson, Lindy; Crawford, John D; Kochan, Nicole A; Brodaty, Henry; Reppermund, Simone

    2017-05-01

    To explore the validity of the Large Allen's Cognitive Level Screen-5 (LACLS-5) as a performance-based measure of functional cognition, representing an ability to perform complex everyday activities in older adults with mild cognitive impairment (MCI) and mild dementia living in the community. Using cross-sectional data from the Sydney Memory and Ageing Study, 160 community-dwelling older adults with normal cognition (CN; N = 87), MCI (N = 43), or dementia (N = 30) were studied. Functional cognition (LACLS-5), complex everyday activities (Disability Assessment for Dementia [DAD]), Assessment of Motor and Process Skills [AMPS]), and neuropsychological measures were used. Participants with dementia performed worse than CN on all clinical measures, and MCI participants were intermediate. Correlational analyses showed that LACLS-5 was most strongly related to AMPS Process scores, DAD instrumental activities of daily living subscale, Mini-Mental State Exam, Block Design, Logical Memory, and Trail Making Test B. Multiple regression analysis indicated that both cognitive (Block Design) and functional measures (AMPS Process score) and sex predicted LACLS-5 performance. Finally, LACLS-5 was able to adequately discriminate between CN and dementia and between MCI and dementia but was unable to reliably distinguish between CN and MCI. Construct validity, including convergent and discriminative validity, was supported. LACLS-5 is a valid performance-based measure for evaluating functional cognition. Discriminativevalidity is acceptable for identifying mild dementia but requires further refinement for detecting MCI. Copyright © 2017 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  8. Predicting membrane flux decline from complex mixtures using flow-field flow fractionation measurements and semi-empirical theory.

    Science.gov (United States)

    Pellegrino, J; Wright, S; Ranvill, J; Amy, G

    2005-01-01

    Flow-Field Flow Fractionation (FI-FFF) is an idealization of the cross flow membrane filtration process in that, (1) the filtration flux and crossflow velocity are constant from beginning to end of the device, (2) the process is a relatively well-defined laminar-flow hydrodynamic condition, and (3) the solutes are introduced as a pulse-input that spreads due to interactions with each other and the membrane in the dilute-solution limit. We have investigated the potential for relating FI-FFF measurements to membrane fouling. An advection-dispersion transport model was used to provide 'ideal' (defined as spherical, non-interacting solutes) solute residence time distributions (RTDs) for comparison with 'real' RTDs obtained experimentally at different cross-field velocities and solution ionic strength. An RTD moment analysis based on a particle diameter probability density function was used to extract "effective" characteristic properties, rather than uniquely defined characteristics, of the standard solute mixture. A semi-empirical unsteady-state, flux decline model was developed that uses solute property parameters. Three modes of flux decline are included: (1) concentration polarization, (2) cake buildup, and (3) adsorption on/in pores, We have used this model to test the hypothesis-that an analysis of a residence time distribution using FI-FFF can describe 'effective' solute properties or indices that can be related to membrane flux decline in crossflow membrane filtration. Constant flux filtration studies included the changes of transport hydrodynamics (solvent flux to solute back diffusion (J/k) ratios), solution ionic strength, and feed water composition for filtration using a regenerated cellulose ultrafiltration membrane. Tests of the modeling hypothesis were compared with experimental results from the filtration measurements using several correction parameters based on the mean and variance of the solute RTDs. The corrections used to modify the boundary layer

  9. Development of a Multi-Point Quantitation Method to Simultaneously Measure Enzymatic and Structural Components of the Clostridium thermocellum Cellulosome Protein Complex

    Energy Technology Data Exchange (ETDEWEB)

    Dykstra, Andrew B [ORNL; St. Brice, Lois [Dartmouth College; Rodriguez, Jr., Miguel [ORNL; Raman, Babu [ORNL; Izquierdo, Javier [ORNL; Cook, Kelsey [ORNL; Lynd, Lee R [ORNL; Hettich, Robert {Bob} L [ORNL

    2014-01-01

    Clostridium thermocellum has emerged as a leading bioenergy-relevant microbe due to its ability to solubilize cellulose into carbohydrates, mediated by multi-component membrane-attached complexes termed cellulosomes. To probe microbial cellulose utilization rates, it is desirable to be able to measure the concentrations of saccharolytic enzymes and estimate the total amount of cellulosome present on a mass basis. Current cellulase determination methodologies involve labor-intensive purification procedures and only allow for indirect determination of abundance. We have developed a method using multiple reaction monitoring (MRM-MS) to simultaneously quantitate both enzymatic and structural components of the cellulosome protein complex in samples ranging in complexity from purified cellulosomes to whole cell lysates, as an alternative to a previously-developed enzyme-linked immunosorbent assay (ELISA) method of cellulosome quantitation. The precision of the cellulosome mass concentration in technical replicates is better than 5% relative standard deviation for all samples, indicating high precision for determination of the mass concentration of cellulosome components.

  10. Engineering geological zonation of a complex landslide system through seismic ambient noise measurements at the Selmun Promontory (Malta)

    Science.gov (United States)

    Iannucci, Roberto; Martino, Salvatore; Paciello, Antonella; D'Amico, Sebastiano; Galea, Pauline

    2018-05-01

    The cliff slope of the Selmun Promontory, located in the Northern part of the island of Malta (Central Mediterranean Sea) close to the coastline, is involved in a landslide process as exhibited by the large block-size talus at its bottom. The landslide process is related to the geological succession outcropping in the Selmun area, characterized by the overposition of a grained limestone on a plastic clay, that induces a lateral spreading phenomenon associated with detachment and collapse of different-size rock blocks. The landslide process shapes a typical landscape with a stable plateau of stiff limestone bordered by an unstable cliff slope. The ruins of Għajn Ħadid Tower, the first of the 13 watchtowers built in 1658 by the Grand Master Martin de Redin, stand out on the Selmun Promontory. The conservation of this important heritage site, already damaged by an earthquake which struck the Maltese Archipelago on 1856 October 12, is currently threatened by a progressive retreat of the landslide process towards the inland plateau area. During 2015 and 2016, field surveys were carried out to derive an engineering geological model of the Selmun Promontory. After a high-resolution geomechanical survey, the spatial distribution of the joints affecting the limestone was obtained. At the same time, 116 single-station noise measurements were carried out to cover inland and edge of the limestone plateau as well as the slope where the clays outcrop. The obtained 1-hour time histories were analysed through the horizontal to vertical spectral ratio technique, as well as polarization and ellipticity analysis of particle motion to define the local seismic response in zones having different stability conditions, that is, related to the presence of unstable rock blocks characterized by different vibrational modes. The results obtained demonstrate the suitability of passive seismic geophysical techniques for zoning landslide hazard in case of rock slopes and prove the relevance of

  11. Characterization of the low-temperature triplet state of chlorophyll in photosystem II core complexes: Application of phosphorescence measurements and Fourier transform infrared spectroscopy.

    Science.gov (United States)

    Zabelin, Alexey A; Neverov, Konstantin V; Krasnovsky, Alexander A; Shkuropatova, Valentina A; Shuvalov, Vladimir A; Shkuropatov, Anatoly Ya

    2016-06-01

    Phosphorescence measurements at 77 K and light-induced FTIR difference spectroscopy at 95 K were applied to study of the triplet state of chlorophyll a ((3)Chl) in photosystem II (PSII) core complexes isolated from spinach. Using both methods, (3)Chl was observed in the core preparations with doubly reduced primary quinone acceptor QA. The spectral parameters of Chl phosphorescence resemble those in the isolated PSII reaction centers (RCs). The main spectral maximum and the lifetime of the phosphorescence corresponded to 955±1 nm and of 1.65±0.05 ms respectively; in the excitation spectrum, the absorption maxima of all core complex pigments (Chl, pheophytin a (Pheo), and β-carotene) were observed. The differential signal at 1667(-)/1628(+)cm(-1) reflecting a downshift of the stretching frequency of the 13(1)-keto C=O group of Chl was found to dominate in the triplet-minus-singlet FTIR difference spectrum of core complexes. Based on FTIR results and literature data, it is proposed that (3)Chl is mostly localized on the accessory chlorophyll that is in triplet equilibrium with P680. Analysis of the data suggests that the Chl triplet state responsible for the phosphorescence and the FTIR difference spectrum is mainly generated due to charge recombination in the reaction center radical pair P680(+)PheoD1(-), and the energy and temporal parameters of this triplet state as well as the molecular environment and interactions of the triplet-bearing Chl molecule are similar in the PSII core complexes and isolated PSII RCs. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Determination of solubility isotherms of barium and strontium nitrates in the system acetic acid-water at 25/sup 0/ C

    Energy Technology Data Exchange (ETDEWEB)

    Hubicki, W.; Piskorek, M. (Uniwersytet Marii Curie-Sklodowskiej, Lublin (Poland))

    1976-01-01

    Investigations of the solubility of barium and strontium nitrates were carried out in the system: acetic acid - water at 25/sup 0/ C. When one compares the isotherms of solubility of barium and strontium nitrates, one can observe that it is possible to separate the admixtures of barium from strontium nitrates as a result of fractional crystallization of these nitrates from actic acid solution at the temperatures lower than 31.3/sup 0/ C, i.e. below the temperature of transformation: Sr(NO/sub 3/)/sub 2/ . 4H/sub 2/O reversible to Sr(NO/sub 3/)/sub 2/ + 4H/sub 2/O for aqueous solution.

  13. ReaxFF molecular dynamics simulation of intermolecular structure formation in acetic acid-water mixtures at elevated temperatures and pressures

    Science.gov (United States)

    Sengul, Mert Y.; Randall, Clive A.; van Duin, Adri C. T.

    2018-04-01

    The intermolecular structure formation in liquid and supercritical acetic acid-water mixtures was investigated using ReaxFF-based molecular dynamics simulations. The microscopic structures of acetic acid-water mixtures with different acetic acid mole fractions (1.0 ≥ xHAc ≥ 0.2) at ambient and critical conditions were examined. The potential energy surface associated with the dissociation of acetic acid molecules was calculated using a metadynamics procedure to optimize the dissociation energy of ReaxFF potential. At ambient conditions, depending on the acetic acid concentration, either acetic acid clusters or water clusters are dominant in the liquid mixture. When acetic acid is dominant (0.4 ≤ xHAc), cyclic dimers and chain structures between acetic acid molecules are present in the mixture. Both structures disappear at increased water content of the mixture. It was found by simulations that the acetic acid molecules released from these dimer and chain structures tend to stay in a dipole-dipole interaction. These structural changes are in agreement with the experimental results. When switched to critical conditions, the long-range interactions (e.g., second or fourth neighbor) disappear and the water-water and acetic acid-acetic acid structural formations become disordered. The simulated radial distribution function for water-water interactions is in agreement with experimental and computational studies. The first neighbor interactions between acetic acid and water molecules are preserved at relatively lower temperatures of the critical region. As higher temperatures are reached in the critical region, these interactions were observed to weaken. These simulations indicate that ReaxFF molecular dynamics simulations are an appropriate tool for studying supercritical water/organic acid mixtures.

  14. Design of a Combined Beacon Receiver and Digital Radiometer for 40 GHz Propagation Measurements at the Madrid Deep Space Communications Complex

    Science.gov (United States)

    Zemba, Michael; Nessel, James; Morabito, David

    2017-01-01

    NASA Glenn Research Center (GRC) and the Jet Propulsion Laboratory (JPL) have jointly developed an atmospheric propagation terminal to measure and characterize propagation phenomena at 40 GHz at the Madrid Deep Space Communications Complex (MDSCC) in Robledo de Chavela, Spain. The hybrid Q-band system utilizes a novel design which combines a 40 GHz beacon receiver and digital radiometer into the same RF front-end and observes the 39.402 GHz beacon of the European Space Agencys Alphasat Aldo Paraboni TDP5 experiment. Atmospheric measurements include gaseous absorption, rain fade, and scintillation. The radiometric measurement is calibrated by means of an included noise diode as well as tipping calibration. The goals of these measurements are to assist MDSCC mission operations as the facility increasingly supports Ka-band missions, as well as to contribute to the development and improvement of International Telecommunications Union (ITU) models for prediction of communications systems performance within the Q-band through the Aldo Paraboni Experiment. Herein, we provide an overview of the system design, characterization, and plan of operations which commenced at the MDSCC beginning in March 2017.

  15. Measuring $\

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Jessica Sarah [Univ. of Cambridge (United Kingdom)

    2011-01-01

    The MINOS Experiment consists of two steel-scintillator calorimeters, sampling the long baseline NuMI muon neutrino beam. It was designed to make a precise measurement of the ‘atmospheric’ neutrino mixing parameters, Δm2 atm. and sin2 (2 atm.). The Near Detector measures the initial spectrum of the neutrino beam 1km from the production target, and the Far Detector, at a distance of 735 km, measures the impact of oscillations in the neutrino energy spectrum. Work performed to validate the quality of the data collected by the Near Detector is presented as part of this thesis. This thesis primarily details the results of a vμ disappearance analysis, and presents a new sophisticated fitting software framework, which employs a maximum likelihood method to extract the best fit oscillation parameters. The software is entirely decoupled from the extrapolation procedure between the detectors, and is capable of fitting multiple event samples (defined by the selections applied) in parallel, and any combination of energy dependent and independent sources of systematic error. Two techniques to improve the sensitivity of the oscillation measurement were also developed. The inclusion of information on the energy resolution of the neutrino events results in a significant improvement in the allowed region for the oscillation parameters. The degree to which sin2 (2θ )= 1.0 could be disfavoured with the exposure of the current dataset if the true mixing angle was non-maximal, was also investigated, with an improved neutrino energy reconstruction for very low energy events. The best fit oscillation parameters, obtained by the fitting software and incorporating resolution information were: | Δm2| = 2.32+0.12 -0.08×10-3 eV2 and sin2 (2θ ) > 0.90(90% C.L.). The analysis provides the current world best measurement of the atmospheric neutrino mass

  16. Measuring the 'complexity' of sound

    Indian Academy of Sciences (India)

    cate that specialized regions of the brain analyse different types of sounds [1]. Music, ... The left panel of figure 1 shows examples of sound–pressure waveforms from the nat- ... which is shown in the right panels in the spectrographic representation using a 45 Hz .... Plot of SFM(t) vs. time for different environmental sounds.

  17. Simulation, measurement, and mitigation of beam instability caused by the kicker impedance in the 3-GeV rapid cycling synchrotron at the Japan Proton Accelerator Research Complex

    Science.gov (United States)

    Saha, P. K.; Shobuda, Y.; Hotchi, H.; Harada, H.; Hayashi, N.; Kinsho, M.; Tamura, F.; Tani, N.; Yamamoto, M.; Watanabe, Y.; Chin, Yong Ho; Holmes, J. A.

    2018-02-01

    The transverse impedance of eight extraction pulsed kicker magnets is a strong beam instability source in the 3-GeV rapid cycling synchrotron (RCS) at the Japan Proton Accelerator Research Complex. Significant beam instability occurs even at half of the designed 1 MW beam power when the chromaticity (ξ ) is fully corrected for the entire acceleration cycle by using ac sextupole (SX) fields. However, if ξ is fully corrected only at the injection energy by using dc SX fields, the beam is stable. In order to study realistic beam instability scenarios, including the effect of space charge and to determine practical measures to accomplish 1 MW beam power, we enhance the orbit particle tracking code to incorporate all realistic time-dependent machine parameters, including the time dependence of the impedance itself. The beam stability properties beyond 0.5 MW beam power are found to be very sensitive to a number of parameters in both simulations and measurements. In order to stabilize a beam at 1 MW beam power, two practical measures based on detailed and systematic simulation studies are determined, namely, (i) proper manipulation of the betatron tunes during acceleration and (ii) reduction of the dc SX field to reduce the ξ correction even at injection. The simulation results are well reproduced by measurements, and, as a consequence, an acceleration to 1 MW beam power is successfully demonstrated. In this paper, details of the orbit simulation and the corresponding experimental results up to 1 MW of beam power are presented. To further increase the RCS beam power, beam stability issues and possible measures beyond 1 MW beam power are also considered.

  18. A near-optimal low complexity sensor fusion technique for accurate indoor localization based on ultrasound time of arrival measurements from low-quality sensors

    Science.gov (United States)

    Mitilineos, Stelios A.; Argyreas, Nick D.; Thomopoulos, Stelios C. A.

    2009-05-01

    A fusion-based localization technique for location-based services in indoor environments is introduced herein, based on ultrasound time-of-arrival measurements from multiple off-the-shelf range estimating sensors which are used in a market-available localization system. In-situ field measurements results indicated that the respective off-the-shelf system was unable to estimate position in most of the cases, while the underlying sensors are of low-quality and yield highly inaccurate range and position estimates. An extensive analysis is performed and a model of the sensor-performance characteristics is established. A low-complexity but accurate sensor fusion and localization technique is then developed, which consists inof evaluating multiple sensor measurements and selecting the one that is considered most-accurate based on the underlying sensor model. Optimality, in the sense of a genie selecting the optimum sensor, is subsequently evaluated and compared to the proposed technique. The experimental results indicate that the proposed fusion method exhibits near-optimal performance and, albeit being theoretically suboptimal, it largely overcomes most flaws of the underlying single-sensor system resulting in a localization system of increased accuracy, robustness and availability.

  19. Long-distance mountain biking does not disturb the measurement of total, free or complexed prostate-specific antigen in healthy men.

    Science.gov (United States)

    Herrmann, Markus; Scharhag, Jürgen; Sand-Hill, Marga; Kindermann, Wilfried; Herrmann, Wolfgang

    2004-03-01

    Mechanical manipulation of the prostate is a generally accepted interfering factor for the measurement of prostate-specific antigen (PSA). However, only few studies have focused on common daily mechanical manipulations, such as bicycle riding. Furthermore, physical exercise is also supposed to modulate PSA serum concentration. Long-distance mountain biking is an excellent model to study the combined effect of mechanical prostate manipulation by bicycle riding and strenuous endurance exercise on total, free and complexed PSA (tPSA, fPSA, cPSA). We investigated tPSA, fPSA and cPSA in 42 healthy male cyclists (mean age 35+/-6 years) before and after a 120 km off-road mountain bike race. Blood sampling was done before, 15 min and 3 h after the race. Mean race time was 342+/-65 min. All athletes had normal serum levels of tPSA, fPSA or cPSA. None of these parameters was modified by the race. In healthy men the measurement of tPSA, fPSA and cPSA is not disturbed by preceding long distance mountain biking or endurance exercise. Based on the present data, there is no evidence for a recommendation to limit bicycle riding or physical activity before the measurement of tPSA, fPSA or cPSA.

  20. Quantification of differences between nailfold capillaroscopy images with a scleroderma pattern and normal pattern using measures of geometric and algorithmic complexity.

    Science.gov (United States)

    Urwin, Samuel George; Griffiths, Bridget; Allen, John

    2017-02-01

    This study aimed to quantify and investigate differences in the geometric and algorithmic complexity of the microvasculature in nailfold capillaroscopy (NFC) images displaying a scleroderma pattern and those displaying a 'normal' pattern. 11 NFC images were qualitatively classified by a capillary specialist as indicative of 'clear microangiopathy' (CM), i.e. a scleroderma pattern, and 11 as 'not clear microangiopathy' (NCM), i.e. a 'normal' pattern. Pre-processing was performed, and fractal dimension (FD) and Kolmogorov complexity (KC) were calculated following image binarisation. FD and KC were compared between groups, and a k-means cluster analysis (n  =  2) on all images was performed, without prior knowledge of the group assigned to them (i.e. CM or NCM), using FD and KC as inputs. CM images had significantly reduced FD and KC compared to NCM images, and the cluster analysis displayed promising results that the quantitative classification of images into CM and NCM groups is possible using the mathematical measures of FD and KC. The analysis techniques used show promise for quantitative microvascular investigation in patients with systemic sclerosis.

  1. On complexity and homogeneity measures in predicting biological aggressiveness of prostate cancer; Implication of the cellular automata model of tumor growth.

    Science.gov (United States)

    Tanase, Mihai; Waliszewski, Przemyslaw

    2015-12-01

    We propose a novel approach for the quantitative evaluation of aggressiveness in prostate carcinomas. The spatial distribution of cancer cell nuclei was characterized by the global spatial fractal dimensions D0, D1, and D2. Two hundred eighteen prostate carcinomas were stratified into the classes of equivalence using results of ROC analysis. A simulation of the cellular automata mix defined a theoretical frame for a specific geometric representation of the cell nuclei distribution called a local structure correlation diagram (LSCD). The LSCD and dispersion Hd were computed for each carcinoma. Data mining generated some quantitative criteria describing tumor aggressiveness. Alterations in tumor architecture along progression were associated with some changes in both shape and the quantitative characteristics of the LSCD consistent with those in the automata mix model. Low-grade prostate carcinomas with low complexity and very low biological aggressiveness are defined by the condition D0 1.764 and Hd < 38. The novel homogeneity measure Hd identifies carcinomas with very low aggressiveness within the class of complexity C1 or carcinomas with very high aggressiveness in the class C7. © 2015 Wiley Periodicals, Inc.

  2. The Na+ transport in gram-positive bacteria defect in the Mrp antiporter complex measured with 23Na nuclear magnetic resonance.

    Science.gov (United States)

    Górecki, Kamil; Hägerhäll, Cecilia; Drakenberg, Torbjörn

    2014-01-15

    (23)Na nuclear magnetic resonance (NMR) has previously been used to monitor Na(+) translocation across membranes in gram-negative bacteria and in various other organelles and liposomes using a membrane-impermeable shift reagent to resolve the signals resulting from internal and external Na(+). In this work, the (23)Na NMR method was adapted for measurements of internal Na(+) concentration in the gram-positive bacterium Bacillus subtilis, with the aim of assessing the Na(+) translocation activity of the Mrp (multiple resistance and pH) antiporter complex, a member of the cation proton antiporter-3 (CPA-3) family. The sodium-sensitive growth phenotype observed in a B. subtilis strain with the gene encoding MrpA deleted could indeed be correlated to the inability of this strain to maintain a lower internal Na(+) concentration than an external one. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Butterfly Deformation Modes in a Photoexcited Pyrazolate-Bridged Pt Complex Measured by Time-Resolved X-Ray Scattering in Solution

    DEFF Research Database (Denmark)

    Haldrup, Kristoffer; Dohn, Asmus Ougaard; Shelby, Megan L.

    2016-01-01

    the monochromatic X-ray pulses at Beamline 11IDD of the Advanced Photon Source. The excited-state structural analysis of 1 was performed based on the results from both transient WAXS measurements and density functional theory calculations to shed light on the primary structural changes in its triplet metal-metal...... excited state has remained scarce. Using time-resolved wide-angle X-ray scattering (WAXS), the excited triplet state molecular structure of [Pt(ppy)(μ-t-Bu2pz)]2 (ppy = 2-phenylpyridine; t-Bu2pz = 3,5-di-tert-butylpyrazolate), complex 1, was obtained in a dilute (0.5 mM) toluene solution utilizing...

  4. Spatial variations of wet deposition rates in an extended region of complex topography deduced from measurements of 210Pb soil inventories

    International Nuclear Information System (INIS)

    Branford, D.; Mourne, R.W.; Fowler, D.

    1998-01-01

    The radionuclide 210 Pb derived from gaseous 222 Rn present in the atmosphere becomes attached to the same aerosols as the bulk of the main pollutants sulphur and nitrogen. When scavenged from the atmosphere by precipitation, the 210 Pb is readily attached to organic matter in the surface horizons of the soil. Inventories of 210 Pb in soil can thus be used to measure the spatial variations in wet (or cloud) deposition due to orography averaged over many precipitation events (half-life of 210 Pb is 22·3 year). Measurements of soil 210 Pb inventories were made along a transect through complex terrain in the Scottish Highlands to quantify the orographic enhancement of wet deposition near the summits of the three mountains Ben Cruachan, Beinn Dorain and Ben Lawers, which, respectively, lie at distances of approximately 30, 55 and 80 km from the coast in the direction of the prevailing wind. The inventory of 210 Pb on the wind-facing slopes of Ben Cruachan shows an increase with altitude that rises faster than the precipitation rate, which is indicative of seeder-feeder scavenging of orographic cloud occurring around the summit. Results for Beinn Dorain show a smaller rise with altitude whereas those for Ben Lawers give no indication of a rise. It is concluded that the seeder-feeder mechanism in regions of complex topology decreases in effectiveness as a function of distance inland along the direction of the prevailing wind. (Copyright (c) 1998 Elsevier Science B.V., Amsterdam. All rights reserved.)

  5. A comparison of the effects of 6 weeks of traditional resistance training, plyometric training, and complex training on measures of strength and anthropometrics.

    Science.gov (United States)

    MacDonald, Christopher J; Lamont, Hugh S; Garner, John C

    2012-02-01

    Complex training (CT; alternating between heavy and lighter load resistance exercises with similar movement patterns within an exercise session) is a form of training that may potentially bring about a state of postactivation potentiation, resulting in increased dynamic power (Pmax) and rate of force development during the lighter load exercise. Such a method may be more effective than either modality, independently for developing strength. The purpose of this research was to compare the effects of resistance training (RT), plyometric training (PT), and CT on lower body strength and anthropometrics. Thirty recreationally trained college-aged men were trained using 1 of 3 methods: resistance, plyometric, or complex twice weekly for 6 weeks. The participants were tested pre, mid, and post to assess back squat strength, Romanian dead lift (RDL) strength, standing calf raise (SCR) strength, quadriceps girth, triceps surae girth, body mass, and body fat percentage. Diet was not controlled during this study. Statistical measures revealed a significant increase for squat strength (p = 0.000), RDL strength (p = 0.000), and SCR strength (p = 0.000) for all groups pre to post, with no differences between groups. There was also a main effect for time for girth measures of the quadriceps muscle group (p = 0.001), the triceps surae muscle group (p = 0.001), and body mass (p = 0.001; post hoc revealed no significant difference). There were main effects for time and group × time interactions for fat-free mass % (RT: p = 0.031; PT: p = 0.000). The results suggest that CT mirrors benefits seen with traditional RT or PT. Moreover, CT revealed no decrement in strength and anthropometric values and appears to be a viable training modality.

  6. Heuristic Relative Entropy Principles with Complex Measures: Large-Degree Asymptotics of a Family of Multi-variate Normal Random Polynomials

    Science.gov (United States)

    Kiessling, Michael Karl-Heinz

    2017-10-01

    Let z\\in C, let σ ^2>0 be a variance, and for N\\in N define the integrals E_N^{}(z;σ ) := {1/σ } \\int _R\\ (x^2+z^2) e^{-{1/2σ^2 x^2}}{√{2π }}/dx \\quad if N=1, {1/σ } \\int _{R^N} \\prod \\prod \\limits _{1≤ k1. These are expected values of the polynomials P_N^{}(z)=\\prod _{1≤ n≤ N}(X_n^2+z^2) whose 2 N zeros ± i X_k^{}_{k=1,\\ldots ,N} are generated by N identically distributed multi-variate mean-zero normal random variables {X_k}N_{k=1} with co-variance {Cov}_N^{}(X_k,X_l)=(1+σ ^2-1/N)δ _{k,l}+σ ^2-1/N(1-δ _{k,l}). The E_N^{}(z;σ ) are polynomials in z^2, explicitly computable for arbitrary N, yet a list of the first three E_N^{}(z;σ ) shows that the expressions become unwieldy already for moderate N—unless σ = 1, in which case E_N^{}(z;1) = (1+z^2)^N for all z\\in C and N\\in N. (Incidentally, commonly available computer algebra evaluates the integrals E_N^{}(z;σ ) only for N up to a dozen, due to memory constraints). Asymptotic evaluations are needed for the large- N regime. For general complex z these have traditionally been limited to analytic expansion techniques; several rigorous results are proved for complex z near 0. Yet if z\\in R one can also compute this "infinite-degree" limit with the help of the familiar relative entropy principle for probability measures; a rigorous proof of this fact is supplied. Computer algebra-generated evidence is presented in support of a conjecture that a generalization of the relative entropy principle to signed or complex measures governs the N→ ∞ asymptotics of the regime iz\\in R. Potential generalizations, in particular to point vortex ensembles and the prescribed Gauss curvature problem, and to random matrix ensembles, are emphasized.

  7. Validating an Agency-based Tool for Measuring Women’s Empowerment in a Complex Public Health Trial in Rural Nepal

    Science.gov (United States)

    Gram, Lu; Morrison, Joanna; Sharma, Neha; Shrestha, Bhim; Manandhar, Dharma; Costello, Anthony; Saville, Naomi; Skordis-Worrall, Jolene

    2017-01-01

    Abstract Despite the rising popularity of indicators of women’s empowerment in global development programmes, little work has been done on the validity of existing measures of such a complex concept. We present a mixed methods validation of the use of the Relative Autonomy Index for measuring Amartya Sen’s notion of agency freedom in rural Nepal. Analysis of think-aloud interviews (n = 7) indicated adequate respondent understanding of questionnaire items, but multiple problems of interpretation including difficulties with the four-point Likert scale, questionnaire item ambiguity and difficulties with translation. Exploratory Factor Analysis of a calibration sample (n = 511) suggested two positively correlated factors (r = 0.64) loading on internally and externally motivated behaviour. Both factors increased with decreasing education and decision-making power on large expenditures and food preparation. Confirmatory Factor Analysis on a validation sample (n = 509) revealed good fit (Root Mean Square Error of Approximation 0.05–0.08, Comparative Fit Index 0.91–0.99). In conclusion, we caution against uncritical use of agency-based quantification of women’s empowerment. While qualitative and quantitative analysis revealed overall satisfactory construct and content validity, the positive correlation between external and internal motivations suggests the existence of adaptive preferences. High scores on internally motivated behaviour may reflect internalized oppression rather than agency freedom. PMID:28303173

  8. Plasma clearance of sup(99m)Tc-N/2,4-dimethyl-acetanilido/iminodiacetate complex as a measure of parenchymal liver damage

    International Nuclear Information System (INIS)

    Studniarek, M.; Durski, K.; Liniecki, J.; Akademia Medyczna, Lodz

    1983-01-01

    Fifty-two patients were studied with various diseases affecting liver parenchyma. Any disorders of bile transport were excluded on the basis of dynamic liver scintigraphy using intravenously injected N/2,4-dimethyl acetanilid/iminodiacetate sup(99m)Tc complex (HEPIDA). The activity concentration of sup(99m)Tc-HEPIDA in plasma was measured from 5 through 60 min post injection. Clearance of the substance (Clsub(B)) was calculated from blood plasma disappearance curves and compared with results of 13 laboratory tests used conventionally for assessment of damage of the liver and its functional capacity; age and body weight was also included in the analysis. Statistical relations were studied using linear regression analysis of two variables, multiple regression analysis as well as multidimensional analysis of variance. It was demonstrated that sup(99m)Tc-HEPIDA clearance is a simple, accurate and repeatable measure of liver parenchyma damage. In males, values of Clsub(B) above 245 ml min - 1 /1.73 m 2 exclude hepatic damage with high probability; values below 195 ml min - 1 /1.73 m 2 indicate evident impairment of liver parenchyma function. (orig.) [de

  9. Environmental Assessment and Finding of No Significant Impact: Interim Measures for the Mixed Waste Management Facility Groundwater at the Burial Ground Complex at the Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    N/A

    1999-12-08

    The U. S. Department of Energy (DOE) prepared this environmental assessment (EA) to analyze the potential environmental impacts associated with the proposed interim measures for the Mixed Waste Management Facility (MW) groundwater at the Burial Ground Complex (BGC) at the Savannah River Site (SRS), located near Aiken, South Carolina. DOE proposes to install a small metal sheet pile dam to impound water around and over the BGC groundwater seepline. In addition, a drip irrigation system would be installed. Interim measures will also address the reduction of volatile organic compounds (VOCS) from ''hot-spot'' regions associated with the Southwest Plume Area (SWPA). This action is taken as an interim measure for the MWMF in cooperation with the South Carolina Department of Health and Environmental Control (SCDHEC) to reduce the amount of tritium seeping from the BGC southwest groundwater plume. The proposed action of this EA is being planned and would be implemented concurrent with a groundwater corrective action program under the Resource Conservation and Recovery Act (RCRA). On September 30, 1999, SCDHEC issued a modification to the SRS RCRA Part B permit that adds corrective action requirements for four plumes that are currently emanating from the BGC. One of those plumes is the southwest plume. The RCRA permit requires SRS to submit a corrective action plan (CAP) for the southwest plume by March 2000. The permit requires that the initial phase of the CAP prescribe a remedy that achieves a 70-percent reduction in the annual amount of tritium being released from the southwest plume area to Fourmile Branch, a nearby stream. Approval and actual implementation of the corrective measure in that CAP may take several years. As an interim measure, the actions described in this EA would manage the release of tritium from the southwest plume area until the final actions under the CAP can be implemented. This proposed action is expected to reduce the

  10. Complexity explained

    CERN Document Server

    Erdi, Peter

    2008-01-01

    This book explains why complex systems research is important in understanding the structure, function and dynamics of complex natural and social phenomena. Readers will learn the basic concepts and methods of complex system research.

  11. Characterization of dynamics in complex lyophilized formulations: I. Comparison of relaxation times measured by isothermal calorimetry with data estimated from the width of the glass transition temperature region.

    Science.gov (United States)

    Chieng, Norman; Mizuno, Masayasu; Pikal, Michael

    2013-10-01

    The purposes of this study are to characterize the relaxation dynamics in complex freeze dried formulations and to investigate the quantitative relationship between the structural relaxation time as measured by thermal activity monitor (TAM) and that estimated from the width of the glass transition temperature (ΔT(g)). The latter method has advantages over TAM because it is simple and quick. As part of this objective, we evaluate the accuracy in estimating relaxation time data at higher temperatures (50 °C and 60 °C) from TAM data at lower temperature (40 °C) and glass transition region width (ΔT(g)) data obtained by differential scanning calorimetry. Formulations studied here were hydroxyethyl starch (HES)-disaccharide, HES-polyol, and HES-disaccharide-polyol at various ratios. We also re-examine, using TAM derived relaxation times, the correlation between protein stability (human growth hormone, hGH) and relaxation times explored in a previous report, which employed relaxation time data obtained from ΔT(g). Results show that most of the freeze dried formulations exist in single amorphous phase, and structural relaxation times were successfully measured for these systems. We find a reasonably good correlation between TAM measured relaxation times and corresponding data obtained from estimates based on ΔT(g), but the agreement is only qualitative. The comparison plot showed that TAM data are directly proportional to the 1/3 power of ΔT(g) data, after correcting for an offset. Nevertheless, the correlation between hGH stability and relaxation time remained qualitatively the same as found with using ΔT(g) derived relaxation data, and it was found that the modest extrapolation of TAM data to higher temperatures using ΔT(g) method and TAM data at 40 °C resulted in quantitative agreement with TAM measurements made at 50 °C and 60 °C, provided the TAM experiment temperature, is well below the Tg of the sample. Copyright © 2013 Elsevier B.V. All rights

  12. Considering sampling strategy and cross-section complexity for estimating the uncertainty of discharge measurements using the velocity-area method

    Science.gov (United States)

    Despax, Aurélien; Perret, Christian; Garçon, Rémy; Hauet, Alexandre; Belleville, Arnaud; Le Coz, Jérôme; Favre, Anne-Catherine

    2016-02-01

    uncertainty component for any routine gauging, the four most similar gaugings among the reference stream-gaugings dataset are selected using an analog approach, where analogy includes both riverbed shape and flow distribution complexity. This new method was applied to 3185 stream-gaugings with various flow conditions and compared with the other methods (ISO 748 , IVE, Q + with a simple automated parametrization). Results show that FLAURE is overall consistent with the Q + method but not with ISO 748 and IVE methods, which produce clearly overestimated uncertainties for discharge measurements with less than 15 verticals. The FLAURE approach therefore appears to be a consistent method. An advantage is the explicit link made between the estimation of cross-sectional interpolation errors and the study of high-resolution reference gaugings.

  13. Complex chemistry

    International Nuclear Information System (INIS)

    Kim, Bong Gon; Kim, Jae Sang; Kim, Jin Eun; Lee, Boo Yeon

    2006-06-01

    This book introduces complex chemistry with ten chapters, which include development of complex chemistry on history coordination theory and Warner's coordination theory and new development of complex chemistry, nomenclature on complex with conception and define, chemical formula on coordination compound, symbol of stereochemistry, stereo structure and isomerism, electron structure and bond theory on complex, structure of complex like NMR and XAFS, balance and reaction on solution, an organo-metallic chemistry, biology inorganic chemistry, material chemistry of complex, design of complex and calculation chemistry.

  14. Complex Role of Secondary Electron Emissions in Dust Grain Charging in Space Environments: Measurements on Apollo 11 and 17 Dust Grains

    Science.gov (United States)

    Abbas, M. M.; Tankosic, D.; Spann, J. F.; LeClair, A. C.

    2010-01-01

    Dust grains in various astrophysical environments are generally charged electrostatically by photoelectric emissions with radiation from nearby sources, or by electron/ion collisions by sticking or secondary electron emissions. Knowledge of the dust grain charges and equilibrium potentials is important for understanding of a variety of physical and dynamical processes in the interstellar medium (ISM), and heliospheric, interplanetary, planetary, and lunar environments. The high vacuum environment on the lunar surface leads to some unusual physical and dynamical phenomena involving dust grains with high adhesive characteristics, and levitation and transportation over long distances. It has been well recognized that the charging properties of individual micron/submicron size dust grains are expected to be substantially different from the corresponding values for bulk materials and theoretical models. In this paper we present experimental results on charging of individual dust grains selected from Apollo 11 and Apollo 17 dust samples by exposing them to mono-energetic electron beams in the 10- 400 eV energy range. The charging rates of positively and negatively charged particles of approximately 0.2 to 13 microns diameters are discussed in terms of the secondary electron emission (SEE) process, which is found to be a complex charging process at electron energies as low as 10-25 eV, with strong particle size dependence. The measurements indicate substantial differences between dust charging properties of individual small size dust grains and of bulk materials.

  15. Integration of ambient seismic noise monitoring, displacement and meteorological measurements to infer the temperature-controlled long-term evolution of a complex prone-to-fall cliff

    Science.gov (United States)

    Colombero, C.; Baillet, L.; Comina, C.; Jongmans, D.; Larose, E.; Valentin, J.; Vinciguerra, S.

    2018-06-01

    Monitoring the temporal evolution of resonance frequencies and velocity changes detected from ambient seismic noise recordings can help in recognizing reversible and irreversible modifications within unstable rock volumes. With this aim, the long-term ambient seismic noise data set acquired at the potentially unstable cliff of Madonna delSasso (NW Italian Alps) was analysed in this study, using both spectral analysis and cross-correlation techniques. Noise results were integrated and compared with direct displacement measurements and meteorological data, to understand the long-term evolution of the cliff. No irreversible modifications in the stability of the site were detected over the monitored period. Conversely, daily and seasonal air temperature fluctuations were found to control resonance frequency values, amplitudes and directivities and to induce reversible velocity changes within the fractured rock mass. The immediate modification in the noise parameters due to temperature fluctuations was interpreted as the result of rock mass thermal expansion and contraction, inducing variations in the contact stiffness along the fractures isolating two unstable compartments. Differences with previous case studies were highlighted in the long-term evolution of noise spectral amplitudes and directivities, due to the complex 3-D fracture setting of the site and to the combined effects of the two unstable compartments.

  16. Complex segregation analysis of blood pressure and heart rate measured before and after a 20-week endurance exercise training program: the HERITAGE Family Study.

    Science.gov (United States)

    An, P; Rice, T; Pérusse, L; Borecki, I B; Gagnon, J; Leon, A S; Skinner, J S; Wilmore, J H; Bouchard, C; Rao, D C

    2000-05-01

    Complex segregation analysis of baseline resting blood pressure (BP) and heart rate (HR) and their responses to training (post-training minus baseline) were performed in a sample of 482 individuals from 99 white families who participated in the HERITAGE Family Study. Resting BP and HR were measured at baseline and after a 20-week training program. Baseline resting BP and HR were age-adjusted and age-BMI-adjusted, and the responses to training were age-adjusted and age-baseline-adjusted, within four gender-by-generation groups. This study also analyzed the responses to training in two subsets of families: (1) the so-called "high" subsample, 45 families (216 individuals) with at least one member whose baseline resting BP is in the high end of the normal BP range (the upper 95th percentile: systolic BP [SBP] > or = 135 or diastolic BP [DBP] > or = 80 mm Hg); and (2) the so-called "nonhigh" subsample, the 54 remaining families (266 individuals). Baseline resting SBP was influenced by a multifactorial component (23%), which was independent of body mass index (BMI). Baseline resting DBP was influenced by a putative recessive locus, which accounted for 31% of the variance. In addition to the major gene effect, which may impact BMI as well, baseline resting DBP was also influenced by a multifactorial component (29%). Baseline resting HR was influenced by a putative dominant locus independent of BMI, which accounted for 31% of the variance. For the responses to training, no familiality was found in the whole sample or in the nonhigh subsample. However, in the high subsample, resting SBP response to training was influenced by a putative recessive locus, which accounted for 44% of the variance. No familiality was found for resting DBP response to training. Resting HR response to training was influenced by a major effect (accounting for 35% of the variance), with an ambiguous transmission from parents to offspring.

  17. (II) complexes

    African Journals Online (AJOL)

    activities of Schiff base tin (II) complexes. Neelofar1 ... Conclusion: All synthesized Schiff bases and their Tin (II) complexes showed high antimicrobial and ...... Singh HL. Synthesis and characterization of tin (II) complexes of fluorinated Schiff bases derived from amino acids. Spectrochim Acta Part A: Molec Biomolec.

  18. Nature of unresolved complex mixture in size-distributed emissions from residential wood combustion as measured by thermal desorption-gas chromatography-mass spectrometry

    Science.gov (United States)

    Hays, Michael D.; Smith, N. Dean; Dong, Yuanji

    2004-08-01

    Unresolved complex mixture (UCM) is an analytical artifact of gas chromatographs of combustion source-related fine aerosol extracts. In this study the UCM is examined in size-resolved fine aerosol emissions from residential wood combustion. The aerosols are sorted by size in an electrical low-pressure impactor (ELPI) and subsequently analyzed by thermal desorption/gas chromatography/mass spectrometry (TD/GC/MS). A semiquantitative system for predicting the branched alkane, cycloalkane, alkylbenzene, C3-, C4-, C5-alkylbenzene, methylnaphthalene, C3-, C4-, C5-alkylnaphthalene, methylphenanthrene C2-, C3-alkylphenanthrene, and dibenzothiophene concentrations in the UCM is introduced. Analysis by TD/GS/MS detects UCM on each ELPI stage for all six combustion tests. The UCM baseline among the different fuel types is variable. In particular, the UCM of Pseudotsuga sp. is enriched in later-eluting compounds of lower volatility. A high level of reproducibility is achieved in determining UCM areas. UCM fractions (UCM ion area/total extracted ion chromatograph area) by individual ELPI stage return a mean relative standard deviation of 19.1% over the entire combustion test set, indicating a highly consistent UCM fraction across the ELPI size boundaries. Among the molecular ions investigated, branched alkane (m/z 57) and dibenzothiophene (m/z 212 and 226) constituents are most abundant in UCM emissions from RWC, collectively accounting for 64-95% of the targeted chemical species. The total UCM emissions span 446-756 mg/kg of dry biomass burned and correspond to an upper limit of 7.1% of the PM2.5 mass. The UCM emissions are primarily accumulation mode (0.1 μm ≤ aerodynamic diameter (da) ≤ 1 μm), with a geometric mean diameter (dg) range of 120.3-518.4 nm. UCM in PM2.5 is chemically asymmetric (shifted to finer da), typically clustering at da ≤ 1 μm. Measurable shifts in dg and changes in distribution widths (σg) on an intratest basis suggest that the particle density

  19. Communication complexity and information complexity

    Science.gov (United States)

    Pankratov, Denis

    Information complexity enables the use of information-theoretic tools in communication complexity theory. Prior to the results presented in this thesis, information complexity was mainly used for proving lower bounds and direct-sum theorems in the setting of communication complexity. We present three results that demonstrate new connections between information complexity and communication complexity. In the first contribution we thoroughly study the information complexity of the smallest nontrivial two-party function: the AND function. While computing the communication complexity of AND is trivial, computing its exact information complexity presents a major technical challenge. In overcoming this challenge, we reveal that information complexity gives rise to rich geometrical structures. Our analysis of information complexity relies on new analytic techniques and new characterizations of communication protocols. We also uncover a connection of information complexity to the theory of elliptic partial differential equations. Once we compute the exact information complexity of AND, we can compute exact communication complexity of several related functions on n-bit inputs with some additional technical work. Previous combinatorial and algebraic techniques could only prove bounds of the form theta( n). Interestingly, this level of precision is typical in the area of information theory, so our result demonstrates that this meta-property of precise bounds carries over to information complexity and in certain cases even to communication complexity. Our result does not only strengthen the lower bound on communication complexity of disjointness by making it more exact, but it also shows that information complexity provides the exact upper bound on communication complexity. In fact, this result is more general and applies to a whole class of communication problems. In the second contribution, we use self-reduction methods to prove strong lower bounds on the information

  20. Joint stability characteristics of the ankle complex in female athletes with histories of lateral ankle sprain, part II: clinical experience using arthrometric measurement.

    Science.gov (United States)

    Kovaleski, John E; Heitman, Robert J; Gurchiek, Larry R; Hollis, J M; Liu, Wei; Pearsall, Albert W

    2014-01-01

    This is part II of a 2-part series discussing stability characteristics of the ankle complex. In part I, we used a cadaver model to examine the effects of sectioning the lateral ankle ligaments on anterior and inversion motion and stiffness of the ankle complex. In part II, we wanted to build on and apply these findings to the clinical assessment of ankle-complex motion and stiffness in a group of athletes with a history of unilateral ankle sprain. To examine ankle-complex motion and stiffness in a group of athletes with reported history of lateral ankle sprain. Cross-sectional study. University research laboratory. Twenty-five female college athletes (age = 19.4 ± 1.4 years, height = 170.2 ± 7.4 cm, mass = 67.3 ± 10.0 kg) with histories of unilateral ankle sprain. All ankles underwent loading with an ankle arthrometer. Ankles were tested bilaterally. The dependent variables were anterior displacement, anterior end-range stiffness, inversion rotation, and inversion end-range stiffness. Anterior displacement of the ankle complex did not differ between the uninjured and sprained ankles (P = .37), whereas ankle-complex rotation was greater for the sprained ankles (P = .03). The sprained ankles had less anterior and inversion end-range stiffness than the uninjured ankles (P ankle-complex laxity and end-range stiffness were detected in ankles with histories of sprain. These results indicate the presence of altered mechanical characteristics in the soft tissues of the sprained ankles.

  1. Visual Complexity: A Review

    Science.gov (United States)

    Donderi, Don C.

    2006-01-01

    The idea of visual complexity, the history of its measurement, and its implications for behavior are reviewed, starting with structuralism and Gestalt psychology at the beginning of the 20th century and ending with visual complexity theory, perceptual learning theory, and neural circuit theory at the beginning of the 21st. Evidence is drawn from…

  2. Indicators: Physical Habitat Complexity

    Science.gov (United States)

    Physical habitat complexity measures the amount and variety of all types of cove at the water’s edge in lakes. In general, dense and varied shoreline habitat is able to support more diverse communities of aquatic life.

  3. Conversation, coupling and complexity

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Abney, Drew; Bahrami, Bahador

    We investigate the linguistic co-construction of interpersonal synergies. By applying a measure of coupling between complex systems to an experimentally elicited corpus of joint decision dialogues, we show that interlocutors’ linguistic behavior displays increasing signature of multi-scale coupling......, known as complexity matching, over the course of interaction. Furthermore, we show that stronger coupling corresponds with more effective interaction, as measured by collective task performance....

  4. In Defense of Simulating Complex and Tragic Historical Episodes: A Measured Response to the Outcry over a New England Slavery Simulation

    Science.gov (United States)

    Wright-Maley, Cory

    2014-01-01

    A slavery simulation that took place as part of a field trip for students of a Hartford junior high academy led a father to file a human rights suit against the school district, and for one official to comment that simulations of complex and tragic human phenomena have "no place in an educational system." In light of these conclusions,…

  5. When physics is not "just physics": complexity science invites new measurement frames for exploring the physics of cognitive and biological development.

    Science.gov (United States)

    Kelty-Stephen, Damian; Dixon, James A

    2012-01-01

    The neurobiological sciences have struggled to resolve the physical foundations for biological and cognitive phenomena with a suspicion that biological and cognitive systems, capable of exhibiting and contributing to structure within themselves and through their contexts, are fundamentally distinct or autonomous from purely physical systems. Complexity science offers new physics-based approaches to explaining biological and cognitive phenomena. In response to controversy over whether complexity science might seek to "explain away" biology and cognition as "just physics," we propose that complexity science serves as an application of recent advances in physics to phenomena in biology and cognition without reducing or undermining the integrity of the phenomena to be explained. We highlight that physics is, like the neurobiological sciences, an evolving field and that the threat of reduction is overstated. We propose that distinctions between biological and cognitive systems from physical systems are pretheoretical and thus optional. We review our own work applying insights from post-classical physics regarding turbulence and fractal fluctuations to the problems of developing cognitive structure. Far from hoping to reduce biology and cognition to "nothing but" physics, we present our view that complexity science offers new explanatory frameworks for considering physical foundations of biological and cognitive phenomena.

  6. Complexity Theory

    Science.gov (United States)

    Lee, William H K.

    2016-01-01

    A complex system consists of many interacting parts, generates new collective behavior through self organization, and adaptively evolves through time. Many theories have been developed to study complex systems, including chaos, fractals, cellular automata, self organization, stochastic processes, turbulence, and genetic algorithms.

  7. Statistics of the derivatives of complex signal derived from Riesz transform and its application to pseudo-Stokes vector correlation for speckle displacement measurement

    DEFF Research Database (Denmark)

    Zhang, Shun; Yang, Yi; Hanson, Steen Grüner

    2015-01-01

    for the superiority of the proposed PSVC technique, we study the statistical properties of the spatial derivatives of the complex signal representation generated from the Riesz transform. Under the assumption of a Gaussian random process, a theoretical analysis for the pseudo Stokes vector correlation has been...... provided. Based on these results, we show mathematically that PSVC has a performance advantage over conventional intensity-based correlation technique....

  8. The measuring of real state of the residential complex Vlčince II in Žilina by using of TLS technology

    Directory of Open Access Journals (Sweden)

    Katarína Pukanská

    2011-12-01

    Full Text Available Construction of blocks of flats Vlčince II in Žilina, realized by the building company Doprastav a.s., consists from two blocksA and B. For measuring of real status construction was used terrestrial laser scanner Leica ScanStation. Processing of measured datawas applicated in software Cyclone Scan, Register and Cloudworx for Microstation. Through measured objects was created horizontalsections in more high levels. Founded deviations are presented in attached tables.

  9. Quantum chemical calculations and spectroscopic measurements of spectroscopic and thermodynamic properties of given uranyl complexes in aqueous solutions with possible environmental and industrial applications

    Directory of Open Access Journals (Sweden)

    Višňak Jakub

    2016-01-01

    Full Text Available A brief introduction into computational methodology and preliminary results for spectroscopic (excitation energies, vibrational frequencies in ground and excited electronic states and thermodynamic (stability constants, standard enthalpies and entropies of complexation reactions properties of some 1:1, 1:2 and 1:3 uranyl sulphato- and selenato- complexes in aqueos solutions will be given. The relativistic effects are included via Effective Core Potential (ECP, electron correlation via (TDDFT/B3LYP (dispersion interaction corrected and solvation is described via explicit inclusion of one hydration sphere beyond the coordinated water molecules. We acknowledge limits of this approximate description – more accurate calculations (ranging from semi-phenomenological two-component spin-orbit coupling up to four-component Dirac-Coulomb-Breit hamiltonian and Molecular Dynamics simulations are in preparation. The computational results are compared with the experimental results from Time-resolved Laser-induced Fluorescence Spectroscopy (TRLFS and UV-VIS spectroscopic studies (including our original experimental research on this topic. In case of the TRLFS and UV-VIS speciation studies, the problem of complex solution spectra decomposition into individual components is ill-conditioned and hints from theoretical chemistry could be very important. Qualitative agreement between our quantum chemical calculations of the spectroscopic properties and experimental data was achieved. Possible applications for geochemical modelling (e.g. safety studies of nuclear waste repositories, modelling of a future mining site and analytical chemical studies (including natural samples are discussed.

  10. Measurement of parameters of extracted beams of charged particles at the LVE accelerating complex; Izmerenie parametrov vyvedennykh puchkov zaryazhennykh chastits na uskoritel`nom komplekse LVEh

    Energy Technology Data Exchange (ETDEWEB)

    Balandikov, A N; Volkov, V I; Gorchenko, V M [and others

    1996-12-31

    Paper described equipment to measure intensity and space parameters of charged particle beams to be output from the JINR synchrophasotron. Equipment of preliminary recording of signals from multiwire ionization chambers was developed to measure space parameters of beams. 6 refs.; 5 figs.

  11. Solution of complex measuring problems for automation of a scientific experiment and technological processes; Reshenie slozhnykh izmeritel`nykh problem pri avtomatizatsii nauchnogo ehksperimenta i tekhnologicheskikh protsessov

    Energy Technology Data Exchange (ETDEWEB)

    Gribov, A A; Zhukov, V A; Sdobnov, S I; Yakovlev, G V [Rossijskij Nauchnyj Tsentr Kurchatovskij Inst., Moskva (Russian Federation)

    1996-12-31

    Paper discusses problems linked with automation of reactor measurements. Paper describes automated system to carry out neutron-physical experiments linked with measuring of slowly varying current of ionization chambers. The system is based on the trunk-module principle with application of a specialized 16-discharge trunk. Total information capacity for one current channel constitutes 5 bytes. 4 refs.; 1 fig.

  12. Managing Complexity

    DEFF Research Database (Denmark)

    Maylath, Bruce; Vandepitte, Sonia; Minacori, Patricia

    2013-01-01

    and into French. The complexity of the undertaking proved to be a central element in the students' learning, as the collaboration closely resembles the complexity of international documentation workplaces of language service providers. © Association of Teachers of Technical Writing.......This article discusses the largest and most complex international learning-by-doing project to date- a project involving translation from Danish and Dutch into English and editing into American English alongside a project involving writing, usability testing, and translation from English into Dutch...

  13. Complex variables

    CERN Document Server

    Fisher, Stephen D

    1999-01-01

    The most important topics in the theory and application of complex variables receive a thorough, coherent treatment in this introductory text. Intended for undergraduates or graduate students in science, mathematics, and engineering, this volume features hundreds of solved examples, exercises, and applications designed to foster a complete understanding of complex variables as well as an appreciation of their mathematical beauty and elegance. Prerequisites are minimal; a three-semester course in calculus will suffice to prepare students for discussions of these topics: the complex plane, basic

  14. Some features of formation and dissolution of a series of Pu(IV) and Zr alkyl and butyl alkyl phosphates in the system TBP -n-dodecane - nitric acid - water

    International Nuclear Information System (INIS)

    Markov, G.S.; Moshkov, M.M.; Kokina, S.A.

    1990-01-01

    The formation and composition of salts produced on interaction of a series of alkyl- and butylalkylphosphoric acids having alkyl radical chain lengths from C 4 to C 1 0 with Pu(IV) and Zr in organic and aqueous phases of the system TBP - n-dodecane -nitric acid - water were studied. The composition of compounds was found to depend on the conditions of their formation, defined first of all by the HNO 3 concentration in aqueous and organic phases. (author) 12 refs.; 4 figs.; 1 tab

  15. The Acute Effect of Upper-Body Complex Training on Power Output of Martial Art Athletes as Measured by the Bench Press Throw Exercise

    Science.gov (United States)

    Liossis, Loudovikos Dimitrios; Forsyth, Jacky; Liossis, Ceorge; Tsolakis, Charilaos

    2013-01-01

    The purpose of this study was to examine the acute effect of upper body complex training on power output, as well as to determine the requisite preload intensity and intra-complex recovery interval needed to induce power output increases. Nine amateur-level combat/martial art athletes completed four distinct experimental protocols, which consisted of 5 bench press repetitions at either: 65% of one-repetition maximum (1RM) with a 4 min rest interval; 65% of 1RM with an 8 min rest; 85% of 1RM with a 4 min rest; or 85% of 1RM with an 8 min rest interval, performed on different days. Before (pre-conditioning) and after (post-conditioning) each experimental protocol, three bench press throws at 30% of 1RM were performed. Significant differences in power output pre-post conditioning were observed across all experimental protocols (F=26.489, partial eta2=0.768, p=0.001). Mean power output significantly increased when the preload stimulus of 65% 1RM was matched with 4 min of rest (p=0.001), and when the 85% 1RM preload stimulus was matched with 8 min of rest (p=0.001). Moreover, a statistically significant difference in power output was observed between the four conditioning protocols (F= 21.101, partial eta2=0.913, p=0.001). It was concluded that, in complex training, matching a heavy preload stimulus with a longer rest interval, and a lighter preload stimulus with a shorter rest interval is important for athletes wishing to increase their power production before training or competition. PMID:24511352

  16. Softball Complex

    Science.gov (United States)

    Ellis, Jim

    1977-01-01

    The Parks and Recreation Department of Montgomery, Alabama, has developed a five-field softball complex as part of a growing community park with facilities for camping, golf, aquatics, tennis, and picnicking. (MJB)

  17. Lecithin Complex

    African Journals Online (AJOL)

    1Department of Food Science and Engineering, Xinyang College of Agriculture and ... Results: The UV and IR spectra of the complex showed an additive effect of polydatin-lecithin, in which .... Monochromatic Cu Ka radiation (wavelength =.

  18. An elongated model of the Xenopus laevis transcription factor IIIA-5S ribosomal RNA complex derived from neutron scattering and hydrodynamic measurements

    International Nuclear Information System (INIS)

    Timmins, P.A.; Langowski, J.; Brown, R.S.

    1988-01-01

    The precise molecular composition of the Xenopus laevis TFIIIA-5S ribosomal RNA complex (7S particle) has been established from small angle neutron and dynamic light scattering. The molecular weight of the particle was found to be 95,700±10,000 and 86,700±9,000 daltons from these two methods respectively. The observed match point of 54.4% D 2 O obtained from contrast variation experiments indicates a 1:1 molar ratio. It is concluded that only a single molecule of TFIIIA, a zinc-finger protein, and of 5S RNA are present in this complex. A simple elongated cylindrical model with dimensions of 140 angstrom length and 59 angstrom diameter is compatible with the neutron results. A globular model can be excluded by the shallow nature of the neutron scattering curves. It is proposed that the observed difference of 15 angstrom in length between the 7S particle and isolated 5S RNA most likely indicates that part(s) of the protein protrudes from the end(s) of the RNA molecule. There is no biochemical evidence for any gross alteration in 5S RNA conformation upon binding to TFIIIA

  19. Comparison of net CO2 fluxes measured with open- and closed-path infrared gas analyzers in an urban complex environment

    DEFF Research Database (Denmark)

    Järvi, L.; Mammarella, I.; Eugster, W.

    2009-01-01

    and their suitability to accurately measure CO2 exchange in such non-ideal landscape. In addition, this study examined the effect of open-path sensor heating on measured fluxes in urban terrain, and these results were compared with similar measurements made above a temperate beech forest in Denmark. The correlation...... between the two fluxes was good (R2 = 0.93) at the urban site, but during the measurement period the open-path net surface exchange (NSE) was 17% smaller than the closed-path NSE, indicating apparent additional uptake of CO2 by open-path measurements. At both sites, sensor heating corrections evidently...... improved the performance of the open-path analyzer by reducing discrepancies in NSE at the urban site to 2% and decreasing the difference in NSE from 67% to 7% at the forest site. Overall, the site-specific approach gave the best results at both sites and, if possible, it should be preferred in the sensor...

  20. In the search for the low-complexity sequences in prokaryotic and eukaryotic genomes: how to derive a coherent picture from global and local entropy measures

    Energy Technology Data Exchange (ETDEWEB)

    Acquisti, Claudia; Allegrini, Paolo E-mail: allegrip@ilc.cnr.it; Bogani, Patrizia; Buiatti, Marcello; Catanese, Elena; Fronzoni, Leone; Grigolini, Paolo; Mersi, Giuseppe; Palatella, Luigi

    2004-04-01

    We investigate on a possible way to connect the presence of low-complexity sequences (LCS) in DNA genomes and the non-stationary properties of base correlations. Under the hypothesis that these variations signal a change in the DNA function, we use a new technique, called non-stationarity entropic index (NSEI) method, and we prove that this technique is an efficient way to detect functional changes with respect to a random baseline. The remarkable aspect is that NSEI does not imply any training data or fitting parameter, the only arbitrarity being the choice of a marker in the sequence. We make this choice on the basis of biological information about LCS distributions in genomes. We show that there exists a correlation between changing the amount in LCS and the ratio of long- to short-range correlation.

  1. In the search for the low-complexity sequences in prokaryotic and eukaryotic genomes: how to derive a coherent picture from global and local entropy measures

    Science.gov (United States)

    Acquisti, Claudia; Allegrini, Paolo; Bogani, Patrizia; Buiatti, Marcello; Catanese, Elena; Fronzoni, Leone; Grigolini, Paolo; Mersi, Giuseppe; Palatella, Luigi

    2004-04-01

    We investigate on a possible way to connect the presence of Low-Complexity Sequences (LCS) in DNA genomes and the nonstationary properties of base correlations. Under the hypothesis that these variations signal a change in the DNA function, we use a new technique, called Non-Stationarity Entropic Index (NSEI) method, and we prove that this technique is an efficient way to detect functional changes with respect to a random baseline. The remarkable aspect is that NSEI does not imply any training data or fitting parameter, the only arbitrarity being the choice of a marker in the sequence. We make this choice on the basis of biological information about LCS distributions in genomes. We show that there exists a correlation between changing the amount in LCS and the ratio of long- to short-range correlation.

  2. In the search for the low-complexity sequences in prokaryotic and eukaryotic genomes: how to derive a coherent picture from global and local entropy measures

    International Nuclear Information System (INIS)

    Acquisti, Claudia; Allegrini, Paolo; Bogani, Patrizia; Buiatti, Marcello; Catanese, Elena; Fronzoni, Leone; Grigolini, Paolo; Mersi, Giuseppe; Palatella, Luigi

    2004-01-01

    We investigate on a possible way to connect the presence of low-complexity sequences (LCS) in DNA genomes and the non-stationary properties of base correlations. Under the hypothesis that these variations signal a change in the DNA function, we use a new technique, called non-stationarity entropic index (NSEI) method, and we prove that this technique is an efficient way to detect functional changes with respect to a random baseline. The remarkable aspect is that NSEI does not imply any training data or fitting parameter, the only arbitrarity being the choice of a marker in the sequence. We make this choice on the basis of biological information about LCS distributions in genomes. We show that there exists a correlation between changing the amount in LCS and the ratio of long- to short-range correlation

  3. Sum frequency generation vibrational spectroscopy (SFG-VS) for complex molecular surfaces and interfaces: Spectral lineshape measurement and analysis plus some controversial issues

    Science.gov (United States)

    Wang, Hong-Fei

    2016-12-01

    Sum-frequency generation vibrational spectroscopy (SFG-VS) was first developed in the 1980s and it has been proven a uniquely sensitive and surface/interface selective spectroscopic probe for characterization of the structure, conformation and dynamics of molecular surfaces and interfaces. In recent years, there have been many progresses in the development of methodology and instrumentation in the SFG-VS toolbox that have significantly broadened the application to complex molecular surfaces and interfaces. In this review, after presenting a unified view on the theory and methodology focusing on the SFG-VS spectral lineshape, as well as the new opportunities in SFG-VS applications with such developments, some of the controversial issues that have been puzzling the community are discussed. The aim of this review is to present to the researchers and students interested in molecular surfaces and interfacial sciences up-to-date perspectives complementary to the existing textbooks and reviews on SFG-VS.

  4. Rate Measurements of the Hydrolysis of Complex Organic Macromolecules in Cold Aqueous Solutions: Implications for Prebiotic Chemistry on the Early Earth and Titan

    Science.gov (United States)

    Neish, C. D.; Somogyi, Á.; Imanaka, H .; Lunine, J. I.; Smith, M. A.

    2008-04-01

    Organic macromolecules (``complex tholins'') were synthesized from a 0.95 N2 / 0.05 CH4 atmosphere in a high-voltage AC flow discharge reactor. When placed in liquid water, specific water soluble compounds in the macromolecules demonstrated Arrhenius type first order kinetics between 273 and 313 K and produced oxygenated organic species with activation energies in the range of ~60 +/- 10 kJ mol-1. These reactions displayed half lives between 0.3 and 17 days at 273 K. Oxygen incorporation into such materials-a necessary step toward the formation of biological molecules-is therefore fast compared to processes that occur on geologic timescales, which include the freezing of impact melt pools and possible cryovolcanic sites on Saturn's organic-rich moon Titan.

  5. Complex analysis

    CERN Document Server

    Freitag, Eberhard

    2005-01-01

    The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...

  6. DTI measures identify mild and moderate TBI cases among patients with complex health problems: A receiver operating characteristic analysis of U.S. veterans.

    Science.gov (United States)

    Main, Keith L; Soman, Salil; Pestilli, Franco; Furst, Ansgar; Noda, Art; Hernandez, Beatriz; Kong, Jennifer; Cheng, Jauhtai; Fairchild, Jennifer K; Taylor, Joy; Yesavage, Jerome; Wesson Ashford, J; Kraemer, Helena; Adamson, Maheen M

    2017-01-01

    Standard MRI methods are often inadequate for identifying mild traumatic brain injury (TBI). Advances in diffusion tensor imaging now provide potential biomarkers of TBI among white matter fascicles (tracts). However, it is still unclear which tracts are most pertinent to TBI diagnosis. This study ranked fiber tracts on their ability to discriminate patients with and without TBI. We acquired diffusion tensor imaging data from military veterans admitted to a polytrauma clinic (Overall n  = 109; Age: M  = 47.2, SD  = 11.3; Male: 88%; TBI: 67%). TBI diagnosis was based on self-report and neurological examination. Fiber tractography analysis produced 20 fiber tracts per patient. Each tract yielded four clinically relevant measures (fractional anisotropy, mean diffusivity, radial diffusivity, and axial diffusivity). We applied receiver operating characteristic (ROC) analyses to identify the most diagnostic tract for each measure. The analyses produced an optimal cutpoint for each tract. We then used kappa coefficients to rate the agreement of each cutpoint with the neurologist's diagnosis. The tract with the highest kappa was most diagnostic. As a check on the ROC results, we performed a stepwise logistic regression on each measure using all 20 tracts as predictors. We also bootstrapped the ROC analyses to compute the 95% confidence intervals for sensitivity, specificity, and the highest kappa coefficients. The ROC analyses identified two fiber tracts as most diagnostic of TBI: the left cingulum (LCG) and the left inferior fronto-occipital fasciculus (LIF). Like ROC, logistic regression identified LCG as most predictive for the FA measure but identified the right anterior thalamic tract (RAT) for the MD, RD, and AD measures. These findings are potentially relevant to the development of TBI biomarkers. Our methods also demonstrate how ROC analysis may be used to identify clinically relevant variables in the TBI population.

  7. Unifying Complexity and Information

    Science.gov (United States)

    Ke, Da-Guan

    2013-04-01

    Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.

  8. Equivalent complex conductivities representing the effects of T-tubules and folded surface membranes on the electrical admittance and impedance of skeletal muscles measured by external-electrode method

    Science.gov (United States)

    Sekine, Katsuhisa

    2017-12-01

    In order to represent the effects of T-tubules and folded surface membranes on the electrical admittance and impedance of skeletal muscles measured by the external-electrode method, analytical relations for the equivalent complex conductivities of hypothetical smooth surface membranes were derived. In the relations, the effects of each tubule were represented by the admittance of a straight cable. The effects of the folding of a surface membrane were represented by the increased area of surface membranes. The equivalent complex conductivities were represented as summation of these effects, and the effects of the T-tubules were different between the transversal and longitudinal directions. The validity of the equivalent complex conductivities was supported by the results of finite-difference method (FDM) calculations made using three-dimensional models in which T-tubules and folded surface membranes were represented explicitly. FDM calculations using the equivalent complex conductivities suggested that the electrically inhomogeneous structure due to the existence of muscle cells with T-tubules was sufficient for explaining the experimental results previously obtained using the external-electrode method. Results of FDM calculations in which the structural changes caused by muscle contractions were taken into account were consistent with the reported experimental results.

  9. Subgroup complexes

    CERN Document Server

    Smith, Stephen D

    2011-01-01

    This book is intended as an overview of a research area that combines geometries for groups (such as Tits buildings and generalizations), topological aspects of simplicial complexes from p-subgroups of a group (in the spirit of Brown, Quillen, and Webb), and combinatorics of partially ordered sets. The material is intended to serve as an advanced graduate-level text and partly as a general reference on the research area. The treatment offers optional tracks for the reader interested in buildings, geometries for sporadic simple groups, and G-equivariant equivalences and homology for subgroup complexes.

  10. Complex manifolds

    CERN Document Server

    Morrow, James

    2006-01-01

    This book, a revision and organization of lectures given by Kodaira at Stanford University in 1965-66, is an excellent, well-written introduction to the study of abstract complex (analytic) manifolds-a subject that began in the late 1940's and early 1950's. It is largely self-contained, except for some standard results about elliptic partial differential equations, for which complete references are given. -D. C. Spencer, MathSciNet The book under review is the faithful reprint of the original edition of one of the most influential textbooks in modern complex analysis and geometry. The classic

  11. Adrenal-kidney-gonad complex measurements may not predict gonad-specific changes in gene expression patterns during temperature-dependent sex determination in the red-eared slider turtle (Trachemys scripta elegans).

    Science.gov (United States)

    Ramsey, Mary; Crews, David

    2007-08-01

    Many turtles, including the red-eared slider turtle (Trachemys scripta elegans) have temperature-dependent sex determination in which gonadal sex is determined by temperature during the middle third of incubation. The gonad develops as part of a heterogenous tissue complex that comprises the developing adrenal, kidney, and gonad (AKG complex). Owing to the difficulty in excising the gonad from the adjacent tissues, the AKG complex is often used as tissue source in assays examining gene expression in the developing gonad. However, the gonad is a relatively small component of the AKG, and gene expression in the adrenal-kidney (AK) compartment may interfere with the detection of gonad-specific changes in gene expression, particularly during early key phases of gonadal development and sex determination. In this study, we examine transcript levels as measured by quantitative real-time polymerase chain reaction for five genes important in slider turtle sex determination and differentiation (AR, ERalpha, ERbeta, aromatase, and Sf1) in AKG, AK, and isolated gonad tissues. In all cases, gonad-specific gene expression patterns were attenuated in AKG versus gonad tissue. All five genes were expressed in the AK in addition to the gonad at all stages/temperatures. Inclusion of the AK compartment masked important changes in gonadal gene expression. In addition, AK and gonad expression patterns are not additive, and gonadal gene expression cannot be predicted from intact AKG measurements. (c) 2007 Wiley-Liss, Inc.

  12. Size optimization for complex permeability measurement of magnetic thin films using a short-circuited microstrip line up to 30 GHz

    Science.gov (United States)

    Takeda, Shigeru; Naoe, Masayuki

    2018-03-01

    High-frequency permeability spectra of magnetic films were measured over a wideband frequency range of 0.1-30 GHz using a shielded and short-circuited microstrip line jig. In this measurement, spurious resonances had to be suppressed up to the highest frequency. To suppress these resonances, characteristic impedance of the microstrip line should approach 50 Ω at the junction between connector and microstrip line. The main factors dominating these resonances were structures of the jig and the sample. The dimensions were optimized in various experiments, and results demonstrated that the frequency could be raised to at least 20 GHz. For the transverse electromagnetic mode to transmit stably along the microstrip line, the preferred sample was rectangular, with the shorter side parallel to the line and the longer side perpendicular to it, and characteristic impedance strongly depended on the signal line width of the jig. However, too small a jig and sample led to a lower S/N ratio.

  13. Complex Networks

    CERN Document Server

    Evsukoff, Alexandre; González, Marta

    2013-01-01

    In the last decade we have seen the emergence of a new inter-disciplinary field focusing on the understanding of networks which are dynamic, large, open, and have a structure sometimes called random-biased. The field of Complex Networks is helping us better understand many complex phenomena such as the spread of  deseases, protein interactions, social relationships, to name but a few. Studies in Complex Networks are gaining attention due to some major scientific breakthroughs proposed by network scientists helping us understand and model interactions contained in large datasets. In fact, if we could point to one event leading to the widespread use of complex network analysis is the availability of online databases. Theories of Random Graphs from Erdös and Rényi from the late 1950s led us to believe that most networks had random characteristics. The work on large online datasets told us otherwise. Starting with the work of Barabási and Albert as well as Watts and Strogatz in the late 1990s, we now know th...

  14. Cyclomatic Complexity: theme and variations

    Directory of Open Access Journals (Sweden)

    Brian Henderson-Sellers

    1993-11-01

    Full Text Available Focussing on the "McCabe family" of measures for the decision/logic structure of a program, leads to an evaluation of extensions to modularization, nesting and, potentially, to object-oriented program structures. A comparison of rated, operating and essential complexities of programs suggests two new metrics: "inessential complexity" as a measure of unstructuredness and "product complexity" as a potential objective measure of structural complexity. Finally, nesting and abstraction levels are considered, especially as to how metrics from the "McCabe family" might be applied in an object-oriented systems development environment.

  15. Measurements of thermodynamic constants of transuranic compounds to predict their geochemistry: Carbonate complexation of Np(V) and Am(III), hydrolysis of Pu(VI) and AM(III)

    International Nuclear Information System (INIS)

    Vitorge, P.

    1984-02-01

    This work is a part of a project to build a waste disposal and to model the behaviour of the radioactive elements. The contribution consists in the prediction of the geochemistry of actinides and the measurement of thermodynamic equilibrium constants, solubility products and redox potentials (E) used in the migration model. It has shown that literature can provide values for some of these constants and proposed theoretical E(pH) diagrams and solubilities of U, Np, Pu, Am. All transport will occur essentially in the aqueous phase leaching the repository site, where hydrolysis and carbonate complexation are important chemical reactions. The hydrolysis and carbonate complexation constants have already been reviewed. The experimental results are compared with the literature data which are usually chosen to predict the solubilities and the proportions of the different forms of these actinides in natural groundwaters

  16. Long-Term Hydrologic Impact Assessment of Non-point Source Pollution Measured Through Land Use/Land Cover (LULC) Changes in a Tropical Complex Catchment

    Science.gov (United States)

    Abdulkareem, Jabir Haruna; Sulaiman, Wan Nor Azmin; Pradhan, Biswajeet; Jamil, Nor Rohaizah

    2018-05-01

    The contribution of non-point source pollution (NPS) to the contamination of surface water is an issue of growing concern. Non-point source (NPS) pollutants are of various types and altered by several site-specific factors making them difficult to control due to complex uncertainties involve in their behavior. Kelantan River basin, Malaysia is a tropical catchment receiving heavy monsoon rainfall coupled with intense land use/land cover (LULC) changes making the area consistently flood prone thereby deteriorating the surface water quality in the area. This study was conducted to determine the spatio-temporal variation of NPS pollutant loads among different LULC changes and to establish a NPS pollutant loads relationships among LULC conditions and sub-basins in each catchment. Four pollutants parameters such as total suspended solids (TSS), total phosphorus (TP), total nitrogen (TN) and ammonia nitrogen (AN) were chosen with their corresponding event mean concentration values (EMC). Soil map and LULC change maps corresponding to 1984, 2002 and 2013 were used for the calculation of runoff and NPS pollutant loads using numeric integration in a GIS environment. Analysis of Variance (ANOVA) was conducted for the comparison of NPS pollutant loads among the three LULC conditions used and the sub-basins in each catchment. The results showed that the spatio-temporal variation of pollutant loads in almost all the catchments increased with changes in LULC condition as one moves from 1984 to 2013, with 2013 LULC condition found as the dominant in almost all cases. NPS pollutant loads among different LULC changes also increased with changes in LULC condition from 1984 to 2013. While urbanization was found to be the dominant LULC change with the highest pollutant load in all the catchments. Results from ANOVA reveals that statistically most significant ( p changes on NPS pollution. The findings of this study may be useful to water resource planners in controlling water pollution

  17. Distribution equilibria of Eu(III) in the system: bis(2-ethylhexyl)phosphoric acid organic diluent-NaCl, lactic acid, polyaminocarboxylic acid, water

    International Nuclear Information System (INIS)

    Danesi, P.R.; Cianetti, C.; Horwitz, E.P.

    1982-01-01

    The distribution equilibria of Eu 3+ between aqueous phases containing lactic acid and N'-(2hydroxyethyl)ethylenediamine-N,N,N'-triacetic acid (HEDTA) or diethylenetriamine-N,N,N',N',N''-penetaacetic acid (DTPA) at constant ionic strength (μ = 1.0), and n-dodecane solutions of HDEHP have been studied. The formation constants of the simple Eu-lactate complexes and Eu-lactate-HEDTA mixed complex were evaluated from the k/sub d/ data. The conclusion is reached that no lactic acid is coextracted into the organic phase at tracer metal concentrations. The separation factors between Eu 3+ , Pm 3+ , and Am 3+ have been evaluated in the presence of HEDTA

  18. Complexity and Dynamical Depth

    Directory of Open Access Journals (Sweden)

    Terrence Deacon

    2014-07-01

    Full Text Available We argue that a critical difference distinguishing machines from organisms and computers from brains is not complexity in a structural sense, but a difference in dynamical organization that is not well accounted for by current complexity measures. We propose a measure of the complexity of a system that is largely orthogonal to computational, information theoretic, or thermodynamic conceptions of structural complexity. What we call a system’s dynamical depth is a separate dimension of system complexity that measures the degree to which it exhibits discrete levels of nonlinear dynamical organization in which successive levels are distinguished by local entropy reduction and constraint generation. A system with greater dynamical depth than another consists of a greater number of such nested dynamical levels. Thus, a mechanical or linear thermodynamic system has less dynamical depth than an inorganic self-organized system, which has less dynamical depth than a living system. Including an assessment of dynamical depth can provide a more precise and systematic account of the fundamental difference between inorganic systems (low dynamical depth and living systems (high dynamical depth, irrespective of the number of their parts and the causal relations between them.

  19. Long-Term Hydrologic Impact Assessment of Non-point Source Pollution Measured Through Land Use/Land Cover (LULC) Changes in a Tropical Complex Catchment

    Science.gov (United States)

    Abdulkareem, Jabir Haruna; Sulaiman, Wan Nor Azmin; Pradhan, Biswajeet; Jamil, Nor Rohaizah

    2018-03-01

    The contribution of non-point source pollution (NPS) to the contamination of surface water is an issue of growing concern. Non-point source (NPS) pollutants are of various types and altered by several site-specific factors making them difficult to control due to complex uncertainties involve in their behavior. Kelantan River basin, Malaysia is a tropical catchment receiving heavy monsoon rainfall coupled with intense land use/land cover (LULC) changes making the area consistently flood prone thereby deteriorating the surface water quality in the area. This study was conducted to determine the spatio-temporal variation of NPS pollutant loads among different LULC changes and to establish a NPS pollutant loads relationships among LULC conditions and sub-basins in each catchment. Four pollutants parameters such as total suspended solids (TSS), total phosphorus (TP), total nitrogen (TN) and ammonia nitrogen (AN) were chosen with their corresponding event mean concentration values (EMC). Soil map and LULC change maps corresponding to 1984, 2002 and 2013 were used for the calculation of runoff and NPS pollutant loads using numeric integration in a GIS environment. Analysis of Variance (ANOVA) was conducted for the comparison of NPS pollutant loads among the three LULC conditions used and the sub-basins in each catchment. The results showed that the spatio-temporal variation of pollutant loads in almost all the catchments increased with changes in LULC condition as one moves from 1984 to 2013, with 2013 LULC condition found as the dominant in almost all cases. NPS pollutant loads among different LULC changes also increased with changes in LULC condition from 1984 to 2013. While urbanization was found to be the dominant LULC change with the highest pollutant load in all the catchments. Results from ANOVA reveals that statistically most significant (p < 0.05) pollutant loads were obtained from 2013 LULC conditions, while statistically least significant (p < 0.05) pollutant

  20. A simple measure with complex determinants: investigation of the correlates of self-rated health in older men and women from three continents

    Directory of Open Access Journals (Sweden)

    French Davina J

    2012-08-01

    consider earlier life experiences of cohorts as well as national and individual factors in later life. Further research is required to understand the complex societal influences on perceptions of health.

  1. Complex Networks IX

    CERN Document Server

    Coronges, Kate; Gonçalves, Bruno; Sinatra, Roberta; Vespignani, Alessandro; Proceedings of the 9th Conference on Complex Networks; CompleNet 2018

    2018-01-01

    This book aims to bring together researchers and practitioners working across domains and research disciplines to measure, model, and visualize complex networks. It collects the works presented at the 9th International Conference on Complex Networks (CompleNet) 2018 in Boston, MA in March, 2018. With roots in physical, information and social science, the study of complex networks provides a formal set of mathematical methods, computational tools and theories to describe prescribe and predict dynamics and behaviors of complex systems. Despite their diversity, whether the systems are made up of physical, technological, informational, or social networks, they share many common organizing principles and thus can be studied with similar approaches. This book provides a view of the state-of-the-art in this dynamic field and covers topics such as group decision-making, brain and cellular connectivity, network controllability and resiliency, online activism, recommendation systems, and cyber security.

  2. Complexities of bloom dynamics in the toxic dinoflagellate Alexandrium fundyense revealed through DNA measurements by imaging flow cytometry coupled with species-specific rRNA probes

    Science.gov (United States)

    Brosnahan, Michael L.; Farzan, Shahla; Keafer, Bruce A.; Sosik, Heidi M.; Olson, Robert J.; Anderson, Donald M.

    2014-05-01

    Measurements of the DNA content of different protist populations can shed light on a variety of processes, including cell division, sex, prey ingestion, and parasite invasion. Here, we modified an Imaging FlowCytobot (IFCB), a custom-built flow cytometer that records images of microplankton, to measure the DNA content of large dinoflagellates and other high-DNA content species. The IFCB was also configured to measure fluorescence from Cy3-labeled rRNA probes, aiding the identification of Alexandrium fundyense (syn. A. tamarense Group I), a photosynthetic dinoflagellate that causes paralytic shellfish poisoning (PSP). The modified IFCB was used to analyze samples from the development, peak and termination phases of an inshore A. fundyense bloom (Salt Pond, Eastham, MA, USA), and from a rare A. fundyense ‘red tide’ that occurred in the western Gulf of Maine, offshore of Portsmouth, NH (USA). Diploid or G2 phase (‘2C’) A. fundyense cells were frequently enriched at the near-surface, suggesting an important role for aggregation at the air-sea interface during sexual events. Also, our analysis showed that large proportions of A. fundyense cells in both the Salt Pond and red tide blooms were planozygotes during bloom decline, highlighting the importance of sexual fusion to bloom termination. At Salt Pond, bloom decline also coincided with a dramatic rise in infections by the parasite genus Amoebophrya. The samples that were most heavily infected contained many large cells with higher DNA-associated fluorescence than 2C vegetative cells, but these cells' nuclei were also frequently consumed by Amoebophrya trophonts. Neither large cell size nor increased DNA-associated fluorescence could be replicated by infecting an A. fundyense culture of vegetative cells. Therefore, we attribute these characteristics of the large Salt Pond cells to planozygote maturation rather than Amoebophrya infection, though an interaction between infection and planozygote maturation may

  3. Elastic properties and seismic anisotropy of the Seve Nappe Complex - Laboratory core measurements from the International Continental Drilling Project COSC-1 well, Åre, Sweden

    Science.gov (United States)

    Wenning, Q. C.; Almqvist, B. S. G.; Zappone, A. S.

    2015-12-01

    The COSC-1 scientific borehole was drilled in the summer of 2014 to ~2.5 km depth to study the structure and composition of the Middle Allochthon of the Central Scandinavian Caledonides. It crosscuts the amphibolite-grade lower part of the Seve nappe and intersects a mylonite zone in the lower 800 m of the borehole. We selected six core samples representing the primary lithologies in the COSC-1 borehole for laboratory investigation of elastic properties. The cores consisted of two amphibolites with differing grain sizes, a calc-silicate gneiss, a felsic gneiss, a coarse grained amphibole bearing gneiss, and a garnet bearing mylonitic schist from the basal shear zone. Both P- and S-waves were measured at ultrasonic frequency (1 MHz), and room temperature hydrostatic pressure conditions up to 260 MPa. Measurements were made along three mutually perpendicular directions, one perpendicular to foliation and two parallel to the foliation with one aligned with mineral lineation. Vp and Vs, anisotropy, and elastic properties are reported as an extrapolation of the high-pressure portion of the ultrasonic measurements back to the intersection with the zero pressure axis. The Vp and Vs in the direction perpendicular to foliation ranges from 5.51-6.67 km/s and 3.18-4.13 km/s, respectively. In the direction parallel to foliation the Vp and Vs ranges from 6.31-7.25 km/s and 3.52-4.35 km/s, respectively. Vp anisotropy ranges from 3% in the calc-silicate gneiss to 18% in mylonitic schist. Acoustic impedance estimations at lithostatic pressure conditions at base of the borehole (70 MPa) show that acoustic impedance contrast generating reflection coefficients between the basal shear zone and overlying units are significant enough to cause seismic reflections. Above the mylonite zone/shear zone, the reflectivity within the lower Seve nappe is due to the impedance contrast between the felsic gneiss and the amphibolite. This result fits with 3D seismic reflection imaging in the area of

  4. Managing Complexity

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Posse, Christian; Malard, Joel M.

    2004-08-01

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today’s most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically-based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This paper explores the state of the art in the use physical analogs for understanding the behavior of some econophysical systems and to deriving stable and robust control strategies for them. In particular we review and discussion applications of some analytic methods based on the thermodynamic metaphor according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood.

  5. Complex refractive index measurements for BaF 2 and CaF 2 via single-angle infrared reflectance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Kelly-Gorham, Molly Rose K.; DeVetter, Brent M.; Brauer, Carolyn S.; Cannon, Bret D.; Burton, Sarah D.; Bliss, Mary; Johnson, Timothy J.; Myers, Tanya L.

    2017-10-01

    We have re-investigated the optical constants n and k for the homologous series of inorganic salts barium fluoride (BaF2) and calcium fluoride (CaF2) using a single-angle near-normal incidence reflectance device in combination with a calibrated Fourier transform infrared (FTIR) spectrometer. Our results are in good qualitative agreement with most previous works. However, certain features of the previously published data near the reststrahlen band exhibit distinct differences in spectral characteristics. Notably, our measurements of BaF2 do not include a spectral feature in the ~250 cm-1 reststrahlen band that was previously published. Additionally, CaF2 exhibits a distinct wavelength shift relative to the model derived from previously published data. We confirmed our results with recently published works that use significantly more modern instrumentation and data reduction techniques

  6. Management of complex fisheries

    DEFF Research Database (Denmark)

    Frost, Hans Staby; Andersen, Peder; Hoff, Ayoe

    2013-01-01

    , including taking into account the response of the fishermen to implemented management measures. To demonstrate the use of complex management models this paper assesses a number of second best management schemes against a first rank optimum (FRO), an ideal individual transferable quotas (ITQ) system...

  7. Estimating functional cognition in older adults using observational assessments of task performance in complex everyday activities: A systematic review and evaluation of measurement properties.

    Science.gov (United States)

    Wesson, Jacqueline; Clemson, Lindy; Brodaty, Henry; Reppermund, Simone

    2016-09-01

    Functional cognition is a relatively new concept in assessment of older adults with mild cognitive impairment or dementia. Instruments need to be reliable and valid, hence we conducted a systematic review of observational assessments of task performance used to estimate functional cognition in this population. Two separate database searches were conducted: firstly to identify instruments; and secondly to identify studies reporting on the psychometric properties of the instruments. Studies were analysed using a published checklist and their quality reviewed according to specific published criteria. Clinical utility was reviewed and the information formulated into a best evidence synthesis. We found 21 instruments and included 58 studies reporting on measurement properties. The majority of studies were rated as being of fair methodological quality and the range of properties investigated was restricted. Most instruments had studies reporting on construct validity (hypothesis testing), none on content validity and there were few studies reporting on reliability. Overall the evidence on psychometric properties is lacking and there is an urgent need for further evaluation of instruments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. On the Use of Molecular Weight Cutoff Cassettes to Measure Dynamic Relaxivity of Novel Gadolinium Contrast Agents: Example Using Hyaluronic Acid Polymer Complexes in Phosphate-Buffered Saline

    Directory of Open Access Journals (Sweden)

    Nima Kasraie

    2011-01-01

    Full Text Available The aims of this study were to determine whether standard extracellular contrast agents of Gd(III ions in combination with a polymeric entity susceptible to hydrolytic degradation over a finite period of time, such as Hyaluronic Acid (HA, have sufficient vascular residence time to obtain comparable vascular imaging to current conventional compounds and to obtain sufficient data to show proof of concept that HA with Gd-DTPA ligands could be useful as vascular imaging agents. We assessed the dynamic relaxivity of the HA bound DTPA compounds using a custom-made phantom, as well as relaxation rates at 10.72 MHz with concentrations ranging between 0.09 and 7.96 mM in phosphate-buffered saline. Linear dependences of static longitudinal relaxation rate (R1 on concentration were found for most measured samples, and the HA samples continued to produce high signal strength after 24 hours after injection into a dialysis cassette at 3T, showing superior dynamic relaxivity values compared to conventional contrast media such as Gd-DTPA-BMA.

  9. Hydrogen storage and evolution catalysed by metal hydride complexes.

    Science.gov (United States)

    Fukuzumi, Shunichi; Suenobu, Tomoyoshi

    2013-01-07

    The storage and evolution of hydrogen are catalysed by appropriate metal hydride complexes. Hydrogenation of carbon dioxide by hydrogen is catalysed by a [C,N] cyclometalated organoiridium complex, [Ir(III)(Cp*)(4-(1H-pyrazol-1-yl-κN(2))benzoic acid-κC(3))(OH(2))](2)SO(4) [Ir-OH(2)](2)SO(4), under atmospheric pressure of H(2) and CO(2) in weakly basic water (pH 7.5) at room temperature. The reverse reaction, i.e., hydrogen evolution from formate, is also catalysed by [Ir-OH(2)](+) in acidic water (pH 2.8) at room temperature. Thus, interconversion between hydrogen and formic acid in water at ambient temperature and pressure has been achieved by using [Ir-OH(2)](+) as an efficient catalyst in both directions depending on pH. The Ir complex [Ir-OH(2)](+) also catalyses regioselective hydrogenation of the oxidised form of β-nicotinamide adenine dinucleotide (NAD(+)) to produce the 1,4-reduced form (NADH) under atmospheric pressure of H(2) at room temperature in weakly basic water. In weakly acidic water, the complex [Ir-OH(2)](+) also catalyses the reverse reaction, i.e., hydrogen evolution from NADH to produce NAD(+) at room temperature. Thus, interconversion between NADH (and H(+)) and NAD(+) (and H(2)) has also been achieved by using [Ir-OH(2)](+) as an efficient catalyst and by changing pH. The iridium hydride complex formed by the reduction of [Ir-OH(2)](+) by H(2) and NADH is responsible for the hydrogen evolution. Photoirradiation (λ > 330 nm) of an aqueous solution of the Ir-hydride complex produced by the reduction of [Ir-OH(2)](+) with alcohols resulted in the quantitative conversion to a unique [C,C] cyclometalated Ir-hydride complex, which can catalyse hydrogen evolution from alcohols in a basic aqueous solution (pH 11.9). The catalytic mechanisms of the hydrogen storage and evolution are discussed by focusing on the reactivity of Ir-hydride complexes.

  10. Complex variables

    CERN Document Server

    Flanigan, Francis J

    2010-01-01

    A caution to mathematics professors: Complex Variables does not follow conventional outlines of course material. One reviewer noting its originality wrote: ""A standard text is often preferred [to a superior text like this] because the professor knows the order of topics and the problems, and doesn't really have to pay attention to the text. He can go to class without preparation."" Not so here-Dr. Flanigan treats this most important field of contemporary mathematics in a most unusual way. While all the material for an advanced undergraduate or first-year graduate course is covered, discussion

  11. Field measurements and modeling to resolve m2 to km2 CH4 emissions for a complex urban source: An Indiana landfill study

    Directory of Open Access Journals (Sweden)

    Maria Obiminda L. Cambaliza

    2017-07-01

    Full Text Available Large spatial and temporal uncertainties for landfill CH4 emissions remain unresolved by short-term field campaigns and historic greenhouse gas (GHG inventory models. Using four field methods (aircraft-based mass balance, tracer correlation, vertical radial plume mapping, static chambers and a new field-validated process-based model (California Landfill Methane Inventory Model, CALMIM 5.4, we investigated the total CH4 emissions from a central Indiana landfill as well as the partitioned emissions inclusive of methanotrophic oxidation for the various cover soils at the site. We observed close agreement between whole site emissions derived from the tracer correlation (8 to 13 mol s–1 and the aircraft mass balance approaches (7 and 17 mol s–1 that were statistically indistinguishable from the modeling result (12 ± 2 mol s–1 inclusive of oxidation. Our model calculations indicated that approximately 90% of the annual average CH4 emissions (11 ± 1 mol s–1; 2200 ± 250 g m–2 d–1 derived from the small daily operational area. Characterized by a thin overnight soil cover directly overlying a thick sequence of older methanogenic waste without biogas recovery, this area constitutes only 2% of the 0.7 km2 total waste footprint area. Because this Indiana landfill is an upwind source for Indianapolis, USA, the resolution of m2 to km2 scale emissions at various temporal scales contributes to improved regional inventories relevant for addressing GHG mitigation strategies. Finally, our comparison of measured to reported CH4 emissions under the US EPA National GHG Reporting program suggests the need to revisit the current IPCC (2006 GHG inventory methodology based on CH4 generation modeling. The reasonable prediction of emissions at individual U.S. landfills requires incorporation of both cover-specific landfill climate modeling (e.g., soil temperature/moisture variability over a typical annual cycle driving CH4 transport and oxidation rates as

  12. 31P and 1H NMR studies of the structure of enzyme-bound substrate complexes of lobster muscle arginine kinase: Relaxation measurements with Mn(II) and Co(II)

    International Nuclear Information System (INIS)

    Jarori, G.K.; Ray, B.D.; Rao, B.D.N.

    1989-01-01

    The paramagnetic effects of Mn(II) and Co(II) on the spin-lattice relaxation rates of 31 P nuclei of ATP and ADP and of Mn(II) on the spin-lattice relaxation rate of the δ protons of arginine bound to arginine kinase from lobster tail muscle have been measured. Temperature variation of 31 P relaxation rates in E-MnADP and E-MnATP yields activation energies (ΔE) in the range 6-10 kcal/mol. Thus, the 31 P relaxation rates in these complexes are exchange limited and cannot provide structural information. However, the relaxation rates in E-CoADP and E-CoATP exhibit frequency dependence and ΔE values in the range 1-2 kcal/mol; i.e., these rates depend upon 31 P-Co(II) distances. These distances were calculated to be in the range 3.2-4.5 angstrom, appropriate for direct coordination between Co(II) and the phosphoryl groups. The paramagnetic effect of Mn(II) on the 1 H spin-lattice relaxation rate of the δ protons of arginine in the E-MnADP-Arg complex was also measured at three frequencies. From the frequency dependence of the relaxation rate an effective τ C of 0.6 ns has also been calculated, which is most likely to be the electron spin relaxation rate (τ S1 ) for Mn(II) in this complex. The distance estimated on the basis of the reciprocal sixth root of the average relaxation rate of the δ protons was 10.9 ± 0.3 angstrom

  13. Complex dynamics

    CERN Document Server

    Carleson, Lennart

    1993-01-01

    Complex dynamics is today very much a focus of interest. Though several fine expository articles were available, by P. Blanchard and by M. Yu. Lyubich in particular, until recently there was no single source where students could find the material with proofs. For anyone in our position, gathering and organizing the material required a great deal of work going through preprints and papers and in some cases even finding a proof. We hope that the results of our efforts will be of help to others who plan to learn about complex dynamics and perhaps even lecture. Meanwhile books in the field a. re beginning to appear. The Stony Brook course notes of J. Milnor were particularly welcome and useful. Still we hope that our special emphasis on the analytic side will satisfy a need. This book is a revised and expanded version of notes based on lectures of the first author at UCLA over several \\Vinter Quarters, particularly 1986 and 1990. We owe Chris Bishop a great deal of gratitude for supervising the production of cour...

  14. Cosmic Complexity

    Science.gov (United States)

    Mather, John C.

    2012-01-01

    What explains the extraordinary complexity of the observed universe, on all scales from quarks to the accelerating universe? My favorite explanation (which I certainty did not invent) ls that the fundamental laws of physics produce natural instability, energy flows, and chaos. Some call the result the Life Force, some note that the Earth is a living system itself (Gaia, a "tough bitch" according to Margulis), and some conclude that the observed complexity requires a supernatural explanation (of which we have many). But my dad was a statistician (of dairy cows) and he told me about cells and genes and evolution and chance when I was very small. So a scientist must look for me explanation of how nature's laws and statistics brought us into conscious existence. And how is that seemll"!gly Improbable events are actually happening a!1 the time? Well, the physicists have countless examples of natural instability, in which energy is released to power change from simplicity to complexity. One of the most common to see is that cooling water vapor below the freezing point produces snowflakes, no two alike, and all complex and beautiful. We see it often so we are not amazed. But physlc!sts have observed so many kinds of these changes from one structure to another (we call them phase transitions) that the Nobel Prize in 1992 could be awarded for understanding the mathematics of their common features. Now for a few examples of how the laws of nature produce the instabilities that lead to our own existence. First, the Big Bang (what an insufficient name!) apparently came from an instability, in which the "false vacuum" eventually decayed into the ordinary vacuum we have today, plus the most fundamental particles we know, the quarks and leptons. So the universe as a whole started with an instability. Then, a great expansion and cooling happened, and the loose quarks, finding themselves unstable too, bound themselves together into today's less elementary particles like protons and

  15. Theories of computational complexity

    CERN Document Server

    Calude, C

    1988-01-01

    This volume presents four machine-independent theories of computational complexity, which have been chosen for their intrinsic importance and practical relevance. The book includes a wealth of results - classical, recent, and others which have not been published before.In developing the mathematics underlying the size, dynamic and structural complexity measures, various connections with mathematical logic, constructive topology, probability and programming theories are established. The facts are presented in detail. Extensive examples are provided, to help clarify notions and constructions. The lists of exercises and problems include routine exercises, interesting results, as well as some open problems.

  16. Increase of Organization in Complex Systems

    OpenAIRE

    Georgiev, Georgi Yordanov; Daly, Michael; Gombos, Erin; Vinod, Amrit; Hoonjan, Gajinder

    2013-01-01

    Measures of complexity and entropy have not converged to a single quantitative description of levels of organization of complex systems. The need for such a measure is increasingly necessary in all disciplines studying complex systems. To address this problem, starting from the most fundamental principle in Physics, here a new measure for quantity of organization and rate of self-organization in complex systems based on the principle of least (stationary) action is applied to a model system -...

  17. A Cryo Complex Control

    CERN Document Server

    Alferov, V; Fedorchenko, V; Ivanova, N; Kholkin, A; Klimov, S; Krendelev, V; Kuznetsov, S; Lukyantsev, A; Lutchev, A; Milutkin, V; Sytin, A N; Vasilev, D

    2004-01-01

    A Cryogenic complex is being constructed to provide by liquid helium and nitrogen the RF-separator of kaons. About 500 parameters including temperature (1,8…300)K, liquid helium/nitrogen level, vacuum, 300 digital signals have to be measured, 70 commands generated, 20 closed loops activated. The paper describes controls electronics which includes home made I8051 compatible controllers connected by the CAN field bus to a bus controller and interface electronic modules for: - temperature measurements; - liquid Ni and He level measurements; - vacuum pumps current measurements; - analog and digital signals measurements and generations. The modules are tested together with signal imitators within a vertical slice of the Control System based on EPICS tools.

  18. Obesogenic environments: complexities, perceptions, and objective measures

    National Research Council Canada - National Science Library

    Lake, Amelia A; Townshend, Tim G; Alvanides, Seraphim

    2010-01-01

    .... In a world where obesity has now reached epidemic proportions, a thorough understanding of the underlying causes of the problem is essential if public health initiatives and government policies...

  19. Obesogenic environments: complexities, perceptions, and objective measures

    National Research Council Canada - National Science Library

    Lake, Amelia A; Townshend, Tim G; Alvanides, Seraphim

    2010-01-01

    ... are to successfully address the issue. Beginning with an overarching introduction to obesity and its implications for health and wellbeing, the book will move on to consider such crucial areas as eating behaviours and food environments...

  20. Obesogenic environments: complexities, perceptions, and objective measures

    National Research Council Canada - National Science Library

    Lake, Amelia A; Townshend, Tim G; Alvanides, Seraphim

    2010-01-01

    ..., physical activity and food access. This groundbreaking book brings together for the first time the knowledge of dietitians, epidemiologists and town planners in order to offer a multidisciplinary approach to public health, suggesting new and exciting ways to shape our environment to better support healthful decisions"--Provided by publisher.