WorldWideScience

Sample records for quantifying history-dependent behavior

  1. Memory in microbes: quantifying history-dependent behavior in a bacterium.

    Directory of Open Access Journals (Sweden)

    Denise M Wolf

    2008-02-01

    Full Text Available Memory is usually associated with higher organisms rather than bacteria. However, evidence is mounting that many regulatory networks within bacteria are capable of complex dynamics and multi-stable behaviors that have been linked to memory in other systems. Moreover, it is recognized that bacteria that have experienced different environmental histories may respond differently to current conditions. These "memory" effects may be more than incidental to the regulatory mechanisms controlling acclimation or to the status of the metabolic stores. Rather, they may be regulated by the cell and confer fitness to the organism in the evolutionary game it participates in. Here, we propose that history-dependent behavior is a potentially important manifestation of memory, worth classifying and quantifying. To this end, we develop an information-theory based conceptual framework for measuring both the persistence of memory in microbes and the amount of information about the past encoded in history-dependent dynamics. This method produces a phenomenological measure of cellular memory without regard to the specific cellular mechanisms encoding it. We then apply this framework to a strain of Bacillus subtilis engineered to report on commitment to sporulation and degradative enzyme (AprE synthesis and estimate the capacity of these systems and growth dynamics to 'remember' 10 distinct cell histories prior to application of a common stressor. The analysis suggests that B. subtilis remembers, both in short and long term, aspects of its cell history, and that this memory is distributed differently among the observables. While this study does not examine the mechanistic bases for memory, it presents a framework for quantifying memory in cellular behaviors and is thus a starting point for studying new questions about cellular regulation and evolutionary strategy.

  2. Memory in microbes: quantifying history-Dependent behavior in a bacterium.

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, Denise M.; Fontaine-Bodin, Lisa; Bischofs, Ilka; Price, Gavin; Keaslin, Jay; Arkin, Adam P.

    2007-11-15

    Memory is usually associated with higher organisms rather than bacteria. However, evidence is mounting that many regulatory networks within bacteria are capable of complex dynamics and multi-stable behaviors that have been linked to memory in other systems. Moreover, it is recognized that bacteria that have experienced different environmental histories may respond differently to current conditions. These"memory" effects may be more than incidental to the regulatory mechanisms controlling acclimation or to the status of the metabolic stores. Rather, they may be regulated by the cell and confer fitness to the organism in the evolutionary game it participates in. Here, we propose that history-dependent behavior is a potentially important manifestation of memory, worth classifying and quantifying. To this end, we develop an information-theory based conceptual framework for measuring both the persistence of memory in microbes and the amount of information about the past encoded in history-dependent dynamics. This method produces a phenomenological measure of cellular memory without regard to the specific cellular mechanisms encoding it. We then apply this framework to a strain of Bacillus subtilis engineered to report on commitment to sporulation and degradative enzyme (AprE) synthesis and estimate the capacity of these systems and growth dynamics to 'remember' 10 distinct cell histories prior to application of a common stressor. The analysis suggests that B. subtilis remembers, both in short and long term, aspects of its cell history, and that this memory is distributed differently among the observables. While this study does not examine the mechanistic bases for memory, it presents a framework for quantifying memory in cellular behaviors and is thus a starting point for studying new questions about cellular regulation and evolutionary strategy.

  3. Memory in Microbes: Quantifying History-Dependent Behavior in a Bacterium

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, Denise M.; Fontaine-Bodin, Lisa; Bischofs, Ilka; Price, Gavin; Keasling, Jay; Arkin, Adam P.

    2007-11-15

    Memory is usually associated with higher organisms rather than bacteria. However, evidence is mounting that many regulatory networks within bacteria are capable of complex dynamics and multi-stable behaviors that have been linked to memory in other systems. Moreover, it is recognized that bacteria that have experienced different environmental histories may respond differently to current conditions. These"memory" effects may be more than incidental to the regulatory mechanisms controlling acclimation or to the status of the metabolic stores. Rather, they may be regulated by the cell and confer fitness to the organism in the evolutionary game it participates in. Here, we propose that history-dependent behavior is a potentially important manifestation of memory, worth classifying and quantifying. To this end, we develop an information-theory based conceptual framework for measuring both the persistence of memory in microbes and the amount of information about the past encoded in history-dependent dynamics. This method produces a phenomenologicalmeasure of cellular memory without regard to the specific cellular mechanisms encoding it. We then apply this framework to a strain of Bacillus subtilis engineered to report on commitment to sporulation and degradative enzyme (AprE) synthesisand estimate the capacity of these systems and growth dynamics to"remember" 10 distinct cell histories prior to application of a common stressor. The analysis suggests that B. subtilis remembers, both in short and long term, aspects of its cellhistory, and that this memory is distributed differently among the observables. While this study does not examine the mechanistic bases for memory, it presents a framework for quantifying memory in cellular behaviors and is thus a starting point for studying new questions about cellular regulation and evolutionary strategy.

  4. An index for quantifying flocking behavior.

    Science.gov (United States)

    Quera, Vicenç; Herrando, Salvador; Beltran, Francesc S; Salas, Laura; Miñano, Meritxell

    2007-12-01

    One of the classic research topics in adaptive behavior is the collective displacement of groups of organisms such as flocks of birds, schools of fish, herds of mammals, and crowds of people. However, most agent-based simulations of group behavior do not provide a quantitative index for determining the point at which the flock emerges. An index was developed of the aggregation of moving individuals in a flock and an example was provided of how it can be used to quantify the degree to which a group of moving individuals actually forms a flock.

  5. History-Dependent Risk Attitude, Second Version

    OpenAIRE

    David Dillenberger; Kareen Rozen

    2011-01-01

    We propose a model of history-dependent risk attitude, allowing a decision maker’s risk attitude to be affected by his history of disappointments and elations. The decision maker recursively evaluates compound risks, classifying realizations as disappointing or elating using a threshold rule. We establish equivalence between the model and two cognitive biases: risk attitudes are reinforced by experiences (one is more risk averse after disappointment than after elation) and there is a primacy ...

  6. History dependence in insect flight decisions during odor tracking.

    Science.gov (United States)

    Pang, Rich; van Breugel, Floris; Dickinson, Michael; Riffell, Jeffrey A; Fairhall, Adrienne

    2018-02-01

    Natural decision-making often involves extended decision sequences in response to variable stimuli with complex structure. As an example, many animals follow odor plumes to locate food sources or mates, but turbulence breaks up the advected odor signal into intermittent filaments and puffs. This scenario provides an opportunity to ask how animals use sparse, instantaneous, and stochastic signal encounters to generate goal-oriented behavioral sequences. Here we examined the trajectories of flying fruit flies (Drosophila melanogaster) and mosquitoes (Aedes aegypti) navigating in controlled plumes of attractive odorants. While it is known that mean odor-triggered flight responses are dominated by upwind turns, individual responses are highly variable. We asked whether deviations from mean responses depended on specific features of odor encounters, and found that odor-triggered turns were slightly but significantly modulated by two features of odor encounters. First, encounters with higher concentrations triggered stronger upwind turns. Second, encounters occurring later in a sequence triggered weaker upwind turns. To contextualize the latter history dependence theoretically, we examined trajectories simulated from three normative tracking strategies. We found that neither a purely reactive strategy nor a strategy in which the tracker learned the plume centerline over time captured the observed history dependence. In contrast, "infotaxis", in which flight decisions maximized expected information gain about source location, exhibited a history dependence aligned in sign with the data, though much larger in magnitude. These findings suggest that while true plume tracking is dominated by a reactive odor response it might also involve a history-dependent modulation of responses consistent with the accumulation of information about a source over multi-encounter timescales. This suggests that short-term memory processes modulating decision sequences may play a role in

  7. Adaptive Smoothed Finite Elements (ASFEM) for history dependent material models

    International Nuclear Information System (INIS)

    Quak, W.; Boogaard, A. H. van den

    2011-01-01

    A successful simulation of a bulk forming process with finite elements can be difficult due to distortion of the finite elements. Nodal smoothed Finite Elements (NSFEM) are an interesting option for such a process since they show good distortion insensitivity and moreover have locking-free behavior and good computational efficiency. In this paper a method is proposed which takes advantage of the nodally smoothed field. This method, named adaptive smoothed finite elements (ASFEM), revises the mesh for every step of a simulation without mapping the history dependent material parameters. In this paper an updated-Lagrangian implementation is presented. Several examples are given to illustrate the method and to show its properties.

  8. A particle method for history-dependent materials

    Energy Technology Data Exchange (ETDEWEB)

    Sulsky, D.; Chen, Z.; Schreyer, H.L. [New Mexico Univ., Albuquerque, NM (United States)

    1993-06-01

    A broad class of engineering problems including penetration, impact and large rotations of solid bodies causes severe numerical problems. For these problems, the constitutive equations are history dependent so material points must be followed; this is difficult to implement in an Eulerian scheme. On the other hand, purely Lagrangian methods typically result in severe mesh distortion and the consequence is ill conditioning of the element stiffness matrix leading to mesh lockup or entanglement. Remeshing prevents the lockup and tangling but then interpolation must be performed for history dependent variables, a process which can introduce errors. Proposed here is an extension of the particle-in-cell method in which particles are interpreted to be material points that are followed through the complete loading process. A fixed Eulerian grid provides the means for determining a spatial gradient. Because the grid can also be interpreted as an updated Lagrangian frame, the usual convection term in the acceleration associated with Eulerian formulations does not appear. With the use of maps between material points and the grid, the advantages of both Eulerian and Lagrangian schemes are utilized so that mesh tangling is avoided while material variables are tracked through the complete deformation history. Example solutions in two dimensions are given to illustrate the robustness of the proposed convection algorithm and to show that typical elastic behavior can be reproduced. Also, it is shown that impact with no slip is handled without any special algorithm for bodies governed by elasticity and strain hardening plasticity.

  9. Spatially quantifying the leadership effectiveness in collective behavior

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Haitao [State Key Laboratory of Digital Manufacturing Equipment and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); Wang Ning [Department of Control Science and Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Chen, Michael Z Q [Department of Mechanical Engineering, University of Hong Kong, Pok Fu Lam Road, Hong Kong (Hong Kong); Su Riqi; Zhou Tao [Department of Modern Physics, University of Science and Technology of China, Hefei 230026 (China); Zhou Changsong, E-mail: zht@mail.hust.edu.cn, E-mail: cszhou@hkbu.edu.hk, E-mail: zhutou@ustc.edu [Department of Physics, Centre for Nonlinear Studies, and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Hong Kong Baptist University, Kowloon Tong (Hong Kong)

    2010-12-15

    Among natural biological flocks/swarms or mass social activities, when the collective behavior of the followers has been dominated by the direction or opinion of one leader group, it seems difficult for later-coming leaders to reverse the orientation of the mass followers, especially when they are in quantitative minority. This paper, however, reports a counter-intuitive phenomenon, i.e. Following the Later-coming Minority, provided that the later-comers obey a favorable distribution pattern that enables them to spread their influence to as many followers as possible within a given time and to be dense enough to govern these local followers they can influence directly from the beginning. We introduce a discriminant index to quantify the whole group's orientation under competing leaderships, with which the eventual orientation of the mass followers can be predicted before launching the real dynamical procedure. From the application point of view, this leadership effectiveness index also helps us to design an economical way for the minority later-coming leaders to defeat the dominating majority leaders solely by optimizing their spatial distribution pattern provided that the premeditated goal is available. Our investigation provides insights into effective leadership in biological systems with meaningful implications for social and industrial applications.

  10. Spatially quantifying the leadership effectiveness in collective behavior

    International Nuclear Information System (INIS)

    Zhang Haitao; Wang Ning; Chen, Michael Z Q; Su Riqi; Zhou Tao; Zhou Changsong

    2010-01-01

    Among natural biological flocks/swarms or mass social activities, when the collective behavior of the followers has been dominated by the direction or opinion of one leader group, it seems difficult for later-coming leaders to reverse the orientation of the mass followers, especially when they are in quantitative minority. This paper, however, reports a counter-intuitive phenomenon, i.e. Following the Later-coming Minority, provided that the later-comers obey a favorable distribution pattern that enables them to spread their influence to as many followers as possible within a given time and to be dense enough to govern these local followers they can influence directly from the beginning. We introduce a discriminant index to quantify the whole group's orientation under competing leaderships, with which the eventual orientation of the mass followers can be predicted before launching the real dynamical procedure. From the application point of view, this leadership effectiveness index also helps us to design an economical way for the minority later-coming leaders to defeat the dominating majority leaders solely by optimizing their spatial distribution pattern provided that the premeditated goal is available. Our investigation provides insights into effective leadership in biological systems with meaningful implications for social and industrial applications.

  11. Reliability analysis of Markov history-dependent repairable systems with neglected failures

    International Nuclear Information System (INIS)

    Du, Shijia; Zeng, Zhiguo; Cui, Lirong; Kang, Rui

    2017-01-01

    Markov history-dependent repairable systems refer to the Markov repairable systems in which some states are changeable and dependent on recent evolutional history of the system. In practice, many Markov history-dependent repairable systems are subjected to neglected failures, i.e., some failures do not affect system performances if they can be repaired promptly. In this paper, we develop a model based on the theory of aggregated stochastic processes to describe the history-dependent behavior and the effect of neglected failures on the Markov history-dependent repairable systems. Based on the developed model, instantaneous and steady-state availabilities are derived to characterize the reliability of the system. Four reliability-related time distributions, i.e., distribution for the k th working period, distribution for the k th failure period, distribution for the real working time in an effective working period, distribution for the neglected failure time in an effective working period, are also derived to provide a more comprehensive description of the system's reliability. Thanks to the power of the theory of aggregated stochastic processes, closed-form expressions are obtained for all the reliability indexes and time distributions. Finally, the developed indexes and analysis methods are demonstrated by a numerical example. - Highlights: • Markovian history-dependent repairable systems with neglected failures is modeled. • Aggregated stochastic processes are used to derive reliability indexes and time distributions. • Closed-form expressions are derived for the considered indexes and distributions.

  12. Quantifying trading behavior in financial markets using Google Trends.

    Science.gov (United States)

    Preis, Tobias; Moat, Helen Susannah; Stanley, H Eugene

    2013-01-01

    Crises in financial markets affect humans worldwide. Detailed market data on trading decisions reflect some of the complex human behavior that has led to these crises. We suggest that massive new data sources resulting from human interaction with the Internet may offer a new perspective on the behavior of market participants in periods of large market movements. By analyzing changes in Google query volumes for search terms related to finance, we find patterns that may be interpreted as "early warning signs" of stock market moves. Our results illustrate the potential that combining extensive behavioral data sets offers for a better understanding of collective human behavior.

  13. Quantifying and Disaggregating Consumer Purchasing Behavior for Energy Systems Modeling

    Science.gov (United States)

    Consumer behaviors such as energy conservation, adoption of more efficient technologies, and fuel switching represent significant potential for greenhouse gas mitigation. Current efforts to model future energy outcomes have tended to use simplified economic assumptions ...

  14. Nanoscale mechanical stimulation method for quantifying C. elegans mechanosensory behavior and memory

    OpenAIRE

    Kiso, Kaori; Sugi, Takuma; Okumura, Etsuko; Igarashi, Ryuji

    2016-01-01

    Here, we establish a novel economic system to quantify C. elegans mechanosensory behavior and memory by a controllable nanoscale mechanical stimulation. Using piezoelectric sheet speaker, we can flexibly change the vibration properties at a nanoscale displacement level and quantify behavioral responses and memory under the control of each vibration property. This system will facilitate understanding of physiological aspects of C. elegans mechanosensory behavior and memory.

  15. Quantifying cell behaviors in negative-pressure induced monolayer cell movement

    Directory of Open Access Journals (Sweden)

    Shu-Er Chow

    2016-02-01

    Conclusion: A quick membrane ruffling formation, an early cell–substratum separation, and an ensuing decrease in the cellular interaction occur in cells at NP. These specific monolayer cell behaviors at NP have been quantified and possibly accelerate wound healing.

  16. Nanoscale Mechanical Stimulation Method for Quantifying C. elegans Mechanosensory Behavior and Memory.

    Science.gov (United States)

    Sugi, Takuma; Okumura, Etsuko; Kiso, Kaori; Igarashi, Ryuji

    2016-01-01

    Withdrawal escape response of C. elegans to nonlocalized vibration is a useful behavioral paradigm to examine mechanisms underlying mechanosensory behavior and its memory-dependent change. However, there are very few methods for investigating the degree of vibration frequency, amplitude and duration needed to induce behavior and memory. Here, we establish a new system to quantify C. elegans mechanosensory behavior and memory using a piezoelectric sheet speaker. In the system, we can flexibly change the vibration properties at a nanoscale displacement level and quantify behavioral responses under each vibration property. This system is an economic setup and easily replicated in other laboratories. By using the system, we clearly detected withdrawal escape responses and confirmed habituation memory. This system will facilitate the understanding of physiological aspects of C. elegans mechanosensory behavior in the future.

  17. History dependent quantum random walks as quantum lattice gas automata

    Energy Technology Data Exchange (ETDEWEB)

    Shakeel, Asif, E-mail: asif.shakeel@gmail.com, E-mail: dmeyer@math.ucsd.edu, E-mail: plove@haverford.edu; Love, Peter J., E-mail: asif.shakeel@gmail.com, E-mail: dmeyer@math.ucsd.edu, E-mail: plove@haverford.edu [Department of Physics, Haverford College, Haverford, Pennsylvania 19041 (United States); Meyer, David A., E-mail: asif.shakeel@gmail.com, E-mail: dmeyer@math.ucsd.edu, E-mail: plove@haverford.edu [Department of Mathematics, University of California/San Diego, La Jolla, California 92093-0112 (United States)

    2014-12-15

    Quantum Random Walks (QRW) were first defined as one-particle sectors of Quantum Lattice Gas Automata (QLGA). Recently, they have been generalized to include history dependence, either on previous coin (internal, i.e., spin or velocity) states or on previous position states. These models have the goal of studying the transition to classicality, or more generally, changes in the performance of quantum walks in algorithmic applications. We show that several history dependent QRW can be identified as one-particle sectors of QLGA. This provides a unifying conceptual framework for these models in which the extra degrees of freedom required to store the history information arise naturally as geometrical degrees of freedom on the lattice.

  18. Objectively Quantified Physical Activity and Sedentary Behavior in Predicting Visceral Adiposity and Liver Fat

    Directory of Open Access Journals (Sweden)

    Shelley E. Keating

    2016-01-01

    Full Text Available Objective. Epidemiologic studies suggest an inverse relationship between nonalcoholic fatty liver disease (NAFLD, visceral adipose tissue (VAT, and self-reported physical activity levels. However, subjective measurements can be inaccurate and prone to reporter bias. We investigated whether objectively quantified physical activity levels predicted liver fat and VAT in overweight/obese adults. Methods. Habitual physical activity was measured by triaxial accelerometry for four days (n=82. Time spent in sedentary behavior (MET < 1.6 and light (MET 1.6 < 3, moderate (MET 3 < 6, and vigorous (MET 6 < 9 physical activity was quantified. Magnetic resonance imaging and spectroscopy were used to quantify visceral and liver fat. Bivariate correlations and hierarchical multiple regression analyses were performed. Results. There were no associations between physical activity or sedentary behavior and liver lipid. Sedentary behavior and moderate and vigorous physical activity accounted for just 3% of variance for VAT (p=0.14 and 0.003% for liver fat (p=0.96. Higher levels of VAT were associated with time spent in moderate activity (r=0.294, p=0.007, but there was no association with sedentary behavior. Known risk factors for obesity-related NAFLD accounted for 62% and 40% of variance in VAT and liver fat, respectively (p<0.01. Conclusion. Objectively measured levels of habitual physical activity and sedentary behavior did not influence VAT or liver fat.

  19. Quantifying the impact of Wellington Zoo's persuasive communication campaign on post-visit behavior.

    Science.gov (United States)

    MacDonald, Edith

    2015-01-01

    Zoos potential to facilitate visitor conservation behavior is commonly articulated. Few studies, however, have quantified whether zoos' conservation messages result in visitors implementing the behavior. To test if zoo conservation messages are adopted at home, I implemented a persuasive communication campaign which advocated keeping cats indoor at night, a behavior that is a potential solution to cats depredating native wildlife. Furthermore, I tested if a public commitment (signing a pledge card) strengthened the relationship between on-site intention to engage in the behavior and actual implementation of the behavior at home. The conservation behavior was included in the twice-daily animal presentations in the amphitheater. A sample of 691 visitors completed a survey as they exited the amphitheater that measured their recall of the conservation behavior and intention to engage in the behavior at home. The last 311 visitors to complete the survey were asked to sign a pledge card which was publicly displayed in the amphitheater. Six weeks after their zoo trip, visitors were contacted and asked if they had implemented the behavior. Recall of the conservation behavior was high (91% for control, 100% for pledge group) and the entire pledge group had implemented the behavior whereas just half (51%) of the control group did. Furthermore, signing the pledge card strengthened the relationship between onsite intention and at home behavior (r = 1.0 of for the pledge group and r = 0.21 for the control group). Overall, the zoo's conservation message was recalled and behavior implemented at home. © 2015 Wiley Periodicals, Inc.

  20. Surface treatment and history-dependent corrosion in lead alloys

    International Nuclear Information System (INIS)

    Li Ning; Zhang Jinsuo; Sencer, Bulent H.; Koury, Daniel

    2006-01-01

    In oxygen-controlled lead and lead-bismuth eutectic (LBE), steel corrosion may be strongly history dependent. This is due to the competition between liquid metal dissolution corrosion and oxidation as a 'self-healing' protection barrier. Such effects can be observed from corrosion testing of a variety of surface-treated materials, such as cold working, shot peening, pre-oxidation, etc. Shot peening of austenitic steels produces surface-layer microstructural damages and grain compression, which could contribute to increased Cr migration to the surface and enhance the protection through an impervious oxide. Pre-oxidation under conditions different from operating ones may form more protective oxides, reduce oxygen and metal ion migration through the oxides, and achieve better protection for longer durations. Corrosion and oxidation modeling and analysis reveal the potential for significantly reducing long-term corrosion rates by initial and early-stage conditioning of steels for Pb/LBE services

  1. Surface treatment and history-dependent corrosion in lead alloys

    Energy Technology Data Exchange (ETDEWEB)

    Li Ning [Los Alamos National Laboratory, Los Alamos, NM (United States)]. E-mail: ningli@lanl.gov; Zhang Jinsuo [Los Alamos National Laboratory, Los Alamos, NM (United States); Sencer, Bulent H. [Los Alamos National Laboratory, Los Alamos, NM (United States); Koury, Daniel [University of Nevada, Las Vegas, NV (United States)

    2006-06-23

    In oxygen-controlled lead and lead-bismuth eutectic (LBE), steel corrosion may be strongly history dependent. This is due to the competition between liquid metal dissolution corrosion and oxidation as a 'self-healing' protection barrier. Such effects can be observed from corrosion testing of a variety of surface-treated materials, such as cold working, shot peening, pre-oxidation, etc. Shot peening of austenitic steels produces surface-layer microstructural damages and grain compression, which could contribute to increased Cr migration to the surface and enhance the protection through an impervious oxide. Pre-oxidation under conditions different from operating ones may form more protective oxides, reduce oxygen and metal ion migration through the oxides, and achieve better protection for longer durations. Corrosion and oxidation modeling and analysis reveal the potential for significantly reducing long-term corrosion rates by initial and early-stage conditioning of steels for Pb/LBE services.

  2. Mapping urban climate zones and quantifying climate behaviors - An application on Toulouse urban area (France)

    Energy Technology Data Exchange (ETDEWEB)

    Houet, Thomas, E-mail: thomas.houet@univ-tlse2.fr [GEODE UMR 5602 CNRS, Universite de Toulouse, 5 allee Antonio Machado, 31058 Toulouse Cedex (France); Pigeon, Gregoire [Centre National de Recherches Meteorologiques, Meteo-France/CNRM-GAME, 42 avenue Coriolis, 31057 Toulouse Cedex (France)

    2011-08-15

    Facing the concern of the population to its environment and to climatic change, city planners are now considering the urban climate in their choices of planning. The use of climatic maps, such Urban Climate Zone-UCZ, is adapted for this kind of application. The objective of this paper is to demonstrate that the UCZ classification, integrated in the World Meteorological Organization guidelines, first can be automatically determined for sample areas and second is meaningful according to climatic variables. The analysis presented is applied on Toulouse urban area (France). Results show first that UCZ differentiate according to air and surface temperature. It has been possible to determine the membership of sample areas to an UCZ using landscape descriptors automatically computed with GIS and remote sensed data. It also emphasizes that climate behavior and magnitude of UCZ may vary from winter to summer. Finally we discuss the influence of climate data and scale of observation on UCZ mapping and climate characterization. - Highlights: > We proposed a method to map Urban Climate Zones and quantify their climate behaviors. > UCZ is an expert-based classification and is integrated in the WMO guidelines. > We classified 26 sample areas and quantified climate behaviors in winter/summer. > Results enhance urban heat islands and outskirts are surprisingly hottest in summer. > Influence of scale and climate data on UCZ mapping and climate evaluation is discussed. - This paper presents an automated approach to classify sample areas in a UCZ using landscape descriptors and demonstrate that climate behaviors of UCZ differ.

  3. Quantifying biopsychosocial aspects in everyday contexts: an integrative methodological approach from the behavioral sciences

    Science.gov (United States)

    Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K

    2015-01-01

    Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708

  4. Quantifying Equid Behavior - A Research Ethogram for Free-Roaming Feral Horses

    Science.gov (United States)

    Ransom, Jason I.; Cade, Brian S.

    2009-01-01

    Feral horses (Equus caballus) are globally distributed in free-roaming populations on all continents except Antarctica and occupy a wide range of habitats including forest, grassland, desert, and montane environments. The largest populations occur in Australia and North America and have been the subject of scientific study for decades, yet guidelines and ethograms for feral horse behavioral research are largely absent in the scientific literature. The U.S. Geological Survey (USGS) Fort Collins Science Center conducted research on the influences of the immunocontraceptive porcine zona pellucida (PZP) on feral horse behavior from 2003-2006 in three discrete populations in the American west. These populations were the Little Book Cliffs Wild Horse Range in Colorado, McCullough Peaks Herd Management Area in Wyoming, and Pryor Mountain Wild Horse Range in Montana; the research effort included over 1,800 hours of behavioral observations of 317 adult free-roaming feral horses. An ethogram was developed during the course of this study to facilitate accurate scientific data collection on feral horse behavior, which is often challenging to quantify. By developing this set of discrete behavioral definitions and a set of strict research protocols, scientists were better able to address both applied questions, such as behavioral changes related to fertility control, and theoretical questions, such as understanding networks and dominance hierarchies within social groups of equids.

  5. History-dependent dynamics in a generic model of ion channels - an analytic study

    Directory of Open Access Journals (Sweden)

    Daniel Soudry

    2010-04-01

    Full Text Available Recent experiments have demonstrated that the timescale of adaptation of single neurons and ion channel populations to stimuli slows down as the length of stimulation increases; in fact, no upper bound on temporal time-scales seems to exist in such systems. Furthermore, patch clamp experiments on single ion channels have hinted at the existence of large, mostly unobservable, inactivation state spaces within a single ion channel. This raises the question of the relation between this multitude of inactivation states and the observed behavior. In this work we propose a minimal model for ion channel dynamics which does not assume any specific structure of the inactivation state space. The model is simple enough to render an analytical study possible. This leads to a clear and concise explanation of the experimentally observed exponential history-dependent relaxation in sodium channels in a voltage clamp setting, and shows that their recovery rate from slow inactivation must be voltage dependent. Furthermore, we predict that history-dependent relaxation cannot be created by overly sparse spiking activity. While the model was created with ion channel populations in mind, its simplicity and genericalness render it a good starting point for modeling similar effects in other systems, and for scaling up to higher levels such as single neurons which are also known to exhibit multiple time scales.

  6. Inferring Characteristics of Sensorimotor Behavior by Quantifying Dynamics of Animal Locomotion

    Science.gov (United States)

    Leung, KaWai

    Locomotion is one of the most well-studied topics in animal behavioral studies. Many fundamental and clinical research make use of the locomotion of an animal model to explore various aspects in sensorimotor behavior. In the past, most of these studies focused on population average of a specific trait due to limitation of data collection and processing power. With recent advance in computer vision and statistical modeling techniques, it is now possible to track and analyze large amounts of behavioral data. In this thesis, I present two projects that aim to infer the characteristics of sensorimotor behavior by quantifying the dynamics of locomotion of nematode Caenorhabditis elegans and fruit fly Drosophila melanogaster, shedding light on statistical dependence between sensing and behavior. In the first project, I investigate the possibility of inferring noxious sensory information from the behavior of Caenorhabditis elegans. I develop a statistical model to infer the heat stimulus level perceived by individual animals from their stereotyped escape responses after stimulation by an IR laser. The model allows quantification of analgesic-like effects of chemical agents or genetic mutations in the worm. At the same time, the method is able to differentiate perturbations of locomotion behavior that are beyond affecting the sensory system. With this model I propose experimental designs that allows statistically significant identification of analgesic-like effects. In the second project, I investigate the relationship of energy budget and stability of locomotion in determining the walking speed distribution of Drosophila melanogaster during aging. The locomotion stability at different age groups is estimated from video recordings using Floquet theory. I calculate the power consumption of different locomotion speed using a biomechanics model. In conclusion, the power consumption, not stability, predicts the locomotion speed distribution at different ages.

  7. Quantifying the contributions of behavioral and biological risk factors to socioeconomic disparities in coronary heart disease incidence: the MORGEN study

    NARCIS (Netherlands)

    Kershaw, Kiarri N.; Droomers, Mariël; Robinson, Whitney R.; Carnethon, Mercedes R.; Daviglus, Martha L.; Verschuren, W. M. Monique

    2013-01-01

    Quantifying the impact of different modifiable behavioral and biological risk factors on socioeconomic disparities in coronary heart disease (CHD) may help inform targeted, population-specific strategies to reduce the unequal distribution of the disease. Previous studies have used analytic

  8. History-dependent excitability as a single-cell substrate of transient memory for information discrimination.

    Directory of Open Access Journals (Sweden)

    Fabiano Baroni

    Full Text Available Neurons react differently to incoming stimuli depending upon their previous history of stimulation. This property can be considered as a single-cell substrate for transient memory, or context-dependent information processing: depending upon the current context that the neuron "sees" through the subset of the network impinging on it in the immediate past, the same synaptic event can evoke a postsynaptic spike or just a subthreshold depolarization. We propose a formal definition of History-Dependent Excitability (HDE as a measure of the propensity to firing in any moment in time, linking the subthreshold history-dependent dynamics with spike generation. This definition allows the quantitative assessment of the intrinsic memory for different single-neuron dynamics and input statistics. We illustrate the concept of HDE by considering two general dynamical mechanisms: the passive behavior of an Integrate and Fire (IF neuron, and the inductive behavior of a Generalized Integrate and Fire (GIF neuron with subthreshold damped oscillations. This framework allows us to characterize the sensitivity of different model neurons to the detailed temporal structure of incoming stimuli. While a neuron with intrinsic oscillations discriminates equally well between input trains with the same or different frequency, a passive neuron discriminates better between inputs with different frequencies. This suggests that passive neurons are better suited to rate-based computation, while neurons with subthreshold oscillations are advantageous in a temporal coding scheme. We also address the influence of intrinsic properties in single-cell processing as a function of input statistics, and show that intrinsic oscillations enhance discrimination sensitivity at high input rates. Finally, we discuss how the recognition of these cell-specific discrimination properties might further our understanding of neuronal network computations and their relationships to the distribution and

  9. Quantifying cognition and behavior in normal aging, mild cognitive impairment, and Alzheimer's disease

    Science.gov (United States)

    Giraldo, Diana L.; Sijbers, Jan; Romero, Eduardo

    2017-11-01

    The diagnosis of Alzheimer's disease (AD) and mild cognitive impairment (MCI) is based on neuropsychological evaluation of the patient. Different cognitive and memory functions are assessed by a battery of tests that are composed of items devised to specifically evaluate such upper functions. This work aims to identify and quantify the factors that determine the performance in neuropsychological evaluation by conducting an Exploratory Factor Analysis (EFA). For this purpose, using data from the Alzheimer's Disease Neuroimaging Initiative (ADNI), EFA was applied to 67 item scores taken from the baseline neuropsychological battery of the three phases of ADNI study. The found factors are directly related to specific brain functions such as memory, behavior, orientation, or verbal fluency. The identification of factors is followed by the calculation of factor scores given by weighted linear combinations of the items scores.

  10. An experimental method to quantify the impact fatigue behavior of rocks

    International Nuclear Information System (INIS)

    Wu, Bangbiao; Xia, Kaiwen; Kanopoulos, Patrick; Luo, Xuedong

    2014-01-01

    Fatigue failure is an important failure mode of engineering materials. The fatigue behavior of both ductile and brittle materials has been under investigation for many years. While the fatigue failure of ductile materials is well established, only a few studies have been carried out on brittle materials. In addition, most fatigue studies on rocks are conducted under quasi-static loading conditions. To address engineering applications involving repeated blasting, this paper proposes a method to quantify the impact fatigue properties of rocks. In this method, a split Hopkinson pressure bar system is adopted to exert impact load on the sample, which is placed in a specially designed steel sleeve to limit the displacement of the sample and thus to enable the recovery of the rock after each impact. The method is then applied to Laurentian granite, which is fine-grained and isotropic material. The results demonstrate that this is a practicable means to conduct impact fatigue tests on rocks and other brittle solids. (paper)

  11. An experimental method to quantify the impact fatigue behavior of rocks

    Science.gov (United States)

    Wu, Bangbiao; Kanopoulos, Patrick; Luo, Xuedong; Xia, Kaiwen

    2014-07-01

    Fatigue failure is an important failure mode of engineering materials. The fatigue behavior of both ductile and brittle materials has been under investigation for many years. While the fatigue failure of ductile materials is well established, only a few studies have been carried out on brittle materials. In addition, most fatigue studies on rocks are conducted under quasi-static loading conditions. To address engineering applications involving repeated blasting, this paper proposes a method to quantify the impact fatigue properties of rocks. In this method, a split Hopkinson pressure bar system is adopted to exert impact load on the sample, which is placed in a specially designed steel sleeve to limit the displacement of the sample and thus to enable the recovery of the rock after each impact. The method is then applied to Laurentian granite, which is fine-grained and isotropic material. The results demonstrate that this is a practicable means to conduct impact fatigue tests on rocks and other brittle solids.

  12. Quantifying human behavior uncertainties in a coupled agent-based model for water resources management

    Science.gov (United States)

    Hyun, J. Y.; Yang, Y. C. E.; Tidwell, V. C.; Macknick, J.

    2017-12-01

    Modeling human behaviors and decisions in water resources management is a challenging issue due to its complexity and uncertain characteristics that affected by both internal (such as stakeholder's beliefs on any external information) and external factors (such as future policies and weather/climate forecast). Stakeholders' decision regarding how much water they need is usually not entirely rational in the real-world cases, so it is not quite suitable to model their decisions with a centralized (top-down) approach that assume everyone in a watershed follow the same order or pursue the same objective. Agent-based modeling (ABM) uses a decentralized approach (bottom-up) that allow each stakeholder to make his/her own decision based on his/her own objective and the belief of information acquired. In this study, we develop an ABM which incorporates the psychological human decision process by the theory of risk perception. The theory of risk perception quantifies human behaviors and decisions uncertainties using two sequential methodologies: the Bayesian Inference and the Cost-Loss Problem. The developed ABM is coupled with a regulation-based water system model: Riverware (RW) to evaluate different human decision uncertainties in water resources management. The San Juan River Basin in New Mexico (Figure 1) is chosen as a case study area, while we define 19 major irrigation districts as water use agents and their primary decision is to decide the irrigated area on an annual basis. This decision will be affected by three external factors: 1) upstream precipitation forecast (potential amount of water availability), 2) violation of the downstream minimum flow (required to support ecosystems), and 3) enforcement of a shortage sharing plan (a policy that is currently undertaken in the region for drought years). Three beliefs (as internal factors) that correspond to these three external factors will also be considered in the modeling framework. The objective of this study is

  13. Cyclic creep, mechanical ratchetting and amplitude history dependence of modified 9Cr-1Mo steel and evaluation of unified constitutive models

    International Nuclear Information System (INIS)

    Tanaka, Eiichi; Yamada, Hiroshi

    1993-01-01

    The purpose of the present paper is to elucidate inelastic behavior of modified 9Cr-1Mo steel as a candidate material for the next-generation fast breeder reactor and to provide the information for the formulation of a unified constitutive model. For this purpose, cyclic creep, mechanical ratchetting and amplitude history dependence of cyclic hardening were first examined at 550degC. As a result, systematic cyclic creep and mechanical ratchetting behavior were observed under various loading conditions, and little amplitude history dependence was found. Then these results were simulated by three unified constitutive models, i.e. the Chaboche, Bodner-Partom and modified Chaboche models. The simulated results show that these models cannot describe the cyclic creep and mechanical ratchetting behavior with high accuracy, but succeed in describing the inelastic behavior of amplitude variation experiments. (author)

  14. Thermal history dependence of superconducting properties in La2CuO4+δ

    International Nuclear Information System (INIS)

    Hirayama, T.; Nakagawa, M.; Sumiyama, A.; Oda, Y.

    1998-01-01

    We studied the thermal history dependence of the superconducting properties below/above room temperature (RT) in the ceramic La 2 CuO 4-δ with excess oxygen. The phase separation (O-rich phase: superconducting and O-poor phase: antiferromagnetic) was concluded to occur above 373 K, in contrast with the usual report of the phase separation around 320 K. As for the superconducting phases, the well-known T c onset of 32 or 36 K, dependent on thermal history around 200 K, in the samples annealed in high-pressure oxygen gas, was not changed by thermal history between RT and 373 K. The samples electrochemically oxidized at RT included the phase with the high T c of 45 K, which was not changed by thermal history below RT, and the phase with the low T c of 32 or 36 K. The 45 K phase was changed into the low-T c phase by annealing at 373 K. The samples electrochemically oxidized at 333 K, which was accompanied with the diffusion of excess oxygen, showed gradual change of superconducting behavior: the single low-T c (32 or 36 K) phase (oxidation time = 24 h), coexistence of the low-T c phase and the high-T c (45 K) phase (36 h), and the single high T c phase (48 and 72 h). Thus, the single superconducting phase with the high T c of 45 K has been obtained, which showed a metallic behavior in normal resistivity and apparent changes of lattice constant in comparison with that of stoichiometric La 2 CuO 4 . (orig.)

  15. History-Dependent Problems with Applications to Contact Models for Elastic Beams

    International Nuclear Information System (INIS)

    Bartosz, Krzysztof; Kalita, Piotr; Migórski, Stanisław; Ochal, Anna; Sofonea, Mircea

    2016-01-01

    We prove an existence and uniqueness result for a class of subdifferential inclusions which involve a history-dependent operator. Then we specialize this result in the study of a class of history-dependent hemivariational inequalities. Problems of such kind arise in a large number of mathematical models which describe quasistatic processes of contact. To provide an example we consider an elastic beam in contact with a reactive obstacle. The contact is modeled with a new and nonstandard condition which involves both the subdifferential of a nonconvex and nonsmooth function and a Volterra-type integral term. We derive a variational formulation of the problem which is in the form of a history-dependent hemivariational inequality for the displacement field. Then, we use our abstract result to prove its unique weak solvability. Finally, we consider a numerical approximation of the model, solve effectively the approximate problems and provide numerical simulations

  16. History-Dependent Problems with Applications to Contact Models for Elastic Beams

    Energy Technology Data Exchange (ETDEWEB)

    Bartosz, Krzysztof; Kalita, Piotr; Migórski, Stanisław; Ochal, Anna, E-mail: ochal@ii.uj.edu.pl [Jagiellonian University, Faculty of Mathematics and Computer Science (Poland); Sofonea, Mircea [Université de Perpignan Via Domitia, Laboratoire de Mathématiques et Physique (France)

    2016-02-15

    We prove an existence and uniqueness result for a class of subdifferential inclusions which involve a history-dependent operator. Then we specialize this result in the study of a class of history-dependent hemivariational inequalities. Problems of such kind arise in a large number of mathematical models which describe quasistatic processes of contact. To provide an example we consider an elastic beam in contact with a reactive obstacle. The contact is modeled with a new and nonstandard condition which involves both the subdifferential of a nonconvex and nonsmooth function and a Volterra-type integral term. We derive a variational formulation of the problem which is in the form of a history-dependent hemivariational inequality for the displacement field. Then, we use our abstract result to prove its unique weak solvability. Finally, we consider a numerical approximation of the model, solve effectively the approximate problems and provide numerical simulations.

  17. Response of single bacterial cells to stress gives rise to complex history dependence at the population level

    Science.gov (United States)

    Mathis, Roland; Ackermann, Martin

    2016-01-01

    Most bacteria live in ever-changing environments where periods of stress are common. One fundamental question is whether individual bacterial cells have an increased tolerance to stress if they recently have been exposed to lower levels of the same stressor. To address this question, we worked with the bacterium Caulobacter crescentus and asked whether exposure to a moderate concentration of sodium chloride would affect survival during later exposure to a higher concentration. We found that the effects measured at the population level depended in a surprising and complex way on the time interval between the two exposure events: The effect of the first exposure on survival of the second exposure was positive for some time intervals but negative for others. We hypothesized that the complex pattern of history dependence at the population level was a consequence of the responses of individual cells to sodium chloride that we observed: (i) exposure to moderate concentrations of sodium chloride caused delays in cell division and led to cell-cycle synchronization, and (ii) whether a bacterium would survive subsequent exposure to higher concentrations was dependent on the cell-cycle state. Using computational modeling, we demonstrated that indeed the combination of these two effects could explain the complex patterns of history dependence observed at the population level. Our insight into how the behavior of single cells scales up to processes at the population level provides a perspective on how organisms operate in dynamic environments with fluctuating stress exposure. PMID:26960998

  18. Quantifying fish swimming behavior in response to acute exposure of aqueous copper using computer assisted video and digital image analysis

    Science.gov (United States)

    Calfee, Robin D.; Puglis, Holly J.; Little, Edward E.; Brumbaugh, William G.; Mebane, Christopher A.

    2016-01-01

    Behavioral responses of aquatic organisms to environmental contaminants can be precursors of other effects such as survival, growth, or reproduction. However, these responses may be subtle, and measurement can be challenging. Using juvenile white sturgeon (Acipenser transmontanus) with copper exposures, this paper illustrates techniques used for quantifying behavioral responses using computer assisted video and digital image analysis. In previous studies severe impairments in swimming behavior were observed among early life stage white sturgeon during acute and chronic exposures to copper. Sturgeon behavior was rapidly impaired and to the extent that survival in the field would be jeopardized, as fish would be swept downstream, or readily captured by predators. The objectives of this investigation were to illustrate protocols to quantify swimming activity during a series of acute copper exposures to determine time to effect during early lifestage development, and to understand the significance of these responses relative to survival of these vulnerable early lifestage fish. With mortality being on a time continuum, determining when copper first affects swimming ability helps us to understand the implications for population level effects. The techniques used are readily adaptable to experimental designs with other organisms and stressors.

  19. Timing of transients: quantifying reaching times and transient behavior in complex systems

    Science.gov (United States)

    Kittel, Tim; Heitzig, Jobst; Webster, Kevin; Kurths, Jürgen

    2017-08-01

    In dynamical systems, one may ask how long it takes for a trajectory to reach the attractor, i.e. how long it spends in the transient phase. Although for a single trajectory the mathematically precise answer may be infinity, it still makes sense to compare different trajectories and quantify which of them approaches the attractor earlier. In this article, we categorize several problems of quantifying such transient times. To treat them, we propose two metrics, area under distance curve and regularized reaching time, that capture two complementary aspects of transient dynamics. The first, area under distance curve, is the distance of the trajectory to the attractor integrated over time. It measures which trajectories are ‘reluctant’, i.e. stay distant from the attractor for long, or ‘eager’ to approach it right away. Regularized reaching time, on the other hand, quantifies the additional time (positive or negative) that a trajectory starting at a chosen initial condition needs to approach the attractor as compared to some reference trajectory. A positive or negative value means that it approaches the attractor by this much ‘earlier’ or ‘later’ than the reference, respectively. We demonstrated their substantial potential for application with multiple paradigmatic examples uncovering new features.

  20. History-dependent friction and slow slip from time-dependent microscopic junction laws studied in a statistical framework.

    Science.gov (United States)

    Thøgersen, Kjetil; Trømborg, Jørgen Kjoshagen; Sveinsson, Henrik Andersen; Malthe-Sørenssen, Anders; Scheibert, Julien

    2014-05-01

    To study how macroscopic friction phenomena originate from microscopic junction laws, we introduce a general statistical framework describing the collective behavior of a large number of individual microjunctions forming a macroscopic frictional interface. Each microjunction can switch in time between two states: a pinned state characterized by a displacement-dependent force and a slipping state characterized by a time-dependent force. Instead of tracking each microjunction individually, the state of the interface is described by two coupled distributions for (i) the stretching of pinned junctions and (ii) the time spent in the slipping state. This framework allows for a whole family of microjunction behavior laws, and we show how it represents an overarching structure for many existing models found in the friction literature. We then use this framework to pinpoint the effects of the time scale that controls the duration of the slipping state. First, we show that the model reproduces a series of friction phenomena already observed experimentally. The macroscopic steady-state friction force is velocity dependent, either monotonic (strengthening or weakening) or nonmonotonic (weakening-strengthening), depending on the microscopic behavior of individual junctions. In addition, slow slip, which has been reported in a wide variety of systems, spontaneously occurs in the model if the friction contribution from junctions in the slipping state is time weakening. Next, we show that the model predicts a nontrivial history dependence of the macroscopic static friction force. In particular, the static friction coefficient at the onset of sliding is shown to increase with increasing deceleration during the final phases of the preceding sliding event. We suggest that this form of history dependence of static friction should be investigated in experiments, and we provide the acceleration range in which this effect is expected to be experimentally observable.

  1. Identifying and Quantifying Emergent Behavior Through System of Systems Modeling and Simulation

    Science.gov (United States)

    2015-09-01

    the similarities and differences between Agent Based Modeling ( ABM ) and Equation Based Modeling (EBM). Both modeling approaches “simulate a system by...entities. For the latter difference, EBM focuses on the system level observables, while ABM defines behaviors at the individual agent level and observes...EMERGENT BEHAVIOR THROUGH SYSTEM OF SYSTEMS MODELING AND SIMULATION by Mary Ann Cummings September 2015 Dissertation Supervisor: Man-Tak Shing

  2. A new differential equations-based model for nonlinear history-dependent magnetic behaviour

    International Nuclear Information System (INIS)

    Aktaa, J.; Weth, A. von der

    2000-01-01

    The paper presents a new kind of numerical model describing nonlinear magnetic behaviour. The model is formulated as a set of differential equations taking into account history dependence phenomena like the magnetisation hysteresis as well as saturation effects. The capability of the model is demonstrated carrying out comparisons between measurements and calculations

  3. Quantifying Spiral Ganglion Neurite and Schwann Behavior on Micropatterned Polymer Substrates.

    Science.gov (United States)

    Cheng, Elise L; Leigh, Braden; Guymon, C Allan; Hansen, Marlan R

    2016-01-01

    The first successful in vitro experiments on the cochlea were conducted in 1928 by Honor Fell (Fell, Arch Exp Zellforsch 7(1):69-81, 1928). Since then, techniques for culture of this tissue have been refined, and dissociated primary culture of the spiral ganglion has become a widely accepted in vitro model for studying nerve damage and regeneration in the cochlea. Additionally, patterned substrates have been developed that facilitate and direct neural outgrowth. A number of automated and semi-automated methods for quantifying this neurite outgrowth have been utilized in recent years (Zhang et al., J Neurosci Methods 160(1):149-162, 2007; Tapias et al., Neurobiol Dis 54:158-168, 2013). Here, we describe a method to study the effect of topographical cues on spiral ganglion neurite and Schwann cell alignment. We discuss our microfabrication process, characterization of pattern features, cell culture techniques for both spiral ganglion neurons and spiral ganglion Schwann cells. In addition, we describe protocols for reducing fibroblast count, immunocytochemistry, and methods for quantifying neurite and Schwann cell alignment.

  4. A Brief Report: Quantifying and Correlating Social Behaviors in Children with Autism Spectrum Disorders

    Science.gov (United States)

    Johnson, Ashley L.; Gillis, Jennifer M.; Romanczyk, Raymond G.

    2012-01-01

    The current study investigated social behaviors, including initiating joint attention (IJA), responding to joint attention (RJA), social orienting, and imitation in 14 children with an autism spectrum disorder (ASD) compared to 12 typically developing children (TD). Results indicated that IJA and RJA were positively correlated with social…

  5. Quantifying voids effecting delamination in carbon/epoxy composites: static and fatigue fracture behavior

    Science.gov (United States)

    Hakim, I.; May, D.; Abo Ras, M.; Meyendorf, N.; Donaldson, S.

    2016-04-01

    On the present work, samples of carbon fiber/epoxy composites with different void levels were fabricated using hand layup vacuum bagging process by varying the pressure. Thermal nondestructive methods: thermal conductivity measurement, pulse thermography, pulse phase thermography and lock-in-thermography, and mechanical testing: modes I and II interlaminar fracture toughness were conducted. Comparing the parameters resulted from the thermal nondestructive testing revealed that voids lead to reductions in thermal properties in all directions of composites. The results of mode I and mode II interlaminar fracture toughness showed that voids lead to reductions in interlaminar fracture toughness. The parameters resulted from thermal nondestructive testing were correlated to the results of mode I and mode II interlaminar fracture toughness and voids were quantified.

  6. Odontocete Cetaceans: Quantifying Behavioral Ecology and Response to Predators Using a Multi-Species Approach

    Science.gov (United States)

    2016-03-21

    Behaviour 144(11): 1315–1332. http://doi.org/10.1163/156853907782418213 Madsen, P.T., M. Wahlberg, J. Tougaard, and K. Lucke. 2006. Wind turbine ...individual in small 14 group (Stanford 2002). Increased vigilance and diluted predation risk is often cited as a factor promoting sociality in birds ...also be used to recruit conspecifics to engage in mobbing behavior in both mammals and birds (Curio et al. 1978; Tamura 1989). Actual predation

  7. Quantifying Physiological, Behavioral and Ecological Consequences of Hypoxic Events in Kelp Forest

    Science.gov (United States)

    Litvin, S. Y.; Beers, J. M.; Woodson, C. B.; Leary, P.; Fringer, O. B.; Goldbogen, J. A.; Micheli, F.; Monismith, S. G.; Somero, G. N.

    2016-02-01

    Rocky reef kelp forests that extend along the coast of central California, like many habitats in upwelling systems, often experience inundations of low dissolved oxygen (DO) or hypoxic waters. These events have the potential to influence the structure and function of coastal ecosystems. The ecological consequences of hypoxia for these systems will be mediated by physiological thresholds and behavioral responses of resident organisms in the context of the spatial and temporal variability of DO, and other potential stressors. Our research focuses on Sebastes (i.e. rockfish) because of their commercial, recreational and ecological importance, high abundance across near shore habitats and the potentially severe impacts of physiological stress due to hypoxia. In the lab, to investigate how hypoxic events physiologically effect rockfish, we exposed young of the year (YOY) of 5 species and two life stages of blue rockfish, S. mystinus (YOY and 1+), to DO concentrations representative of upwelling conditions and measured a suite of whole organisms and tissue level responses including metabolic rate, ventilation, tissue-level metabolism, and blood biochemistry. Results demonstrate species and life stage specific differences in physiological stress under upwelling driven hypoxic conditions and suggest YOY rockfishes may currently be living near their physiological limits. In the laboratory we further explored if physiological impacts result in behavioral consequences by examining the startle response of YOY rockfish, a relative measure of predator avoidance ability, under a range of DO concentrations and exposure durations. To further explore behavioral responses of rockfish to low in DO within the kelp forest we are using two approaches, monitoring the vertical distribution of fish communities across the water column using an acoustic imaging camera (ARIS 3000, Soundmetrics Inc.) and acoustic tagging, with 3-D positioning ability (VPS, VEMCO Inc.), of larger blue rockfish

  8. Properly quantized history-dependent Parrondo games, Markov processes, and multiplexing circuits

    Energy Technology Data Exchange (ETDEWEB)

    Bleiler, Steven A. [Fariborz Maseeh Department of Mathematics and Statistics, Portland State University, PO Box 751, Portland, OR 97207 (United States); Khan, Faisal Shah, E-mail: faisal.khan@kustar.ac.a [Khalifa University of Science, Technology and Research, PO Box 127788, Abu Dhabi (United Arab Emirates)

    2011-05-09

    Highlights: History-dependent Parrondo games are viewed as Markov processes. Quantum mechanical analogues of these Markov processes are constructed. These quantum analogues restrict to the original process on measurement. Relationship between these analogues and a quantum circuits is exhibited. - Abstract: In the context of quantum information theory, 'quantization' of various mathematical and computational constructions is said to occur upon the replacement, at various points in the construction, of the classical randomization notion of probability distribution with higher order randomization notions from quantum mechanics such as quantum superposition with measurement. For this to be done 'properly', a faithful copy of the original construction is required to exist within the new quantum one, just as is required when a function is extended to a larger domain. Here procedures for extending history-dependent Parrondo games, Markov processes and multiplexing circuits to their quantum versions are analyzed from a game theoretic viewpoint, and from this viewpoint, proper quantizations developed.

  9. Mapping urban climate zones and quantifying climate behaviors--an application on Toulouse urban area (France).

    Science.gov (United States)

    Houet, Thomas; Pigeon, Grégoire

    2011-01-01

    Facing the concern of the population to its environment and to climatic change, city planners are now considering the urban climate in their choices of planning. The use of climatic maps, such Urban Climate Zone‑UCZ, is adapted for this kind of application. The objective of this paper is to demonstrate that the UCZ classification, integrated in the World Meteorological Organization guidelines, first can be automatically determined for sample areas and second is meaningful according to climatic variables. The analysis presented is applied on Toulouse urban area (France). Results show first that UCZ differentiate according to air and surface temperature. It has been possible to determine the membership of sample areas to an UCZ using landscape descriptors automatically computed with GIS and remote sensed data. It also emphasizes that climate behavior and magnitude of UCZ may vary from winter to summer. Finally we discuss the influence of climate data and scale of observation on UCZ mapping and climate characterization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Automatic MRI Quantifying Methods in Behavioral-Variant Frontotemporal Dementia Diagnosis

    Directory of Open Access Journals (Sweden)

    Antti Cajanus

    2018-02-01

    Full Text Available Aims: We assessed the value of automated MRI quantification methods in the differential diagnosis of behavioral-variant frontotemporal dementia (bvFTD from Alzheimer disease (AD, Lewy body dementia (LBD, and subjective memory complaints (SMC. We also examined the role of the C9ORF72-related genetic status in the differentiation sensitivity. Methods: The MRI scans of 50 patients with bvFTD (17 C9ORF72 expansion carriers were analyzed using 6 quantification methods as follows: voxel-based morphometry (VBM, tensor-based morphometry, volumetry (VOL, manifold learning, grading, and white-matter hyperintensities. Each patient was then individually compared to an independent reference group in order to attain diagnostic suggestions. Results: Only VBM and VOL showed utility in correctly identifying bvFTD from our set of data. The overall classification sensitivity of bvFTD with VOL + VBM achieved a total sensitivity of 60%. Using VOL + VBM, 32% were misclassified as having LBD. There was a trend of higher values for classification sensitivity of the C9ORF72 expansion carriers than noncarriers. Conclusion: VOL, VBM, and their combination are effective in differential diagnostics between bvFTD and AD or SMC. However, MRI atrophy profiles for bvFTD and LBD are too similar for a reliable differentiation with the quantification methods tested in this study.

  11. Quantifying Transmission.

    Science.gov (United States)

    Woolhouse, Mark

    2017-07-01

    Transmissibility is the defining characteristic of infectious diseases. Quantifying transmission matters for understanding infectious disease epidemiology and designing evidence-based disease control programs. Tracing individual transmission events can be achieved by epidemiological investigation coupled with pathogen typing or genome sequencing. Individual infectiousness can be estimated by measuring pathogen loads, but few studies have directly estimated the ability of infected hosts to transmit to uninfected hosts. Individuals' opportunities to transmit infection are dependent on behavioral and other risk factors relevant given the transmission route of the pathogen concerned. Transmission at the population level can be quantified through knowledge of risk factors in the population or phylogeographic analysis of pathogen sequence data. Mathematical model-based approaches require estimation of the per capita transmission rate and basic reproduction number, obtained by fitting models to case data and/or analysis of pathogen sequence data. Heterogeneities in infectiousness, contact behavior, and susceptibility can have substantial effects on the epidemiology of an infectious disease, so estimates of only mean values may be insufficient. For some pathogens, super-shedders (infected individuals who are highly infectious) and super-spreaders (individuals with more opportunities to transmit infection) may be important. Future work on quantifying transmission should involve integrated analyses of multiple data sources.

  12. Using Data From the Microsoft Kinect 2 to Quantify Upper Limb Behavior: A Feasibility Study.

    Science.gov (United States)

    Dehbandi, Behdad; Barachant, Alexandre; Harary, David; Long, John Davis; Tsagaris, K Zoe; Bumanlag, Silverio Joseph; He, Victor; Putrino, David

    2017-09-01

    The objective of this study was to assess whether the novel application of a machine learning approach to data collected from the Microsoft Kinect 2 (MK2) could be used to classify differing levels of upper limb impairment. Twenty-four healthy subjects completed items of the Wolf Motor Function Test (WMFT), which is a clinically validated metric of upper limb function for stroke survivors. Subjects completed the WMFT three times: 1) as a healthy individual; 2) emulating mild impairment; and 3) emulating moderate impairment. A MK2 was positioned in front of participants, and collected kinematic data as they completed the WMFT. A classification framework, based on Riemannian geometry and the use of covariance matrices as feature representation of the MK2 data, was developed for these data, and its ability to successfully classify subjects as either "healthy," "mildly impaired," or "moderately impaired" was assessed. Mean accuracy for our classifier was 91.7%, with a specific accuracy breakdown of 100%, 83.3%, and 91.7% for the "healthy," "mildly impaired," and "moderately impaired" conditions, respectively. We conclude that data from the MK2 is of sufficient quality to perform objective motor behavior classification in individuals with upper limb impairment. The data collection and analysis framework that we have developed has the potential to disrupt the field of clinical assessment. Future studies will focus on validating this protocol on large populations of individuals with actual upper limb impairments in order to create a toolkit that is clinically validated and available to the clinical community.

  13. Tracing Mantle Plumes: Quantifying their Morphology and Behavior from Seismic Tomography

    Science.gov (United States)

    O'Farrell, K. A.; Eakin, C. M.; Jones, T. D.; Garcia, E.; Robson, A.; Mittal, T.; Lithgow-Bertelloni, C. R.; Jackson, M. G.; Lekic, V.; Rudolph, M. L.

    2016-12-01

    Hotspot volcanism provides a direct link between the deep mantle and the surface, but the location, depth and source of the mantle plumes that feed hotspots are highly controversial. In order to address this issue it is important to understand the journey along which plumes have travelled through the mantle. The general behavior of plumes in the mantle also has the potential to tell us about the vigor of mantle convection, net rotation of the mantle, the role of thermal versus chemical anomalies, and important bulk physical properties of the mantle such as the viscosity profile. To address these questions we developed an algorithm to trace plume-like features in shear-wave (Vs) seismic tomographic models based on picking local minima in velocity and searching for continuous features with depth. We apply this method to several of the latest tomographic models and can recover 30 or more continuous plume conduits that are >750 km long. Around half of these can be associated with a known hotspot at the surface. We study the morphology of these plume chains and find that the largest lateral deflections occur near the base of the lower mantle and in the upper mantle. We analyze the preferred orientation of the plume deflections and their gradient to infer large scale mantle flow patterns and the depth of viscosity contrasts in the mantle respectively. We also retrieve Vs profiles for our traced plumes and compare with velocity profiles predicted for different mantle adiabat temperatures. We use this to constrain the thermal anomaly associated with these plumes. This thermal anomaly is then converted to a density anomaly and an upwelling velocity is derived. We compare this to buoyancy fluxes calculated at the surface and use this in conjunction with our measured plume tilts/deflections to estimate the strength of the "mantle wind".

  14. Critical current characteristics and history dependence in superconducting SmFeAsOF bulk

    International Nuclear Information System (INIS)

    Ni, B; Ge, J; Kiuchi, M; Otabe, E S; Gao, Z; Wang, L; Qi, Y; Zhang, X; Ma, Y

    2010-01-01

    The superconducting SmFeAsO 1-x F x (x=0.2) polycrystalline bulks were prepared by the powder-in-tube (PIT) method. The magnetic field and temperature dependences of critical current densities in the samples were investigated by resistive and ac inductive (Campbell's) methods. It was found that a fairly large shielding current density over 10 9 A/m 2 , which is considered to correspond to the local critical current density, flows locally with the perimeter size similar to the average grain size of the bulk samples, while an extremely low transport current density of about 10 5 A/m 2 corresponding to the global critical current density flows through the whole sample. Furthermore, a unique history dependence of global critical current density was observed, i.e., it shows a smaller value in the increasing-field process than that in the decreasing-field process. The history dependence of global critical current characteristic in our case can be ascribed to the existence of the weak-link property between the grains in SmFeAsO 1-x F x bulk.

  15. History-dependent nonlinear dissipation in superfluid 3He-A

    International Nuclear Information System (INIS)

    Gay, R.; Bagley, M.; Hook, J.R.; Sandiford, D.J.; Hall, H.E.

    1983-01-01

    We have studied nonlinear dissipation in oscillatory flow of 3 He-A through 49-μm- and 17-μm-wide channels by means of torsion pendulum experiments at about 50 Hz. The observed effects are strongly history dependent; the dissipation at a given measuring amplitude is strongly increased if the sample is cooled through T/sub c/ while oscillating at large amplitude. Once a highly dissipative state has been created it does not noticeably decay below T/sub c/, though a more dissipative state can be created below T/sub c/ by a period of sufficiently large-amplitude oscillation. The results are described semiquantitatively by a model based on the idea of superflow collapse by motion of the l vector, with consequent orbital dissipation. The history dependence is introduced into this model by postulating the existence of surface singularities in the l texture, the density of which is determined by the previous history of the helium

  16. The game-theoretic national interstate economic model : an integrated framework to quantify the economic impacts of cyber-terrorist behavior.

    Science.gov (United States)

    2014-12-01

    This study suggests an integrated framework to quantify cyber attack impacts on the U.S. airport security system. A cyber attack by terrorists on the U.S. involves complex : strategic behavior by the terrorists because they could plan to invade an ai...

  17. Thermal-history dependent magnetoelastic transition in (Mn,Fe){sub 2}(P,Si)

    Energy Technology Data Exchange (ETDEWEB)

    Miao, X. F., E-mail: x.f.miao@tudelft.nl; Dijk, N. H. van; Brück, E. [Fundamental Aspects of Materials and Energy, Faculty of Applied Sciences, Delft University of Technology, Mekelweg 15, 2629 JB Delft (Netherlands); Caron, L. [Fundamental Aspects of Materials and Energy, Faculty of Applied Sciences, Delft University of Technology, Mekelweg 15, 2629 JB Delft (Netherlands); Max Planck Institute for Chemical Physics of Solids, Nöthnitzer Straße 40, D-01187 Dresden (Germany); Gercsi, Z. [Blackett Laboratory, Department of Physics, Imperial College London, London SW7 2AZ (United Kingdom); CRANN and School of Physics, Trinity College Dublin, Dublin (Ireland); Daoud-Aladine, A. [ISIS Facility, Rutherford Appleton Laboratory, Chilton, Didcot, Oxfordshire OX11 0QX (United Kingdom)

    2015-07-27

    The thermal-history dependence of the magnetoelastic transition in (Mn,Fe){sub 2}(P,Si) compounds has been investigated using high-resolution neutron diffraction. As-prepared samples display a large difference in paramagnetic-ferromagnetic (PM-FM) transition temperature compared to cycled samples. The initial metastable state transforms into a lower-energy stable state when the as-prepared sample crosses the PM-FM transition for the first time. This additional transformation is irreversible around the transition temperature and increases the energy barrier which needs to be overcome through the PM-FM transition. Consequently, the transition temperature on first cooling is found to be lower than on subsequent cycles characterizing the so-called “virgin effect.” High-temperature annealing can restore the cycled sample to the high-temperature metastable state, which leads to the recovery of the virgin effect. A model is proposed to interpret the formation and recovery of the virgin effect.

  18. Life history dependent morphometric variation in stream-dwelling Atlantic salmon

    Science.gov (United States)

    Letcher, B.H.

    2003-01-01

    The time course of morphometric variation among life histories for stream-dwelling Atlantic salmon (Salmo salar L.) parr (age-0+ to age-2+) was analyzed. Possible life histories were combinations of parr maturity status in the autumn (mature or immature) and age at outmigration (smolt at age-2+ or later age). Actual life histories expressed with enough fish for analysis in the 1997 cohort were immature/age-2+ smolt, mature/age-2 +smolt, and mature/age-2+ non-smolt. Tagged fish were assigned to one of the three life histories and digital pictures from the field were analyzed using landmark-based geometric morphometrics. Results indicated that successful grouping of fish according to life history varied with fish age, but that fish could be grouped before the actual expression of the life histories. By March (age-1+), fish were successfully grouped using a descriptive discriminant function and successful assignment ranged from 84 to 97% for the remainder of stream residence. A jackknife of the discriminant function revealed an average life history prediction success of 67% from age-1+ summer to smolting. Low sample numbers for one of the life histories may have limited prediction success. A MANOVA on the shape descriptors (relative warps) also indicated significant differences in shape among life histories from age-1+ summer through to smolting. Across all samples, shape varied significantly with size. Within samples, shape did not vary significantly with size for samples from December (age-0+) to May (age-1+). During the age-1+ summer however, shape varied significantly with size, but the relationship between shape and size was not different among life histories. In the autumn (age-1+) and winter (age-2+), life history differences explained a significant portion of the change in shape with size. Life history dependent morphometric variation may be useful to indicate the timing of early expressions of life history variation and as a tool to explore temporal and

  19. Quantifying Matter

    CERN Document Server

    Angelo, Joseph A

    2011-01-01

    Quantifying Matter explains how scientists learned to measure matter and quantify some of its most fascinating and useful properties. It presents many of the most important intellectual achievements and technical developments that led to the scientific interpretation of substance. Complete with full-color photographs, this exciting new volume describes the basic characteristics and properties of matter. Chapters include:. -Exploring the Nature of Matter. -The Origin of Matter. -The Search for Substance. -Quantifying Matter During the Scientific Revolution. -Understanding Matter's Electromagnet

  20. Neutron diffraction study of history dependence in MnFeP0.6Si0.4

    International Nuclear Information System (INIS)

    Zhang, L.; Moze, O.; Prokes, K.; Tegus, O.; Brueck, E.

    2005-01-01

    In the MnFe(P,As) compounds which are promising magnetorefrigerant materials, we have studied the effect of Si substitution and successfully replaced As by Si. Surprisingly besides all the other changes, a peculiar history dependence of the magnetic phase transition was disclosed. The as-prepared sample shows a significantly lower transition temperature (namely a virgin T C ) than the sample that has experienced thermal cycling. The neutron diffraction patterns recorded during the first cooling manifest the first-order and magnetic-field-induced characters of the virgin phase transition. However, the refinement of the diffraction patterns does not provide evidence for atomic-position swapping, which might account for this history dependence

  1. Technical note: Quantifying and characterizing behavior in dairy calves using the IceTag automatic recording device

    DEFF Research Database (Denmark)

    Trénel, P.; Jensen, Margit Bak; Decker, Erik Luc

    2009-01-01

    The objectives of the current study were 1) to validate the IceTag ( http://www.icerobotics.com/ ) automatic recording device for measuring lying, standing, and moving behavior in dairy calves, and 2) to improve the information yield from this device by applying a filtering procedure allowing...... for the detection of lying versus upright. The IceTag device provides measures of intensity (I) of lying, standing, and activity measured as percent lying, percent standing, and percent active, but does not directly measure lying, standing, and moving behavior because body movements occurring while lying (e...... (LPC) was established empirically, and IceTag data were filtered according to the LPC, providing information on the posture of the animal as lying versus being upright. Third, a new threshold of I was estimated for moving activity conditional on the animal being upright. IceTag recordings from 9 calves...

  2. Quantifying the Impacts of Timebased Rates, Enabling Technology, and Other Treatments in Consumer Behavior Studies: Protocols and Guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Cappers, Peter [Ernest Orlando Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Todd, Annika [Ernest Orlando Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Perry, Michael [Ernest Orlando Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Neenan, Bernie [Ernest Orlando Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Boisvert, Richard [Ernest Orlando Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)

    2013-06-27

    This report offers guidelines and protocols for measuring the effects of time-based rates, enabling technology, and various other treatments on customers’ levels and patterns of electricity usage. Although the focus is on evaluating consumer behavior studies (CBS) that involve field trials and pilots, the methods can be extended to assessing the large-scale programs that may follow. CBSs are undertaken to resolve uncertainties and ambiguities about how consumers respond to inducements to modify their electricity demand. Those inducements include price structures; feedback and information; and enabling technologies embedded in programs such as: critical peak, time-of use, real-time pricing; peak time rebate or critical peak rebate; home energy reports and in-home displays; and all manner of device controls for appliances and plug loads. Although the focus of this report is on consumer studies—where the subjects are households—the behavioral sciences principles discussed and many of the methods recommended apply equally to studying commercial and industrial customer electricity demand.

  3. Bayesian deterministic decision making: A normative account of the operant matching law and heavy-tailed reward history dependency of choices

    Directory of Open Access Journals (Sweden)

    Hiroshi eSaito

    2014-03-01

    Full Text Available The decision making behaviors of humans and animals adapt and then satisfy an ``operant matching law'' in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.

  4. Educational differences in postmenopausal breast cancer--quantifying indirect effects through health behaviors, body mass index and reproductive patterns.

    Directory of Open Access Journals (Sweden)

    Ulla Arthur Hvidtfeldt

    Full Text Available Studying mechanisms underlying social inequality in postmenopausal breast cancer is important in order to develop prevention strategies. Standard methods for investigating indirect effects, by comparing crude models to adjusted, are often biased. We applied a new method enabling the decomposition of the effect of educational level on breast cancer incidence into indirect effects through reproductive patterns (parity and age at first birth, body mass index and health behavior (alcohol consumption, physical inactivity, and hormone therapy use. The study was based on a pooled cohort of 6 studies from the Copenhagen area including 33,562 women (1,733 breast cancer cases aged 50-70 years at baseline. The crude absolute rate of breast cancer was 399 cases per 100,000 person-years. A high educational level compared to low was associated with 74 (95% CI 22-125 extra breast cancer cases per 100,000 person-years at risk. Of these, 26% (95% CI 14%-69% could be attributed to alcohol consumption. Similar effects were observed for age at first birth (32%; 95% CI 10%-257%, parity (19%; 95%CI 10%-45%, and hormone therapy use (10%; 95% CI 6%-18%. Educational level modified the effect of physical activity on breast cancer. In conclusion, this analysis suggests that a substantial number of the excess postmenopausal breast cancer events among women with a high educational level compared to a low can be attributed to differences in alcohol consumption, use of hormone therapy, and reproductive patterns. Women of high educational level may be more vulnerable to physical inactivity compared to women of low educational level.

  5. The Risk of Repetition of Attempted Suicide Among Iranian Women with Psychiatric Disorders as Quantified by the Suicide Behaviors Questionnaire

    Directory of Open Access Journals (Sweden)

    Jalal Shakeri

    2015-05-01

    Full Text Available Objectives: The factors associated with repetition of attempted suicide are poorly categorized in the Iranian population. In this study, the prevalence of different psychiatric disorders among women who attempted suicide and the risk of repetition were assessed. Methods: Participants were women admitted to the Poisoning Emergency Hospital, Kermanshah University of Medical Sciences following failed suicide attempts. Psychiatric disorders were diagnosed based on the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV symptom checklist. Risk of repetition was evaluated using the Suicide Behaviors Questionnaire-Revised (SBQ-R. Results: About 72% of individuals had a SBQ-R score >8 and were considered to be at high risk for repeated attempted suicide. Adjustment disorders were the most common psychiatric disorders (40.8%. However, the type of psychiatric disorder was not associated with the risk of repetition (p=0.320. Marital status, educational level, employment, substance use, history of suicide among family members, and motivation were not determinant factors for repetition of suicide attempt (p=0.220, 0.880, 0.220, 0.290, 0.350 and 0.270, respectively. Younger women were associated with violent methods of attempted suicide, such as self-cutting, whereas older individuals preferred consumption of poison (p<0.001. Drug overdose was more common among single and married women whereas widows or divorcees preferred self-burning (p=0.004. Conclusion: About 72% of patients with failed suicide attempts were at high risk for repeated attempts. Age, marital status, and type of psychiatric disorder were the only determinants of suicide method. Adjustment disorders were the most common psychiatric disorders among Iranian women. However, this did not predict the risk of further attempts.

  6. Educational Differences in Postmenopausal Breast Cancer – Quantifying Indirect Effects through Health Behaviors, Body Mass Index and Reproductive Patterns

    Science.gov (United States)

    Hvidtfeldt, Ulla Arthur; Lange, Theis; Andersen, Ingelise; Diderichsen, Finn; Keiding, Niels; Prescott, Eva; Sørensen, Thorkild I. A.; Tjønneland, Anne; Rod, Naja Hulvej

    2013-01-01

    Studying mechanisms underlying social inequality in postmenopausal breast cancer is important in order to develop prevention strategies. Standard methods for investigating indirect effects, by comparing crude models to adjusted, are often biased. We applied a new method enabling the decomposition of the effect of educational level on breast cancer incidence into indirect effects through reproductive patterns (parity and age at first birth), body mass index and health behavior (alcohol consumption, physical inactivity, and hormone therapy use). The study was based on a pooled cohort of 6 studies from the Copenhagen area including 33,562 women (1,733 breast cancer cases) aged 50–70 years at baseline. The crude absolute rate of breast cancer was 399 cases per 100,000 person-years. A high educational level compared to low was associated with 74 (95% CI 22–125) extra breast cancer cases per 100,000 person-years at risk. Of these, 26% (95% CI 14%–69%) could be attributed to alcohol consumption. Similar effects were observed for age at first birth (32%; 95% CI 10%–257%), parity (19%; 95%CI 10%–45%), and hormone therapy use (10%; 95% CI 6%–18%). Educational level modified the effect of physical activity on breast cancer. In conclusion, this analysis suggests that a substantial number of the excess postmenopausal breast cancer events among women with a high educational level compared to a low can be attributed to differences in alcohol consumption, use of hormone therapy, and reproductive patterns. Women of high educational level may be more vulnerable to physical inactivity compared to women of low educational level. PMID:24205296

  7. Using a computational model to quantify the potential impact of changing the placement of healthy beverages in stores as an intervention to "Nudge" adolescent behavior choice.

    Science.gov (United States)

    Wong, Michelle S; Nau, Claudia; Kharmats, Anna Yevgenyevna; Vedovato, Gabriela Milhassi; Cheskin, Lawrence J; Gittelsohn, Joel; Lee, Bruce Y

    2015-12-23

    Product placement influences consumer choices in retail stores. While sugar sweetened beverage (SSB) manufacturers expend considerable effort and resources to determine how product placement may increase SSB purchases, the information is proprietary and not available to the public health and research community. This study aims to quantify the effect of non-SSB product placement in corner stores on adolescent beverage purchasing behavior. Corner stores are small privately owned retail stores that are important beverage providers in low-income neighborhoods--where adolescents have higher rates of obesity. Using data from a community-based survey in Baltimore and parameters from the marketing literature, we developed a decision-analytic model to simulate and quantify how placement of healthy beverage (placement in beverage cooler closest to entrance, distance from back of the store, and vertical placement within each cooler) affects the probability of adolescents purchasing non-SSBs. In our simulation, non-SSB purchases were 2.8 times higher when placed in the "optimal location"--on the second or third shelves of the front cooler--compared to the worst location on the bottom shelf of the cooler farthest from the entrance. Based on our model results and survey data, we project that moving non-SSBs from the worst to the optional location would result in approximately 5.2 million more non-SSBs purchased by Baltimore adolescents annually. Our study is the first to quantify the potential impact of changing placement of beverages in corner stores. Our findings suggest that this could be a low-cost, yet impactful strategy to nudge this population--highly susceptible to obesity--towards healthier beverage decisions.

  8. History dependence and consequences of the microchemical evolution of AISI 316

    International Nuclear Information System (INIS)

    Garner, F.A.; Porter, D.L.

    1982-04-01

    The onset of void swelling in AISI 316 can be correlated with the removal of the elements nickel and silicon into precipitate phases. This conclusion was reached earlier based on measurement of microscopic volumes and has now been confirmed by the use of bulk extraction and elemental analysis of precipitates formed during irradiation. It appears that the average level of nickel remaining in the alloy matrix can be used as an index to assess the degree of completion of the microchemical evolution. The dependence of nickel removal on neutron flux and fluence, irradiation temperature, applied stress and preirradiation thermal-mechanical treatment is consistent with the behavior of swelling in response to these variables. It appears that at 400 0 C the microchemical evolution is very sluggish and still in progress at 14 x 10 22 n/cm 2 (E > 0.1 MeV). This suggests that the swelling rate at low temperatures will continue to increase with fluence and will approach that measured at higher temperatures. This higher swelling rate can be realized for some temperature histories which accelerate the kinetics of phase evolution

  9. Quantified student

    NARCIS (Netherlands)

    Rens van der Vorst

    2017-01-01

    Learning is all about feedback. Runners, for example, use apps like the RunKeeper. Research shows that apps like that enhance engagement and results. And people think it is fun. The essence being that the behavior of the runner is tracked and communicated back to the runner in a dashboard. We

  10. Quantifiers and working memory

    NARCIS (Netherlands)

    Szymanik, J.; Zajenkowski, M.

    2010-01-01

    The paper presents a study examining the role of working memory in quantifier verification. We created situations similar to the span task to compare numerical quantifiers of low and high rank, parity quantifiers and proportional quantifiers. The results enrich and support the data obtained

  11. Quantifiers and working memory

    NARCIS (Netherlands)

    Szymanik, J.; Zajenkowski, M.

    2009-01-01

    The paper presents a study examining the role of working memory in quantifier verification. We created situations similar to the span task to compare numerical quantifiers of low and high rank, parity quantifiers and proportional quantifiers. The results enrich and support the data obtained

  12. Calculating radiation exposures during use of (14)C-labeled nutrients, food components, and biopharmaceuticals to quantify metabolic behavior in humans.

    Science.gov (United States)

    Kim, Seung-Hyun; Kelly, Peter B; Clifford, Andrew J

    2010-04-28

    (14)C has long been used as a tracer for quantifying the in vivo human metabolism of food components, biopharmaceuticals, and nutrients. Minute amounts (food components, biopharmaceuticals, or nutrients to be organized into models suitable for quantitative hypothesis testing and determination of metabolic parameters. In vivo models are important for specification of intake levels for food components, biopharmaceuticals, and nutrients. Accurate estimation of the radiation exposure from ingested (14)C is an essential component of the experimental design. Therefore, this paper illustrates the calculation involved in determining the radiation exposure from a minute dose of orally administered (14)C-beta-carotene, (14)C-alpha-tocopherol, (14)C-lutein, and (14)C-folic acid from four prior experiments. The administered doses ranged from 36 to 100 nCi, and radiation exposure ranged from 0.12 to 5.2 microSv to whole body and from 0.2 to 3.4 microSv to liver with consideration of tissue weighting factor and fractional nutrient. In comparison, radiation exposure experienced during a 4 h airline flight across the United States at 37000 ft was 20 microSv.

  13. Robotic set-up to quantify hand-eye behavior in motor execution and learning of children with autism spectrum disorder.

    Science.gov (United States)

    Casellato, Claudia; Gandolla, Marta; Crippa, Alessandro; Pedrocchi, Alessandra

    2017-07-01

    Autism spectrum disorder (ASD) is a multifaceted neurodevelopmental disorder characterized by a persistence of social and communication impairment, and restricted and repetitive behaviors. However, motor disorders have also been described, but not objectively assessed. Most studies showed inefficient eye-hand coordination and motor learning in children with ASD; in other experiments, mechanisms of acquisition of internal models in self-generated movements appeared to be normal in autism. In this framework, we have developed a robotic protocol, recording gaze and hand data during upper limb tasks, in which a haptic pen-like handle is moved along specific trajectories displayed on the screen. The protocol includes trials of reaching under a perturbing force field and catching moving targets, with or without visual availability of the whole path. We acquired 16 typically-developing scholar-age children and one child with ASD as a case study. Speed-accuracy tradeoff, motor performance, and gaze-hand spatial coordination have been evaluated. Compared to typically developing peers, in the force field sequence, the child with ASD showed an intact but delayed learning, and more variable gazehand patterns. In the catching trials, he showed less efficient movements, but an intact capability of exploiting the available a-priori plan. The proposed protocol represents a powerful tool, easily tunable, for quantitative (longitudinal) assessment, and for subject-tailored training in ASD.

  14. Quantifying the Behavioral Response of Spawning Chum Salmon to Elevated Discharges from Bonneville Dam, Columbia River : Annual Report 2005-2006.

    Energy Technology Data Exchange (ETDEWEB)

    Tiffan, Kenneth F.; Haskell, Craig A.; Kock, Tobias J.

    2008-12-01

    In unimpounded rivers, Pacific salmon (Oncorhynchus spp.) typically spawn under relatively stable stream flows, with exceptions occurring during periodic precipitation events. In contrast, hydroelectric development has often resulted in an artificial hydrograph characterized by rapid changes in discharge and tailwater elevation that occur on a daily, or even an hourly basis, due to power generation (Cushman 1985; Moog 1993). Consequently, populations of Pacific salmon that are known to spawn in main-stem habitats below hydroelectric dams face the risks of changing habitat suitability, potential redd dewatering, and uncertain spawning success (Hamilton and Buell 1976; Chapman et al. 1986; Dauble et al. 1999; Garland et al. 2003; Connor and Pflug 2004; McMichael et al. 2005). Although the direct effects of a variable hydrograph, such as redd dewatering are apparent, specific effects on spawning behavior remain largely unexplored. Chum salmon (O. keta) that spawn below Bonneville Dam on the Columbia River are particularly vulnerable to the effects of water level fluctuations. Although chum salmon generally spawn in smaller tributaries (Johnson et al. 1997), many fish spawn in main-stem habitats below Bonneville Dam near Ives Island (Tomaro et al. 2007; Figure 1). The primary spawning area near Ives Island is shallow and sensitive to changes in water level caused by hydroelectric power generation at Bonneville Dam. In the past, fluctuating water levels have dewatered redds and changed the amount of available spawning habitat (Garland et al. 2003). To minimize these effects, fishery managers attempt to maintain a stable tailwater elevation at Bonneville Dam of 3.5 m (above mean sea level) during spawning, which ensures adequate water is provided to the primary chum salmon spawning area below the mouth of Hamilton Creek (Figure 1). Given the uncertainty of winter precipitation and water supply, this strategy has been effective at restricting spawning to a specific

  15. History Dependence of the Microstructure on Time-Dependent Deformation During In-Situ Cooling of a Nickel-Based Single-Crystal Superalloy

    Science.gov (United States)

    Panwisawas, Chinnapat; D'Souza, Neil; Collins, David M.; Bhowmik, Ayan; Roebuck, Bryan

    2018-05-01

    Time-dependent plastic deformation through stress relaxation and creep deformation during in-situ cooling of the as-cast single-crystal superalloy CMSX-4® has been studied via neutron diffraction, transmission electron microscopy, electro-thermal miniature testing, and analytical modeling across two temperature regimes. Between 1000 °C and 900 °C, stress relaxation prevails and gives rise to softening as evidenced by a decreased dislocation density and the presence of long segment stacking faults in γ phase. Lattice strains decrease in both the γ matrix and γ' precipitate phases. A constitutive viscoplastic law derived from in-situ isothermal relaxation test under-estimates the equivalent plastic strain in the prediction of the stress and strain evolution during cooling in this case. It is thereby shown that the history dependence of the microstructure needs to be taken into account while deriving a constitutive law and which becomes even more relevant at high temperatures approaching the solvus. Higher temperature cooling experiments have also been carried out between 1300 °C and 1150 °C to measure the evolution of stress and plastic strain close to the γ' solvus temperature. In-situ cooling of samples using ETMT shows that creep dominates during high-temperature deformation between 1300 °C and 1220 °C, but below a threshold temperature, typically 1220 °C work hardening begins to prevail from increasing γ' fraction and resulting in a rapid increase in stress. The history dependence of prior accumulated deformation is also confirmed in the flow stress measurements using a single sample while cooling. The saturation stresses in the flow stress experiments show very good agreement with the stresses measured in the cooling experiments when viscoplastic deformation is dominant. This study demonstrates that experimentation during high-temperature deformation as well as the history dependence of the microstructure during cooling plays a key role in deriving

  16. Quantifiers for quantum logic

    OpenAIRE

    Heunen, Chris

    2008-01-01

    We consider categorical logic on the category of Hilbert spaces. More generally, in fact, any pre-Hilbert category suffices. We characterise closed subobjects, and prove that they form orthomodular lattices. This shows that quantum logic is just an incarnation of categorical logic, enabling us to establish an existential quantifier for quantum logic, and conclude that there cannot be a universal quantifier.

  17. Connected Car: Quantified Self becomes Quantified Car

    Directory of Open Access Journals (Sweden)

    Melanie Swan

    2015-02-01

    Full Text Available The automotive industry could be facing a situation of profound change and opportunity in the coming decades. There are a number of influencing factors such as increasing urban and aging populations, self-driving cars, 3D parts printing, energy innovation, and new models of transportation service delivery (Zipcar, Uber. The connected car means that vehicles are now part of the connected world, continuously Internet-connected, generating and transmitting data, which on the one hand can be helpfully integrated into applications, like real-time traffic alerts broadcast to smartwatches, but also raises security and privacy concerns. This paper explores the automotive connected world, and describes five killer QS (Quantified Self-auto sensor applications that link quantified-self sensors (sensors that measure the personal biometrics of individuals like heart rate and automotive sensors (sensors that measure driver and passenger biometrics or quantitative automotive performance metrics like speed and braking activity. The applications are fatigue detection, real-time assistance for parking and accidents, anger management and stress reduction, keyless authentication and digital identity verification, and DIY diagnostics. These kinds of applications help to demonstrate the benefit of connected world data streams in the automotive industry and beyond where, more fundamentally for human progress, the automation of both physical and now cognitive tasks is underway.

  18. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst......-case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only...... compare the worst-case execution time bounds of different architectures....

  19. Thermosensory reversal effect quantified

    NARCIS (Netherlands)

    Bergmann Tiest, W.M.; Kappers, A.M.L.

    2008-01-01

    At room temperature, some materials feel colder than others due to differences in thermal conductivity, heat capacity and geometry. When the ambient temperature is well above skin temperature, the roles of 'cold' and 'warm' materials are reversed. In this paper, this effect is quantified by

  20. Thermosensory reversal effect quantified

    NARCIS (Netherlands)

    Bergmann Tiest, W.M.; Kappers, A.M.L.

    2008-01-01

    At room temperature, some materials feel colder than others due to differences in thermal conductivity, heat capacity and geometry. When the ambient temperature is well above skin temperature, the roles of ‘cold’ and ‘warm’ materials are reversed. In this paper, this effect is quantified by

  1. Quantifying requirements volatility effects

    NARCIS (Netherlands)

    Kulk, G.P.; Verhoef, C.

    2008-01-01

    In an organization operating in the bancassurance sector we identified a low-risk IT subportfolio of 84 IT projects comprising together 16,500 function points, each project varying in size and duration, for which we were able to quantify its requirements volatility. This representative portfolio

  2. The quantified relationship

    NARCIS (Netherlands)

    Danaher, J.; Nyholm, S.R.; Earp, B.

    2018-01-01

    The growth of self-tracking and personal surveillance has given rise to the Quantified Self movement. Members of this movement seek to enhance their personal well-being, productivity, and self-actualization through the tracking and gamification of personal data. The technologies that make this

  3. Quantifying IT estimation risks

    NARCIS (Netherlands)

    Kulk, G.P.; Peters, R.J.; Verhoef, C.

    2009-01-01

    A statistical method is proposed for quantifying the impact of factors that influence the quality of the estimation of costs for IT-enabled business projects. We call these factors risk drivers as they influence the risk of the misestimation of project costs. The method can effortlessly be

  4. Quantifying light pollution

    International Nuclear Information System (INIS)

    Cinzano, P.; Falchi, F.

    2014-01-01

    In this paper we review new available indicators useful to quantify and monitor light pollution, defined as the alteration of the natural quantity of light in the night environment due to introduction of manmade light. With the introduction of recent radiative transfer methods for the computation of light pollution propagation, several new indicators become available. These indicators represent a primary step in light pollution quantification, beyond the bare evaluation of the night sky brightness, which is an observational effect integrated along the line of sight and thus lacking the three-dimensional information. - Highlights: • We review new available indicators useful to quantify and monitor light pollution. • These indicators are a primary step in light pollution quantification. • These indicators allow to improve light pollution mapping from a 2D to a 3D grid. • These indicators allow carrying out a tomography of light pollution. • We show an application of this technique to an Italian region

  5. Quantifying linguistic coordination

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Tylén, Kristian

    task (Bahrami et al 2010, Fusaroli et al. 2012) we extend to linguistic coordination dynamical measures of recurrence employed in the analysis of sensorimotor coordination (such as heart-rate (Konvalinka et al 2011), postural sway (Shockley 2005) and eye-movements (Dale, Richardson and Kirkham 2012......). We employ nominal recurrence analysis (Orsucci et al 2005, Dale et al 2011) on the decision-making conversations between the participants. We report strong correlations between various indexes of recurrence and collective performance. We argue this method allows us to quantify the qualities...

  6. Behaviorism

    Science.gov (United States)

    Moore, J.

    2011-01-01

    Early forms of psychology assumed that mental life was the appropriate subject matter for psychology, and introspection was an appropriate method to engage that subject matter. In 1913, John B. Watson proposed an alternative: classical S-R behaviorism. According to Watson, behavior was a subject matter in its own right, to be studied by the…

  7. Quantifying global exergy resources

    International Nuclear Information System (INIS)

    Hermann, Weston A.

    2006-01-01

    Exergy is used as a common currency to assess and compare the reservoirs of theoretically extractable work we call energy resources. Resources consist of matter or energy with properties different from the predominant conditions in the environment. These differences can be classified as physical, chemical, or nuclear exergy. This paper identifies the primary exergy reservoirs that supply exergy to the biosphere and quantifies the intensive and extensive exergy of their derivative secondary reservoirs, or resources. The interconnecting accumulations and flows among these reservoirs are illustrated to show the path of exergy through the terrestrial system from input to its eventual natural or anthropogenic destruction. The results are intended to assist in evaluation of current resource utilization, help guide fundamental research to enable promising new energy technologies, and provide a basis for comparing the resource potential of future energy options that is independent of technology and cost

  8. Quantifying the Adaptive Cycle.

    Directory of Open Access Journals (Sweden)

    David G Angeler

    Full Text Available The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994-2011 data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  9. Quantifying Anthropogenic Dust Emissions

    Science.gov (United States)

    Webb, Nicholas P.; Pierre, Caroline

    2018-02-01

    Anthropogenic land use and land cover change, including local environmental disturbances, moderate rates of wind-driven soil erosion and dust emission. These human-dust cycle interactions impact ecosystems and agricultural production, air quality, human health, biogeochemical cycles, and climate. While the impacts of land use activities and land management on aeolian processes can be profound, the interactions are often complex and assessments of anthropogenic dust loads at all scales remain highly uncertain. Here, we critically review the drivers of anthropogenic dust emission and current evaluation approaches. We then identify and describe opportunities to: (1) develop new conceptual frameworks and interdisciplinary approaches that draw on ecological state-and-transition models to improve the accuracy and relevance of assessments of anthropogenic dust emissions; (2) improve model fidelity and capacity for change detection to quantify anthropogenic impacts on aeolian processes; and (3) enhance field research and monitoring networks to support dust model applications to evaluate the impacts of disturbance processes on local to global-scale wind erosion and dust emissions.

  10. Quantifying loopy network architectures.

    Directory of Open Access Journals (Sweden)

    Eleni Katifori

    Full Text Available Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes from the metric topology (connectivity and edge weight and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.

  11. Quantifying social asymmetric structures.

    Science.gov (United States)

    Solanas, Antonio; Salafranca, Lluís; Riba, Carles; Sierra, Vicenta; Leiva, David

    2006-08-01

    Many social phenomena involve a set of dyadic relations among agents whose actions may be dependent. Although individualistic approaches have frequently been applied to analyze social processes, these are not generally concerned with dyadic relations, nor do they deal with dependency. This article describes a mathematical procedure for analyzing dyadic interactions in a social system. The proposed method consists mainly of decomposing asymmetric data into their symmetric and skew-symmetric parts. A quantification of skew symmetry for a social system can be obtained by dividing the norm of the skew-symmetric matrix by the norm of the asymmetric matrix. This calculation makes available to researchers a quantity related to the amount of dyadic reciprocity. With regard to agents, the procedure enables researchers to identify those whose behavior is asymmetric with respect to all agents. It is also possible to derive symmetric measurements among agents and to use multivariate statistical techniques.

  12. The Fallacy of Quantifying Risk

    Science.gov (United States)

    2012-09-01

    Defense AT&L: September–October 2012 18 The Fallacy of Quantifying Risk David E. Frick, Ph.D. Frick is a 35-year veteran of the Department of...a key to risk analysis was “choosing the right technique” of quantifying risk . The weakness in this argument stems not from the assertion that one...of information about the enemy), yet achiev- ing great outcomes. Attempts at quantifying risk are not, in and of themselves, objectionable. Prudence

  13. Market behavior and performance of different strategy evaluation schemes.

    Science.gov (United States)

    Baek, Yongjoo; Lee, Sang Hoon; Jeong, Hawoong

    2010-08-01

    Strategy evaluation schemes are a crucial factor in any agent-based market model, as they determine the agents' strategy preferences and consequently their behavioral pattern. This study investigates how the strategy evaluation schemes adopted by agents affect their performance in conjunction with the market circumstances. We observe the performance of three strategy evaluation schemes, the history-dependent wealth game, the trend-opposing minority game, and the trend-following majority game, in a stock market where the price is exogenously determined. The price is either directly adopted from the real stock market indices or generated with a Markov chain of order ≤2 . Each scheme's success is quantified by average wealth accumulated by the traders equipped with the scheme. The wealth game, as it learns from the history, shows relatively good performance unless the market is highly unpredictable. The majority game is successful in a trendy market dominated by long periods of sustained price increase or decrease. On the other hand, the minority game is suitable for a market with persistent zigzag price patterns. We also discuss the consequence of implementing finite memory in the scoring processes of strategies. Our findings suggest under which market circumstances each evaluation scheme is appropriate for modeling the behavior of real market traders.

  14. Multidominance, ellipsis, and quantifier scope

    NARCIS (Netherlands)

    Temmerman, Tanja Maria Hugo

    2012-01-01

    This dissertation provides a novel perspective on the interaction between quantifier scope and ellipsis. It presents a detailed investigation of the scopal interaction between English negative indefinites, modals, and quantified phrases in ellipsis. One of the crucial observations is that a negative

  15. Quantifying Stock Return Distributions in Financial Markets.

    Science.gov (United States)

    Botta, Federico; Moat, Helen Susannah; Stanley, H Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales.

  16. Quantifiers in Russian Sign Language

    NARCIS (Netherlands)

    Kimmelman, V.; Paperno, D.; Keenan, E.L.

    2017-01-01

    After presenting some basic genetic, historical and typological information about Russian Sign Language, this chapter outlines the quantification patterns it expresses. It illustrates various semantic types of quantifiers, such as generalized existential, generalized universal, proportional,

  17. Quantified Self in de huisartsenpraktijk

    NARCIS (Netherlands)

    de Groot, Martijn; Timmers, Bart; Kooiman, Thea; van Ittersum, Miriam

    2015-01-01

    Quantified Self staat voor de zelfmetende mens. Het aantal mensen dat met zelf gegeneerde gezondheidsgegevens het zorgproces binnenwandelt gaat de komende jaren groeien. Verschillende soorten activity trackers en gezondheidsapplicaties voor de smartphone maken het relatief eenvoudig om persoonlijke

  18. Quantifying information leakage of randomized protocols

    DEFF Research Database (Denmark)

    Biondi, Fabrizio; Legay, Axel; Malacaria, Pasquale

    2015-01-01

    The quantification of information leakage provides a quantitative evaluation of the security of a system. We propose the usage of Markovian processes to model deterministic and probabilistic systems. By using a methodology generalizing the lattice of information approach we model refined attackers...... capable to observe the internal behavior of the system, and quantify the information leakage of such systems. We also use our method to obtain an algorithm for the computation of channel capacity from our Markovian models. Finally, we show how to use the method to analyze timed and non-timed attacks...

  19. Animal biometrics: quantifying and detecting phenotypic appearance.

    Science.gov (United States)

    Kühl, Hjalmar S; Burghardt, Tilo

    2013-07-01

    Animal biometrics is an emerging field that develops quantified approaches for representing and detecting the phenotypic appearance of species, individuals, behaviors, and morphological traits. It operates at the intersection between pattern recognition, ecology, and information sciences, producing computerized systems for phenotypic measurement and interpretation. Animal biometrics can benefit a wide range of disciplines, including biogeography, population ecology, and behavioral research. Currently, real-world applications are gaining momentum, augmenting the quantity and quality of ecological data collection and processing. However, to advance animal biometrics will require integration of methodologies among the scientific disciplines involved. Such efforts will be worthwhile because the great potential of this approach rests with the formal abstraction of phenomics, to create tractable interfaces between different organizational levels of life. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Quantifying Pilot Visual Attention in Low Visibility Terminal Operations

    Science.gov (United States)

    Ellis, Kyle K.; Arthur, J. J.; Latorella, Kara A.; Kramer, Lynda J.; Shelton, Kevin J.; Norman, Robert M.; Prinzel, Lawrence J.

    2012-01-01

    Quantifying pilot visual behavior allows researchers to determine not only where a pilot is looking and when, but holds implications for specific behavioral tracking when these data are coupled with flight technical performance. Remote eye tracking systems have been integrated into simulators at NASA Langley with effectively no impact on the pilot environment. This paper discusses the installation and use of a remote eye tracking system. The data collection techniques from a complex human-in-the-loop (HITL) research experiment are discussed; especially, the data reduction algorithms and logic to transform raw eye tracking data into quantified visual behavior metrics, and analysis methods to interpret visual behavior. The findings suggest superior performance for Head-Up Display (HUD) and improved attentional behavior for Head-Down Display (HDD) implementations of Synthetic Vision System (SVS) technologies for low visibility terminal area operations. Keywords: eye tracking, flight deck, NextGen, human machine interface, aviation

  1. Quantifying the uncertainty in heritability.

    Science.gov (United States)

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-05-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large.

  2. An adaptive method for history dependent materials

    NARCIS (Netherlands)

    Quak, W.; van den Boogaard, Antonius H.; Khalili, Nasser; Valliappan, Somasundaram; Li, Qing; Russel, Adrian

    2010-01-01

    Introduction: Finite element simulations of bulk forming processes, like extrusion or forging, can fail due to element distortion. Simulating a forming process requires either many re-meshing steps or an Eulerian formulation to avoid this problem. This re-meshing or usage of an Eulerian formulation,

  3. History-dependent stochastic Petri nets

    NARCIS (Netherlands)

    Schonenberg, H.; Sidorova, N.; Aalst, van der W.M.P.; Hee, van K.M.; Pnueli, A.; Virbitskaite, I.; Voronkov, A.

    2010-01-01

    Stochastic Petri Nets are a useful and well-known tool for performance analysis. However, an implicit assumption in the different types of Stochastic Petri Nets is the Markov property. It is assumed that a choice in the Petri net only depends on the current state and not on earlier choices. For many

  4. Quantifying and simulating human sensation

    DEFF Research Database (Denmark)

    Quantifying and simulating human sensation – relating science and technology of indoor climate research Abstract In his doctoral thesis from 1970 civil engineer Povl Ole Fanger proposed that the understanding of indoor climate should focus on the comfort of the individual rather than averaged...... this understanding of human sensation was adjusted to technology. I will look into the construction of the equipment, what it measures and the relationship between theory, equipment and tradition....

  5. Quantifying emissions from spontaneous combustion

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-09-01

    Spontaneous combustion can be a significant problem in the coal industry, not only due to the obvious safety hazard and the potential loss of valuable assets, but also with respect to the release of gaseous pollutants, especially CO2, from uncontrolled coal fires. This report reviews methodologies for measuring emissions from spontaneous combustion and discusses methods for quantifying, estimating and accounting for the purpose of preparing emission inventories.

  6. Towards Quantifying a Wider Reality: Shannon Exonerata

    Directory of Open Access Journals (Sweden)

    Robert E. Ulanowicz

    2011-10-01

    Full Text Available In 1872 Ludwig von Boltzmann derived a statistical formula to represent the entropy (an apophasis of a highly simplistic system. In 1948 Claude Shannon independently formulated the same expression to capture the positivist essence of information. Such contradictory thrusts engendered decades of ambiguity concerning exactly what is conveyed by the expression. Resolution of widespread confusion is possible by invoking the third law of thermodynamics, which requires that entropy be treated in a relativistic fashion. Doing so parses the Boltzmann expression into separate terms that segregate apophatic entropy from positivist information. Possibly more importantly, the decomposition itself portrays a dialectic-like agonism between constraint and disorder that may provide a more appropriate description of the behavior of living systems than is possible using conventional dynamics. By quantifying the apophatic side of evolution, the Shannon approach to information achieves what no other treatment of the subject affords: It opens the window on a more encompassing perception of reality.

  7. Quantifying Quantum-Mechanical Processes.

    Science.gov (United States)

    Hsieh, Jen-Hsiang; Chen, Shih-Hsuan; Li, Che-Ming

    2017-10-19

    The act of describing how a physical process changes a system is the basis for understanding observed phenomena. For quantum-mechanical processes in particular, the affect of processes on quantum states profoundly advances our knowledge of the natural world, from understanding counter-intuitive concepts to the development of wholly quantum-mechanical technology. Here, we show that quantum-mechanical processes can be quantified using a generic classical-process model through which any classical strategies of mimicry can be ruled out. We demonstrate the success of this formalism using fundamental processes postulated in quantum mechanics, the dynamics of open quantum systems, quantum-information processing, the fusion of entangled photon pairs, and the energy transfer in a photosynthetic pigment-protein complex. Since our framework does not depend on any specifics of the states being processed, it reveals a new class of correlations in the hierarchy between entanglement and Einstein-Podolsky-Rosen steering and paves the way for the elaboration of a generic method for quantifying physical processes.

  8. Quantifying collective attention from tweet stream.

    Directory of Open Access Journals (Sweden)

    Kazutoshi Sasahara

    Full Text Available Online social media are increasingly facilitating our social interactions, thereby making available a massive "digital fossil" of human behavior. Discovering and quantifying distinct patterns using these data is important for studying social behavior, although the rapid time-variant nature and large volumes of these data make this task difficult and challenging. In this study, we focused on the emergence of "collective attention" on Twitter, a popular social networking service. We propose a simple method for detecting and measuring the collective attention evoked by various types of events. This method exploits the fact that tweeting activity exhibits a burst-like increase and an irregular oscillation when a particular real-world event occurs; otherwise, it follows regular circadian rhythms. The difference between regular and irregular states in the tweet stream was measured using the Jensen-Shannon divergence, which corresponds to the intensity of collective attention. We then associated irregular incidents with their corresponding events that attracted the attention and elicited responses from large numbers of people, based on the popularity and the enhancement of key terms in posted messages or "tweets." Next, we demonstrate the effectiveness of this method using a large dataset that contained approximately 490 million Japanese tweets by over 400,000 users, in which we identified 60 cases of collective attentions, including one related to the Tohoku-oki earthquake. "Retweet" networks were also investigated to understand collective attention in terms of social interactions. This simple method provides a retrospective summary of collective attention, thereby contributing to the fundamental understanding of social behavior in the digital era.

  9. Quantifying Evaporation in a Permeable Pavement System

    Science.gov (United States)

    Studies quantifying evaporation from permeable pavement systems are limited to a few laboratory studies and one field application. This research quantifies evaporation for a larger-scale field application by measuring the water balance from lined permeable pavement sections. Th...

  10. Quantifying sound quality in loudspeaker reproduction

    NARCIS (Netherlands)

    Beerends, John G.; van Nieuwenhuizen, Kevin; van den Broek, E.L.

    2016-01-01

    We present PREQUEL: Perceptual Reproduction Quality Evaluation for Loudspeakers. Instead of quantifying the loudspeaker system itself, PREQUEL quantifies the overall loudspeakers' perceived sound quality by assessing their acoustic output using a set of music signals. This approach introduces a

  11. Quantifying the semantics of search behavior before stock market moves.

    Science.gov (United States)

    Curme, Chester; Preis, Tobias; Stanley, H Eugene; Moat, Helen Susannah

    2014-08-12

    Technology is becoming deeply interwoven into the fabric of society. The Internet has become a central source of information for many people when making day-to-day decisions. Here, we present a method to mine the vast data Internet users create when searching for information online, to identify topics of interest before stock market moves. In an analysis of historic data from 2004 until 2012, we draw on records from the search engine Google and online encyclopedia Wikipedia as well as judgments from the service Amazon Mechanical Turk. We find evidence of links between Internet searches relating to politics or business and subsequent stock market moves. In particular, we find that an increase in search volume for these topics tends to precede stock market falls. We suggest that extensions of these analyses could offer insight into large-scale information flow before a range of real-world events.

  12. Modeling and analysis to quantify MSE wall behavior and performance.

    Science.gov (United States)

    2009-08-01

    To better understand potential sources of adverse performance of mechanically stabilized earth (MSE) walls, a suite of analytical models was studied using the computer program FLAC, a numerical modeling computer program widely used in geotechnical en...

  13. Quantifier Scope in Categorical Compositional Distributional Semantics

    Directory of Open Access Journals (Sweden)

    Mehrnoosh Sadrzadeh

    2016-08-01

    Full Text Available In previous work with J. Hedges, we formalised a generalised quantifiers theory of natural language in categorical compositional distributional semantics with the help of bialgebras. In this paper, we show how quantifier scope ambiguity can be represented in that setting and how this representation can be generalised to branching quantifiers.

  14. Quantifying the vitamin D economy.

    Science.gov (United States)

    Heaney, Robert P; Armas, Laura A G

    2015-01-01

    Vitamin D enters the body through multiple routes and in a variety of chemical forms. Utilization varies with input, demand, and genetics. Vitamin D and its metabolites are carried in the blood on a Gc protein that has three principal alleles with differing binding affinities and ethnic prevalences. Three major metabolites are produced, which act via two routes, endocrine and autocrine/paracrine, and in two compartments, extracellular and intracellular. Metabolic consumption is influenced by physiological controls, noxious stimuli, and tissue demand. When administered as a supplement, varying dosing schedules produce major differences in serum metabolite profiles. To understand vitamin D's role in human physiology, it is necessary both to identify the foregoing entities, mechanisms, and pathways and, specifically, to quantify them. This review was performed to delineate the principal entities and transitions involved in the vitamin D economy, summarize the status of present knowledge of the applicable rates and masses, draw inferences about functions that are implicit in these quantifications, and point out implications for the determination of adequacy. © The Author(s) 2014. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Quantify the complexity of turbulence

    Science.gov (United States)

    Tao, Xingtian; Wu, Huixuan

    2017-11-01

    Many researchers have used Reynolds stress, power spectrum and Shannon entropy to characterize a turbulent flow, but few of them have measured the complexity of turbulence. Yet as this study shows, conventional turbulence statistics and Shannon entropy have limits when quantifying the flow complexity. Thus, it is necessary to introduce new complexity measures- such as topology complexity and excess information-to describe turbulence. Our test flow is a classic turbulent cylinder wake at Reynolds number 8100. Along the stream-wise direction, the flow becomes more isotropic and the magnitudes of normal Reynolds stresses decrease monotonically. These seem to indicate the flow dynamics becomes simpler downstream. However, the Shannon entropy keeps increasing along the flow direction and the dynamics seems to be more complex, because the large-scale vortices cascade to small eddies, the flow is less correlated and more unpredictable. In fact, these two contradictory observations partially describe the complexity of a turbulent wake. Our measurements (up to 40 diameters downstream the cylinder) show that the flow's degree-of-complexity actually increases firstly and then becomes a constant (or drops slightly) along the stream-wise direction. University of Kansas General Research Fund.

  16. Quantifying Cancer Risk from Radiation.

    Science.gov (United States)

    Keil, Alexander P; Richardson, David B

    2017-12-06

    Complex statistical models fitted to data from studies of atomic bomb survivors are used to estimate the human health effects of ionizing radiation exposures. We describe and illustrate an approach to estimate population risks from ionizing radiation exposure that relaxes many assumptions about radiation-related mortality. The approach draws on developments in methods for causal inference. The results offer a different way to quantify radiation's effects and show that conventional estimates of the population burden of excess cancer at high radiation doses are driven strongly by projecting outside the range of current data. Summary results obtained using the proposed approach are similar in magnitude to those obtained using conventional methods, although estimates of radiation-related excess cancers differ for many age, sex, and dose groups. At low doses relevant to typical exposures, the strength of evidence in data is surprisingly weak. Statements regarding human health effects at low doses rely strongly on the use of modeling assumptions. © 2017 Society for Risk Analysis.

  17. Quantifying China's regional economic complexity

    Science.gov (United States)

    Gao, Jian; Zhou, Tao

    2018-02-01

    China has experienced an outstanding economic expansion during the past decades, however, literature on non-monetary metrics that reveal the status of China's regional economic development are still lacking. In this paper, we fill this gap by quantifying the economic complexity of China's provinces through analyzing 25 years' firm data. First, we estimate the regional economic complexity index (ECI), and show that the overall time evolution of provinces' ECI is relatively stable and slow. Then, after linking ECI to the economic development and the income inequality, we find that the explanatory power of ECI is positive for the former but negative for the latter. Next, we compare different measures of economic diversity and explore their relationships with monetary macroeconomic indicators. Results show that the ECI index and the non-linear iteration based Fitness index are comparative, and they both have stronger explanatory power than other benchmark measures. Further multivariate regressions suggest the robustness of our results after controlling other socioeconomic factors. Our work moves forward a step towards better understanding China's regional economic development and non-monetary macroeconomic indicators.

  18. Quantifying and Reducing Light Pollution

    Science.gov (United States)

    Gokhale, Vayujeet; Caples, David; Goins, Jordan; Herdman, Ashley; Pankey, Steven; Wren, Emily

    2018-06-01

    We describe the current level of light pollution in and around Kirksville, Missouri and around Anderson Mesa near Flagstaff, Arizona. We quantify the amount of light that is projected up towards the sky, instead of the ground, using Unihedron sky quality meters installed at various locations. We also present results from DSLR photometry of several standard stars, and compare the photometric quality of the data collected at locations with varying levels of light pollution. Presently, light fixture shields and ‘warm-colored’ lights are being installed on Truman State University’s campus in order to reduce light pollution. We discuss the experimental procedure we use to test the effectiveness of the different light fixtures shields in a controlled setting inside the Del and Norma Robison Planetarium.Apart from negatively affecting the quality of the night sky for astronomers, light pollution adversely affects migratory patterns of some animals and sleep-patterns in humans, increases our carbon footprint, and wastes resources and money. This problem threatens to get particularly acute with the increasing use of outdoor LED lamps. We conclude with a call to action to all professional and amateur astronomers to act against the growing nuisance of light pollution.

  19. Quantifying meniscal kinematics in dogs.

    Science.gov (United States)

    Park, Brian H; Banks, Scott A; Pozzi, Antonio

    2017-11-06

    The dog has been used extensively as an experimental model to study meniscal treatments such as meniscectomy, meniscal repair, transplantation, and regeneration. However, there is very little information on meniscal kinematics in the dog. This study used MR imaging to quantify in vitro meniscal kinematics in loaded dog knees in four distinct poses: extension, flexion, internal, and external rotation. A new method was used to track the meniscal poses along the convex and posteriorly tilted tibial plateau. Meniscal displacements were large, displacing 13.5 and 13.7 mm posteriorly on average for the lateral and medial menisci during flexion (p = 0.90). The medial anterior horn and lateral posterior horns were the most mobile structures, showing average translations of 15.9 and 15.1 mm, respectively. Canine menisci are highly mobile and exhibit movements that correlate closely with the relative tibiofemoral positions. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  20. Quantifying the invasiveness of species

    Directory of Open Access Journals (Sweden)

    Robert Colautti

    2014-04-01

    Full Text Available The success of invasive species has been explained by two contrasting but non-exclusive views: (i intrinsic factors make some species inherently good invaders; (ii species become invasive as a result of extrinsic ecological and genetic influences such as release from natural enemies, hybridization or other novel ecological and evolutionary interactions. These viewpoints are rarely distinguished but hinge on distinct mechanisms leading to different management scenarios. To improve tests of these hypotheses of invasion success we introduce a simple mathematical framework to quantify the invasiveness of species along two axes: (i interspecific differences in performance among native and introduced species within a region, and (ii intraspecific differences between populations of a species in its native and introduced ranges. Applying these equations to a sample dataset of occurrences of 1,416 plant species across Europe, Argentina, and South Africa, we found that many species are common in their native range but become rare following introduction; only a few introduced species become more common. Biogeographical factors limiting spread (e.g. biotic resistance, time of invasion therefore appear more common than those promoting invasion (e.g. enemy release. Invasiveness, as measured by occurrence data, is better explained by inter-specific variation in invasion potential than biogeographical changes in performance. We discuss how applying these comparisons to more detailed performance data would improve hypothesis testing in invasion biology and potentially lead to more efficient management strategies.

  1. Integrated cosmological probes: concordance quantified

    Energy Technology Data Exchange (ETDEWEB)

    Nicola, Andrina; Amara, Adam; Refregier, Alexandre, E-mail: andrina.nicola@phys.ethz.ch, E-mail: adam.amara@phys.ethz.ch, E-mail: alexandre.refregier@phys.ethz.ch [Department of Physics, ETH Zürich, Wolfgang-Pauli-Strasse 27, CH-8093 Zürich (Switzerland)

    2017-10-01

    Assessing the consistency of parameter constraints derived from different cosmological probes is an important way to test the validity of the underlying cosmological model. In an earlier work [1], we computed constraints on cosmological parameters for ΛCDM from an integrated analysis of CMB temperature anisotropies and CMB lensing from Planck, galaxy clustering and weak lensing from SDSS, weak lensing from DES SV as well as Type Ia supernovae and Hubble parameter measurements. In this work, we extend this analysis and quantify the concordance between the derived constraints and those derived by the Planck Collaboration as well as WMAP9, SPT and ACT. As a measure for consistency, we use the Surprise statistic [2], which is based on the relative entropy. In the framework of a flat ΛCDM cosmological model, we find all data sets to be consistent with one another at a level of less than 1σ. We highlight that the relative entropy is sensitive to inconsistencies in the models that are used in different parts of the analysis. In particular, inconsistent assumptions for the neutrino mass break its invariance on the parameter choice. When consistent model assumptions are used, the data sets considered in this work all agree with each other and ΛCDM, without evidence for tensions.

  2. Neural basis for generalized quantifier comprehension.

    Science.gov (United States)

    McMillan, Corey T; Clark, Robin; Moore, Peachie; Devita, Christian; Grossman, Murray

    2005-01-01

    Generalized quantifiers like "all cars" are semantically well understood, yet we know little about their neural representation. Our model of quantifier processing includes a numerosity device, operations that combine number elements and working memory. Semantic theory posits two types of quantifiers: first-order quantifiers identify a number state (e.g. "at least 3") and higher-order quantifiers additionally require maintaining a number state actively in working memory for comparison with another state (e.g. "less than half"). We used BOLD fMRI to test the hypothesis that all quantifiers recruit inferior parietal cortex associated with numerosity, while only higher-order quantifiers recruit prefrontal cortex associated with executive resources like working memory. Our findings showed that first-order and higher-order quantifiers both recruit right inferior parietal cortex, suggesting that a numerosity component contributes to quantifier comprehension. Moreover, only probes of higher-order quantifiers recruited right dorsolateral prefrontal cortex, suggesting involvement of executive resources like working memory. We also observed activation of thalamus and anterior cingulate that may be associated with selective attention. Our findings are consistent with a large-scale neural network centered in frontal and parietal cortex that supports comprehension of generalized quantifiers.

  3. Quantifying the web browser ecosystem.

    Science.gov (United States)

    Ferdman, Sela; Minkov, Einat; Bekkerman, Ron; Gefen, David

    2017-01-01

    Contrary to the assumption that web browsers are designed to support the user, an examination of a 900,000 distinct PCs shows that web browsers comprise a complex ecosystem with millions of addons collaborating and competing with each other. It is possible for addons to "sneak in" through third party installations or to get "kicked out" by their competitors without user involvement. This study examines that ecosystem quantitatively by constructing a large-scale graph with nodes corresponding to users, addons, and words (terms) that describe addon functionality. Analyzing addon interactions at user level using the Personalized PageRank (PPR) random walk measure shows that the graph demonstrates ecological resilience. Adapting the PPR model to analyzing the browser ecosystem at the level of addon manufacturer, the study shows that some addon companies are in symbiosis and others clash with each other as shown by analyzing the behavior of 18 prominent addon manufacturers. Results may herald insight on how other evolving internet ecosystems may behave, and suggest a methodology for measuring this behavior. Specifically, applying such a methodology could transform the addon market.

  4. Quantifying the web browser ecosystem

    Science.gov (United States)

    Ferdman, Sela; Minkov, Einat; Gefen, David

    2017-01-01

    Contrary to the assumption that web browsers are designed to support the user, an examination of a 900,000 distinct PCs shows that web browsers comprise a complex ecosystem with millions of addons collaborating and competing with each other. It is possible for addons to “sneak in” through third party installations or to get “kicked out” by their competitors without user involvement. This study examines that ecosystem quantitatively by constructing a large-scale graph with nodes corresponding to users, addons, and words (terms) that describe addon functionality. Analyzing addon interactions at user level using the Personalized PageRank (PPR) random walk measure shows that the graph demonstrates ecological resilience. Adapting the PPR model to analyzing the browser ecosystem at the level of addon manufacturer, the study shows that some addon companies are in symbiosis and others clash with each other as shown by analyzing the behavior of 18 prominent addon manufacturers. Results may herald insight on how other evolving internet ecosystems may behave, and suggest a methodology for measuring this behavior. Specifically, applying such a methodology could transform the addon market. PMID:28644833

  5. Quantifying over-activity in bipolar and schizophrenia patients in a human open field Paradigm

    OpenAIRE

    Perry, William; Minassian, Arpi; Henry, Brook; Kincaid, Meegin; Young, Jared W.; Geyer, Mark A.

    2010-01-01

    It has been suggested that a cardinal symptom of mania is over-activity and exaggerated goal-directed behavior. Nevertheless, few attempts have been made to quantify this behavior objectively in a laboratory environment. Having a methodology to assess over-activity reliably might be useful in distinguishing manic bipolar disorder (BD) from schizophrenia (SCZ) during highly activated states. In the current study, quantifiable measures of object-interaction were assessed using a multivariate ...

  6. Quantifying social development in autism.

    Science.gov (United States)

    Volkmar, F R; Carter, A; Sparrow, S S; Cicchetti, D V

    1993-05-01

    This study was concerned with the development of quantitative measures of social development in autism. Multiple regression equations predicting social, communicative, and daily living skills on the Vineland Adaptive Behavior Scales were derived from a large, normative sample and applied to groups of autistic and nonautistic, developmentally disordered children. Predictive models included either mental or chronological age and other relevant variables. Social skills in the autistic group were more than two standard deviations below those predicted by their mental age; an index derived from the ratio of actual to predicted social skills correctly classified 94% of the autistic and 92% of the nonautistic, developmentally disordered cases. The findings are consistent with the idea that social disturbance is central in the definition of autism. The approach used in this study has potential advantages for providing more precise measures of social development in autism.

  7. Quantifying Information Flow During Emergencies

    Science.gov (United States)

    Gao, Liang; Song, Chaoming; Gao, Ziyou; Barabási, Albert-László; Bagrow, James P.; Wang, Dashun

    2014-02-01

    Recent advances on human dynamics have focused on the normal patterns of human activities, with the quantitative understanding of human behavior under extreme events remaining a crucial missing chapter. This has a wide array of potential applications, ranging from emergency response and detection to traffic control and management. Previous studies have shown that human communications are both temporally and spatially localized following the onset of emergencies, indicating that social propagation is a primary means to propagate situational awareness. We study real anomalous events using country-wide mobile phone data, finding that information flow during emergencies is dominated by repeated communications. We further demonstrate that the observed communication patterns cannot be explained by inherent reciprocity in social networks, and are universal across different demographics.

  8. Market behavior and performance of different strategy evaluation schemes

    OpenAIRE

    Yongjoo Baek; Sang Hoon Lee; Hawoong Jeong

    2010-01-01

    Strategy evaluation schemes are a crucial factor in any agent-based market model, as they determine the agents' strategy preferences and consequently their behavioral pattern. This study investigates how the strategy evaluation schemes adopted by agents affect their performance in conjunction with the market circumstances. We observe the performance of three strategy evaluation schemes, the history-dependent wealth game, the trend-opposing minority game, and the trend-following majority game,...

  9. Quantifying Urban Groundwater in Environmental Field Observatories

    Science.gov (United States)

    Welty, C.; Miller, A. J.; Belt, K.; Smith, J. A.; Band, L. E.; Groffman, P.; Scanlon, T.; Warner, J.; Ryan, R. J.; Yeskis, D.; McGuire, M. P.

    2006-12-01

    Despite the growing footprint of urban landscapes and their impacts on hydrologic and biogeochemical cycles, comprehensive field studies of urban water budgets are few. The cumulative effects of urban infrastructure (buildings, roads, culverts, storm drains, detention ponds, leaking water supply and wastewater pipe networks) on temporal and spatial patterns of groundwater stores, fluxes, and flowpaths are poorly understood. The goal of this project is to develop expertise and analytical tools for urban groundwater systems that will inform future environmental observatory planning and that can be shared with research teams working in urban environments elsewhere. The work plan for this project draws on a robust set of information resources in Maryland provided by ongoing monitoring efforts of the Baltimore Ecosystem Study (BES), USGS, and the U.S. Forest Service working together with university scientists and engineers from multiple institutions. A key concern is to bridge the gap between small-scale intensive field studies and larger-scale and longer-term hydrologic patterns using synoptic field surveys, remote sensing, numerical modeling, data mining and visualization tools. Using the urban water budget as a unifying theme, we are working toward estimating the various elements of the budget in order to quantify the influence of urban infrastructure on groundwater. Efforts include: (1) comparison of base flow behavior from stream gauges in a nested set of watersheds at four different spatial scales from 0.8 to 171 km2, with diverse patterns of impervious cover and urban infrastructure; (2) synoptic survey of well water levels to characterize the regional water table; (3) use of airborne thermal infrared imagery to identify locations of groundwater seepage into streams across a range of urban development patterns; (4) use of seepage transects and tracer tests to quantify the spatial pattern of groundwater fluxes to the drainage network in selected subwatersheds; (5

  10. Quantifying forecast quality of IT business value

    NARCIS (Netherlands)

    Eveleens, J.L.; van der Pas, M.; Verhoef, C.

    2012-01-01

    This article discusses how to quantify the forecasting quality of IT business value. We address a common economic indicator often used to determine the business value of project proposals, the Net Present Value (NPV). To quantify the forecasting quality of IT business value, we develop a generalized

  11. Bare quantifier fronting as contrastive topicalization

    Directory of Open Access Journals (Sweden)

    Ion Giurgea

    2015-11-01

    Full Text Available I argue that indefinites (in particular bare quantifiers such as ‘something’, ‘somebody’, etc. which are neither existentially presupposed nor in the restriction of a quantifier over situations, can undergo topicalization in a number of Romance languages (Catalan, Italian, Romanian, Spanish, but only if the sentence contains “verum” focus, i.e. focus on a high degree of certainty of the sentence. I analyze these indefinites as contrastive topics, using Büring’s (1999 theory (where the term ‘S-topic’ is used for what I call ‘contrastive topic’. I propose that the topic is evaluated in relation to a scalar set including generalized quantifiers such as {lP $x P(x, lP MANYx P(x, lP MOSTx P(x, lP “xP(x} or {lP $xP(x, lP P(a, lP P(b …}, and that the contrastive topic is the weakest generalized quantifier in this set. The verum focus, which is part of the “comment” that co-occurs with the “Topic”, introduces a set of alternatives including degrees of certainty of the assertion. The speaker asserts that his claim is certainly true or highly probable, contrasting it with stronger claims for which the degree of probability is unknown. This explains the observation that in downward entailing contexts, the fronted quantified DPs are headed by ‘all’ or ‘many’, whereas ‘some’, small numbers or ‘at least n’ appear in upward entailing contexts. Unlike other cases of non-specific topics, which are property topics, these are quantifier topics: the topic part is a generalized quantifier, the comment is a property of generalized quantifiers. This explains the narrow scope of the fronted quantified DP.

  12. Quantify Risk to Manage Cost and Schedule

    National Research Council Canada - National Science Library

    Raymond, Fred

    1999-01-01

    Too many projects suffer from unachievable budget and schedule goals, caused by unrealistic estimates and the failure to quantify and communicate the uncertainty of these estimates to managers and sponsoring executives...

  13. Quantifying drug-protein binding in vivo

    International Nuclear Information System (INIS)

    Buchholz, B; Bench, G; Keating III, G; Palmblad, M; Vogel, J; Grant, P G; Hillegonds, D

    2004-01-01

    Accelerator mass spectrometry (AMS) provides precise quantitation of isotope labeled compounds that are bound to biological macromolecules such as DNA or proteins. The sensitivity is high enough to allow for sub-pharmacological (''micro-'') dosing to determine macromolecular targets without inducing toxicities or altering the system under study, whether it is healthy or diseased. We demonstrated an application of AMS in quantifying the physiologic effects of one dosed chemical compound upon the binding level of another compound in vivo at sub-toxic doses [4].We are using tissues left from this study to develop protocols for quantifying specific binding to isolated and identified proteins. We also developed a new technique to quantify nanogram to milligram amounts of isolated protein at precisions that are comparable to those for quantifying the bound compound by AMS

  14. New frontiers of quantified self 3

    DEFF Research Database (Denmark)

    Rapp, Amon; Cena, Federica; Kay, Judy

    2017-01-01

    Quantified Self (QS) field needs to start thinking of how situated needs may affect the use of self-tracking technologies. In this workshop we will focus on the idiosyncrasies of specific categories of users....

  15. Quantifying space, understanding minds: A visual summary approach

    Directory of Open Access Journals (Sweden)

    Mark Simpson

    2017-06-01

    Full Text Available This paper presents an illustrated, validated taxonomy of research that compares spatial measures to human behavior. Spatial measures quantify the spatial characteristics of environments, such as the centrality of intersections in a street network or the accessibility of a room in a building from all the other rooms. While spatial measures have been of interest to spatial sciences, they are also of importance in the behavioral sciences for use in modeling human behavior. A high correlation between values for spatial measures and specific behaviors can provide insights into an environment's legibility, and contribute to a deeper understanding of human spatial cognition. Research in this area takes place in several domains, which makes a full understanding of existing literature difficult. To address this challenge, we adopt a visual summary approach. Literature is analyzed, and recurring topics are identified and validated with independent inter-rater agreement tasks in order to create a robust taxonomy for spatial measures and human behavior. The taxonomy is then illustrated with a visual representation that allows for at-a-glance visual access to the content of individual research papers in a corpus. A public web interface has been created that allows interested researchers to add to the database and create visual summaries for their research papers using our taxonomy.

  16. Quantifying graininess of glossy food products

    DEFF Research Database (Denmark)

    Møller, Flemming; Carstensen, Jens Michael

    The sensory quality of yoghurt can be altered when changing the milk composition or processing conditions. Part of the sensory quality may be assessed visually. It is described how a non-contact method for quantifying surface gloss and grains in yoghurt can be made. It was found that the standard...

  17. Quantifying antimicrobial resistance at veal calf farms

    NARCIS (Netherlands)

    Bosman, A.B.; Wagenaar, J.A.; Stegeman, A.; Vernooij, H.; Mevius, D.J.

    2012-01-01

    This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm) were collected. From

  18. QS Spiral: Visualizing Periodic Quantified Self Data

    DEFF Research Database (Denmark)

    Larsen, Jakob Eg; Cuttone, Andrea; Jørgensen, Sune Lehmann

    2013-01-01

    In this paper we propose an interactive visualization technique QS Spiral that aims to capture the periodic properties of quantified self data and let the user explore those recurring patterns. The approach is based on time-series data visualized as a spiral structure. The interactivity includes ...

  19. Quantifying recontamination through factory environments - a review

    NARCIS (Netherlands)

    Asselt-den Aantrekker, van E.D.; Boom, R.M.; Zwietering, M.H.; Schothorst, van M.

    2003-01-01

    Recontamination of food products can be the origin of foodborne illnesses and should therefore be included in quantitative microbial risk assessment (MRA) studies. In order to do this, recontamination should be quantified using predictive models. This paper gives an overview of the relevant

  20. Quantifying quantum coherence with quantum Fisher information.

    Science.gov (United States)

    Feng, X N; Wei, L F

    2017-11-14

    Quantum coherence is one of the old but always important concepts in quantum mechanics, and now it has been regarded as a necessary resource for quantum information processing and quantum metrology. However, the question of how to quantify the quantum coherence has just been paid the attention recently (see, e.g., Baumgratz et al. PRL, 113. 140401 (2014)). In this paper we verify that the well-known quantum Fisher information (QFI) can be utilized to quantify the quantum coherence, as it satisfies the monotonicity under the typical incoherent operations and the convexity under the mixing of the quantum states. Differing from most of the pure axiomatic methods, quantifying quantum coherence by QFI could be experimentally testable, as the bound of the QFI is practically measurable. The validity of our proposal is specifically demonstrated with the typical phase-damping and depolarizing evolution processes of a generic single-qubit state, and also by comparing it with the other quantifying methods proposed previously.

  1. Interbank exposures: quantifying the risk of contagion

    OpenAIRE

    C. H. Furfine

    1999-01-01

    This paper examines the likelihood that failure of one bank would cause the subsequent collapse of a large number of other banks. Using unique data on interbank payment flows, the magnitude of bilateral federal funds exposures is quantified. These exposures are used to simulate the impact of various failure scenarios, and the risk of contagion is found to be economically small.

  2. Quantifying Productivity Gains from Foreign Investment

    NARCIS (Netherlands)

    C. Fons-Rosen (Christian); S. Kalemli-Ozcan (Sebnem); B.E. Sorensen (Bent); C. Villegas-Sanchez (Carolina)

    2013-01-01

    textabstractWe quantify the causal effect of foreign investment on total factor productivity (TFP) using a new global firm-level database. Our identification strategy relies on exploiting the difference in the amount of foreign investment by financial and industrial investors and simultaneously

  3. Power Curve Measurements, quantify the production increase

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Vesth, Allan

    The purpose of this report is to quantify the production increase on a given turbine with respect to another given turbine. The used methodology is the “side by side” comparison method, provided by the client. This method involves the use of two neighboring turbines and it is based...

  4. Quantifying capital goods for waste landfilling

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Stentsøe, Steen; Willumsen, Hans Christian

    2013-01-01

    Materials and energy used for construction of a hill-type landfill of 4 million m3 were quantified in detail. The landfill is engineered with a liner and leachate collections system, as well as a gas collection and control system. Gravel and clay were the most common materials used, amounting...

  5. Quantifying interspecific coagulation efficiency of phytoplankton

    DEFF Research Database (Denmark)

    Hansen, J.L.S.; Kiørboe, Thomas

    1997-01-01

    . nordenskjoeldii. Mutual coagulation between Skeletonema costatum and the non-sticky cel:ls of Ditylum brightwellii also proceeded with hall the efficiency of S. costatum alone. The latex beads were suitable to be used as 'standard particles' to quantify the ability of phytoplankton to prime aggregation...

  6. New frontiers of quantified self 2

    DEFF Research Database (Denmark)

    Rapp, Amon; Cena, Federica; Kay, Judy

    2016-01-01

    While the Quantified Self (QS) community is described in terms of "self-knowledge through numbers" people are increasingly demanding value and meaning. In this workshop we aim at refocusing the QS debate on the value of data for providing new services....

  7. Quantifying temporal ventriloquism in audiovisual synchrony perception

    NARCIS (Netherlands)

    Kuling, I.A.; Kohlrausch, A.G.; Juola, J.F.

    2013-01-01

    The integration of visual and auditory inputs in the human brain works properly only if the components are perceived in close temporal proximity. In the present study, we quantified cross-modal interactions in the human brain for audiovisual stimuli with temporal asynchronies, using a paradigm from

  8. Reliability-How to Quantify and Improve?

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 5. Reliability - How to Quantify and Improve? - Improving the Health of Products. N K Srinivasan. General Article Volume 5 Issue 5 May 2000 pp 55-63. Fulltext. Click here to view fulltext PDF. Permanent link:

  9. Bosonic behavior of entangled fermions

    DEFF Research Database (Denmark)

    C. Tichy, Malte; Alexander Bouvrie, Peter; Mølmer, Klaus

    2012-01-01

    Two bound, entangled fermions form a composite boson, which can be treated as an elementary boson as long as the Pauli principle does not affect the behavior of many such composite bosons. The departure of ideal bosonic behavior is quantified by the normalization ratio of multi-composite-boson st......Two bound, entangled fermions form a composite boson, which can be treated as an elementary boson as long as the Pauli principle does not affect the behavior of many such composite bosons. The departure of ideal bosonic behavior is quantified by the normalization ratio of multi...

  10. A masking index for quantifying hidden glitches

    OpenAIRE

    Berti-Equille, Laure; Loh, J. M.; Dasu, T.

    2015-01-01

    Data glitches are errors in a dataset. They are complex entities that often span multiple attributes and records. When they co-occur in data, the presence of one type of glitch can hinder the detection of another type of glitch. This phenomenon is called masking. In this paper, we define two important types of masking and propose a novel, statistically rigorous indicator called masking index for quantifying the hidden glitches. We outline four cases of masking: outliers masked by missing valu...

  11. How are the catastrophical risks quantifiable

    International Nuclear Information System (INIS)

    Chakraborty, S.

    1985-01-01

    For the assessment and evaluation of industrial risks the question must be asked how are the catastrophical risks quantifiable. Typical real catastrophical risks and risk assessment based on modelling assumptions have been placed against each other in order to put the risks into proper perspective. However, the society is risk averse when there is a catastrophic potential of severe accidents in a large scale industrial facility even though there is extremely low probability of occurence. (orig.) [de

  12. Quantifying Distributional Model Risk via Optimal Transport

    OpenAIRE

    Blanchet, Jose; Murthy, Karthyek R. A.

    2016-01-01

    This paper deals with the problem of quantifying the impact of model misspecification when computing general expected values of interest. The methodology that we propose is applicable in great generality, in particular, we provide examples involving path dependent expectations of stochastic processes. Our approach consists in computing bounds for the expectation of interest regardless of the probability measure used, as long as the measure lies within a prescribed tolerance measured in terms ...

  13. Quantifying Anthropogenic Stress on Groundwater Resources

    OpenAIRE

    Ashraf, Batool; AghaKouchak, Amir; Alizadeh, Amin; Mousavi Baygi, Mohammad; R. Moftakhari, Hamed; Mirchi, Ali; Anjileli, Hassan; Madani, Kaveh

    2017-01-01

    This study explores a general framework for quantifying anthropogenic influences on groundwater budget based on normalized human outflow (hout) and inflow (hin). The framework is useful for sustainability assessment of groundwater systems and allows investigating the effects of different human water abstraction scenarios on the overall aquifer regime (e.g., depleted, natural flow-dominated, and human flow-dominated). We apply this approach to selected regions in the USA, Germany and Iran to e...

  14. Quantifying Diffuse Contamination: Method and Application to Pb in Soil.

    Science.gov (United States)

    Fabian, Karl; Reimann, Clemens; de Caritat, Patrice

    2017-06-20

    A new method for detecting and quantifying diffuse contamination at the continental to regional scale is based on the analysis of cumulative distribution functions (CDFs). It uses cumulative probability (CP) plots for spatially representative data sets, preferably containing >1000 determinations. Simulations demonstrate how different types of contamination influence elemental CDFs of different sample media. It is found that diffuse contamination is characterized by a distinctive shift of the low-concentration end of the distribution of the studied element in its CP plot. Diffuse contamination can be detected and quantified via either (1) comparing the distribution of the contaminating element to that of an element with a geochemically comparable behavior but no contamination source (e.g., Pb vs Rb), or (2) comparing the top soil distribution of an element to the distribution of the same element in subsoil samples from the same area, taking soil forming processes into consideration. Both procedures are demonstrated for geochemical soil data sets from Europe, Australia, and the U.S.A. Several different data sets from Europe deliver comparable results at different scales. Diffuse Pb contamination in surface soil is estimated to be contamination sources and can be used to efficiently monitor diffuse contamination at the continental to regional scale.

  15. Quantifying the relationship between financial news and the stock market.

    Science.gov (United States)

    Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

    2013-12-20

    The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2(nd) January 2007 until 31(st) December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked.

  16. National survey describing and quantifying students with communication needs.

    Science.gov (United States)

    Andzik, Natalie R; Schaefer, John M; Nichols, Robert T; Chung, Yun-Ching

    2018-01-01

    Research literature has yet to quantify and describe how students with complex communication needs are supported in the classroom and how special educators are being prepared to offer support. This study sought out special educators to complete a survey about their students with complex communication needs. Over 4,000 teachers representing 50 states reported on the communicative and behavioral characteristics of 15,643 students. Teachers described the training they have received and instructional approaches they used. The majority of students were reported to use speech as their primary communication mode. Over half of students utilizing alternative and augmentative communication (AAC) were reported to have non-proficient communication. Teacher training varied across respondents as well as the supports they used to support these students in the classroom. The majority of students with disabilities using AAC when communicating across the nation are not proficiently communicating. Implications and recommendations will be discussed.

  17. Quantifying entanglement in two-mode Gaussian states

    Science.gov (United States)

    Tserkis, Spyros; Ralph, Timothy C.

    2017-12-01

    Entangled two-mode Gaussian states are a key resource for quantum information technologies such as teleportation, quantum cryptography, and quantum computation, so quantification of Gaussian entanglement is an important problem. Entanglement of formation is unanimously considered a proper measure of quantum correlations, but for arbitrary two-mode Gaussian states no analytical form is currently known. In contrast, logarithmic negativity is a measure that is straightforward to calculate and so has been adopted by most researchers, even though it is a less faithful quantifier. In this work, we derive an analytical lower bound for entanglement of formation of generic two-mode Gaussian states, which becomes tight for symmetric states and for states with balanced correlations. We define simple expressions for entanglement of formation in physically relevant situations and use these to illustrate the problematic behavior of logarithmic negativity, which can lead to spurious conclusions.

  18. Certain Verbs Are Syntactically Explicit Quantifiers

    Directory of Open Access Journals (Sweden)

    Anna Szabolcsi

    2010-12-01

    Full Text Available Quantification over individuals, times, and worlds can in principle be made explicit in the syntax of the object language, or left to the semantics and spelled out in the meta-language. The traditional view is that quantification over individuals is syntactically explicit, whereas quantification over times and worlds is not. But a growing body of literature proposes a uniform treatment. This paper examines the scopal interaction of aspectual raising verbs (begin, modals (can, and intensional raising verbs (threaten with quantificational subjects in Shupamem, Dutch, and English. It appears that aspectual raising verbs and at least modals may undergo the same kind of overt or covert scope-changing operations as nominal quantifiers; the case of intensional raising verbs is less clear. Scope interaction is thus shown to be a new potential diagnostic of object-linguistic quantification, and the similarity in the scope behavior of nominal and verbal quantifiers supports the grammatical plausibility of ontological symmetry, explored in Schlenker (2006.ReferencesBen-Shalom, D. 1996. Semantic Trees. Ph.D. thesis, UCLA.Bittner, M. 1993. Case, Scope, and Binding. Dordrecht: Reidel.Cresswell, M. 1990. Entities and Indices. Dordrecht: Kluwer.Cresti, D. 1995. ‘Extraction and reconstruction’. Natural Language Semantics 3: 79–122.http://dx.doi.org/10.1007/BF01252885Curry, B. H. & Feys, R. 1958. Combinatory Logic I. Dordrecht: North-Holland.Dowty, D. R. 1988. ‘Type raising, functional composition, and non-constituent conjunction’. In Richard T. Oehrle, Emmon W. Bach & Deirdre Wheeler (eds. ‘Categorial Grammars and Natural Language Structures’, 153–197. Dordrecht: Reidel.Fox, D. 2002. ‘TOn Logical Form’. In Randall Hendrick (ed. ‘Minimalist Syntax’, 82–124. Oxford: Blackwell.Gallin, D. 1975. Intensional and higher-order modal logic: with applications to Montague semantics. North Holland Pub. Co.; American Elsevier Pub. Co., Amsterdam

  19. Quantifier spreading: children misled by ostensive cues

    Directory of Open Access Journals (Sweden)

    Katalin É. Kiss

    2017-04-01

    Full Text Available This paper calls attention to a methodological problem of acquisition experiments. It shows that the economy of the stimulus employed in child language experiments may lend an increased ostensive effect to the message communicated to the child. Thus, when the visual stimulus in a sentence-picture matching task is a minimal model abstracting away from the details of the situation, children often regard all the elements of the stimulus as ostensive clues to be represented in the corresponding sentence. The use of such minimal stimuli is mistaken when the experiment aims to test whether or not a certain element of the stimulus is relevant for the linguistic representation or interpretation. The paper illustrates this point by an experiment involving quantifier spreading. It is claimed that children find a universally quantified sentence like 'Every girl is riding a bicycle 'to be a false description of a picture showing three girls riding bicycles and a solo bicycle because they are misled to believe that all the elements in the visual stimulus are relevant, hence all of them are to be represented by the corresponding linguistic description. When the iconic drawings were replaced by photos taken in a natural environment rich in accidental details, the occurrence of quantifier spreading was radically reduced. It is shown that an extra object in the visual stimulus can lead to the rejection of the sentence also in the case of sentences involving no quantification, which gives further support to the claim that the source of the problem is not (or not only the grammatical or cognitive difficulty of quantification but the unintended ostensive effect of the extra object.  This article is part of the special collection: Acquisition of Quantification

  20. Characterization of autoregressive processes using entropic quantifiers

    Science.gov (United States)

    Traversaro, Francisco; Redelico, Francisco O.

    2018-01-01

    The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.

  1. Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.

    Science.gov (United States)

    Richie, Megan; Josephson, S Andrew

    2018-01-01

    Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained

  2. A Generalizable Methodology for Quantifying User Satisfaction

    Science.gov (United States)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  3. Quantifying the efficiency of river regulation

    Directory of Open Access Journals (Sweden)

    R. Rödel

    2005-01-01

    Full Text Available Dam-affected hydrologic time series give rise to uncertainties when they are used for calibrating large-scale hydrologic models or for analysing runoff records. It is therefore necessary to identify and to quantify the impact of impoundments on runoff time series. Two different approaches were employed. The first, classic approach compares the volume of the dams that are located upstream from a station with the annual discharge. The catchment areas of the stations are calculated and then related to geo-referenced dam attributes. The paper introduces a data set of geo-referenced dams linked with 677 gauging stations in Europe. Second, the intensity of the impoundment impact on runoff times series can be quantified more exactly and directly when long-term runoff records are available. Dams cause a change in the variability of flow regimes. This effect can be measured using the model of linear single storage. The dam-caused storage change ΔS can be assessed through the volume of the emptying process between two flow regimes. As an example, the storage change ΔS is calculated for regulated long-term series of the Luleälven in northern Sweden.

  4. Quantifying meta-correlations in financial markets

    Science.gov (United States)

    Kenett, Dror Y.; Preis, Tobias; Gur-Gershgoren, Gitit; Ben-Jacob, Eshel

    2012-08-01

    Financial markets are modular multi-level systems, in which the relationships between the individual components are not constant in time. Sudden changes in these relationships significantly affect the stability of the entire system, and vice versa. Our analysis is based on historical daily closing prices of the 30 components of the Dow Jones Industrial Average (DJIA) from March 15th, 1939 until December 31st, 2010. We quantify the correlation among these components by determining Pearson correlation coefficients, to investigate whether mean correlation of the entire portfolio can be used as a precursor for changes in the index return. To this end, we quantify the meta-correlation - the correlation of mean correlation and index return. We find that changes in index returns are significantly correlated with changes in mean correlation. Furthermore, we study the relationship between the index return and correlation volatility - the standard deviation of correlations for a given time interval. This parameter provides further evidence of the effect of the index on market correlations and their fluctuations. Our empirical findings provide new information and quantification of the index leverage effect, and have implications to risk management, portfolio optimization, and to the increased stability of financial markets.

  5. Quantifying climate risk - the starting point

    International Nuclear Information System (INIS)

    Fairweather, Helen; Luo, Qunying; Liu, De Li; Wiles, Perry

    2007-01-01

    Full text: All natural systems have evolved to their current state as a result inter alia of the climate in which they developed. Similarly, man-made systems (such as agricultural production) have developed to suit the climate experienced over the last 100 or so years. The capacity of different systems to adapt to changes in climate that are outside those that have been experienced previously is largely unknown. This results in considerable uncertainty when predicting climate change impacts. However, it is possible to quantify the relative probabilities of a range of potential impacts of climate change. Quantifying current climate risks is an effective starting point for analysing the probable impacts of future climate change and guiding the selection of appropriate adaptation strategies. For a farming system to be viable within the current climate, its profitability must be sustained and, therefore, possible adaptation strategies need to be tested for continued viability in a changed climate. The methodology outlined in this paper examines historical patterns of key climate variables (rainfall and temperature) across the season and their influence on the productivity of wheat growing in NSW. This analysis is used to identify the time of year that the system is most vulnerable to climate variation, within the constraints of the current climate. Wheat yield is used as a measure of productivity, which is also assumed to be a surrogate for profitability. A time series of wheat yields is sorted into ascending order and categorised into five percentile groupings (i.e. 20th, 40th, 60th and 80th percentiles) for each shire across NSW (-100 years). Five time series of climate data (which are aggregated daily data from the years in each percentile) are analysed to determine the period that provides the greatest climate risk to the production system. Once this period has been determined, this risk is quantified in terms of the degree of separation of the time series

  6. Quantifying the topology of porous structures

    Energy Technology Data Exchange (ETDEWEB)

    Kinney, J. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Computerized x-ray tomography, with microscopic resolution, has been used to volumetrically visualize the evolution of porosity in a ceramic matrix composite during processing. The topological variables describing the porosity have been measured. The evolution of the porosity exhibits critical scaling behavior near final consolidation, and appears to be independent of the structure (universality).

  7. Quantifying selectivity in spectrophotometric multicomponent analysis

    NARCIS (Netherlands)

    Faber, N.M.; Ferre, J.; Boque, R.; Kalivas, J.H.

    2003-01-01

    According to the latest recommendation of the International Union of Pure and Applied Chemistry, "selectivity refers to the extent to which the method can be used to determine particular analytes in mixtures or matrices without interferences from other components of similar behavior". Because of the

  8. How to quantify conduits in wood?

    Science.gov (United States)

    Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven

    2013-01-01

    Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology.

  9. Message passing for quantified Boolean formulas

    International Nuclear Information System (INIS)

    Zhang, Pan; Ramezanpour, Abolfazl; Zecchina, Riccardo; Zdeborová, Lenka

    2012-01-01

    We introduce two types of message passing algorithms for quantified Boolean formulas (QBF). The first type is a message passing based heuristics that can prove unsatisfiability of the QBF by assigning the universal variables in such a way that the remaining formula is unsatisfiable. In the second type, we use message passing to guide branching heuristics of a Davis–Putnam–Logemann–Loveland (DPLL) complete solver. Numerical experiments show that on random QBFs our branching heuristics give robust exponential efficiency gain with respect to state-of-the-art solvers. We also manage to solve some previously unsolved benchmarks from the QBFLIB library. Apart from this, our study sheds light on using message passing in small systems and as subroutines in complete solvers

  10. Quantifying decoherence in continuous variable systems

    Energy Technology Data Exchange (ETDEWEB)

    Serafini, A [Dipartimento di Fisica ' ER Caianiello' , Universita di Salerno, INFM UdR Salerno, INFN Sezione Napoli, Gruppo Collegato Salerno, Via S Allende, 84081 Baronissi, SA (Italy); Paris, M G A [Dipartimento di Fisica and INFM, Universita di Milano, Milan (Italy); Illuminati, F [Dipartimento di Fisica ' ER Caianiello' , Universita di Salerno, INFM UdR Salerno, INFN Sezione Napoli, Gruppo Collegato Salerno, Via S Allende, 84081 Baronissi, SA (Italy); De Siena, S [Dipartimento di Fisica ' ER Caianiello' , Universita di Salerno, INFM UdR Salerno, INFN Sezione Napoli, Gruppo Collegato Salerno, Via S Allende, 84081 Baronissi, SA (Italy)

    2005-04-01

    We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the purity, some non-classicality indicators in phase space, and, for two-mode states, entanglement measures and total correlations between the modes. Different sets of physically relevant initial configurations are considered, including one- and two-mode Gaussian states, number states, and coherent superpositions. Our analysis shows that, generally, the use of initially squeezed configurations does not help to preserve the coherence of Gaussian states, whereas it can be effective in protecting coherent superpositions of both number states and Gaussian wavepackets. (review article)

  11. Quantifying decoherence in continuous variable systems

    International Nuclear Information System (INIS)

    Serafini, A; Paris, M G A; Illuminati, F; De Siena, S

    2005-01-01

    We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the purity, some non-classicality indicators in phase space, and, for two-mode states, entanglement measures and total correlations between the modes. Different sets of physically relevant initial configurations are considered, including one- and two-mode Gaussian states, number states, and coherent superpositions. Our analysis shows that, generally, the use of initially squeezed configurations does not help to preserve the coherence of Gaussian states, whereas it can be effective in protecting coherent superpositions of both number states and Gaussian wavepackets. (review article)

  12. Crowdsourcing for quantifying transcripts: An exploratory study.

    Science.gov (United States)

    Azzam, Tarek; Harman, Elena

    2016-02-01

    This exploratory study attempts to demonstrate the potential utility of crowdsourcing as a supplemental technique for quantifying transcribed interviews. Crowdsourcing is the harnessing of the abilities of many people to complete a specific task or a set of tasks. In this study multiple samples of crowdsourced individuals were asked to rate and select supporting quotes from two different transcripts. The findings indicate that the different crowdsourced samples produced nearly identical ratings of the transcripts, and were able to consistently select the same supporting text from the transcripts. These findings suggest that crowdsourcing, with further development, can potentially be used as a mixed method tool to offer a supplemental perspective on transcribed interviews. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Quantifying capital goods for waste incineration

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Riber, C.; Christensen, Thomas Højlund

    2013-01-01

    material used amounting to 19,000–26,000tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000–5000MWh. In terms of the environmental burden...... that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2–3% with respect to kg CO2 per tonne of waste combusted.......Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000–240,000tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main...

  14. Pendulum Underwater - An Approach for Quantifying Viscosity

    Science.gov (United States)

    Leme, José Costa; Oliveira, Agostinho

    2017-12-01

    The purpose of the experiment presented in this paper is to quantify the viscosity of a liquid. Viscous effects are important in the flow of fluids in pipes, in the bloodstream, in the lubrication of engine parts, and in many other situations. In the present paper, the authors explore the oscillations of a physical pendulum in the form of a long and lightweight wire that carries a ball at its lower end, which is totally immersed in water, so as to determine the water viscosity. The system used represents a viscous damped pendulum and we tried different theoretical models to describe it. The experimental part of the present paper is based on a very simple and low-cost image capturing apparatus that can easily be replicated in a physics classroom. Data on the pendulum's amplitude as a function of time were acquired using digital video analysis with the open source software Tracker.

  15. Quantifying gait patterns in Parkinson's disease

    Science.gov (United States)

    Romero, Mónica; Atehortúa, Angélica; Romero, Eduardo

    2017-11-01

    Parkinson's disease (PD) is constituted by a set of motor symptoms, namely tremor, rigidity, and bradykinesia, which are usually described but not quantified. This work proposes an objective characterization of PD gait patterns by approximating the single stance phase a single grounded pendulum. This model estimates the force generated by the gait during the single support from gait data. This force describes the motion pattern for different stages of the disease. The model was validated using recorded videos of 8 young control subjects, 10 old control subjects and 10 subjects with Parkinson's disease in different stages. The estimated force showed differences among stages of Parkinson disease, observing a decrease of the estimated force for the advanced stages of this illness.

  16. Quantifying brain microstructure with diffusion MRI

    DEFF Research Database (Denmark)

    Novikov, Dmitry S.; Jespersen, Sune N.; Kiselev, Valerij G.

    2016-01-01

    the potential to quantify the relevant length scales for neuronal tissue, such as the packing correlation length for neuronal fibers, the degree of neuronal beading, and compartment sizes. The second avenue corresponds to the long-time limit, when the observed signal can be approximated as a sum of multiple non......-exchanging anisotropic Gaussian components. Here the challenge lies in parameter estimation and in resolving its hidden degeneracies. The third avenue employs multiple diffusion encoding techniques, able to access information not contained in the conventional diffusion propagator. We conclude with our outlook...... on the future research directions which can open exciting possibilities for developing markers of pathology and development based on methods of studying mesoscopic transport in disordered systems....

  17. Quantifying Temporal Genomic Erosion in Endangered Species.

    Science.gov (United States)

    Díez-Del-Molino, David; Sánchez-Barreiro, Fatima; Barnes, Ian; Gilbert, M Thomas P; Dalén, Love

    2018-03-01

    Many species have undergone dramatic population size declines over the past centuries. Although stochastic genetic processes during and after such declines are thought to elevate the risk of extinction, comparative analyses of genomic data from several endangered species suggest little concordance between genome-wide diversity and current population sizes. This is likely because species-specific life-history traits and ancient bottlenecks overshadow the genetic effect of recent demographic declines. Therefore, we advocate that temporal sampling of genomic data provides a more accurate approach to quantify genetic threats in endangered species. Specifically, genomic data from predecline museum specimens will provide valuable baseline data that enable accurate estimation of recent decreases in genome-wide diversity, increases in inbreeding levels, and accumulation of deleterious genetic variation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  19. Quantifying the evolution of individual scientific impact.

    Science.gov (United States)

    Sinatra, Roberta; Wang, Dashun; Deville, Pierre; Song, Chaoming; Barabási, Albert-László

    2016-11-04

    Despite the frequent use of numerous quantitative indicators to gauge the professional impact of a scientist, little is known about how scientific impact emerges and evolves in time. Here, we quantify the changes in impact and productivity throughout a career in science, finding that impact, as measured by influential publications, is distributed randomly within a scientist's sequence of publications. This random-impact rule allows us to formulate a stochastic model that uncouples the effects of productivity, individual ability, and luck and unveils the existence of universal patterns governing the emergence of scientific success. The model assigns a unique individual parameter Q to each scientist, which is stable during a career, and it accurately predicts the evolution of a scientist's impact, from the h-index to cumulative citations, and independent recognitions, such as prizes. Copyright © 2016, American Association for the Advancement of Science.

  20. Quantifying creativity: can measures span the spectrum?

    Science.gov (United States)

    Simonton, Dean Keith

    2012-03-01

    Because the cognitive neuroscientists have become increasingly interested in the phenomenon of creativity, the issue arises of how creativity is to be optimally measured. Unlike intelligence, which can be assessed across the full range of intellectual ability creativity measures tend to concentrate on different sections of the overall spectrum. After first defining creativity in terms of the three criteria of novelty, usefulness, and surprise, this article provides an overview of the available measures. Not only do these instruments vary according to whether they focus on the creative process, person, or product, but they differ regarding whether they tap into "little-c" versus "Big-C" creativity; only productivity and eminence measures reach into genius-level manifestations of the phenomenon. The article closes by discussing whether various alternative assessment techniques can be integrated into a single measure that quantifies creativity across the full spectrum.

  1. A compact clinical instrument for quantifying suppression.

    Science.gov (United States)

    Black, Joanne M; Thompson, Benjamin; Maehara, Goro; Hess, Robert F

    2011-02-01

    We describe a compact and convenient clinical apparatus for the measurement of suppression based on a previously reported laboratory-based approach. In addition, we report and validate a novel, rapid psychophysical method for measuring suppression using this apparatus, which makes the technique more applicable to clinical practice. By using a Z800 dual pro head-mounted display driven by a MAC laptop, we provide dichoptic stimulation. Global motion stimuli composed of arrays of moving dots are presented to each eye. One set of dots move in a coherent direction (termed signal) whereas another set of dots move in a random direction (termed noise). To quantify performance, we measure the signal/noise ratio corresponding to a direction-discrimination threshold. Suppression is quantified by assessing the extent to which it matters which eye sees the signal and which eye sees the noise. A space-saving, head-mounted display using current video technology offers an ideal solution for clinical practice. In addition, our optimized psychophysical method provided results that were in agreement with those produced using the original technique. We made measures of suppression on a group of nine adult amblyopic participants using this apparatus with both the original and new psychophysical paradigms. All participants had measurable suppression ranging from mild to severe. The two different psychophysical methods gave a strong correlation for the strength of suppression (rho = -0.83, p = 0.006). Combining the new apparatus and new psychophysical method creates a convenient and rapid technique for parametric measurement of interocular suppression. In addition, this apparatus constitutes the ideal platform for suppressors to combine information between their eyes in a similar way to binocularly normal people. This provides a convenient way for clinicians to implement the newly proposed binocular treatment of amblyopia that is based on antisuppression training.

  2. Quantifying capital goods for waste incineration

    International Nuclear Information System (INIS)

    Brogaard, L.K.; Riber, C.; Christensen, T.H.

    2013-01-01

    Highlights: • Materials and energy used for the construction of waste incinerators were quantified. • The data was collected from five incineration plants in Scandinavia. • Included were six main materials, electronic systems, cables and all transportation. • The capital goods contributed 2–3% compared to the direct emissions impact on GW. - Abstract: Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000–240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000–26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000–5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7–14 kg CO 2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2–3% with respect to kg CO 2 per tonne of waste combusted

  3. Quantifying structural states of soft mudrocks

    Science.gov (United States)

    Li, B.; Wong, R. C. K.

    2016-05-01

    In this paper, a cm model is proposed to quantify structural states of soft mudrocks, which are dependent on clay fractions and porosities. Physical properties of natural and reconstituted soft mudrock samples are used to derive two parameters in the cm model. With the cm model, a simplified homogenization approach is proposed to estimate geomechanical properties and fabric orientation distributions of soft mudrocks based on the mixture theory. Soft mudrocks are treated as a mixture of nonclay minerals and clay-water composites. Nonclay minerals have a high stiffness and serve as a structural framework of mudrocks when they have a high volume fraction. Clay-water composites occupy the void space among nonclay minerals and serve as an in-fill matrix. With the increase of volume fraction of clay-water composites, there is a transition in the structural state from the state of framework supported to the state of matrix supported. The decreases in shear strength and pore size as well as increases in compressibility and anisotropy in fabric are quantitatively related to such transition. The new homogenization approach based on the proposed cm model yields better performance evaluation than common effective medium modeling approaches because the interactions among nonclay minerals and clay-water composites are considered. With wireline logging data, the cm model is applied to quantify the structural states of Colorado shale formations at different depths in the Cold Lake area, Alberta, Canada. Key geomechancial parameters are estimated based on the proposed homogenization approach and the critical intervals with low strength shale formations are identified.

  4. Quantifying multiscale inefficiency in electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Uritskaya, Olga Y. [Department of Economics, University of Calgary, Calgary, Alberta T2N 1N4, and Department of Economics and Management, St. Petersburg Polytechnic University, St. Petersburg (Russian Federation); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2008-11-15

    One of the basic features of efficient markets is the absence of correlations between price increments over any time scale leading to random walk-type behavior of prices. In this paper, we propose a new approach for measuring deviations from the efficient market state based on an analysis of scale-dependent fractal exponent characterizing correlations at different time scales. The approach is applied to two electricity markets, Alberta and Mid Columbia (Mid-C), as well as to the AECO Alberta natural gas market (for purposes of providing a comparison between storable and non-storable commodities). We show that price fluctuations in all studied markets are not efficient, with electricity prices exhibiting complex multiscale correlated behavior not captured by monofractal methods used in previous studies. (author)

  5. Quantifying multiscale inefficiency in electricity markets

    International Nuclear Information System (INIS)

    Uritskaya, Olga Y.; Serletis, Apostolos

    2008-01-01

    One of the basic features of efficient markets is the absence of correlations between price increments over any time scale leading to random walk-type behavior of prices. In this paper, we propose a new approach for measuring deviations from the efficient market state based on an analysis of scale-dependent fractal exponent characterizing correlations at different time scales. The approach is applied to two electricity markets, Alberta and Mid Columbia (Mid-C), as well as to the AECO Alberta natural gas market (for purposes of providing a comparison between storable and non-storable commodities). We show that price fluctuations in all studied markets are not efficient, with electricity prices exhibiting complex multiscale correlated behavior not captured by monofractal methods used in previous studies. (author)

  6. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  7. A recursive formulation for repeated agency with history dependence

    OpenAIRE

    Ana Fernandes; Christopher Phelan

    1999-01-01

    There is now an extensive literature regarding the efficient design of incentive mechanisms in dynamic environments. In this literature, there are no exogenous links across time periods because either privately observed shocks are assumed time independent or past private actions have no influence on the realizations of current variables. The absence of exogenous links across time periods ensures that preferences over continuation contracts are common knowledge, making the definition of incent...

  8. QUANTIFYING LIFE STYLE IMPACT ON LIFESPAN

    Directory of Open Access Journals (Sweden)

    Antonello Lorenzini

    2012-12-01

    Full Text Available A healthy diet, physical activity and avoiding dangerous habits such as smoking are effective ways of increasing health and lifespan. Although a significant portion of the world's population still suffers from malnutrition, especially children, the most common cause of death in the world today is non-communicable diseases. Overweight and obesity significantly increase the relative risk for the most relevant non communicable diseases: cardiovascular disease, type II diabetes and some cancers. Childhood overweight also seems to increase the likelihood of disease in adulthood through epigenetic mechanisms. This worrisome trend now termed "globesity" will deeply impact society unless preventive strategies are put into effect. Researchers of the basic biology of aging have clearly established that animals with short lifespans live longer when their diet is calorie restricted. Although similar experiments carried on rhesus monkeys, a longer-lived species more closely related to humans, yielded mixed results, overall the available scientific data suggest keeping the body mass index in the "normal" range increases the chances of living a longer, healthier life. This can be successfully achieved both by maintaining a healthy diet and by engaging in physical activity. In this review we will try to quantify the relative impact of life style choices on lifespan.

  9. Quantifying and Mapping Global Data Poverty.

    Science.gov (United States)

    Leidig, Mathias; Teeuw, Richard M

    2015-01-01

    Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI). The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction.

  10. Stimfit: quantifying electrophysiological data with Python

    Directory of Open Access Journals (Sweden)

    Segundo Jose Guzman

    2014-02-01

    Full Text Available Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals.

  11. Quantifying capital goods for waste incineration.

    Science.gov (United States)

    Brogaard, L K; Riber, C; Christensen, T H

    2013-06-01

    Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000-240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000-26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000-5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7-14 kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2-3% with respect to kg CO2 per tonne of waste combusted. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Fluorescence imaging to quantify crop residue cover

    Science.gov (United States)

    Daughtry, C. S. T.; Mcmurtrey, J. E., III; Chappelle, E. W.

    1994-01-01

    Crop residues, the portion of the crop left in the field after harvest, can be an important management factor in controlling soil erosion. Methods to quantify residue cover are needed that are rapid, accurate, and objective. Scenes with known amounts of crop residue were illuminated with long wave ultraviolet (UV) radiation and fluorescence images were recorded with an intensified video camera fitted with a 453 to 488 nm band pass filter. A light colored soil and a dark colored soil were used as background for the weathered soybean stems. Residue cover was determined by counting the proportion of the pixels in the image with fluorescence values greater than a threshold. Soil pixels had the lowest gray levels in the images. The values of the soybean residue pixels spanned nearly the full range of the 8-bit video data. Classification accuracies typically were within 3(absolute units) of measured cover values. Video imaging can provide an intuitive understanding of the fraction of the soil covered by residue.

  13. Quantifying Potential Groundwater Recharge In South Texas

    Science.gov (United States)

    Basant, S.; Zhou, Y.; Leite, P. A.; Wilcox, B. P.

    2015-12-01

    Groundwater in South Texas is heavily relied on for human consumption and irrigation for food crops. Like most of the south west US, woody encroachment has altered the grassland ecosystems here too. While brush removal has been widely implemented in Texas with the objective of increasing groundwater recharge, the linkage between vegetation and groundwater recharge in South Texas is still unclear. Studies have been conducted to understand plant-root-water dynamics at the scale of plants. However, little work has been done to quantify the changes in soil water and deep percolation at the landscape scale. Modeling water flow through soil profiles can provide an estimate of the total water flowing into deep percolation. These models are especially powerful with parameterized and calibrated with long term soil water data. In this study we parameterize the HYDRUS soil water model using long term soil water data collected in Jim Wells County in South Texas. Soil water was measured at every 20 cm intervals up to a depth of 200 cm. The parameterized model will be used to simulate soil water dynamics under a variety of precipitation regimes ranging from well above normal to severe drought conditions. The results from the model will be compared with the changes in soil moisture profile observed in response to vegetation cover and treatments from a study in a similar. Comparative studies like this can be used to build new and strengthen existing hypotheses regarding deep percolation and the role of soil texture and vegetation in groundwater recharge.

  14. Quantifying Anthropogenic Stress on Groundwater Resources.

    Science.gov (United States)

    Ashraf, Batool; AghaKouchak, Amir; Alizadeh, Amin; Mousavi Baygi, Mohammad; R Moftakhari, Hamed; Mirchi, Ali; Anjileli, Hassan; Madani, Kaveh

    2017-10-10

    This study explores a general framework for quantifying anthropogenic influences on groundwater budget based on normalized human outflow (h out ) and inflow (h in ). The framework is useful for sustainability assessment of groundwater systems and allows investigating the effects of different human water abstraction scenarios on the overall aquifer regime (e.g., depleted, natural flow-dominated, and human flow-dominated). We apply this approach to selected regions in the USA, Germany and Iran to evaluate the current aquifer regime. We subsequently present two scenarios of changes in human water withdrawals and return flow to the system (individually and combined). Results show that approximately one-third of the selected aquifers in the USA, and half of the selected aquifers in Iran are dominated by human activities, while the selected aquifers in Germany are natural flow-dominated. The scenario analysis results also show that reduced human withdrawals could help with regime change in some aquifers. For instance, in two of the selected USA aquifers, a decrease in anthropogenic influences by ~20% may change the condition of depleted regime to natural flow-dominated regime. We specifically highlight a trending threat to the sustainability of groundwater in northwest Iran and California, and the need for more careful assessment and monitoring practices as well as strict regulations to mitigate the negative impacts of groundwater overexploitation.

  15. Quantifying Supply Risk at a Cellulosic Biorefinery

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Jason K [Idaho National Laboratory; Jacobson, Jacob Jordan [Idaho National Laboratory; Cafferty, Kara Grace [Idaho National Laboratory; Lamers, Patrick [Idaho National Laboratory; Roni, MD S [Idaho National Laboratory

    2015-03-01

    In order to increase the sustainability and security of the nation’s energy supply, the U.S. Department of Energy through its Bioenergy Technology Office has set a vision for one billion tons of biomass to be processed for renewable energy and bioproducts annually by the year 2030. The Renewable Fuels Standard limits the amount of corn grain that can be used in ethanol conversion sold in the U.S, which is already at its maximum. Therefore making the DOE’s vision a reality requires significant growth in the advanced biofuels industry where currently three cellulosic biorefineries convert cellulosic biomass to ethanol. Risk mitigation is central to growing the industry beyond its infancy to a level necessary to achieve the DOE vision. This paper focuses on reducing the supply risk that faces a firm that owns a cellulosic biorefinery. It uses risk theory and simulation modeling to build a risk assessment model based on causal relationships of underlying, uncertain, supply driving variables. Using the model the paper quantifies supply risk reduction achieved by converting the supply chain from a conventional supply system (bales and trucks) to an advanced supply system (depots, pellets, and trains). Results imply that the advanced supply system reduces supply system risk, defined as the probability of a unit cost overrun, from 83% in the conventional system to 4% in the advanced system. Reducing cost risk in this nascent industry improves the odds of realizing desired growth.

  16. Quantifying Supply Risk at a Cellulosic Biorefinery

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Jason K.; Jacobson, Jacob J.; Cafferty, Kara G.; Lamers, Patrick; Roni, Mohammad S.

    2015-07-01

    In order to increase the sustainability and security of the nation’s energy supply, the U.S. Department of Energy through its Bioenergy Technology Office has set a vision for one billion tons of biomass to be processed for renewable energy and bioproducts annually by the year 2030. The Renewable Fuels Standard limits the amount of corn grain that can be used in ethanol conversion sold in the U.S, which is already at its maximum. Therefore making the DOE’s vision a reality requires significant growth in the advanced biofuels industry where currently three cellulosic biorefineries convert cellulosic biomass to ethanol. Risk mitigation is central to growing the industry beyond its infancy to a level necessary to achieve the DOE vision. This paper focuses on reducing the supply risk that faces a firm that owns a cellulosic biorefinery. It uses risk theory and simulation modeling to build a risk assessment model based on causal relationships of underlying, uncertain, supply driving variables. Using the model the paper quantifies supply risk reduction achieved by converting the supply chain from a conventional supply system (bales and trucks) to an advanced supply system (depots, pellets, and trains). Results imply that the advanced supply system reduces supply system risk, defined as the probability of a unit cost overrun, from 83% in the conventional system to 4% in the advanced system. Reducing cost risk in this nascent industry improves the odds of realizing desired growth.

  17. Quantifying and Mapping Global Data Poverty.

    Directory of Open Access Journals (Sweden)

    Mathias Leidig

    Full Text Available Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI. The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction.

  18. Data Used in Quantified Reliability Models

    Science.gov (United States)

    DeMott, Diana; Kleinhammer, Roger K.; Kahn, C. J.

    2014-01-01

    Data is the crux to developing quantitative risk and reliability models, without the data there is no quantification. The means to find and identify reliability data or failure numbers to quantify fault tree models during conceptual and design phases is often the quagmire that precludes early decision makers consideration of potential risk drivers that will influence design. The analyst tasked with addressing a system or product reliability depends on the availability of data. But, where is does that data come from and what does it really apply to? Commercial industries, government agencies, and other international sources might have available data similar to what you are looking for. In general, internal and external technical reports and data based on similar and dissimilar equipment is often the first and only place checked. A common philosophy is "I have a number - that is good enough". But, is it? Have you ever considered the difference in reported data from various federal datasets and technical reports when compared to similar sources from national and/or international datasets? Just how well does your data compare? Understanding how the reported data was derived, and interpreting the information and details associated with the data is as important as the data itself.

  19. Quantifying Information Leakage of Randomized Protocols

    DEFF Research Database (Denmark)

    Biondi, Fabrizio; Wasowski, Andrzej; Legay, Axel

    2013-01-01

    The quantification of information leakage provides a quantitative evaluation of the security of a system. We propose the usage of Markovian processes to model and analyze the information leakage of deterministic and probabilistic systems. We show that this method generalizes the lattice...... of information approach and is a natural framework for modeling refined attackers capable to observe the internal behavior of the system. We also use our method to obtain an algorithm for the computation of channel capacity from our Markovian models. Finally, we show how to use the method to analyze timed...

  20. A stochastic approach for quantifying immigrant integration: the Spanish test case

    Science.gov (United States)

    Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia

    2014-10-01

    We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of ‘time’ and the quantifier the role of ‘space,’ it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build.

  1. A stochastic approach for quantifying immigrant integration: the Spanish test case

    International Nuclear Information System (INIS)

    Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia

    2014-01-01

    We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999–2010. As opposed to the classic time-series approach, by letting immigrant density play the role of ‘time’ and the quantifier the role of ‘space,’ it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build. (paper)

  2. Quantifying over-activity in bipolar and schizophrenia patients in a human open field paradigm.

    Science.gov (United States)

    Perry, William; Minassian, Arpi; Henry, Brook; Kincaid, Meegin; Young, Jared W; Geyer, Mark A

    2010-06-30

    It has been suggested that a cardinal symptom of mania is over-activity and exaggerated goal-directed behavior. Nevertheless, few attempts have been made to quantify this behavior objectively in a laboratory environment. Having a methodology to assess over-activity reliably might be useful in distinguishing manic bipolar disorder (BD) from schizophrenia (SCZ) during highly activated states. In the current study, quantifiable measures of object interaction were assessed using a multivariate approach. Additionally, symptom correlates of over-activity were assessed. Patients admitted to an acute care psychiatric hospital for either BD with mania or SCZ (paranoid and non-paranoid subtypes) as well as non-patient comparison (NC) participants were assessed in an open field setting referred to as the human Behavioral Pattern Monitor (hBPM). Activity and interactions with novel and engaging objects were recorded for 15min via a concealed video camera and rated for exploratory behavior. Both BD and SCZ patients spent more time near the objects and exhibited more overall walking compared to NC. In contrast, BD patients exhibited greater physical contact with objects (number of object interactions and time spent with objects) relative to SCZ patients or NC participants, as well as more perseverative and socially disinhibited behaviors, indicating a unique pattern of over-activity and goal-directed behavior. Further analyses revealed a distinction between SCZ patients according to their subtype. The current study extends our methodology for quantifying exploration and over-activity in a controlled laboratory setting and aids in assessing the overlap and distinguishing characteristics of BD and SCZ.

  3. Quantifying Riverscape Connectivity with Graph Theory

    Science.gov (United States)

    Carbonneau, P.; Milledge, D.; Sinha, R.; Tandon, S. K.

    2013-12-01

    Fluvial catchments convey fluxes of water, sediment, nutrients and aquatic biota. At continental scales, crustal topography defines the overall path of channels whilst at local scales depositional and/or erosional features generally determine the exact path of a channel. Furthermore, constructions such as dams, for either water abstraction or hydropower, often have a significant impact on channel networks.The concept of ';connectivity' is commonly invoked when conceptualising the structure of a river network.This concept is easy to grasp but there have been uneven efforts across the environmental sciences to actually quantify connectivity. Currently there have only been a few studies reporting quantitative indices of connectivity in river sciences, notably, in the study of avulsion processes. However, the majority of current work describing some form of environmental connectivity in a quantitative manner is in the field of landscape ecology. Driven by the need to quantify habitat fragmentation, landscape ecologists have returned to graph theory. Within this formal setting, landscape ecologists have successfully developed a range of indices which can model connectivity loss. Such formal connectivity metrics are currently needed for a range of applications in fluvial sciences. One of the most urgent needs relates to dam construction. In the developed world, hydropower development has generally slowed and in many countries, dams are actually being removed. However, this is not the case in the developing world where hydropower is seen as a key element to low-emissions power-security. For example, several dam projects are envisaged in Himalayan catchments in the next 2 decades. This region is already under severe pressure from climate change and urbanisation, and a better understanding of the network fragmentation which can be expected in this system is urgently needed. In this paper, we apply and adapt connectivity metrics from landscape ecology. We then examine the

  4. Quantifying human vitamin kinetics using AMS

    Energy Technology Data Exchange (ETDEWEB)

    Hillegonds, D; Dueker, S; Ognibene, T; Buchholz, B; Lin, Y; Vogel, J; Clifford, A

    2004-02-19

    Tracing vitamin kinetics at physiologic concentrations has been hampered by a lack of quantitative sensitivity for chemically equivalent tracers that could be used safely in healthy people. Instead, elderly or ill volunteers were sought for studies involving pharmacologic doses with radioisotopic labels. These studies fail to be relevant in two ways: vitamins are inherently micronutrients, whose biochemical paths are saturated and distorted by pharmacological doses; and while vitamins remain important for health in the elderly or ill, their greatest effects may be in preventing slow and cumulative diseases by proper consumption throughout youth and adulthood. Neither the target dose nor the target population are available for nutrient metabolic studies through decay counting of radioisotopes at high levels. Stable isotopic labels are quantified by isotope ratio mass spectrometry at levels that trace physiologic vitamin doses, but the natural background of stable isotopes severely limits the time span over which the tracer is distinguishable. Indeed, study periods seldom ranged over a single biological mean life of the labeled nutrients, failing to provide data on the important final elimination phase of the compound. Kinetic data for the absorption phase is similarly rare in micronutrient research because the phase is rapid, requiring many consecutive plasma samples for accurate representation. However, repeated blood samples of sufficient volume for precise stable or radio-isotope quantitations consume an indefensible amount of the volunteer's blood over a short period. Thus, vitamin pharmacokinetics in humans has often relied on compartmental modeling based upon assumptions and tested only for the short period of maximal blood circulation, a period that poorly reflects absorption or final elimination kinetics except for the most simple models.

  5. Quantifying Sentiment and Influence in Blogspaces

    Energy Technology Data Exchange (ETDEWEB)

    Hui, Peter SY; Gregory, Michelle L.

    2010-07-25

    The weblog, or blog, has become a popular form of social media, through which authors can write posts, which can in turn generate feedback in the form of user comments. When considered in totality, a collection of blogs can thus be viewed as a sort of informal collection of mass sentiment and opinion. An obvious topic of interest might be to mine this collection to obtain some gauge of public sentiment over the wide variety of topics contained therein. However, the sheer size of the so-called blogosphere, combined with the fact that the subjects of posts can vary over a practically limitless number of topics poses some serious challenges when any meaningful analysis is attempted. Namely, the fact that largely anyone with access to the Internet can author their own blog, raises the serious issue of credibility— should some blogs be considered to be more influential than others, and consequently, when gauging sentiment with respect to a topic, should some blogs be weighted more heavily than others? In addition, as new posts and comments can be made on almost a constant basis, any blog analysis algorithm must be able to handle such updates efficiently. In this paper, we give a formalization of the blog model. We give formal methods of quantifying sentiment and influence with respect to a hierarchy of topics, with the specific aim of facilitating the computation of a per-topic, influence-weighted sentiment measure. Finally, as efficiency is a specific endgoal, we give upper bounds on the time required to update these values with new posts, showing that our analysis and algorithms are scalable.

  6. Quantifying antimicrobial resistance at veal calf farms.

    Directory of Open Access Journals (Sweden)

    Angela B Bosman

    Full Text Available This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm were collected. From each individual sample and one pooled faecal sample per farm, 90 selected Escherichia coli isolates were tested for their resistance against 25 mg/L amoxicillin, 25 mg/L tetracycline, 0.5 mg/L cefotaxime, 0.125 mg/L ciprofloxacin and 8/152 mg/L trimethoprim/sulfamethoxazole (tmp/s by replica plating. From each faecal sample another 10 selected E. coli isolates were tested for their resistance by broth microdilution as a reference. Logistic regression analysis was performed to compare the odds of testing an isolate resistant between both test methods (replica plating vs. broth microdilution and to evaluate the effect of pooling faecal samples. Bootstrap analysis was used to investigate the precision of the estimated prevalence of resistance to each antimicrobial obtained by several simulated sampling strategies. Replica plating showed similar odds of E. coli isolates tested resistant compared to broth microdilution, except for ciprofloxacin (OR 0.29, p ≤ 0.05. Pooled samples showed in general lower odds of an isolate being resistant compared to individual samples, although these differences were not significant. Bootstrap analysis showed that within each antimicrobial the various compositions of a pooled sample provided consistent estimates for the mean proportion of resistant isolates. Sampling strategies should be based on the variation in resistance among isolates within faecal samples and between faecal samples, which may vary by antimicrobial. In our study, the optimal sampling strategy from the perspective of precision of the estimated levels of resistance and practicality consists of a pooled faecal sample from 20 individual animals, of which

  7. Quantifying the Clinical Significance of Cannabis Withdrawal

    Science.gov (United States)

    Allsop, David J.; Copeland, Jan; Norberg, Melissa M.; Fu, Shanlin; Molnar, Anna; Lewis, John; Budney, Alan J.

    2012-01-01

    Background and Aims Questions over the clinical significance of cannabis withdrawal have hindered its inclusion as a discrete cannabis induced psychiatric condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM IV). This study aims to quantify functional impairment to normal daily activities from cannabis withdrawal, and looks at the factors predicting functional impairment. In addition the study tests the influence of functional impairment from cannabis withdrawal on cannabis use during and after an abstinence attempt. Methods and Results A volunteer sample of 49 non-treatment seeking cannabis users who met DSM-IV criteria for dependence provided daily withdrawal-related functional impairment scores during a one-week baseline phase and two weeks of monitored abstinence from cannabis with a one month follow up. Functional impairment from withdrawal symptoms was strongly associated with symptom severity (p = 0.0001). Participants with more severe cannabis dependence before the abstinence attempt reported greater functional impairment from cannabis withdrawal (p = 0.03). Relapse to cannabis use during the abstinence period was associated with greater functional impairment from a subset of withdrawal symptoms in high dependence users. Higher levels of functional impairment during the abstinence attempt predicted higher levels of cannabis use at one month follow up (p = 0.001). Conclusions Cannabis withdrawal is clinically significant because it is associated with functional impairment to normal daily activities, as well as relapse to cannabis use. Sample size in the relapse group was small and the use of a non-treatment seeking population requires findings to be replicated in clinical samples. Tailoring treatments to target withdrawal symptoms contributing to functional impairment during a quit attempt may improve treatment outcomes. PMID:23049760

  8. Quantifying seasonal velocity at Khumbu Glacier, Nepal

    Science.gov (United States)

    Miles, E.; Quincey, D. J.; Miles, K.; Hubbard, B. P.; Rowan, A. V.

    2017-12-01

    While the low-gradient debris-covered tongues of many Himalayan glaciers exhibit low surface velocities, quantifying ice flow and its variation through time remains a key challenge for studies aimed at determining the long-term evolution of these glaciers. Recent work has suggested that glaciers in the Everest region of Nepal may show seasonal variability in surface velocity, with ice flow peaking during the summer as monsoon precipitation provides hydrological inputs and thus drives changes in subglacial drainage efficiency. However, satellite and aerial observations of glacier velocity during the monsoon are greatly limited due to cloud cover. Those that do exist do not span the period over which the most dynamic changes occur, and consequently short-term (i.e. daily) changes in flow, as well as the evolution of ice dynamics through the monsoon period, remain poorly understood. In this study, we combine field and remote (satellite image) observations to create a multi-temporal, 3D synthesis of ice deformation rates at Khumbu Glacier, Nepal, focused on the 2017 monsoon period. We first determine net annual and seasonal surface displacements for the whole glacier based on Landsat-8 (OLI) panchromatic data (15m) processed with ImGRAFT. We integrate inclinometer observations from three boreholes drilled by the EverDrill project to determine cumulative deformation at depth, providing a 3D perspective and enabling us to assess the role of basal sliding at each site. We additionally analyze high-frequency on-glacier L1 GNSS data from three sites to characterize variability within surface deformation at sub-seasonal timescales. Finally, each dataset is validated against repeat-dGPS observations at gridded points in the vicinity of the boreholes and GNSS dataloggers. These datasets complement one another to infer thermal regime across the debris-covered ablation area of the glacier, and emphasize the seasonal and spatial variability of ice deformation for glaciers in High

  9. Quantifying the clinical significance of cannabis withdrawal.

    Directory of Open Access Journals (Sweden)

    David J Allsop

    Full Text Available Questions over the clinical significance of cannabis withdrawal have hindered its inclusion as a discrete cannabis induced psychiatric condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM IV. This study aims to quantify functional impairment to normal daily activities from cannabis withdrawal, and looks at the factors predicting functional impairment. In addition the study tests the influence of functional impairment from cannabis withdrawal on cannabis use during and after an abstinence attempt.A volunteer sample of 49 non-treatment seeking cannabis users who met DSM-IV criteria for dependence provided daily withdrawal-related functional impairment scores during a one-week baseline phase and two weeks of monitored abstinence from cannabis with a one month follow up. Functional impairment from withdrawal symptoms was strongly associated with symptom severity (p=0.0001. Participants with more severe cannabis dependence before the abstinence attempt reported greater functional impairment from cannabis withdrawal (p=0.03. Relapse to cannabis use during the abstinence period was associated with greater functional impairment from a subset of withdrawal symptoms in high dependence users. Higher levels of functional impairment during the abstinence attempt predicted higher levels of cannabis use at one month follow up (p=0.001.Cannabis withdrawal is clinically significant because it is associated with functional impairment to normal daily activities, as well as relapse to cannabis use. Sample size in the relapse group was small and the use of a non-treatment seeking population requires findings to be replicated in clinical samples. Tailoring treatments to target withdrawal symptoms contributing to functional impairment during a quit attempt may improve treatment outcomes.

  10. Quantifying motion for pancreatic radiotherapy margin calculation

    International Nuclear Information System (INIS)

    Whitfield, Gillian; Jain, Pooja; Green, Melanie; Watkins, Gillian; Henry, Ann; Stratford, Julie; Amer, Ali; Marchant, Thomas; Moore, Christopher; Price, Patricia

    2012-01-01

    Background and purpose: Pancreatic radiotherapy (RT) is limited by uncertain target motion. We quantified 3D patient/organ motion during pancreatic RT and calculated required treatment margins. Materials and methods: Cone-beam computed tomography (CBCT) and orthogonal fluoroscopy images were acquired post-RT delivery from 13 patients with locally advanced pancreatic cancer. Bony setup errors were calculated from CBCT. Inter- and intra-fraction fiducial (clip/seed/stent) motion was determined from CBCT projections and orthogonal fluoroscopy. Results: Using an off-line CBCT correction protocol, systematic (random) setup errors were 2.4 (3.2), 2.0 (1.7) and 3.2 (3.6) mm laterally (left–right), vertically (anterior–posterior) and longitudinally (cranio-caudal), respectively. Fiducial motion varied substantially. Random inter-fractional changes in mean fiducial position were 2.0, 1.6 and 2.6 mm; 95% of intra-fractional peak-to-peak fiducial motion was up to 6.7, 10.1 and 20.6 mm, respectively. Calculated clinical to planning target volume (CTV–PTV) margins were 1.4 cm laterally, 1.4 cm vertically and 3.0 cm longitudinally for 3D conformal RT, reduced to 0.9, 1.0 and 1.8 cm, respectively, if using 4D planning and online setup correction. Conclusions: Commonly used CTV–PTV margins may inadequately account for target motion during pancreatic RT. Our results indicate better immobilisation, individualised allowance for respiratory motion, online setup error correction and 4D planning would improve targeting.

  11. Quantifying Discrete Fracture Network Connectivity in Hydraulic Fracturing Stimulation

    Science.gov (United States)

    Urbancic, T.; Ardakani, E. P.; Baig, A.

    2017-12-01

    Hydraulic fracture stimulations generally result in microseismicity that is associated with the activation or extension of pre-existing microfractures and discontinuities. Microseismic events acquired under 3D downhole sensor coverage provide accurate event locations outlining hydraulic fracture growth. Combined with source characteristics, these events provide a high quality input for seismic moment tensor inversion and eventually constructing the representative discrete fracture network (DFN). In this study, we investigate the strain and stress state, identified fracture orientation, and DFN connectivity and performance for example stages in a multistage perf and plug completion in a North American shale play. We use topology, the familiar concept in many areas of structural geology, to further describe the relationships between the activated fractures and their effectiveness in enhancing permeability. We explore how local perturbations of stress state lead to the activation of different fractures sets and how that effects the DFN interaction and complexity. In particular, we observe that a more heterogeneous stress state shows a higher percentage of sub-horizontal fractures or bedding plane slips. Based on topology, the fractures are evenly distributed from the injection point, with decreasing numbers of connections by distance. The dimensionless measure of connection per branch and connection per line are used for quantifying the DFN connectivity. In order to connect the concept of connectivity back to productive volume and stimulation efficiency, the connectivity is compared with the character of deformation in the reservoir as deduced from the collective behavior of microseismicity using robustly determined source parameters.

  12. POST BEHAVIORAL FINANCE ADOLESCENCE

    Directory of Open Access Journals (Sweden)

    ADRIAN MITROI

    2016-12-01

    Full Text Available The study of behavioral finance combines the investigation and expertise from research and practice into smart portfolios of individual investors’ portfolios. Understanding cognitive errors and misleading emotions drive investors to their long-term goals of financial prosperity and capital preservation. 10 years ago, Behavioral Finance was still considered an incipient, adolescent science. First Nobel Prize in Economics awarded to the study of Behavioral Economics in 2002 established the field as a new, respected study of economics. 2013 Nobel Prize was awarded to three economists, one of them considered the one of the founders of the Behavioral Finance. As such, by now we are entering the coming of age of behavioral finance. It is now recognized as a science of understanding investors behaviors and their biased patterns. It applies quantitative finance and provides practical models grounded on robust understanding of investors behavior toward financial risk. Financial Personality influences investment decisions. Behavioral portfolio construction methods combine classic finance with rigorously quantified psychological metrics and improves models for financial advice to enhance investors chances in reaching their lifetime financial goals. Behavioral finance helps understanding psychological profile dissimilarities of individuals and how these differences manifest in investment decision process. This new science has become now a must topic in modern finance.

  13. A Methodological Approach to Quantifying Plyometric Intensity.

    Science.gov (United States)

    Jarvis, Mark M; Graham-Smith, Phil; Comfort, Paul

    2016-09-01

    Jarvis, MM, Graham-Smith, P, and Comfort, P. A Methodological approach to quantifying plyometric intensity. J Strength Cond Res 30(9): 2522-2532, 2016-In contrast to other methods of training, the quantification of plyometric exercise intensity is poorly defined. The purpose of this study was to evaluate the suitability of a range of neuromuscular and mechanical variables to describe the intensity of plyometric exercises. Seven male recreationally active subjects performed a series of 7 plyometric exercises. Neuromuscular activity was measured using surface electromyography (SEMG) at vastus lateralis (VL) and biceps femoris (BF). Surface electromyography data were divided into concentric (CON) and eccentric (ECC) phases of movement. Mechanical output was measured by ground reaction forces and processed to provide peak impact ground reaction force (PF), peak eccentric power (PEP), and impulse (IMP). Statistical analysis was conducted to assess the reliability intraclass correlation coefficient and sensitivity smallest detectable difference of all variables. Mean values of SEMG demonstrate high reliability (r ≥ 0.82), excluding ECC VL during a 40-cm drop jump (r = 0.74). PF, PEP, and IMP demonstrated high reliability (r ≥ 0.85). Statistical power for force variables was excellent (power = 1.0), and good for SEMG (power ≥0.86) excluding CON BF (power = 0.57). There was no significant difference (p > 0.05) in CON SEMG between exercises. Eccentric phase SEMG only distinguished between exercises involving a landing and those that did not (percentage of maximal voluntary isometric contraction [%MVIC] = no landing -65 ± 5, landing -140 ± 8). Peak eccentric power, PF, and IMP all distinguished between exercises. In conclusion, CON neuromuscular activity does not appear to vary when intent is maximal, whereas ECC activity is dependent on the presence of a landing. Force characteristics provide a reliable and sensitive measure enabling precise description of intensity

  14. Quantifying Permafrost Characteristics with DCR-ERT

    Science.gov (United States)

    Schnabel, W.; Trochim, E.; Munk, J.; Kanevskiy, M. Z.; Shur, Y.; Fortier, R.

    2012-12-01

    Geophysical methods are an efficient method for quantifying permafrost characteristics for Arctic road design and engineering. In the Alaskan Arctic construction and maintenance of roads requires integration of permafrost; ground that is below 0 degrees C for two or more years. Features such as ice content and temperature are critical for understanding current and future ground conditions for planning, design and evaluation of engineering applications. This study focused on the proposed Foothills West Transportation Access project corridor where the purpose is to construct a new all-season road connecting the Dalton Highway to Umiat. Four major areas were chosen that represented a range of conditions including gravel bars, alluvial plains, tussock tundra (both unburned and burned conditions), high and low centered ice-wedge polygons and an active thermokarst feature. Direct-current resistivity using galvanic contact (DCR-ERT) was applied over transects. In conjunction complimentary site data including boreholes, active layer depths, vegetation descriptions and site photographs was obtained. The boreholes provided information on soil morphology, ice texture and gravimetric moisture content. Horizontal and vertical resolutions in the DCR-ERT were varied to determine the presence or absence of ground ice; subsurface heterogeneity; and the depth to groundwater (if present). The four main DCR-ERT methods used were: 84 electrodes with 2 m spacing; 42 electrodes with 0.5 m spacing; 42 electrodes with 2 m spacing; and 84 electrodes with 1 m spacing. In terms of identifying the ground ice characteristics the higher horizontal resolution DCR-ERT transects with either 42 or 84 electrodes and 0.5 or 1 m spacing were best able to differentiate wedge-ice. This evaluation is based on a combination of both borehole stratigraphy and surface characteristics. Simulated apparent resistivity values for permafrost areas varied from a low of 4582 Ω m to a high of 10034 Ω m. Previous

  15. Quantifying geocode location error using GIS methods

    Directory of Open Access Journals (Sweden)

    Gardner Bennett R

    2007-04-01

    Full Text Available Abstract Background The Metropolitan Atlanta Congenital Defects Program (MACDP collects maternal address information at the time of delivery for infants and fetuses with birth defects. These addresses have been geocoded by two independent agencies: (1 the Georgia Division of Public Health Office of Health Information and Policy (OHIP and (2 a commercial vendor. Geographic information system (GIS methods were used to quantify uncertainty in the two sets of geocodes using orthoimagery and tax parcel datasets. Methods We sampled 599 infants and fetuses with birth defects delivered during 1994–2002 with maternal residence in either Fulton or Gwinnett County. Tax parcel datasets were obtained from the tax assessor's offices of Fulton and Gwinnett County. High-resolution orthoimagery for these counties was acquired from the U.S. Geological Survey. For each of the 599 addresses we attempted to locate the tax parcel corresponding to the maternal address. If the tax parcel was identified the distance and the angle between the geocode and the residence were calculated. We used simulated data to characterize the impact of geocode location error. In each county 5,000 geocodes were generated and assigned their corresponding Census 2000 tract. Each geocode was then displaced at a random angle by a random distance drawn from the distribution of observed geocode location errors. The census tract of the displaced geocode was determined. We repeated this process 5,000 times and report the percentage of geocodes that resolved into incorrect census tracts. Results Median location error was less than 100 meters for both OHIP and commercial vendor geocodes; the distribution of angles appeared uniform. Median location error was approximately 35% larger in Gwinnett (a suburban county relative to Fulton (a county with urban and suburban areas. Location error occasionally caused the simulated geocodes to be displaced into incorrect census tracts; the median percentage

  16. Entropy generation method to quantify thermal comfort

    Science.gov (United States)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study

  17. Quantifying emission reduction contributions by emerging economics

    Energy Technology Data Exchange (ETDEWEB)

    Moltmann, Sara; Hagemann, Markus; Eisbrenner, Katja; Hoehne, Niklas [Ecofys GmbH, Koeln (Germany); Sterk, Wolfgang; Mersmann, Florian; Ott, Hermann E.; Watanabe, Rie [Wuppertal Institut (Germany)

    2011-04-15

    Further action is needed that goes far beyond what has been agreed so far under the United Nations Framework Convention on Climate Change (UNFCCC) and the Kyoto Protocol to 'prevent dangerous anthropogenic interference with the climate system', the ultimate objective of the UNFCCC. It is out of question that developed countries (Annex I countries) will have to take a leading role. They will have to commit to substantial emission reductions and financing commitments due to their historical responsibility and their financial capability. However, the stabilisation of the climate system will require global emissions to peak within the next decade and decline well below half of current levels by the middle of the century. It is hence a global issue and, thus, depends on the participation of as many countries as possible. This report provides a comparative analysis of greenhouse gas (GHG) emissions, including their national climate plans, of the major emitting developing countries Brazil, China, India, Mexico, South Africa and South Korea. It includes an overview of emissions and economic development, existing national climate change strategies, uses a consistent methodology for estimating emission reduction potential, costs of mitigation options, provides an estimate of the reductions to be achieved through the national climate plans and finally provides a comparison of the results to the allocation of emission rights according to different global effort-sharing approaches. In addition, the report discusses possible nationally appropriate mitigation actions (NAMAs) the six countries could take based on the analysis of mitigation options. This report is an output of the project 'Proposals for quantifying emission reduction contributions by emerging economies' by Ecofys and the Wuppertal Institute for the Federal Environment Agency in Dessau. It builds upon earlier joint work ''Proposals for contributions of emerging economies to the climate

  18. Quantifying the impacts of global disasters

    Science.gov (United States)

    Jones, L. M.; Ross, S.; Wilson, R. I.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Hansen, R. A.; Johnson, L. A.; Kirby, S. H.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Thio, H. K.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2012-12-01

    The US Geological Survey, National Oceanic and Atmospheric Administration, California Geological Survey, and other entities are developing a Tsunami Scenario, depicting a realistic outcome of a hypothetical but plausible large tsunami originating in the eastern Aleutian Arc, affecting the west coast of the United States, including Alaska and Hawaii. The scenario includes earth-science effects, damage and restoration of the built environment, and social and economic impacts. Like the earlier ShakeOut and ARkStorm disaster scenarios, the purpose of the Tsunami Scenario is to apply science to quantify the impacts of natural disasters in a way that can be used by decision makers in the affected sectors to reduce the potential for loss. Most natural disasters are local. A major hurricane can destroy a city or damage a long swath of coastline while mostly sparing inland areas. The largest earthquake on record caused strong shaking along 1500 km of Chile, but left the capital relatively unscathed. Previous scenarios have used the local nature of disasters to focus interaction with the user community. However, the capacity for global disasters is growing with the interdependency of the global economy. Earthquakes have disrupted global computer chip manufacturing and caused stock market downturns. Tsunamis, however, can be global in their extent and direct impact. Moreover, the vulnerability of seaports to tsunami damage can increase the global consequences. The Tsunami Scenario is trying to capture the widespread effects while maintaining the close interaction with users that has been one of the most successful features of the previous scenarios. The scenario tsunami occurs in the eastern Aleutians with a source similar to the 2011 Tohoku event. Geologic similarities support the argument that a Tohoku-like source is plausible in Alaska. It creates a major nearfield tsunami in the Aleutian arc and peninsula, a moderate tsunami in the US Pacific Northwest, large but not the

  19. Quantifier spreading in child eye movements: A case of the Russian quantifier kazhdyj ‘every'

    Directory of Open Access Journals (Sweden)

    Irina A. Sekerina

    2017-07-01

    Full Text Available Extensive cross-linguistic work has documented that children up to the age of 9–10 make errors when performing a sentence-picture verification task that pairs spoken sentences with the universal quantifier 'every 'and pictures with entities in partial one-to-one correspondence. These errors stem from children’s difficulties in restricting the domain of a universal quantifier to the appropriate noun phrase and are referred in the literature as 'quantifier-spreading '('q'-spreading. We adapted the task to be performed in conjunction with eye-movement recordings using the Visual World Paradigm. Russian-speaking 5-to-6-year-old children ('N '= 31 listened to sentences like 'Kazhdyj alligator lezhit v vanne '‘Every alligator is lying in a bathtub’ and viewed pictures with three alligators, each in a bathtub, and two extra empty bathtubs. Non-spreader children ('N '= 12 were adult-like in their accuracy whereas 'q'-spreading ones ('N '= 19 were only 43% correct in interpreting such sentences compared to the control sentences. Eye movements of 'q'-spreading children revealed that more looks to the extra containers (two empty bathtubs correlated with higher error rates reflecting the processing pattern of 'q'-spreading. In contrast, more looks to the distractors in control sentences did not lead to errors in interpretation. We argue that 'q'-spreading errors are caused by interference from the extra entities in the visual context, and our results support the processing difficulty account of acquisition of quantification. Interference results in cognitive overload as children have to integrate multiple sources of information, i.e., visual context with salient extra entities and the spoken sentence in which these entities are mentioned in real-time processing.   This article is part of the special collection: Acquisition of Quantification

  20. Quantifying data worth toward reducing predictive uncertainty

    Science.gov (United States)

    Dausman, A.M.; Doherty, J.; Langevin, C.D.; Sukop, M.C.

    2010-01-01

    The present study demonstrates a methodology for optimization of environmental data acquisition. Based on the premise that the worth of data increases in proportion to its ability to reduce the uncertainty of key model predictions, the methodology can be used to compare the worth of different data types, gathered at different locations within study areas of arbitrary complexity. The method is applied to a hypothetical nonlinear, variable density numerical model of salt and heat transport. The relative utilities of temperature and concentration measurements at different locations within the model domain are assessed in terms of their ability to reduce the uncertainty associated with predictions of movement of the salt water interface in response to a decrease in fresh water recharge. In order to test the sensitivity of the method to nonlinear model behavior, analyses were repeated for multiple realizations of system properties. Rankings of observation worth were similar for all realizations, indicating robust performance of the methodology when employed in conjunction with a highly nonlinear model. The analysis showed that while concentration and temperature measurements can both aid in the prediction of interface movement, concentration measurements, especially when taken in proximity to the interface at locations where the interface is expected to move, are of greater worth than temperature measurements. Nevertheless, it was also demonstrated that pairs of temperature measurements, taken in strategic locations with respect to the interface, can also lead to more precise predictions of interface movement. Journal compilation ?? 2010 National Ground Water Association.

  1. On the contrast between Germanic and Romance negated quantifiers

    OpenAIRE

    Robert Cirillo

    2009-01-01

    Universal quantifiers can be stranded in the manner described by Sportiche (1988), Giusti (1990) and Shlonsky (1991) in both the Romance and Germanic languages, but a negated universal quantifier can only be stranded in the Germanic languages. The goal of this paper is to show that this contrast between the Romance and the Germanic languages can be explained if one adapts the theory of sentential negation in Zeijlstra (2004) to constituent (quantifier) negation. According to Zeijlstra’s theor...

  2. Quantifying commuter exposures to volatile organic compounds

    Science.gov (United States)

    Kayne, Ashleigh

    Motor-vehicles can be a predominant source of air pollution in cities. Traffic-related air pollution is often unavoidable for people who live in populous areas. Commuters may have high exposures to traffic-related air pollution as they are close to vehicle tailpipes. Volatile organic compounds (VOCs) are one class of air pollutants of concern because exposure to VOCs carries risk for adverse health effects. Specific VOCs of interest for this work include benzene, toluene, ethylbenzene, and xylenes (BTEX), which are often found in gasoline and combustion products. Although methods exist to measure time-integrated personal exposures to BTEX, there are few practical methods to measure a commuter's time-resolved BTEX exposure which could identify peak exposures that could be concealed with a time-integrated measurement. This study evaluated the ability of a photoionization detector (PID) to measure commuters' exposure to BTEX using Tenax TA samples as a reference and quantified the difference in BTEX exposure between cyclists and drivers with windows open and closed. To determine the suitability of two measurement methods (PID and Tenax TA) for use in this study, the precision, linearity, and limits of detection (LODs) for both the PID and Tenax TA measurement methods were determined in the laboratory with standard BTEX calibration gases. Volunteers commuted from their homes to their work places by cycling or driving while wearing a personal exposure backpack containing a collocated PID and Tenax TA sampler. Volunteers completed a survey and indicated if the windows in their vehicle were open or closed. Comparing pairs of exposure data from the Tenax TA and PID sampling methods determined the suitability of the PID to measure the BTEX exposures of commuters. The difference between BTEX exposures of cyclists and drivers with windows open and closed in Fort Collins was determined. Both the PID and Tenax TA measurement methods were precise and linear when evaluated in the

  3. Quantifying the impact of relativity and of dispersion interactions on the activation of molecular oxygen promoted by noble metal nanoparticles

    KAUST Repository

    Kanoun, Mohammed; Cavallo, Luigi

    2014-01-01

    an energy barrier close to 20 kcal/mol on Ag38, which decreases to slightly more than 10 kcal/mol on Au38. This behavior is analyzed to quantify the impact of relativity and of dispersion interactions through a comparison of nonrelativistic, scalar

  4. Quantifying the Determinants of Evolutionary Dynamics Leading to Drug Resistance.

    Directory of Open Access Journals (Sweden)

    Guillaume Chevereau

    Full Text Available The emergence of drug resistant pathogens is a serious public health problem. It is a long-standing goal to predict rates of resistance evolution and design optimal treatment strategies accordingly. To this end, it is crucial to reveal the underlying causes of drug-specific differences in the evolutionary dynamics leading to resistance. However, it remains largely unknown why the rates of resistance evolution via spontaneous mutations and the diversity of mutational paths vary substantially between drugs. Here we comprehensively quantify the distribution of fitness effects (DFE of mutations, a key determinant of evolutionary dynamics, in the presence of eight antibiotics representing the main modes of action. Using precise high-throughput fitness measurements for genome-wide Escherichia coli gene deletion strains, we find that the width of the DFE varies dramatically between antibiotics and, contrary to conventional wisdom, for some drugs the DFE width is lower than in the absence of stress. We show that this previously underappreciated divergence in DFE width among antibiotics is largely caused by their distinct drug-specific dose-response characteristics. Unlike the DFE, the magnitude of the changes in tolerated drug concentration resulting from genome-wide mutations is similar for most drugs but exceptionally small for the antibiotic nitrofurantoin, i.e., mutations generally have considerably smaller resistance effects for nitrofurantoin than for other drugs. A population genetics model predicts that resistance evolution for drugs with this property is severely limited and confined to reproducible mutational paths. We tested this prediction in laboratory evolution experiments using the "morbidostat", a device for evolving bacteria in well-controlled drug environments. Nitrofurantoin resistance indeed evolved extremely slowly via reproducible mutations-an almost paradoxical behavior since this drug causes DNA damage and increases the mutation

  5. Quantifying object and material surface areas in residences

    Energy Technology Data Exchange (ETDEWEB)

    Hodgson, Alfred T.; Ming, Katherine Y.; Singer, Brett C.

    2005-01-05

    The dynamic behavior of volatile organic compounds (VOCs) in indoor environments depends, in part, on sorptive interactions between VOCs in the gas phase and material surfaces. Since information on the types and quantities of interior material surfaces is not generally available, this pilot-scale study was conducted in occupied residences to develop and demonstrate a method for quantifying surface areas of objects and materials in rooms. Access to 33 rooms in nine residences consisting of bathrooms, bedroom/offices and common areas was solicited from among research group members living in the East San Francisco Bay Area. A systematic approach was implemented for measuring rooms and objects from 300 cm{sup 2} and larger. The ventilated air volumes of the rooms were estimated and surface area-to-volume ratios were calculated for objects and materials, each segregated into 20 or more categories. Total surface area-to-volume ratios also were determined for each room. The bathrooms had the highest total surface area-to-volume ratios. Bedrooms generally had higher ratios than common areas consisting of kitchens, living/dining rooms and transitional rooms. Total surface area-to-volume ratios for the 12 bedrooms ranged between 2.3 and 4.7 m{sup 2} m{sup -3}. The importance of individual objects and materials with respect to sorption will depend upon the sorption coefficients for the various VOC/materials combinations. When combined, the highly permeable material categories, which may contribute to significant interactions, had a median ratio of about 0.5 m{sup 2} m{sup -3} for all three types of rooms.

  6. Quantifying uncertainty in observational rainfall datasets

    Science.gov (United States)

    Lennard, Chris; Dosio, Alessandro; Nikulin, Grigory; Pinto, Izidine; Seid, Hussen

    2015-04-01

    rainfall datasets available over Africa on monthly, daily and sub-daily time scales as appropriate to quantify spatial and temporal differences between the datasets. We find regional wet and dry biases between datasets (using the ensemble mean as a reference) with generally larger biases in reanalysis products. Rainfall intensity is poorly represented in some datasets which demonstrates some datasets should not be used for rainfall intensity analyses. Using 10 CORDEX models we show in east Africa that the spread between observed datasets is often similar to the spread between models. We recommend that specific observational rainfall datasets datasets be used for specific investigations and also that where many datasets are applicable to an investigation, a probabilistic view be adopted for rainfall studies over Africa. Endris, H. S., P. Omondi, S. Jain, C. Lennard, B. Hewitson, L. Chang'a, J. L. Awange, A. Dosio, P. Ketiem, G. Nikulin, H-J. Panitz, M. Büchner, F. Stordal, and L. Tazalika (2013) Assessment of the Performance of CORDEX Regional Climate Models in Simulating East African Rainfall. J. Climate, 26, 8453-8475. DOI: 10.1175/JCLI-D-12-00708.1 Gbobaniyi, E., A. Sarr, M. B. Sylla, I. Diallo, C. Lennard, A. Dosio, A. Dhie ?diou, A. Kamga, N. A. B. Klutse, B. Hewitson, and B. Lamptey (2013) Climatology, annual cycle and interannual variability of precipitation and temperature in CORDEX simulations over West Africa. Int. J. Climatol., DOI: 10.1002/joc.3834 Hernández-Díaz, L., R. Laprise, L. Sushama, A. Martynov, K. Winger, and B. Dugas (2013) Climate simulation over CORDEX Africa domain using the fifth-generation Canadian Regional Climate Model (CRCM5). Clim. Dyn. 40, 1415-1433. DOI: 10.1007/s00382-012-1387-z Kalognomou, E., C. Lennard, M. Shongwe, I. Pinto, A. Favre, M. Kent, B. Hewitson, A. Dosio, G. Nikulin, H. Panitz, and M. Büchner (2013) A diagnostic evaluation of precipitation in CORDEX models over southern Africa. Journal of Climate, 26, 9477-9506. DOI:10

  7. Quantifying biodiversity and asymptotics for a sequence of random strings.

    Science.gov (United States)

    Koyano, Hitoshi; Kishino, Hirohisa

    2010-06-01

    We present a methodology for quantifying biodiversity at the sequence level by developing the probability theory on a set of strings. Further, we apply our methodology to the problem of quantifying the population diversity of microorganisms in several extreme environments and digestive organs and reveal the relation between microbial diversity and various environmental parameters.

  8. Visual Attention and Quantifier-Spreading in Heritage Russian Bilinguals

    Science.gov (United States)

    Sekerina, Irina A.; Sauermann, Antje

    2015-01-01

    It is well established in language acquisition research that monolingual children and adult second language learners misinterpret sentences with the universal quantifier "every" and make quantifier-spreading errors that are attributed to a preference for a match in number between two sets of objects. The present Visual World eye-tracking…

  9. Gender Differences in Knee Joint Congruity Quantified from MRI

    DEFF Research Database (Denmark)

    Tummala, Sudhakar; Schiphof, Dieuwke; Byrjalsen, Inger

    2018-01-01

    was located and quantified using Euclidean distance transform. Furthermore, the CI was quantified over the contact area by assessing agreement of the first- and second-order general surface features. Then, the gender differences between CA and CI values were evaluated at different stages of radiographic OA...

  10. Anatomy of Alternating Quantifier Satisfiability (Work in progress)

    DEFF Research Database (Denmark)

    Dung, Phan Anh; Bjørner, Nikolaj; Monniaux, David

    We report on work in progress to generalize an algorithm recently introduced in [10] for checking satisfiability of formulas with quantifier alternation. The algorithm uses two auxiliary procedures: a procedure for producing a candidate formula for quantifier elimination and a procedure for elimi...

  11. The Role of Quantifier Alternations in Cut Elimination

    DEFF Research Database (Denmark)

    Gerhardy, Philipp

    2005-01-01

    Extending previous results on the complexity of cut elimination for the sequent calculus LK, we discuss the role of quantifier alternations and develop a measure to describe the complexity of cut elimination in terms of quantifier alternations in cut formulas and contractions on such formulas...

  12. Quantified Effects of Late Pregnancy and Lactation on the Osmotic ...

    African Journals Online (AJOL)

    Quantified Effects of Late Pregnancy and Lactation on the Osmotic Stability of ... in the composition of erythrocyte membranes associated with the physiologic states. Keywords: Erythrocyteosmotic stability, osmotic fragility, late pregnancy, ...

  13. Study Quantifies Physical Demands of Yoga in Seniors

    Science.gov (United States)

    ... Z Study Quantifies Physical Demands of Yoga in Seniors Share: A recent NCCAM-funded study measured the ... performance of seven standing poses commonly taught in senior yoga classes: Chair, Wall Plank, Tree, Warrior II, ...

  14. Quantifying the economic water savings benefit of water hyacinth ...

    African Journals Online (AJOL)

    Quantifying the economic water savings benefit of water hyacinth ... Value Method was employed to estimate the average production value of irrigation water, ... invasions of this nature, as they present significant costs to the economy and ...

  15. Analyzing complex networks evolution through Information Theory quantifiers

    International Nuclear Information System (INIS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martin Gomez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  16. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  17. On the contrast between Germanic and Romance negated quantifiers

    Directory of Open Access Journals (Sweden)

    Robert Cirillo

    2009-01-01

    Full Text Available Universal quantifiers can be stranded in the manner described by Sportiche (1988, Giusti (1990 and Shlonsky (1991 in both the Romance and Germanic languages, but a negated universal quantifier can only be stranded in the Germanic languages. The goal of this paper is to show that this contrast between the Romance and the Germanic languages can be explained if one adapts the theory of sentential negation in Zeijlstra (2004 to constituent (quantifier negation. According to Zeijlstra’s theory, a negation marker in the Romance languages is the head of a NegP that dominates vP, whereas in the Germanic languages a negation marker is a maximal projection that occupies the specifier position of a verbal phrase. I will show that the non-occurrence of stranded negated quantifiers in the Romance languages follows from the fact that negation markers in the Romance languages are highly positioned syntactic heads.

  18. A kernel plus method for quantifying wind turbine performance upgrades

    KAUST Repository

    Lee, Giwhyun; Ding, Yu; Xie, Le; Genton, Marc G.

    2014-01-01

    Power curves are commonly estimated using the binning method recommended by the International Electrotechnical Commission, which primarily incorporates wind speed information. When such power curves are used to quantify a turbine's upgrade

  19. Quantifying Functional Reuse from Object Oriented Requirements Specifications

    NARCIS (Netherlands)

    Condori-Fernandez, Nelly; Condori-Fernández, N.; Pastor, O; Daneva, Maia; Abran, A.; Castro, J.; Quer, C.; Carvallo, J. B.; Fernandes da Silva, L.

    2008-01-01

    Software reuse is essential in improving efficiency and productivity in the software development process. This paper analyses reuse within requirements engineering phase by taking and adapting a standard functional size measurement method, COSMIC FFP. Our proposal attempts to quantify reusability

  20. User guide : process for quantifying the benefits of research.

    Science.gov (United States)

    2017-07-01

    The Minnesota Department of Transportation Research Services has adopted a process for quantifying the monetary benefits of research projects, such as the dollar value of particular ideas when implemented across the states transportation system. T...

  1. How to Quantify Deterrence and Reduce Critical Infrastructure Risk

    OpenAIRE

    Taquechel, Eric F.; Lewis, Ted G.

    2012-01-01

    This article appeared in Homeland Security Affairs (August 2012), v.8, article 12 "We propose a definition of critical infrastructure deterrence and develop a methodology to explicitly quantify the deterrent effects of critical infrastructure security strategies. We leverage historical work on analyzing deterrence, game theory and utility theory. Our methodology quantifies deterrence as the extent to which an attacker's expected utility from an infrastructure attack changes after a defende...

  2. Quantifying and containing the curse of high resolution coronal imaging

    Directory of Open Access Journals (Sweden)

    V. Delouille

    2008-10-01

    Full Text Available Future missions such as Solar Orbiter (SO, InterHelioprobe, or Solar Probe aim at approaching the Sun closer than ever before, with on board some high resolution imagers (HRI having a subsecond cadence and a pixel area of about (80 km2 at the Sun during perihelion. In order to guarantee their scientific success, it is necessary to evaluate if the photon counts available at these resolution and cadence will provide a sufficient signal-to-noise ratio (SNR. For example, if the inhomogeneities in the Quiet Sun emission prevail at higher resolution, one may hope to locally have more photon counts than in the case of a uniform source. It is relevant to quantify how inhomogeneous the quiet corona will be for a pixel pitch that is about 20 times smaller than in the case of SoHO/EIT, and 5 times smaller than TRACE. We perform a first step in this direction by analyzing and characterizing the spatial intermittency of Quiet Sun images thanks to a multifractal analysis. We identify the parameters that specify the scale-invariance behavior. This identification allows next to select a family of multifractal processes, namely the Compound Poisson Cascades, that can synthesize artificial images having some of the scale-invariance properties observed on the recorded images. The prevalence of self-similarity in Quiet Sun coronal images makes it relevant to study the ratio between the SNR present at SoHO/EIT images and in coarsened images. SoHO/EIT images thus play the role of "high resolution" images, whereas the "low-resolution" coarsened images are rebinned so as to simulate a smaller angular resolution and/or a larger distance to the Sun. For a fixed difference in angular resolution and in Spacecraft-Sun distance, we determine the proportion of pixels having a SNR preserved at high resolution given a particular increase in effective area. If scale-invariance continues to prevail at smaller scales, the conclusion reached with SoHO/EIT images can be transposed

  3. Die Anwendbarkeit der Behavioral Finance im Devisenmarkt

    OpenAIRE

    Heidorn, Thomas; Siragusano, Tindaro

    2004-01-01

    Behavioral finance theory is used for the foreign exchange market to show, that the profit of a typical trader is mainly due to the higher number of correct positions. Using behavioral finance the amount of loss trades is larger than 60%, however the individual gains are larger than the losses leading to an overall profit. Using this approach we show, that behavioral finance rules can be quantified and a trading outperformance is possible just using 24h spot rates and 3 day volatilities.

  4. Quantifying Cutting and Wearing Behaviors of TiN- and CrNCoated AISI 1070 Steel

    Directory of Open Access Journals (Sweden)

    Ahmet Cakan

    2008-11-01

    Full Text Available Hard coatings such as titanium nitride (TiN and chromium nitride (CrN are widely used in cutting and forming tools against wear and corrosion. In the present study, hard coating films were deposited onto AISI 1070 steels by a cathodic arc evaporation plating (CAVP technique. These samples were subjected to wear in a conventional lathe for investigating the tribological behaviour of coating structure, and prenitrided subsurface composition was characterized using scanning electron microscopy (SEM, line scan analyses and X-ray diffraction (XRD. The wear properties of TiN- and CrNcoated samples were determined using an on-line monitoring system. The results show that TiN-coated samples demonstrate higher wear resistance than CrN-coated samples.

  5. Diagnosis and characterization of mania: Quantifying increased energy and activity in the human behavioral pattern monitor

    OpenAIRE

    Perry, William; McIlwain, Meghan; Kloezeman, Karen; Henry, Brook L.; Minassian, Arpi

    2016-01-01

    Increased energy or activity is now an essential feature of the mania of Bipolar Disorder (BD) according to DSM-5. This study examined whether objective measures of increased energy can differentiate manic BD individuals and provide greater diagnostic accuracy compared to rating scales, extending the work of previous studies with smaller samples. We also tested the relationship between objective measures of energy and rating scales. 50 hospitalized manic BD patients were compared to healthy s...

  6. Quantifying the effect of fuel reduction treatments on fire behavior in boreal forests

    Science.gov (United States)

    B.W. Butler; R.D. Ottmar; T.S. Rupp; R. Jandt; E. Miller; K. Howard; R. Schmoll; S. Theisen; R.E. Vihnanek; D. Jimenez

    2013-01-01

    Mechanical (e.g., shearblading) and manual (e.g., thinning) fuel treatments have become the preferred strategy of many fire managers and agencies for reducing fire hazard in boreal forests. This study attempts to characterize the effectiveness of four fuel treatments through direct measurement of fire intensity and forest floor consumption during a single prescribed...

  7. Quantifying the behavior of price dynamics at opening time in stock market

    Science.gov (United States)

    Ochiai, Tomoshiro; Takada, Hideyuki; Nacher, Jose C.

    2014-11-01

    The availability of huge volume of financial data has offered the possibility for understanding the markets as a complex system characterized by several stylized facts. Here we first show that the time evolution of the Japan’s Nikkei stock average index (Nikkei 225) futures follows the resistance and breaking-acceleration effects when the complete time series data is analyzed. However, in stock markets there are periods where no regular trades occur between the close of the market on one day and the next day’s open. To examine these time gaps we decompose the time series data into opening time and intermediate time. Our analysis indicates that for the intermediate time, both the resistance and the breaking-acceleration effects are still observed. However, for the opening time there are almost no resistance and breaking-acceleration effects, and volatility is always constantly high. These findings highlight unique dynamic differences between stock markets and forex market and suggest that current risk management strategies may need to be revised to address the absence of these dynamic effects at the opening time.

  8. Automatic MRI Quantifying Methods in Behavioral-Variant Frontotemporal Dementia Diagnosis

    DEFF Research Database (Denmark)

    Cajanus, Antti; Hall, Anette; Koikkalainen, Juha

    2018-01-01

    genetic status in the differentiation sensitivity. Methods: The MRI scans of 50 patients with bvFTD (17 C9ORF72 expansion carriers) were analyzed using 6 quantification methods as follows: voxel-based morphometry (VBM), tensor-based morphometry, volumetry (VOL), manifold learning, grading, and white...

  9. Acoustic monitoring system to quantify ingestive behavior of free-grazing cattle

    Science.gov (United States)

    Methods to estimate intake in grazing livestock include using markers, visual observation, mechanical sensors that respond to jaw movement and acoustic recording. In most of the acoustic monitoring studies, the microphone is inverted on the forehead of the grazing livestock and the skull is utilize...

  10. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  11. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    Science.gov (United States)

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  12. Quantifying NMR relaxation correlation and exchange in articular cartilage with time domain analysis

    Science.gov (United States)

    Mailhiot, Sarah E.; Zong, Fangrong; Maneval, James E.; June, Ronald K.; Galvosas, Petrik; Seymour, Joseph D.

    2018-02-01

    Measured nuclear magnetic resonance (NMR) transverse relaxation data in articular cartilage has been shown to be multi-exponential and correlated to the health of the tissue. The observed relaxation rates are dependent on experimental parameters such as solvent, data acquisition methods, data analysis methods, and alignment to the magnetic field. In this study, we show that diffusive exchange occurs in porcine articular cartilage and impacts the observed relaxation rates in T1-T2 correlation experiments. By using time domain analysis of T2-T2 exchange spectroscopy, the diffusive exchange time can be quantified by measurements that use a single mixing time. Measured characteristic times for exchange are commensurate with T1 in this material and so impacts the observed T1 behavior. The approach used here allows for reliable quantification of NMR relaxation behavior in cartilage in the presence of diffusive fluid exchange between two environments.

  13. Quantifying Ion Transport in Polymers Using Electrochemical Quartz Crystal Microbalance with Dissipation

    Science.gov (United States)

    Lutkenhaus, Jodie; Wang, Shaoyang

    For polymers in energy systems, one of the most common means of quantifying ion transport is that of electrochemical impedance spectroscopy, in which an alternating electric field is applied and the resultant impedance response is recorded. While useful, this approach misses subtle details in transient film swelling, effects of hydration or solvent shells around the transporting ion, and changes in mechanical properties of the polymer. Here we present electrochemical quartz crystal microbalance with dissipation (EQCMD) monitoring as a means to quantify ion transport, dynamic swelling, and mechanical properties of polymers during electrochemical interrogation. We focus upon EQCMD characterization of the redox-active nitroxide radical polymer, poly(2,2,6,6-tetramethylpiperidinyloxy methacrylate) (PTMA). Upon oxidation, PTMA becomes positively charged, which requires the transport of a complementary anion into the polymer for electroneutrality. By EQCMD, we quantify anion transport and resultant swelling upon oxidation, as well as decoupling of contributions attributed to the ion and the solvent. We explore the effect of different lithium electrolyte salts in which each salt gives different charge storage and mass transport behavior. This is attributed to varied polymer-dopant and dopant-solvent interactions. The work was supported by the Grant DE-SC0014006 funded by the U.S. Department of Energy, Office of Science.

  14. Behavioral economics

    OpenAIRE

    Camerer, Colin F.

    2014-01-01

    Economics, like behavioral psychology, is a science of behavior, albeit highly organized human behavior. The value of economic concepts for behavioral psychology rests on (1) their empirical validity when tested in the laboratory with individual subjects and (2) their uniqueness when compared to established behavioral concepts. Several fundamental concepts are introduced and illustrated by reference to experimental data: open and closed economies, elastic and inelastic demand, and substitutio...

  15. Quantifying the statistical complexity of low-frequency fluctuations in semiconductor lasers with optical feedback

    International Nuclear Information System (INIS)

    Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.; Garcia-Ojalvo, J.; Rosso, O. A.

    2010-01-01

    Low-frequency fluctuations (LFFs) represent a dynamical instability that occurs in semiconductor lasers when they are operated near the lasing threshold and subject to moderate optical feedback. LFFs consist of sudden power dropouts followed by gradual, stepwise recoveries. We analyze experimental time series of intensity dropouts and quantify the complexity of the underlying dynamics employing two tools from information theory, namely, Shannon's entropy and the Martin, Plastino, and Rosso statistical complexity measure. These measures are computed using a method based on ordinal patterns, by which the relative length and ordering of consecutive interdropout intervals (i.e., the time intervals between consecutive intensity dropouts) are analyzed, disregarding the precise timing of the dropouts and the absolute durations of the interdropout intervals. We show that this methodology is suitable for quantifying subtle characteristics of the LFFs, and in particular the transition to fully developed chaos that takes place when the laser's pump current is increased. Our method shows that the statistical complexity of the laser does not increase continuously with the pump current, but levels off before reaching the coherence collapse regime. This behavior coincides with that of the first- and second-order correlations of the interdropout intervals, suggesting that these correlations, and not the chaotic behavior, are what determine the level of complexity of the laser's dynamics. These results hold for two different dynamical regimes, namely, sustained LFFs and coexistence between LFFs and steady-state emission.

  16. Studying Behavioral Ecology on High School & College Campuses: A Practical Guide to Measuring Foraging Behavior Using Urban Wildlife

    Science.gov (United States)

    Baker, Mohammad A. Abu; Emerson, Sara E.; Brown, Joel S.

    2015-01-01

    We present a practical field exercise for ecology and animal behavior classes that can be carried out on campus, using urban wildlife. Students document an animal's feeding behavior to study its interactions with the surrounding environment. In this approach, an animal's feeding behavior is quantified at experimental food patches placed within its…

  17. Quantifying the perceived risks associated with nuclear energy issues

    International Nuclear Information System (INIS)

    Sandquist, G.M.

    2004-01-01

    A mathematical model is presented for quantifying and assessing perceived risks in an empirical manner. The analytical model provides for the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. The set of risk perception factors used to demonstrate the model are those that have been identified by social and behavioural scientists as principal factors influencing people in their perception of risks associated with major technical issues. These same risk factors are commonly associated with nuclear energy issues. A rational means is proposed for determining and quantifying these risk factors for a given application. The model should contribute to improved understanding of the basis and logic of public risk perception and provide practical and effective means for addressing perceived risks when they arise over important technical issues and projects. (author)

  18. Quantifying the value of E and P technology

    International Nuclear Information System (INIS)

    Heinemann, R.F.; Donlon, W.P.; Hoefner, M.L.

    1996-01-01

    A quantitative value-to-cost analysis was performed for the upstream technology portfolio of Mobil Oil for the period 1993 to 1998, by quantifying the cost of developing and delivering various technologies, including the net present value from technologies applied to thirty major assets. The value captured was classified into four general categories: (1) reduced capital costs, (2) reduced operating costs, (3) increased hydrocarbon production, and (4) increased proven reserves. The methodology used in quantifying the value-to-cost of upstream technologies and the results of asset analysis were described, with examples of value of technology to specific assets. A method to incorporate strategic considerations and business alignment to set overall program priorities was also discussed. Identifying and quantifying specific cases of technology application on an asset by asset basis was considered to be the principal advantage of using this method. figs

  19. Ventilation in Sewers Quantified by Measurements of CO2

    DEFF Research Database (Denmark)

    Fuglsang, Emil Dietz; Vollertsen, Jes; Nielsen, Asbjørn Haaning

    2012-01-01

    Understanding and quantifying ventilation in sewer systems is a prerequisite to predict transport of odorous and corrosive gasses within the system as well as their interaction with the urban atmosphere. This paper studies ventilation in sewer systems quantified by measurements of the natural...... occurring compound CO2. Most often Danish wastewater is supersaturated with CO2 and hence a potential for stripping is present. A novel model was built based on the kinetics behind the stripping process. It was applied to simulate ventilation rates from field measurements of wastewater temperature, p...

  20. Assessing Orchestrated Simulation Through Modeling to Quantify the Benefits of Unmanned-Teaming in a Tactical ASW Scenario

    Science.gov (United States)

    2018-03-01

    be used by the model to create pseudo- random behavior for agents. This process is a critical aspect of the simulation because it allows for the...ORCHESTRATED SIMULATION THROUGH MODELING TO QUANTIFY THE BENEFITS OF UNMANNED–MANNED TEAMING IN A TACTICAL ASW SCENARIO by Preston T. Tilus March...Leave blank) 2. REPORT DATE March 2018 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE ASSESSING ORCHESTRATED SIMULATION

  1. Aggressive Behavior

    Science.gov (United States)

    ... It is always more effective to positively reinforce desired behaviors and to teach children alternative behaviors rather ... he is angry, but instead to express his feelings through words. It’s important for him to learn ...

  2. Behaviorally inadequate

    DEFF Research Database (Denmark)

    Kasperbauer, Tyler Joshua

    2014-01-01

    According to situationism in psychology, behavior is primarily influenced by external situational factors rather than internal traits or motivations such as virtues. Environmental ethicists wish to promote pro-environmental behaviors capable of providing adequate protection for the environment...

  3. Verbal behavior

    OpenAIRE

    Michael, Jack

    1984-01-01

    The recent history and current status of the area of verbal behavior are considered in terms of three major thematic lines: the operant conditioning of adult verbal behavior, learning to be an effective speaker and listener, and developments directly related to Skinner's Verbal Behavior. Other topics not directly related to the main themes are also considered: the work of Kurt Salzinger, ape-language research, and human operant research related to rule-governed behavior.

  4. A user-oriented and quantifiable approach to irrigation design.

    NARCIS (Netherlands)

    Baars, E.; Bastiaansen, A.P.M.; Menenti, M.

    1995-01-01

    A new user-oriented approach is presented to apply marketing research techniques to quantify perceptions, preferences and utility values of farmers. This approach was applied to design an improved water distribution method for an irrigation scheme in Mendoza, Argentina. The approach comprises two

  5. Quantifying the CO{sub 2} permit price sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Gruell, Georg; Kiesel, Ruediger [Duisburg-Essen Univ., Essen (Germany). Inst. of Energy Trading and Financial Services

    2012-06-15

    Equilibrium models have been widely used in the literature with the aim of showing theoretical properties of emissions trading schemes. This paper applies equilibrium models to empirically study permit prices and to quantify the permit price sensitivity. In particular, we demonstrate that emission trading schemes both with and without banking are inherently prone to price jumps. (orig.)

  6. Quantifying Creative Destruction Entrepreneurship and Productivity in New Zealand

    OpenAIRE

    John McMillan

    2005-01-01

    This paper (a) provides a framework for quantifying any economy’s flexibility, and (b) reviews the evidence on New Zealand firms’ birth, growth and death. The data indicate that, by and large, the labour market and the financial market are doing their job.

  7. Comparing methods to quantify experimental transmission of infectious agents

    NARCIS (Netherlands)

    Velthuis, A.G.J.; Jong, de M.C.M.; Bree, de J.

    2007-01-01

    Transmission of an infectious agent can be quantified from experimental data using the transient-state (TS) algorithm. The TS algorithm is based on the stochastic SIR model and provides a time-dependent probability distribution over the number of infected individuals during an epidemic, with no need

  8. Quantifying Solar Cell Cracks in Photovoltaic Modules by Electroluminescence Imaging

    DEFF Research Database (Denmark)

    Spataru, Sergiu; Hacke, Peter; Sera, Dezso

    2015-01-01

    This article proposes a method for quantifying the percentage of partially and totally disconnected solar cell cracks by analyzing electroluminescence images of the photovoltaic module taken under high- and low-current forward bias. The method is based on the analysis of the module’s electrolumin...

  9. Quantifying levels of animal activity using camera trap data

    NARCIS (Netherlands)

    Rowcliffe, J.M.; Kays, R.; Kranstauber, B.; Carbone, C.; Jansen, P.A.

    2014-01-01

    1.Activity level (the proportion of time that animals spend active) is a behavioural and ecological metric that can provide an indicator of energetics, foraging effort and exposure to risk. However, activity level is poorly known for free-living animals because it is difficult to quantify activity

  10. Information on Quantifiers and Argument Structure in English Learner's Dictionaries.

    Science.gov (United States)

    Lee, Thomas Hun-tak

    1993-01-01

    Lexicographers have been arguing for the inclusion of abstract and complex grammatical information in dictionaries. This paper examines the extent to which information about quantifiers and the argument structure of verbs is encoded in English learner's dictionaries. The Oxford Advanced Learner's Dictionary (1989), the Longman Dictionary of…

  11. Quantifying trail erosion and stream sedimentation with sediment tracers

    Science.gov (United States)

    Mark S. Riedel

    2006-01-01

    Abstract--The impacts of forest disturbance and roads on stream sedimentation have been rigorously investigated and documented. While historical research on turbidity and suspended sediments has been thorough, studies of stream bed sedimentation have typically relied on semi-quantitative measures such as embeddedness or marginal pool depth. To directly quantify the...

  12. Coupling and quantifying resilience and sustainability in facilities management

    DEFF Research Database (Denmark)

    Cox, Rimante Andrasiunaite; Nielsen, Susanne Balslev; Rode, Carsten

    2015-01-01

    Purpose – The purpose of this paper is to consider how to couple and quantify resilience and sustainability, where sustainability refers to not only environmental impact, but also economic and social impacts. The way a particular function of a building is provisioned may have significant repercus......Purpose – The purpose of this paper is to consider how to couple and quantify resilience and sustainability, where sustainability refers to not only environmental impact, but also economic and social impacts. The way a particular function of a building is provisioned may have significant...... repercussions beyond just resilience. The goal is to develop a decision support tool for facilities managers. Design/methodology/approach – A risk framework is used to quantify both resilience and sustainability in monetary terms. The risk framework allows to couple resilience and sustainability, so...... that the provisioning of a particular building can be investigated with consideration of functional, environmental, economic and, possibly, social dimensions. Findings – The method of coupling and quantifying resilience and sustainability (CQRS) is illustrated with a simple example that highlights how very different...

  13. Quantifying Time Dependent Moisture Storage and Transport Properties

    DEFF Research Database (Denmark)

    Peuhkuri, Ruut H

    2003-01-01

    This paper describes an experimental and numerical approach to quantify the time dependence of sorption mechanisms for some hygroscopic building - mostly insulation - materials. Some investigations of retarded sorption and non-Fickian phenomena, mostly on wood, have given inspiration to the present...

  14. A framework for quantifying net benefits of alternative prognostic models

    NARCIS (Netherlands)

    Rapsomaniki, E.; White, I.R.; Wood, A.M.; Thompson, S.G.; Feskens, E.J.M.; Kromhout, D.

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit)

  15. Using multiple linear regression techniques to quantify carbon ...

    African Journals Online (AJOL)

    Fallow ecosystems provide a significant carbon stock that can be quantified for inclusion in the accounts of global carbon budgets. Process and statistical models of productivity, though useful, are often technically rigid as the conditions for their application are not easy to satisfy. Multiple regression techniques have been ...

  16. Quantifying Stakeholder Values of VET Provision in the Netherlands

    Science.gov (United States)

    van der Sluis, Margriet E.; Reezigt, Gerry J.; Borghans, Lex

    2014-01-01

    It is well-known that the quality of vocational education and training (VET) depends on how well a given programme aligns with the values and interests of its stakeholders, but it is less well-known what these values and interests are and to what extent they are shared across different groups of stakeholders. We use vignettes to quantify the…

  17. Cross-linguistic patterns in the acquisition of quantifiers

    Science.gov (United States)

    Cummins, Chris; Gavarró, Anna; Kuvač Kraljević, Jelena; Hrzica, Gordana; Grohmann, Kleanthes K.; Skordi, Athina; Jensen de López, Kristine; Sundahl, Lone; van Hout, Angeliek; Hollebrandse, Bart; Overweg, Jessica; Faber, Myrthe; van Koert, Margreet; Smith, Nafsika; Vija, Maigi; Zupping, Sirli; Kunnari, Sari; Morisseau, Tiffany; Rusieshvili, Manana; Yatsushiro, Kazuko; Fengler, Anja; Varlokosta, Spyridoula; Konstantzou, Katerina; Farby, Shira; Guasti, Maria Teresa; Vernice, Mirta; Okabe, Reiko; Isobe, Miwa; Crosthwaite, Peter; Hong, Yoonjee; Balčiūnienė, Ingrida; Ahmad Nizar, Yanti Marina; Grech, Helen; Gatt, Daniela; Cheong, Win Nee; Asbjørnsen, Arve; Torkildsen, Janne von Koss; Haman, Ewa; Miękisz, Aneta; Gagarina, Natalia; Puzanova, Julia; Anđelković, Darinka; Savić, Maja; Jošić, Smiljana; Slančová, Daniela; Kapalková, Svetlana; Barberán, Tania; Özge, Duygu; Hassan, Saima; Chan, Cecilia Yuet Hung; Okubo, Tomoya; van der Lely, Heather; Sauerland, Uli; Noveck, Ira

    2016-01-01

    Learners of most languages are faced with the task of acquiring words to talk about number and quantity. Much is known about the order of acquisition of number words as well as the cognitive and perceptual systems and cultural practices that shape it. Substantially less is known about the acquisition of quantifiers. Here, we consider the extent to which systems and practices that support number word acquisition can be applied to quantifier acquisition and conclude that the two domains are largely distinct in this respect. Consequently, we hypothesize that the acquisition of quantifiers is constrained by a set of factors related to each quantifier’s specific meaning. We investigate competence with the expressions for “all,” “none,” “some,” “some…not,” and “most” in 31 languages, representing 11 language types, by testing 768 5-y-old children and 536 adults. We found a cross-linguistically similar order of acquisition of quantifiers, explicable in terms of four factors relating to their meaning and use. In addition, exploratory analyses reveal that language- and learner-specific factors, such as negative concord and gender, are significant predictors of variation. PMID:27482119

  18. FRAGSTATS: spatial pattern analysis program for quantifying landscape structure.

    Science.gov (United States)

    Kevin McGarigal; Barbara J. Marks

    1995-01-01

    This report describes a program, FRAGSTATS, developed to quantify landscape structure. FRAGSTATS offers a comprehensive choice of landscape metrics and was designed to be as versatile as possible. The program is almost completely automated and thus requires little technical training. Two separate versions of FRAGSTATS exist: one for vector images and one for raster...

  19. Quantifying Spin Hall Angles from Spin Pumping : Experiments and Theory

    NARCIS (Netherlands)

    Mosendz, O.; Pearson, J.E.; Fradin, F.Y.; Bauer, G.E.W.; Bader, S.D.; Hoffmann, A.

    2010-01-01

    Spin Hall effects intermix spin and charge currents even in nonmagnetic materials and, therefore, ultimately may allow the use of spin transport without the need for ferromagnets. We show how spin Hall effects can be quantified by integrating Ni80Fe20|normal metal (N) bilayers into a coplanar

  20. Quantifying Effectiveness of Streambank Stabilization Practices on Cedar River, Nebraska

    Directory of Open Access Journals (Sweden)

    Naisargi Dave

    2017-11-01

    Full Text Available Excessive sediment is a major pollutant to surface waters worldwide. In some watersheds, streambanks are a significant source of this sediment, leading to the expenditure of billions of dollars in stabilization projects. Although costly streambank stabilization projects have been implemented worldwide, long-term monitoring to quantify their success is lacking. There is a critical need to document the long-term success of streambank restoration projects. The objectives of this research were to (1 quantify streambank retreat before and after the stabilization of 18 streambanks on the Cedar River in North Central Nebraska, USA; (2 assess the impact of a large flood event; and (3 determine the most cost-efficient stabilization practice. The stabilized streambanks included jetties (10, rock-toe protection (1, slope reduction/gravel bank (1, a retaining wall (1, rock vanes (2, and tree revetments (3. Streambank retreat and accumulation were quantified using aerial images from 1993 to 2016. Though streambank retreat has been significant throughout the study period, a breached dam in 2010 caused major flooding and streambank erosion on the Cedar River. This large-scale flood enabled us to quantify the effect of one extreme event and evaluate the effectiveness of the stabilized streambanks. With a 70% success rate, jetties were the most cost-efficient practice and yielded the most deposition. If minimal risk is unacceptable, a more costly yet immobile practice such as a gravel bank or retaining wall is recommended.

  1. Quantifying carbon stores and decomposition in dead wood: A review

    Science.gov (United States)

    Matthew B. Russell; Shawn Fraver; Tuomas Aakala; Jeffrey H. Gove; Christopher W. Woodall; Anthony W. D’Amato; Mark J. Ducey

    2015-01-01

    The amount and dynamics of forest dead wood (both standing and downed) has been quantified by a variety of approaches throughout the forest science and ecology literature. Differences in the sampling and quantification of dead wood can lead to differences in our understanding of forests and their role in the sequestration and emissions of CO2, as...

  2. Quantifying soil respiration at landscape scales. Chapter 11

    Science.gov (United States)

    John B. Bradford; Michael G. Ryan

    2008-01-01

    Soil CO2, efflux, or soil respiration, represents a substantial component of carbon cycling in terrestrial ecosystems. Consequently, quantifying soil respiration over large areas and long time periods is an increasingly important goal. However, soil respiration rates vary dramatically in space and time in response to both environmental conditions...

  3. Lecture Note on Discrete Mathematics: Predicates and Quantifiers

    DEFF Research Database (Denmark)

    Nordbjerg, Finn Ebertsen

    2016-01-01

    This lecture note supplements the treatment of predicates and quantifiers given in standard textbooks on Discrete Mathematics (e.g.: [1]) and introduces the notation used in this course. We will present central concepts that are important, when predicate logic is used for specification...

  4. Quantifying the FIR interaction enhancement in paired galaxies

    International Nuclear Information System (INIS)

    Xu Cong; Sulentic, J.W.

    1990-01-01

    We studied the ''Catalogue of Isolated Pairs of Galaxies in the Northern Hemisphere'' by Karachentsev (1972) and a well matched comparison sample taken from the ''Catalogue of Isolated Galaxies'' by Karachentseva (1973) in order to quantify the enhanced FIR emission properties of interacting galaxies. 8 refs, 6 figs

  5. A Sustainability Initiative to Quantify Carbon Sequestration by Campus Trees

    Science.gov (United States)

    Cox, Helen M.

    2012-01-01

    Over 3,900 trees on a university campus were inventoried by an instructor-led team of geography undergraduates in order to quantify the carbon sequestration associated with biomass growth. The setting of the project is described, together with its logistics, methodology, outcomes, and benefits. This hands-on project provided a team of students…

  6. Designing a systematic landscape monitoring approach for quantifying ecosystem services

    Science.gov (United States)

    A key problem encountered early on by governments striving to incorporate the ecosystem services concept into decision making is quantifying ecosystem services across large landscapes. Basically, they are faced with determining what to measure, how to measure it and how to aggre...

  7. Challenges in quantifying biosphere-atmosphere exchange of nitrogen species

    DEFF Research Database (Denmark)

    Sutton, M.A.; Nemitz, E.; Erisman, J.W.

    2007-01-01

    Recent research in nitrogen exchange with the atmosphere has separated research communities according to N form. The integrated perspective needed to quantify the net effect of N on greenhouse-gas balance is being addressed by the NitroEurope Integrated Project (NEU). Recent advances have depende...

  8. Quantifying and mapping spatial variability in simulated forest plots

    Science.gov (United States)

    Gavin R. Corral; Harold E. Burkhart

    2016-01-01

    We used computer simulations to test the efficacy of multivariate statistical methods to detect, quantify, and map spatial variability of forest stands. Simulated stands were developed of regularly-spaced plantations of loblolly pine (Pinus taeda L.). We assumed no affects of competition or mortality, but random variability was added to individual tree characteristics...

  9. Quantifying Ladder Fuels: A New Approach Using LiDAR

    Science.gov (United States)

    Heather Kramer; Brandon Collins; Maggi Kelly; Scott Stephens

    2014-01-01

    We investigated the relationship between LiDAR and ladder fuels in the northern Sierra Nevada, California USA. Ladder fuels are often targeted in hazardous fuel reduction treatments due to their role in propagating fire from the forest floor to tree crowns. Despite their importance, ladder fuels are difficult to quantify. One common approach is to calculate canopy base...

  10. Quantifying a Negative: How Homeland Security Adds Value

    Science.gov (United States)

    2015-12-01

    access to future victims. The Law Enforcement agency could then identifying and quantifying the value of future crimes. For example, if a serial ... killer is captured with evidence of the next victim or an established pattern of victimization, network theory could be used to identify the next

  11. Behaviorally Speaking.

    Science.gov (United States)

    Porter, Elias H.; Dutton, Darell W. J.

    1987-01-01

    Consists of two articles focusing on (1) a modern behavioral model that takes cues from Hippocrates' Four Temperaments and (2) use of a behavioral approach to improve the effectiveness of meetings. Lists positive and negative behaviors within the meeting context. (CH)

  12. The benefits and risks of quantified relationship technologies : response to open peer commentaries on "the quantified relationship"

    NARCIS (Netherlands)

    Danaher, J.; Nyholm, S.R.; Earp, B.D.

    2018-01-01

    Our critics argue that quantified relationships (QR) will threaten privacy, undermine autonomy, reinforce problematic business models, and promote epistemic injustice. We do not deny these risks. But to determine the appropriate policy response, it will be necessary to assess their likelihood,

  13. Benefits Innovations in Employee Behavioral Health.

    Science.gov (United States)

    Sherman, Bruce; Block, Lori

    2017-01-01

    More and more employers recognize the business impact of behavioral health concerns in the workplace. This article provides insights into some of the current innovations in behavioral health benefits, along with their rationale for development. Areas of innovation include conceptual and delivery models, technological advance- ments, tools for engaging employees and ways of quantifying the business value of behavioral health benefits. The rapid growth of innovative behavioral health services should provide employers with confidence that they can tailor a program best suited to their priorities, organizational culture and cost limitations.

  14. The Quantified Self: Fundamental Disruption in Big Data Science and Biological Discovery.

    Science.gov (United States)

    Swan, Melanie

    2013-06-01

    A key contemporary trend emerging in big data science is the quantified self (QS)-individuals engaged in the self-tracking of any kind of biological, physical, behavioral, or environmental information as n=1 individuals or in groups. There are opportunities for big data scientists to develop new models to support QS data collection, integration, and analysis, and also to lead in defining open-access database resources and privacy standards for how personal data is used. Next-generation QS applications could include tools for rendering QS data meaningful in behavior change, establishing baselines and variability in objective metrics, applying new kinds of pattern recognition techniques, and aggregating multiple self-tracking data streams from wearable electronics, biosensors, mobile phones, genomic data, and cloud-based services. The long-term vision of QS activity is that of a systemic monitoring approach where an individual's continuous personal information climate provides real-time performance optimization suggestions. There are some potential limitations related to QS activity-barriers to widespread adoption and a critique regarding scientific soundness-but these may be overcome. One interesting aspect of QS activity is that it is fundamentally a quantitative and qualitative phenomenon since it includes both the collection of objective metrics data and the subjective experience of the impact of these data. Some of this dynamic is being explored as the quantified self is becoming the qualified self in two new ways: by applying QS methods to the tracking of qualitative phenomena such as mood, and by understanding that QS data collection is just the first step in creating qualitative feedback loops for behavior change. In the long-term future, the quantified self may become additionally transformed into the extended exoself as data quantification and self-tracking enable the development of new sense capabilities that are not possible with ordinary senses. The

  15. Psychological behaviorism and behaviorizing psychology

    Science.gov (United States)

    Staats, Arthur W.

    1994-01-01

    Paradigmatic or psychological behaviorism (PB), in a four-decade history of development, has been shaped by its goal, the establishment of a behaviorism that can also serve as the approach in psychology (Watson's original goal). In the process, PB has become a new generation of behaviorism with abundant heuristic avenues for development in theory, philosophy, methodology, and research. Psychology has resources, purview and problem areas, and nascent developments of many kinds, gathered in chaotic diversity, needing unification (and other things) that cognitivism cannot provide. Behaviorism can, within PB's multilevel framework for connecting and advancing both psychology and behaviorism. PMID:22478175

  16. Using hardware models to quantify sensory data acquisition across the rat vibrissal array.

    Science.gov (United States)

    Gopal, Venkatesh; Hartmann, Mitra J Z

    2007-12-01

    Our laboratory investigates how animals acquire sensory data to understand the neural computations that permit complex sensorimotor behaviors. We use the rat whisker system as a model to study active tactile sensing; our aim is to quantitatively describe the spatiotemporal structure of incoming sensory information to place constraints on subsequent neural encoding and processing. In the first part of this paper we describe the steps in the development of a hardware model (a 'sensobot') of the rat whisker array that can perform object feature extraction. We show how this model provides insights into the neurophysiology and behavior of the real animal. In the second part of this paper, we suggest that sensory data acquisition across the whisker array can be quantified using the complete derivative. We use the example of wall-following behavior to illustrate that computing the appropriate spatial gradients across a sensor array would enable an animal or mobile robot to predict the sensory data that will be acquired at the next time step.

  17. A new paradigm of quantifying ecosystem stress through chemical signatures

    Energy Technology Data Exchange (ETDEWEB)

    Kravitz, Ben [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, P.O. Box 999, MSIN K9-30 Richland Washington 99352 USA; Guenther, Alex B. [Department of Earth System Science, University of California Irvine, 3200 Croul Hall Street Irvine California 92697 USA; Gu, Lianhong [Environmental Sciences Division, Oak Ridge National Laboratory, Oak Ridge Tennessee 37831 USA; Karl, Thomas [Institute of Atmospheric and Crysopheric Sciences, University of Innsbruck, Innrain 52f A-6020 Innsbruck Austria; Kaser, Lisa [National Center for Atmospheric Research, P.O. Box 3000 Boulder Colorado 80307 USA; Pallardy, Stephen G. [Department of Forestry, University of Missouri, 203 Anheuser-Busch Natural Resources Building Columbia Missouri 65211 USA; Peñuelas, Josep [CREAF, Cerdanyola del Vallès 08193 Catalonia Spain; Global Ecology Unit CREAF-CSIC-UAB, CSIC, Cerdanyola del Vallès 08193 Catalonia Spain; Potosnak, Mark J. [Department of Environmental Science and Studies, DePaul University, McGowan South, Suite 203 Chicago Illinois 60604 USA; Seco, Roger [Department of Earth System Science, University of California Irvine, 3200 Croul Hall Street Irvine California 92697 USA

    2016-11-01

    Stress-induced emissions of biogenic volatile organic compounds (VOCs) from terrestrial ecosystems may be one of the dominant sources of VOC emissions world-wide. Understanding the ecosystem stress response could reveal how ecosystems will respond and adapt to climate change and, in turn, quantify changes in the atmospheric burden of VOC oxidants and secondary organic aerosols. Here we argue, based on preliminary evidence from several opportunistic measurement sources, that chemical signatures of stress can be identified and quantified at the ecosystem scale. We also outline future endeavors that we see as next steps toward uncovering quantitative signatures of stress, including new advances in both VOC data collection and analysis of "big data."

  18. A framework for quantifying net benefits of alternative prognostic models

    DEFF Research Database (Denmark)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit......) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk...... reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple...

  19. Quantifying the value of SHM for wind turbine blades

    DEFF Research Database (Denmark)

    Nielsen, Jannie Sønderkær; Tcherniak, Dmitri; Ulriksen, Martin Dalgaard

    2018-01-01

    is developed to quantify the value of SHM for an 8 MW OWT using a decision framework based on Bayesian pre-posterior decision analysis. Deterioration is modelled as a Markov chain developed based on data, and the costs are obtained from a service provider for OWTs. Discrete Bayesian networks are used......In this paper, the value of information (VoI) from structural health monitoring (SHM) is quantified in a case study for offshore wind turbines (OWTs). This is done by combining data from an operating turbine equipped with a blade SHM system with cost information from a service provider for OWTs...... is compared to a statistical model from the healthy state using a metric that yields a damage index representing the structural integrity. As the damage was introduced artificially, it is possible to statistically estimate the confusion matrix corresponding to different threshold values, and here we opt...

  20. Quantifiers for randomness of chaotic pseudo-random number generators.

    Science.gov (United States)

    De Micco, L; Larrondo, H A; Plastino, A; Rosso, O A

    2009-08-28

    We deal with randomness quantifiers and concentrate on their ability to discern the hallmark of chaos in time series used in connection with pseudo-random number generators (PRNGs). Workers in the field are motivated to use chaotic maps for generating PRNGs because of the simplicity of their implementation. Although there exist very efficient general-purpose benchmarks for testing PRNGs, we feel that the analysis provided here sheds additional didactic light on the importance of the main statistical characteristics of a chaotic map, namely (i) its invariant measure and (ii) the mixing constant. This is of help in answering two questions that arise in applications: (i) which is the best PRNG among the available ones? and (ii) if a given PRNG turns out not to be good enough and a randomization procedure must still be applied to it, which is the best applicable randomization procedure? Our answer provides a comparative analysis of several quantifiers advanced in the extant literature.

  1. Resolving and quantifying overlapped chromatographic bands by transmutation

    Science.gov (United States)

    Malinowski

    2000-09-15

    A new chemometric technique called "transmutation" is developed for the purpose of sharpening overlapped chromatographic bands in order to quantify the components. The "transmutation function" is created from the chromatogram of the pure component of interest, obtained from the same instrument, operating under the same experimental conditions used to record the unresolved chromatogram of the sample mixture. The method is used to quantify mixtures containing toluene, ethylbenzene, m-xylene, naphthalene, and biphenyl from unresolved chromatograms previously reported. The results are compared to those obtained using window factor analysis, rank annihilation factor analysis, and matrix regression analysis. Unlike the latter methods, the transmutation method is not restricted to two-dimensional arrays of data, such as those obtained from HPLC/DAD, but is also applicable to chromatograms obtained from single detector experiments. Limitations of the method are discussed.

  2. Pitfalls in quantifying species turnover: the residency effect

    Directory of Open Access Journals (Sweden)

    Kevin Chase Burns

    2014-03-01

    Full Text Available The composition of ecological communities changes continuously through time and space. Understanding this turnover in species composition is a central goal in biogeography, but quantifying species turnover can be problematic. Here, I describe an underappreciated source of bias in quantifying species turnover, namely ‘the residency effect’, which occurs when the contiguous distributions of species across sampling domains are small relative to census intervals. I present the results of a simulation model that illustrates the problem theoretically and then I demonstrate the problem empirically using a long-term dataset of plant species turnover on islands. Results from both exercises indicate that empirical estimates of species turnover may be susceptible to significant observer bias, which may potentially cloud a better understanding of how the composition of ecological communities changes through time.

  3. Quantifying resilience for resilience engineering of socio technical systems

    OpenAIRE

    Häring, Ivo; Ebenhöch, Stefan; Stolz, Alexander

    2016-01-01

    Resilience engineering can be defined to comprise originally technical, engineering and natural science approaches to improve the resilience and sustainability of socio technical cyber-physical systems of various complexities with respect to disruptive events. It is argued how this emerging interdisciplinary technical and societal science approach may contribute to civil and societal security research. In this context, the article lists expected benefits of quantifying resilience. Along the r...

  4. Quantifying the Lateral Bracing Provided by Standing Steam Roof Systems

    OpenAIRE

    Sorensen, Taylor J.

    2016-01-01

    One of the major challenges of engineering is finding the proper balance between economical and safe. Currently engineers at Nucor Corporation have ignored the additional lateral bracing provided by standing seam roofing systems to joists because of the lack of methods available to quantify the amount of bracing provided. Based on the results of testing performed herein, this bracing is significant, potentially resulting in excessively conservative designs and unnecessary costs. This proje...

  5. A framework for quantifying net benefits of alternative prognostic models

    OpenAIRE

    Rapsomaniki, E.; White, I.R.; Wood, A.M.; Thompson, S.G.; Ford, I.

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measure...

  6. PREDICTION OF SURGICAL TREATMENT WITH POUR PERITONITIS QUANTIFYING RISK FACTORS

    Directory of Open Access Journals (Sweden)

    І. К. Churpiy

    2012-11-01

    Full Text Available Explored the possibility of quantitative assessment of risk factors of complications in the treatment of diffuse peritonitis. Highlighted 53 groups of features that are important in predicting the course of diffuse peritonitis. The proposed scheme of defining the risk of clinical course of diffuse peritonitis can quantify the severity of the source of patients and in most cases correctly predict the results of treatment of disease.

  7. Simulating non-prenex cuts in quantified propositional calculus

    Czech Academy of Sciences Publication Activity Database

    Jeřábek, Emil; Nguyen, P.

    2011-01-01

    Roč. 57, č. 5 (2011), s. 524-532 ISSN 0942-5616 R&D Projects: GA AV ČR IAA100190902; GA MŠk(CZ) 1M0545 Institutional research plan: CEZ:AV0Z10190503 Keywords : proof complexity * prenex cuts * quantified propositional calculus Subject RIV: BA - General Mathematics Impact factor: 0.496, year: 2011 http://onlinelibrary.wiley.com/doi/10.1002/malq.201020093/abstract

  8. Quantifying high dimensional entanglement with two mutually unbiased bases

    Directory of Open Access Journals (Sweden)

    Paul Erker

    2017-07-01

    Full Text Available We derive a framework for quantifying entanglement in multipartite and high dimensional systems using only correlations in two unbiased bases. We furthermore develop such bounds in cases where the second basis is not characterized beyond being unbiased, thus enabling entanglement quantification with minimal assumptions. Furthermore, we show that it is feasible to experimentally implement our method with readily available equipment and even conservative estimates of physical parameters.

  9. Parkinson's Law Quantified: Three Investigations on Bureaucratic Inefficiency

    OpenAIRE

    Klimek, Peter; Hanel, Rudolf; Thurner, Stefan

    2008-01-01

    We formulate three famous, descriptive essays of C.N. Parkinson on bureaucratic inefficiency in a quantifiable and dynamical socio-physical framework. In the first model we show how the use of recent opinion formation models for small groups can be used to understand Parkinson's observation that decision making bodies such as cabinets or boards become highly inefficient once their size exceeds a critical 'Coefficient of Inefficiency', typically around 20. A second observation of Parkinson - w...

  10. The quantified self a sociology of self-tracking

    CERN Document Server

    Lupton, Deborah

    2016-01-01

    With the advent of digital devices and software, self-tracking practices have gained new adherents and have spread into a wide array of social domains. The Quantified Self movement has emerged to promote 'self knowledge through numbers'. In this ground-breaking book, Deborah Lupton critically analyses the social, cultural and political dimensions of contemporary self-tracking and identifies the concepts of selfhood, human embodiment and the value of data that underpin them.

  11. Quantifying the ice-albedo feedback through decoupling

    Science.gov (United States)

    Kravitz, B.; Rasch, P. J.

    2017-12-01

    The ice-albedo feedback involves numerous individual components, whereby warming induces sea ice melt, inducing reduced surface albedo, inducing increased surface shortwave absorption, causing further warming. Here we attempt to quantify the sea ice albedo feedback using an analogue of the "partial radiative perturbation" method, but where the governing mechanisms are directly decoupled in a climate model. As an example, we can isolate the insulating effects of sea ice on surface energy and moisture fluxes by allowing sea ice thickness to change but fixing Arctic surface albedo, or vice versa. Here we present results from such idealized simulations using the Community Earth System Model in which individual components are successively fixed, effectively decoupling the ice-albedo feedback loop. We isolate the different components of this feedback, including temperature change, sea ice extent/thickness, and air-sea exchange of heat and moisture. We explore the interactions between these different components, as well as the strengths of the total feedback in the decoupled feedback loop, to quantify contributions from individual pieces. We also quantify the non-additivity of the effects of the components as a means of investigating the dominant sources of nonlinearity in the ice-albedo feedback.

  12. A novel approach to quantify cybersecurity for electric power systems

    Science.gov (United States)

    Kaster, Paul R., Jr.

    Electric Power grid cybersecurity is a topic gaining increased attention in academia, industry, and government circles, yet a method of quantifying and evaluating a system's security is not yet commonly accepted. In order to be useful, a quantification scheme must be able to accurately reflect the degree to which a system is secure, simply determine the level of security in a system using real-world values, model a wide variety of attacker capabilities, be useful for planning and evaluation, allow a system owner to publish information without compromising the security of the system, and compare relative levels of security between systems. Published attempts at quantifying cybersecurity fail at one or more of these criteria. This document proposes a new method of quantifying cybersecurity that meets those objectives. This dissertation evaluates the current state of cybersecurity research, discusses the criteria mentioned previously, proposes a new quantification scheme, presents an innovative method of modeling cyber attacks, demonstrates that the proposed quantification methodology meets the evaluation criteria, and proposes a line of research for future efforts.

  13. Clinical relevance of quantified fundus autofluorescence in diabetic macular oedema.

    Science.gov (United States)

    Yoshitake, S; Murakami, T; Uji, A; Unoki, N; Dodo, Y; Horii, T; Yoshimura, N

    2015-05-01

    To quantify the signal intensity of fundus autofluorescence (FAF) and evaluate its association with visual function and optical coherence tomography (OCT) findings in diabetic macular oedema (DMO). We reviewed 103 eyes of 78 patients with DMO and 30 eyes of 22 patients without DMO. FAF images were acquired using Heidelberg Retina Angiograph 2, and the signal levels of FAF in the individual subfields of the Early Treatment Diabetic Retinopathy Study grid were measured. We evaluated the association between quantified FAF and the logMAR VA and OCT findings. One hundred and three eyes with DMO had lower FAF signal intensity levels in the parafoveal subfields compared with 30 eyes without DMO. The autofluorescence intensity in the parafoveal subfields was associated negatively with logMAR VA and the retinal thickness in the corresponding subfields. The autofluorescence levels in the parafoveal subfield, except the nasal subfield, were lower in eyes with autofluorescent cystoid spaces in the corresponding subfield than in those without autofluorescent cystoid spaces. The autofluorescence level in the central subfield was related to foveal cystoid spaces but not logMAR VA or retinal thickness in the corresponding area. Quantified FAF in the parafovea has diagnostic significance and is clinically relevant in DMO.

  14. Information criteria for quantifying loss of reversibility in parallelized KMC

    Energy Technology Data Exchange (ETDEWEB)

    Gourgoulias, Konstantinos, E-mail: gourgoul@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu; Rey-Bellet, Luc, E-mail: luc@math.umass.edu

    2017-01-01

    Parallel Kinetic Monte Carlo (KMC) is a potent tool to simulate stochastic particle systems efficiently. However, despite literature on quantifying domain decomposition errors of the particle system for this class of algorithms in the short and in the long time regime, no study yet explores and quantifies the loss of time-reversibility in Parallel KMC. Inspired by concepts from non-equilibrium statistical mechanics, we propose the entropy production per unit time, or entropy production rate, given in terms of an observable and a corresponding estimator, as a metric that quantifies the loss of reversibility. Typically, this is a quantity that cannot be computed explicitly for Parallel KMC, which is why we develop a posteriori estimators that have good scaling properties with respect to the size of the system. Through these estimators, we can connect the different parameters of the scheme, such as the communication time step of the parallelization, the choice of the domain decomposition, and the computational schedule, with its performance in controlling the loss of reversibility. From this point of view, the entropy production rate can be seen both as an information criterion to compare the reversibility of different parallel schemes and as a tool to diagnose reversibility issues with a particular scheme. As a demonstration, we use Sandia Lab's SPPARKS software to compare different parallelization schemes and different domain (lattice) decompositions.

  15. Quantifying the topography of the intrinsic energy landscape of flexible biomolecular recognition

    Science.gov (United States)

    Chu, Xiakun; Gan, Linfeng; Wang, Erkang; Wang, Jin

    2013-01-01

    Biomolecular functions are determined by their interactions with other molecules. Biomolecular recognition is often flexible and associated with large conformational changes involving both binding and folding. However, the global and physical understanding for the process is still challenging. Here, we quantified the intrinsic energy landscapes of flexible biomolecular recognition in terms of binding–folding dynamics for 15 homodimers by exploring the underlying density of states, using a structure-based model both with and without considering energetic roughness. By quantifying three individual effective intrinsic energy landscapes (one for interfacial binding, two for monomeric folding), the association mechanisms for flexible recognition of 15 homodimers can be classified into two-state cooperative “coupled binding–folding” and three-state noncooperative “folding prior to binding” scenarios. We found that the association mechanism of flexible biomolecular recognition relies on the interplay between the underlying effective intrinsic binding and folding energy landscapes. By quantifying the whole global intrinsic binding–folding energy landscapes, we found strong correlations between the landscape topography measure Λ (dimensionless ratio of energy gap versus roughness modulated by the configurational entropy) and the ratio of the thermodynamic stable temperature versus trapping temperature, as well as between Λ and binding kinetics. Therefore, the global energy landscape topography determines the binding–folding thermodynamics and kinetics, crucial for the feasibility and efficiency of realizing biomolecular function. We also found “U-shape” temperature-dependent kinetic behavior and a dynamical cross-over temperature for dividing exponential and nonexponential kinetics for two-state homodimers. Our study provides a unique way to bridge the gap between theory and experiments. PMID:23754431

  16. Plug Load Behavioral Change Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Metzger, I.; Kandt, A.; VanGeet, O.

    2011-08-01

    This report documents the methods and results of a plug load study of the Environmental Protection Agency's Region 8 Headquarters in Denver, Colorado, conducted by the National Renewable Energy Laboratory. The study quantified the effect of mechanical and behavioral change approaches on plug load energy reduction and identified effective ways to reduce plug load energy. Load reduction approaches included automated energy management systems and behavioral change strategies.

  17. Quantifying food intake in socially housed monkeys: social status effects on caloric consumption

    Science.gov (United States)

    Wilson, Mark E.; Fisher, Jeff; Fischer, Andrew; Lee, Vanessa; Harris, Ruth B.; Bartness, Timothy J.

    2008-01-01

    Obesity results from a number of factors including socio-environmental influences and rodent models show that several different stressors increase the preference for calorically dense foods leading to an obese phenotype. We present here a non-human primate model using socially housed adult female macaques living in long-term stable groups given access to diets of different caloric density. Consumption of a low fat (LFD; 15% of calories from fat) and a high fat diet (HFD; 45% of calories from fat) was quantified by means of a custom-built, automated feeder that dispensed a pellet of food when activated by a radiofrequency chip implanted subcutaneously in the animal’s wrist. Socially subordinate females showed indices of chronic psychological stress having reduced glucocorticoid negative feedback and higher frequencies of anxiety-like behavior. Twenty-four hour intakes of both the LFD and HFD were significantly greater in subordinates than dominates, an effect that persisted whether standard monkey chow (13% of calories from fat) was present or absent. Furthermore, although dominants restricted their food intake to daylight, subordinates continued to feed at night. Total caloric intake was significantly correlated with body weight change. Collectively, these results show that food intake can be reliably quantified in non-human primates living in complex social environments and suggest that socially-subordinate females consume more calories, suggesting this ethologically relevant model may help understand how psychosocial stress changes food preferences and consumption leading to obesity. PMID:18486158

  18. A comparative analysis of alternative approaches for quantifying nonlinear dynamics in cardiovascular system.

    Science.gov (United States)

    Chen, Yun; Yang, Hui

    2013-01-01

    Heart rate variability (HRV) analysis has emerged as an important research topic to evaluate autonomic cardiac function. However, traditional time and frequency-domain analysis characterizes and quantify only linear and stationary phenomena. In the present investigation, we made a comparative analysis of three alternative approaches (i.e., wavelet multifractal analysis, Lyapunov exponents and multiscale entropy analysis) for quantifying nonlinear dynamics in heart rate time series. Note that these extracted nonlinear features provide information about nonlinear scaling behaviors and the complexity of cardiac systems. To evaluate the performance, we used 24-hour HRV recordings from 54 healthy subjects and 29 heart failure patients, available in PhysioNet. Three nonlinear methods are evaluated not only individually but also in combination using three classification algorithms, i.e., linear discriminate analysis, quadratic discriminate analysis and k-nearest neighbors. Experimental results show that three nonlinear methods capture nonlinear dynamics from different perspectives and the combined feature set achieves the best performance, i.e., sensitivity 97.7% and specificity 91.5%. Collectively, nonlinear HRV features are shown to have the promise to identify the disorders in autonomic cardiovascular function.

  19. Handwriting Movement Kinematics for Quantifying EPS in Patients Treated with Atypical Antipsychotics

    Science.gov (United States)

    Caligiuri, Michael P.; Teulings, Hans-Leo; Dean, Charles E.; Niculescu, Alexander B.; Lohr, James B.

    2009-01-01

    Ongoing monitoring of neuroleptic-induced extrapyramidal side effects (EPS) is important to maximize treatment outcome, improve medication adherence and reduce re-hospitalization. Traditional approaches for assessing EPS such as parkinsonism, tardive akathisia, or dyskinesia rely upon clinical ratings. However, these observer-based EPS severity ratings can be unreliable and are subject to examiner bias. In contrast, quantitative instrumental methods are less subject to bias. Most instrumental methods have only limited clinical utility because of their complexity and costs. This paper describes an easy-to-use instrumental approach based on handwriting movements for quantifying EPS. Here, we present findings from psychiatric patients treated with atypical (second generation) antipsychotics. The handwriting task consisted of a sentence written several times within a 2 cm vertical boundary at a comfortable speed using an inkless pen and digitizing tablet. Kinematic variables including movement duration, peak vertical velocity and the number of acceleration peaks, and average normalized jerk (a measure of smoothness) for each up or down stroke and their submovements were analyzed. Results from 59 psychosis patients and 46 healthy comparison subjects revealed significant slowing and dysfluency in patients compared to controls. We observed differences across medications and daily dose. These findings support the ecological validity of handwriting movement analysis as an objective behavioral biomarker for quantifying the effects of antipsychotic medication and dose on the motor system. PMID:20381875

  20. Quantifying carbon footprint reduction opportunities for U.S. households and communities.

    Science.gov (United States)

    Jones, Christopher M; Kammen, Daniel M

    2011-05-01

    Carbon management is of increasing interest to individuals, households, and communities. In order to effectively assess and manage their climate impacts, individuals need information on the financial and greenhouse gas benefits of effective mitigation opportunities. We use consumption-based life cycle accounting techniques to quantify the carbon footprints of typical U.S. households in 28 cities for 6 household sizes and 12 income brackets. The model includes emissions embodied in transportation, energy, water, waste, food, goods, and services. We further quantify greenhouse gas and financial savings from 13 potential mitigation actions across all household types. The model suggests that the size and composition of carbon footprints vary dramatically between geographic regions and within regions based on basic demographic characteristics. Despite these differences, large cash-positive carbon footprint reductions are evident across all household types and locations; however, realizing this potential may require tailoring policies and programs to different population segments with very different carbon footprint profiles. The results of this model have been incorporated into an open access online carbon footprint management tool designed to enable behavior change at the household level through personalized feedback.

  1. Quantifying sleep architecture dynamics and individual differences using big data and Bayesian networks.

    Science.gov (United States)

    Yetton, Benjamin D; McDevitt, Elizabeth A; Cellini, Nicola; Shelton, Christian; Mednick, Sara C

    2018-01-01

    The pattern of sleep stages across a night (sleep architecture) is influenced by biological, behavioral, and clinical variables. However, traditional measures of sleep architecture such as stage proportions, fail to capture sleep dynamics. Here we quantify the impact of individual differences on the dynamics of sleep architecture and determine which factors or set of factors best predict the next sleep stage from current stage information. We investigated the influence of age, sex, body mass index, time of day, and sleep time on static (e.g. minutes in stage, sleep efficiency) and dynamic measures of sleep architecture (e.g. transition probabilities and stage duration distributions) using a large dataset of 3202 nights from a non-clinical population. Multi-level regressions show that sex effects duration of all Non-Rapid Eye Movement (NREM) stages, and age has a curvilinear relationship for Wake After Sleep Onset (WASO) and slow wave sleep (SWS) minutes. Bayesian network modeling reveals sleep architecture depends on time of day, total sleep time, age and sex, but not BMI. Older adults, and particularly males, have shorter bouts (more fragmentation) of Stage 2, SWS, and they transition less frequently to these stages. Additionally, we showed that the next sleep stage and its duration can be optimally predicted by the prior 2 stages and age. Our results demonstrate the potential benefit of big data and Bayesian network approaches in quantifying static and dynamic architecture of normal sleep.

  2. Metrics to quantify the importance of mixing state for CCN activity

    Directory of Open Access Journals (Sweden)

    J. Ching

    2017-06-01

    Full Text Available It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index (χ proposed by Riemer and West (2013 to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportant for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations (χ below 20 % the relationship of χ and the error in CCN predictions is not unique and ranges from lower than −40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.

  3. Quantifying the external cost of oil consumption within the context of sustainable development

    International Nuclear Information System (INIS)

    Abdel Sabour, S.A.

    2005-01-01

    The concept of sustainability implies that the flow of services derived from the use of natural capital must be constant over time and should be obtained at a constant price. For a depletable resource such as oil, the future generations are highly impacted due to the consumption behavior of the current generation. Since the ultimate oil stock within the Earth declines with cumulative consumption, excessive consumption of oil now reduces the availability of oil for future needs. Moreover, since oil reserves are normally extracted in the order of ascending cost and descending quality, excessive consumption of relatively high-quality, cheap oil reserves by the current generation raises the cost at which future generations can meet their needs of oil and hence imposes an external cost on the future generations. This study aims to quantify the external cost of consuming a barrel of oil within the context of sustainable development. An option-pricing model is developed to quantify this external cost assuming that the external cost of consuming a barrel of oil now equals the value of the option to get a barrel of oil in the future at the same current cost. Then, the total cost of consuming a barrel of oil now, that should be used in lifecycle costing to design more sustainable products, is the summation of the oil price and the external cost

  4. Quantifying the impact of scholarly papers based on higher-order weighted citations.

    Science.gov (United States)

    Bai, Xiaomei; Zhang, Fuli; Hou, Jie; Lee, Ivan; Kong, Xiangjie; Tolba, Amr; Xia, Feng

    2018-01-01

    Quantifying the impact of a scholarly paper is of great significance, yet the effect of geographical distance of cited papers has not been explored. In this paper, we examine 30,596 papers published in Physical Review C, and identify the relationship between citations and geographical distances between author affiliations. Subsequently, a relative citation weight is applied to assess the impact of a scholarly paper. A higher-order weighted quantum PageRank algorithm is also developed to address the behavior of multiple step citation flow. Capturing the citation dynamics with higher-order dependencies reveals the actual impact of papers, including necessary self-citations that are sometimes excluded in prior studies. Quantum PageRank is utilized in this paper to help differentiating nodes whose PageRank values are identical.

  5. Agent-Based Model to Study and Quantify the Evolution Dynamics of Android Malware Infection

    Directory of Open Access Journals (Sweden)

    Juan Alegre-Sanahuja

    2014-01-01

    Full Text Available In the last years the number of malware Apps that the users download to their devices has risen. In this paper, we propose an agent-based model to quantify the Android malware infection evolution, modeling the behavior of the users and the different markets where the users may download Apps. The model predicts the number of infected smartphones depending on the type of malware. Additionally, we will estimate the cost that the users should afford when the malware is in their devices. We will be able to analyze which part is more critical: the users, giving indiscriminate permissions to the Apps or not protecting their devices with antivirus software, or the Android platform, due to the vulnerabilities of the Android devices that permit their rooted. We focus on the community of Valencia, Spain, although the obtained results can be extrapolated to other places where the number of Android smartphones remains fairly stable.

  6. Quantifying the role of noise on droplet decisions in bifurcating microchannels

    Science.gov (United States)

    Norouzi Darabad, Masoud; Vaughn, Mark; Vanapalli, Siva

    2017-11-01

    While many aspects of path selection of droplets flowing through a bifurcating microchannel have been studied, there are still unaddressed issues in predicting and controlling droplet traffic. One of the more important is understanding origin of aperiodic patterns. As a new tool to investigate this phenomena we propose monitoring the continuous time response of pressure fluctuations at different locations. Then we use time-series analysis to investigate the dynamics of the system. We suggest that natural system noise is the cause of irregularity in the traffic patterns. Using a mathematical model, we investigate the effect of noise on droplet decisions at the junction. Noise can be derived from different sources including droplet size variation, droplet spacing, and pump induced velocity fluctuation. By analyzing different situations we explain system behavior. We also investigate the ``memory'' of a microfluidic system in terms of the resistance to perturbations that quantify the allowable deviation in operating condition before the system changes state.

  7. Quantifying the impact of scholarly papers based on higher-order weighted citations

    Science.gov (United States)

    Bai, Xiaomei; Zhang, Fuli; Hou, Jie; Kong, Xiangjie; Tolba, Amr; Xia, Feng

    2018-01-01

    Quantifying the impact of a scholarly paper is of great significance, yet the effect of geographical distance of cited papers has not been explored. In this paper, we examine 30,596 papers published in Physical Review C, and identify the relationship between citations and geographical distances between author affiliations. Subsequently, a relative citation weight is applied to assess the impact of a scholarly paper. A higher-order weighted quantum PageRank algorithm is also developed to address the behavior of multiple step citation flow. Capturing the citation dynamics with higher-order dependencies reveals the actual impact of papers, including necessary self-citations that are sometimes excluded in prior studies. Quantum PageRank is utilized in this paper to help differentiating nodes whose PageRank values are identical. PMID:29596426

  8. A high-throughput assay for quantifying appetite and digestive dynamics

    Science.gov (United States)

    Guggiana-Nilo, Drago; Soucy, Edward; Song, Erin Yue; Lei Wee, Caroline; Engert, Florian

    2015-01-01

    Food intake and digestion are vital functions, and their dysregulation is fundamental for many human diseases. Current methods do not support their dynamic quantification on large scales in unrestrained vertebrates. Here, we combine an infrared macroscope with fluorescently labeled food to quantify feeding behavior and intestinal nutrient metabolism with high temporal resolution, sensitivity, and throughput in naturally behaving zebrafish larvae. Using this method and rate-based modeling, we demonstrate that zebrafish larvae match nutrient intake to their bodily demand and that larvae adjust their digestion rate, according to the ingested meal size. Such adaptive feedback mechanisms make this model system amenable to identify potential chemical modulators. As proof of concept, we demonstrate that nicotine, l-lysine, ghrelin, and insulin have analogous impact on food intake as in mammals. Consequently, the method presented here will promote large-scale translational research of food intake and digestive function in a naturally behaving vertebrate. PMID:26108871

  9. Tuning and Quantifying Steric and Electronic Effects of N-Heterocyclic Carbenes

    KAUST Repository

    Falivene, Laura

    2014-07-12

    This chapter states that the main handles for tuning steric and electronic effects are the substituents on N atoms, the nature of the C4-C5 bridge (either saturated or unsaturated), and the substituents on the C4 and C5 atoms. The initial intuition that steric properties of N-heterocyclic carbenes (NHCs) could be modulated and could impact catalytic behavior stimulated the development of steric descriptors to quantify the steric requirement of different NHCs and, possibly, to compare them with tertiary phosphines. NHCs can be classified as typically strong σ-basic/π-acid ligands, although they have been also shown to exhibit reasonable π-basic properties. This electronic modularity allows NHC ligands to adapt flexibly to different chemical environments represented by a transition metal and the other ligands. © 2014 Wiley-VCH Verlag GmbH & Co. KGaA. All rights reserved.

  10. Foundations for a time reliability correlation system to quantify human reliability

    International Nuclear Information System (INIS)

    Dougherty, E.M. Jr.; Fragola, J.R.

    1988-01-01

    Time reliability correlations (TRCs) have been used in human reliability analysis (HRA) in conjunction with probabilistic risk assessment (PRA) to quantify post-initiator human failure events. The first TRCs were judgmental but recent data taken from simulators have provided evidence for development of a system of TRCs. This system has the equational form: t = tau R X tau U , where the first factor is the lognormally distributed random variable of successful response time, derived from the simulator data, and the second factor is a unitary lognormal random variable to account for uncertainty in the model. The first random variable is further factored into a median response time and a factor to account for the dominant type of behavior assumed to be involved in the response and a second factor to account for other influences on the reliability of the response

  11. Feedback and efficient behavior.

    Directory of Open Access Journals (Sweden)

    Sandro Casal

    Full Text Available Feedback is an effective tool for promoting efficient behavior: it enhances individuals' awareness of choice consequences in complex settings. Our study aims to isolate the mechanisms underlying the effects of feedback on achieving efficient behavior in a controlled environment. We design a laboratory experiment in which individuals are not aware of the consequences of different alternatives and, thus, cannot easily identify the efficient ones. We introduce feedback as a mechanism to enhance the awareness of consequences and to stimulate exploration and search for efficient alternatives. We assess the efficacy of three different types of intervention: provision of social information, manipulation of the frequency, and framing of feedback. We find that feedback is most effective when it is framed in terms of losses, that it reduces efficiency when it includes information about inefficient peers' behavior, and that a lower frequency of feedback does not disrupt efficiency. By quantifying the effect of different types of feedback, our study suggests useful insights for policymakers.

  12. Aggressive behavior

    NARCIS (Netherlands)

    Didden, H.C.M.; Lindsay, W.R.; Lang, R.B.; Sigafoos, J.; Deb, S.; Wiersma, J.; Peters-Scheffer, N.C.; Marschik, P.B.; O'Reilly, M.F.; Lancioni, G.E.; Singh, N.N.

    2016-01-01

    Aggressive behavior is common in individuals with intellectual and developmental disabilities (IDDs), and it is most often targeted for intervention. Psychological, contextual, and biological risk factors may contribute to the risk of aggressive behavior. Risk factors are gender (males), level of

  13. Quantifying the levitation picture of extended states in lattice models

    OpenAIRE

    Pereira, Ana. L. C.; Schulz, P. A.

    2002-01-01

    The behavior of extended states is quantitatively analyzed for two-dimensional lattice models. A levitation picture is established for both white-noise and correlated disorder potentials. In a continuum limit window of the lattice models we find simple quantitative expressions for the extended states levitation, suggesting an underlying universal behavior. On the other hand, these results point out that the quantum Hall phase diagrams may be disorder dependent.

  14. Quantifying the tibiofemoral joint space using x-ray tomosynthesis.

    Science.gov (United States)

    Kalinosky, Benjamin; Sabol, John M; Piacsek, Kelly; Heckel, Beth; Gilat Schmidt, Taly

    2011-12-01

    Digital x-ray tomosynthesis (DTS) has the potential to provide 3D information about the knee joint in a load-bearing posture, which may improve diagnosis and monitoring of knee osteoarthritis compared with projection radiography, the current standard of care. Manually quantifying and visualizing the joint space width (JSW) from 3D tomosynthesis datasets may be challenging. This work developed a semiautomated algorithm for quantifying the 3D tibiofemoral JSW from reconstructed DTS images. The algorithm was validated through anthropomorphic phantom experiments and applied to three clinical datasets. A user-selected volume of interest within the reconstructed DTS volume was enhanced with 1D multiscale gradient kernels. The edge-enhanced volumes were divided by polarity into tibial and femoral edge maps and combined across kernel scales. A 2D connected components algorithm was performed to determine candidate tibial and femoral edges. A 2D joint space width map (JSW) was constructed to represent the 3D tibiofemoral joint space. To quantify the algorithm accuracy, an adjustable knee phantom was constructed, and eleven posterior-anterior (PA) and lateral DTS scans were acquired with the medial minimum JSW of the phantom set to 0-5 mm in 0.5 mm increments (VolumeRad™, GE Healthcare, Chalfont St. Giles, United Kingdom). The accuracy of the algorithm was quantified by comparing the minimum JSW in a region of interest in the medial compartment of the JSW map to the measured phantom setting for each trial. In addition, the algorithm was applied to DTS scans of a static knee phantom and the JSW map compared to values estimated from a manually segmented computed tomography (CT) dataset. The algorithm was also applied to three clinical DTS datasets of osteoarthritic patients. The algorithm segmented the JSW and generated a JSW map for all phantom and clinical datasets. For the adjustable phantom, the estimated minimum JSW values were plotted against the measured values for all

  15. Radiology reports: a quantifiable and objective textual approach

    International Nuclear Information System (INIS)

    Scott, J.A.; Palmer, E.L.

    2015-01-01

    Aim: To examine the feasibility of using automated lexical analysis in conjunction with machine learning to create a means of objectively characterising radiology reports for quality improvement. Materials and methods: Twelve lexical parameters were quantified from the collected reports of four radiologists. These included the number of different words used, number of sentences, reading grade, readability, usage of the passive voice, and lexical metrics of concreteness, ambivalence, complexity, passivity, embellishment, communication and cognition. Each radiologist was statistically compared to the mean of the group for each parameter to determine outlying report characteristics. The reproducibility of these parameters in a given radiologist's reporting style was tested by using only these 12 parameters as input to a neural network designed to establish the authorship of 60 unknown reports. Results: Significant differences in report characteristics were observed between radiologists, quantifying and characterising deviations of individuals from the group reporting style. The 12 metrics employed in a neural network correctly identified the author in each of 60 unknown reports tested, indicating a robust parametric signature. Conclusion: Automated and quantifiable methods can be used to analyse reporting style and provide impartial and objective feedback as well as to detect and characterise significant differences from the group. The parameters examined are sufficiently specific to identify the authors of reports and can potentially be useful in quality improvement and residency training. - Highlights: • Radiology reports can be objectively studied based upon their lexical characteristics. • This analysis can help establish norms for reporting, resident training and authorship attribution. • This analysis can complement higher level subjective analysis in quality improvement efforts.

  16. Using nitrate to quantify quick flow in a karst aquifer

    Science.gov (United States)

    Mahler, B.J.; Garner, B.D.

    2009-01-01

    In karst aquifers, contaminated recharge can degrade spring water quality, but quantifying the rapid recharge (quick flow) component of spring flow is challenging because of its temporal variability. Here, we investigate the use of nitrate in a two-endmember mixing model to quantify quick flow in Barton Springs, Austin, Texas. Historical nitrate data from recharging creeks and Barton Springs were evaluated to determine a representative nitrate concentration for the aquifer water endmember (1.5 mg/L) and the quick flow endmember (0.17 mg/L for nonstormflow conditions and 0.25 mg/L for stormflow conditions). Under nonstormflow conditions for 1990 to 2005, model results indicated that quick flow contributed from 0% to 55% of spring flow. The nitrate-based two-endmember model was applied to the response of Barton Springs to a storm and results compared to those produced using the same model with ??18O and specific conductance (SC) as tracers. Additionally, the mixing model was modified to allow endmember quick flow values to vary over time. Of the three tracers, nitrate appears to be the most advantageous because it is conservative and because the difference between the concentrations in the two endmembers is large relative to their variance. The ??18O- based model was very sensitive to variability within the quick flow endmember, and SC was not conservative over the timescale of the storm response. We conclude that a nitrate-based two-endmember mixing model might provide a useful approach for quantifying the temporally variable quick flow component of spring flow in some karst systems. ?? 2008 National Ground Water Association.

  17. A simple method for quantifying jump loads in volleyball athletes.

    Science.gov (United States)

    Charlton, Paula C; Kenneally-Dabrowski, Claire; Sheppard, Jeremy; Spratford, Wayne

    2017-03-01

    Evaluate the validity of a commercially available wearable device, the Vert, for measuring vertical displacement and jump count in volleyball athletes. Propose a potential method of quantifying external load during training and match play within this population. Validation study. The ability of the Vert device to measure vertical displacement in male, junior elite volleyball athletes was assessed against reference standard laboratory motion analysis. The ability of the Vert device to count jumps during training and match-play was assessed via comparison with retrospective video analysis to determine precision and recall. A method of quantifying external load, known as the load index (LdIx) algorithm was proposed using the product of the jump count and average kinetic energy. Correlation between two separate Vert devices and three-dimensional trajectory data were good to excellent for all jump types performed (r=0.83-0.97), with a mean bias of between 3.57-4.28cm. When matched against jumps identified through video analysis, the Vert demonstrated excellent precision (0.995-1.000) evidenced by a low number of false positives. The number of false negatives identified with the Vert was higher resulting in lower recall values (0.814-0.930). The Vert is a commercially available tool that has potential for measuring vertical displacement and jump count in elite junior volleyball athletes without the need for time-consuming analysis and bespoke software. Subsequently, allowing the collected data to better quantify load using the proposed algorithm (LdIx). Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  18. A simplified score to quantify comorbidity in COPD.

    Directory of Open Access Journals (Sweden)

    Nirupama Putcha

    Full Text Available Comorbidities are common in COPD, but quantifying their burden is difficult. Currently there is a COPD-specific comorbidity index to predict mortality and another to predict general quality of life. We sought to develop and validate a COPD-specific comorbidity score that reflects comorbidity burden on patient-centered outcomes.Using the COPDGene study (GOLD II-IV COPD, we developed comorbidity scores to describe patient-centered outcomes employing three techniques: 1 simple count, 2 weighted score, and 3 weighted score based upon statistical selection procedure. We tested associations, area under the Curve (AUC and calibration statistics to validate scores internally with outcomes of respiratory disease-specific quality of life (St. George's Respiratory Questionnaire, SGRQ, six minute walk distance (6MWD, modified Medical Research Council (mMRC dyspnea score and exacerbation risk, ultimately choosing one score for external validation in SPIROMICS.Associations between comorbidities and all outcomes were comparable across the three scores. All scores added predictive ability to models including age, gender, race, current smoking status, pack-years smoked and FEV1 (p<0.001 for all comparisons. Area under the curve (AUC was similar between all three scores across outcomes: SGRQ (range 0·7624-0·7676, MMRC (0·7590-0·7644, 6MWD (0·7531-0·7560 and exacerbation risk (0·6831-0·6919. Because of similar performance, the comorbidity count was used for external validation. In the SPIROMICS cohort, the comorbidity count performed well to predict SGRQ (AUC 0·7891, MMRC (AUC 0·7611, 6MWD (AUC 0·7086, and exacerbation risk (AUC 0·7341.Quantifying comorbidity provides a more thorough understanding of the risk for patient-centered outcomes in COPD. A comorbidity count performs well to quantify comorbidity in a diverse population with COPD.

  19. Quantifying the ventilatory control contribution to sleep apnoea using polysomnography.

    Science.gov (United States)

    Terrill, Philip I; Edwards, Bradley A; Nemati, Shamim; Butler, James P; Owens, Robert L; Eckert, Danny J; White, David P; Malhotra, Atul; Wellman, Andrew; Sands, Scott A

    2015-02-01

    Elevated loop gain, consequent to hypersensitive ventilatory control, is a primary nonanatomical cause of obstructive sleep apnoea (OSA) but it is not possible to quantify this in the clinic. Here we provide a novel method to estimate loop gain in OSA patients using routine clinical polysomnography alone. We use the concept that spontaneous ventilatory fluctuations due to apnoeas/hypopnoeas (disturbance) result in opposing changes in ventilatory drive (response) as determined by loop gain (response/disturbance). Fitting a simple ventilatory control model (including chemical and arousal contributions to ventilatory drive) to the ventilatory pattern of OSA reveals the underlying loop gain. Following mathematical-model validation, we critically tested our method in patients with OSA by comparison with a standard (continuous positive airway pressure (CPAP) drop method), and by assessing its ability to detect the known reduction in loop gain with oxygen and acetazolamide. Our method quantified loop gain from baseline polysomnography (correlation versus CPAP-estimated loop gain: n=28; r=0.63, p<0.001), detected the known reduction in loop gain with oxygen (n=11; mean±sem change in loop gain (ΔLG) -0.23±0.08, p=0.02) and acetazolamide (n=11; ΔLG -0.20±0.06, p=0.005), and predicted the OSA response to loop gain-lowering therapy. We validated a means to quantify the ventilatory control contribution to OSA pathogenesis using clinical polysomnography, enabling identification of likely responders to therapies targeting ventilatory control. Copyright ©ERS 2015.

  20. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    Science.gov (United States)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  1. Quantifying uncertainties in the structural response of SSME blades

    Science.gov (United States)

    Nagpal, Vinod K.

    1987-01-01

    To quantify the uncertainties associated with the geometry and material properties of a Space Shuttle Main Engine (SSME) turbopump blade, a computer code known as STAEBL was used. A finite element model of the blade used 80 triangular shell elements with 55 nodes and five degrees of freedom per node. The whole study was simulated on the computer and no real experiments were conducted. The structural response has been evaluated in terms of three variables which are natural frequencies, root (maximum) stress, and blade tip displacements. The results of the study indicate that only the geometric uncertainties have significant effects on the response. Uncertainties in material properties have insignificant effects.

  2. Quantifying DNA melting transitions using single-molecule force spectroscopy

    International Nuclear Information System (INIS)

    Calderon, Christopher P; Chen, W-H; Harris, Nolan C; Kiang, C-H; Lin, K-J

    2009-01-01

    We stretched a DNA molecule using an atomic force microscope (AFM) and quantified the mechanical properties associated with B and S forms of double-stranded DNA (dsDNA), molten DNA, and single-stranded DNA. We also fit overdamped diffusion models to the AFM time series and used these models to extract additional kinetic information about the system. Our analysis provides additional evidence supporting the view that S-DNA is a stable intermediate encountered during dsDNA melting by mechanical force. In addition, we demonstrated that the estimated diffusion models can detect dynamical signatures of conformational degrees of freedom not directly observed in experiments.

  3. Quantifying DNA melting transitions using single-molecule force spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Calderon, Christopher P [Department of Computational and Applied Mathematics, Rice University, Houston, TX (United States); Chen, W-H; Harris, Nolan C; Kiang, C-H [Department of Physics and Astronomy, Rice University, Houston, TX (United States); Lin, K-J [Department of Chemistry, National Chung Hsing University, Taichung, Taiwan (China)], E-mail: chkiang@rice.edu

    2009-01-21

    We stretched a DNA molecule using an atomic force microscope (AFM) and quantified the mechanical properties associated with B and S forms of double-stranded DNA (dsDNA), molten DNA, and single-stranded DNA. We also fit overdamped diffusion models to the AFM time series and used these models to extract additional kinetic information about the system. Our analysis provides additional evidence supporting the view that S-DNA is a stable intermediate encountered during dsDNA melting by mechanical force. In addition, we demonstrated that the estimated diffusion models can detect dynamical signatures of conformational degrees of freedom not directly observed in experiments.

  4. Quantifying unsteadiness and dynamics of pulsatory volcanic activity

    Science.gov (United States)

    Dominguez, L.; Pioli, L.; Bonadonna, C.; Connor, C. B.; Andronico, D.; Harris, A. J. L.; Ripepe, M.

    2016-06-01

    Pulsatory eruptions are marked by a sequence of explosions which can be separated by time intervals ranging from a few seconds to several hours. The quantification of the periodicities associated with these eruptions is essential not only for the comprehension of the mechanisms controlling explosivity, but also for classification purposes. We focus on the dynamics of pulsatory activity and quantify unsteadiness based on the distribution of the repose time intervals between single explosive events in relation to magma properties and eruptive styles. A broad range of pulsatory eruption styles are considered, including Strombolian, violent Strombolian and Vulcanian explosions. We find a general relationship between the median of the observed repose times in eruptive sequences and the viscosity of magma given by η ≈ 100 ṡtmedian. This relationship applies to the complete range of magma viscosities considered in our study (102 to 109 Pa s) regardless of the eruption length, eruptive style and associated plume heights, suggesting that viscosity is the main magma property controlling eruption periodicity. Furthermore, the analysis of the explosive sequences in terms of failure time through statistical survival analysis provides further information: dynamics of pulsatory activity can be successfully described in terms of frequency and regularity of the explosions, quantified based on the log-logistic distribution. A linear relationship is identified between the log-logistic parameters, μ and s. This relationship is useful for quantifying differences among eruptive styles from very frequent and regular mafic events (Strombolian activity) to more sporadic and irregular Vulcanian explosions in silicic systems. The time scale controlled by the parameter μ, as a function of the median of the distribution, can be therefore correlated with the viscosity of magmas; while the complexity of the erupting system, including magma rise rate, degassing and fragmentation efficiency

  5. Quantifying environmental performance using an environmental footprint calculator

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.B.; Loney, A.C.; Chan, V. [Conestoga-Rovers & Associates, Waterloo, Ontario (Canada)

    2009-07-01

    This paper provides a case study using relevant key performance indicators (KPIs) to evaluate the environmental performance of a business. Using recognized calculation and reporting frameworks, Conestoga-Rovers & Associates (CRA) designed the Environmental Footprint Calculator to quantify the environmental performance of a Canadian construction materials company. CRA designed the Environmental Footprint calculator for our client to track and report their environmental performance in accordance with their targets, based on requirements of relevant guidance documents. The objective was to design a tool that effectively manages, calculates, and reports environmental performance to various stakeholders in a user-friendly format. (author)

  6. Quantifying capital goods for biological treatment of organic waste

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Petersen, Per H.; Nielsen, Peter D.

    2015-01-01

    for the AD plant. For the composting plants, gravel and concrete slabs for the pavement were used in large amounts. To frame the quantification, environmental impact assessments (EIAs) showed that the steel used for tanks at the AD plant and the concrete slabs at the composting plants made the highest...... on the different sizes for the three different types of waste (garden and park waste, food waste and sludge from wastewater treatment) in amounts of 10,000 or 50,000 tonnes per year. The AD plant was quantified for a capacity of 80,000 tonnes per year. Concrete and steel for the tanks were the main materials...

  7. Quantifying capital goods for collection and transport of waste

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Christensen, Thomas Højlund

    2012-01-01

    he capital goods for collection and transport of waste were quantified for different types of containers (plastic containers, cubes and steel containers) and an 18-tonnes compacting collection truck. The data were collected from producers and vendors of the bins and the truck. The service lifetime...... tonne of waste handled. The impact of producing the capital goods for waste collection and transport cannot be neglected as the capital goods dominate (>85%) the categories human-toxicity (non-cancer and cancer), ecotoxicity, resource depletion and aquatic eutrophication, but also play a role (>13...

  8. Quantifying camouflage: how to predict detectability from appearance.

    Science.gov (United States)

    Troscianko, Jolyon; Skelhorn, John; Stevens, Martin

    2017-01-06

    Quantifying the conspicuousness of objects against particular backgrounds is key to understanding the evolution and adaptive value of animal coloration, and in designing effective camouflage. Quantifying detectability can reveal how colour patterns affect survival, how animals' appearances influence habitat preferences, and how receiver visual systems work. Advances in calibrated digital imaging are enabling the capture of objective visual information, but it remains unclear which methods are best for measuring detectability. Numerous descriptions and models of appearance have been used to infer the detectability of animals, but these models are rarely empirically validated or directly compared to one another. We compared the performance of human 'predators' to a bank of contemporary methods for quantifying the appearance of camouflaged prey. Background matching was assessed using several established methods, including sophisticated feature-based pattern analysis, granularity approaches and a range of luminance and contrast difference measures. Disruptive coloration is a further camouflage strategy where high contrast patterns disrupt they prey's tell-tale outline, making it more difficult to detect. Disruptive camouflage has been studied intensely over the past decade, yet defining and measuring it have proven far more problematic. We assessed how well existing disruptive coloration measures predicted capture times. Additionally, we developed a new method for measuring edge disruption based on an understanding of sensory processing and the way in which false edges are thought to interfere with animal outlines. Our novel measure of disruptive coloration was the best predictor of capture times overall, highlighting the importance of false edges in concealment over and above pattern or luminance matching. The efficacy of our new method for measuring disruptive camouflage together with its biological plausibility and computational efficiency represents a substantial

  9. Quantifying the energetics of cooperativity in a ternary protein complex

    DEFF Research Database (Denmark)

    Andersen, Peter S; Schuck, Peter; Sundberg, Eric J

    2002-01-01

    and mathematical modeling to describe the energetics of cooperativity in a trimolecular protein complex. As a model system for quantifying cooperativity, we studied the ternary complex formed by the simultaneous interaction of a superantigen with major histocompatibility complex and T cell receptor, for which...... a structural model is available. This system exhibits positive and negative cooperativity, as well as augmentation of the temperature dependence of binding kinetics upon the cooperative interaction of individual protein components in the complex. Our experimental and theoretical analysis may be applicable...... to other systems involving cooperativity....

  10. The quantified patient of the future: Opportunities and challenges.

    Science.gov (United States)

    Majmudar, Maulik D; Colucci, Lina Avancini; Landman, Adam B

    2015-09-01

    The healthcare system is undergoing rapid transformation as national policies increase patient access, reward positive health outcomes, and push for an end to the current era of episodic care. Advances in health sensors are rapidly moving diagnostic and monitoring capabilities into consumer products, enabling new care models. Although hospitals and health care providers have been slow to embrace novel health technologies, such innovations may help meet mounting pressure to provide timely, high quality, and low-cost care to large populations. This leading edge perspective focuses on the quantified-self movement and highlights the opportunities and challenges for patients, providers, and researchers. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Quantifying environmental performance using an environmental footprint calculator

    International Nuclear Information System (INIS)

    Smith, D.B.; Loney, A.C.; Chan, V.

    2009-01-01

    This paper provides a case study using relevant key performance indicators (KPIs) to evaluate the environmental performance of a business. Using recognized calculation and reporting frameworks, Conestoga-Rovers & Associates (CRA) designed the Environmental Footprint Calculator to quantify the environmental performance of a Canadian construction materials company. CRA designed the Environmental Footprint calculator for our client to track and report their environmental performance in accordance with their targets, based on requirements of relevant guidance documents. The objective was to design a tool that effectively manages, calculates, and reports environmental performance to various stakeholders in a user-friendly format. (author)

  12. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  13. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  14. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  15. Statistical Measures to Quantify Similarity between Molecular Dynamics Simulation Trajectories

    Directory of Open Access Journals (Sweden)

    Jenny Farmer

    2017-11-01

    Full Text Available Molecular dynamics simulation is commonly employed to explore protein dynamics. Despite the disparate timescales between functional mechanisms and molecular dynamics (MD trajectories, functional differences are often inferred from differences in conformational ensembles between two proteins in structure-function studies that investigate the effect of mutations. A common measure to quantify differences in dynamics is the root mean square fluctuation (RMSF about the average position of residues defined by C α -atoms. Using six MD trajectories describing three native/mutant pairs of beta-lactamase, we make comparisons with additional measures that include Jensen-Shannon, modifications of Kullback-Leibler divergence, and local p-values from 1-sample Kolmogorov-Smirnov tests. These additional measures require knowing a probability density function, which we estimate by using a nonparametric maximum entropy method that quantifies rare events well. The same measures are applied to distance fluctuations between C α -atom pairs. Results from several implementations for quantitative comparison of a pair of MD trajectories are made based on fluctuations for on-residue and residue-residue local dynamics. We conclude that there is almost always a statistically significant difference between pairs of 100 ns all-atom simulations on moderate-sized proteins as evident from extraordinarily low p-values.

  16. Quantifying construction and demolition waste: An analytical review

    International Nuclear Information System (INIS)

    Wu, Zezhou; Yu, Ann T.W.; Shen, Liyin; Liu, Guiwen

    2014-01-01

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested

  17. Methods to Quantify Nickel in Soils and Plant Tissues

    Directory of Open Access Journals (Sweden)

    Bruna Wurr Rodak

    2015-06-01

    Full Text Available In comparison with other micronutrients, the levels of nickel (Ni available in soils and plant tissues are very low, making quantification very difficult. The objective of this paper is to present optimized determination methods of Ni availability in soils by extractants and total content in plant tissues for routine commercial laboratory analyses. Samples of natural and agricultural soils were processed and analyzed by Mehlich-1 extraction and by DTPA. To quantify Ni in the plant tissues, samples were digested with nitric acid in a closed system in a microwave oven. The measurement was performed by inductively coupled plasma/optical emission spectrometry (ICP-OES. There was a positive and significant correlation between the levels of available Ni in the soils subjected to Mehlich-1 and DTPA extraction, while for plant tissue samples the Ni levels recovered were high and similar to the reference materials. The availability of Ni in some of the natural soil and plant tissue samples were lower than the limits of quantification. Concentrations of this micronutrient were higher in the soil samples in which Ni had been applied. Nickel concentration differed in the plant parts analyzed, with highest levels in the grains of soybean. The grain, in comparison with the shoot and leaf concentrations, were better correlated with the soil available levels for both extractants. The methods described in this article were efficient in quantifying Ni and can be used for routine laboratory analysis of soils and plant tissues.

  18. Development of an algorithm for quantifying extremity biological tissue

    International Nuclear Information System (INIS)

    Pavan, Ana L.M.; Miranda, Jose R.A.; Pina, Diana R. de

    2013-01-01

    The computerized radiology (CR) has become the most widely used device for image acquisition and production, since its introduction in the 80s. The detection and early diagnosis, obtained via CR, are important for the successful treatment of diseases such as arthritis, metabolic bone diseases, tumors, infections and fractures. However, the standards used for optimization of these images are based on international protocols. Therefore, it is necessary to compose radiographic techniques for CR system that provides a secure medical diagnosis, with doses as low as reasonably achievable. To this end, the aim of this work is to develop a quantifier algorithm of tissue, allowing the construction of a homogeneous end used phantom to compose such techniques. It was developed a database of computed tomography images of hand and wrist of adult patients. Using the Matlab ® software, was developed a computational algorithm able to quantify the average thickness of soft tissue and bones present in the anatomical region under study, as well as the corresponding thickness in simulators materials (aluminium and lucite). This was possible through the application of mask and Gaussian removal technique of histograms. As a result, was obtained an average thickness of soft tissue of 18,97 mm and bone tissue of 6,15 mm, and their equivalents in materials simulators of 23,87 mm of acrylic and 1,07mm of aluminum. The results obtained agreed with the medium thickness of biological tissues of a patient's hand pattern, enabling the construction of an homogeneous phantom

  19. A framework for quantifying net benefits of alternative prognostic models.

    Science.gov (United States)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd.

  20. A framework for quantifying net benefits of alternative prognostic models‡

    Science.gov (United States)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21905066

  1. Quantifying potential recharge in mantled sinkholes using ERT.

    Science.gov (United States)

    Schwartz, Benjamin F; Schreiber, Madeline E

    2009-01-01

    Potential recharge through thick soils in mantled sinkholes was quantified using differential electrical resistivity tomography (ERT). Conversion of time series two-dimensional (2D) ERT profiles into 2D volumetric water content profiles using a numerically optimized form of Archie's law allowed us to monitor temporal changes in water content in soil profiles up to 9 m in depth. Combining Penman-Monteith daily potential evapotranspiration (PET) and daily precipitation data with potential recharge calculations for three sinkhole transects indicates that potential recharge occurred only during brief intervals over the study period and ranged from 19% to 31% of cumulative precipitation. Spatial analysis of ERT-derived water content showed that infiltration occurred both on sinkhole flanks and in sinkhole bottoms. Results also demonstrate that mantled sinkholes can act as regions of both rapid and slow recharge. Rapid recharge is likely the result of flow through macropores (such as root casts and thin gravel layers), while slow recharge is the result of unsaturated flow through fine-grained sediments. In addition to developing a new method for quantifying potential recharge at the field scale in unsaturated conditions, we show that mantled sinkholes are an important component of storage in a karst system.

  2. Quantifying facial paralysis using the Kinect v2.

    Science.gov (United States)

    Gaber, Amira; Taher, Mona F; Wahed, Manal Abdel

    2015-01-01

    Assessment of facial paralysis (FP) and quantitative grading of facial asymmetry are essential in order to quantify the extent of the condition as well as to follow its improvement or progression. As such, there is a need for an accurate quantitative grading system that is easy to use, inexpensive and has minimal inter-observer variability. A comprehensive automated system to quantify and grade FP is the main objective of this work. An initial prototype has been presented by the authors. The present research aims to enhance the accuracy and robustness of one of this system's modules: the resting symmetry module. This is achieved by including several modifications to the computation method of the symmetry index (SI) for the eyebrows, eyes and mouth. These modifications are the gamma correction technique, the area of the eyes, and the slope of the mouth. The system was tested on normal subjects and showed promising results. The mean SI of the eyebrows decreased slightly from 98.42% to 98.04% using the modified method while the mean SI for the eyes and mouth increased from 96.93% to 99.63% and from 95.6% to 98.11% respectively while using the modified method. The system is easy to use, inexpensive, automated and fast, has no inter-observer variability and is thus well suited for clinical use.

  3. Digitally quantifying cerebral hemorrhage using Photoshop and Image J.

    Science.gov (United States)

    Tang, Xian Nan; Berman, Ari Ethan; Swanson, Raymond Alan; Yenari, Midori Anne

    2010-07-15

    A spectrophotometric hemoglobin assay is widely used to estimate the extent of brain hemorrhage by measuring the amount of hemoglobin in the brain. However, this method requires using the entire brain sample, leaving none for histology or other assays. Other widely used measures of gross brain hemorrhage are generally semi-quantitative and can miss subtle differences. Semi-quantitative brain hemorrhage scales may also be subject to bias. Here, we present a method to digitally quantify brain hemorrhage using Photoshop and Image J, and compared this method to the spectrophotometric hemoglobin assay. Male Sprague-Dawley rats received varying amounts of autologous blood injected into the cerebral hemispheres in order to generate different sized hematomas. 24h later, the brains were harvested, sectioned, photographed then prepared for the hemoglobin assay. From the brain section photographs, pixels containing hemorrhage were identified by Photoshop and the optical intensity was measured by Image J. Identification of hemorrhage size using optical intensities strongly correlated to the hemoglobin assay (R=0.94). We conclude that our method can accurately quantify the extent of hemorrhage. An advantage of this technique is that brain tissue can be used for additional studies. Published by Elsevier B.V.

  4. DIGITALLY QUANTIFYING CEREBRAL HEMORRHAGE USING PHOTOSHOP® AND IMAGE J

    Science.gov (United States)

    Tang, Xian Nan; Berman, Ari Ethan; Swanson, Raymond Alan; Yenari, Midori Anne

    2010-01-01

    A spectrophotometric hemoglobin assay is widely used to estimate the extent of brain hemorrhage by measuring the amount of hemoglobin in the brain. However, this method requires using the entire brain sample, leaving none for histology or other assays. Other widely used measures of gross brain hemorrhage are generally semi-quantitative and can miss subtle differences. Semi-quantitative brain hemorrhage scales may also be subject to bias. Here, we present a method to digitally quantify brain hemorrhage using Photoshop and Image J, and compared this method to the spectrophotometric hemoglobin assay. Male Sprague-Dawley rats received varying amounts of autologous blood injected into the cerebral hemispheres in order to generate different sized hematomas. 24 hours later, the brains were harvested, sectioned, photographed then prepared for the hemoglobin assay. From the brain section photographs, pixels containing hemorrhage were identified by Photoshop® and the optical intensity was measured by Image J. Identification of hemorrhage size using optical intensities strongly correlated to the hemoglobin assay (R=0.94). We conclude that our method can accurately quantify the extent of hemorrhage. An advantage of this technique is that brain tissue can be used for additional studies. PMID:20452374

  5. Quantifying construction and demolition waste: an analytical review.

    Science.gov (United States)

    Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen

    2014-09-01

    Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Quantifying Urban Fragmentation under Economic Transition in Shanghai City, China

    Directory of Open Access Journals (Sweden)

    Heyuan You

    2015-12-01

    Full Text Available Urban fragmentation affects sustainability through multiple impacts on economic, social, and environmental cost. Characterizing the dynamics of urban fragmentation in relation to economic transition should provide implications for sustainability. However, rather few efforts have been made in this issue. Using the case of Shanghai (China, this paper quantifies urban fragmentation in relation to economic transition. In particular, urban fragmentation is quantified by a time-series of remotely sensed images and a set of landscape metrics; and economic transition is described by a set of indicators from three aspects (globalization, decentralization, and marketization. Results show that urban fragmentation presents an increasing linear trend. Multivariate regression identifies positive linear correlation between urban fragmentation and economic transition. More specifically, the relative influence is different for the three components of economic transition. The relative influence of decentralization is stronger than that of globalization and marketization. The joint influences of decentralization and globalization are the strongest for urban fragmentation. The demonstrated methodology can be applicable to other places after making suitable adjustment of the economic transition indicators and fragmentation metrics.

  7. Design Life Level: Quantifying risk in a changing climate

    Science.gov (United States)

    Rootzén, Holger; Katz, Richard W.

    2013-09-01

    In the past, the concepts of return levels and return periods have been standard and important tools for engineering design. However, these concepts are based on the assumption of a stationary climate and do not apply to a changing climate, whether local or global. In this paper, we propose a refined concept, Design Life Level, which quantifies risk in a nonstationary climate and can serve as the basis for communication. In current practice, typical hydrologic risk management focuses on a standard (e.g., in terms of a high quantile corresponding to the specified probability of failure for a single year). Nevertheless, the basic information needed for engineering design should consist of (i) the design life period (e.g., the next 50 years, say 2015-2064); and (ii) the probability (e.g., 5% chance) of a hazardous event (typically, in the form of the hydrologic variable exceeding a high level) occurring during the design life period. Capturing both of these design characteristics, the Design Life Level is defined as an upper quantile (e.g., 5%) of the distribution of the maximum value of the hydrologic variable (e.g., water level) over the design life period. We relate this concept and variants of it to existing literature and illustrate how they, and some useful complementary plots, may be computed and used. One practically important consideration concerns quantifying the statistical uncertainty in estimating a high quantile under nonstationarity.

  8. Quantifying complexity in translational research: an integrated approach.

    Science.gov (United States)

    Munoz, David A; Nembhard, Harriet Black; Kraschnewski, Jennifer L

    2014-01-01

    The purpose of this paper is to quantify complexity in translational research. The impact of major operational steps and technical requirements is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. A three-phase integrated quality function deployment (QFD) and analytic hierarchy process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, the authors found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research.

  9. Quantifying the strength of quorum sensing crosstalk within microbial communities.

    Directory of Open Access Journals (Sweden)

    Kalinga Pavan T Silva

    2017-10-01

    Full Text Available In multispecies microbial communities, the exchange of signals such as acyl-homoserine lactones (AHL enables communication within and between species of Gram-negative bacteria. This process, commonly known as quorum sensing, aids in the regulation of genes crucial for the survival of species within heterogeneous populations of microbes. Although signal exchange was studied extensively in well-mixed environments, less is known about the consequences of crosstalk in spatially distributed mixtures of species. Here, signaling dynamics were measured in a spatially distributed system containing multiple strains utilizing homologous signaling systems. Crosstalk between strains containing the lux, las and rhl AHL-receptor circuits was quantified. In a distributed population of microbes, the impact of community composition on spatio-temporal dynamics was characterized and compared to simulation results using a modified reaction-diffusion model. After introducing a single term to account for crosstalk between each pair of signals, the model was able to reproduce the activation patterns observed in experiments. We quantified the robustness of signal propagation in the presence of interacting signals, finding that signaling dynamics are largely robust to interference. The ability of several wild isolates to participate in AHL-mediated signaling was investigated, revealing distinct signatures of crosstalk for each species. Our results present a route to characterize crosstalk between species and predict systems-level signaling dynamics in multispecies communities.

  10. Using multiscale norms to quantify mixing and transport

    International Nuclear Information System (INIS)

    Thiffeault, Jean-Luc

    2012-01-01

    Mixing is relevant to many areas of science and engineering, including the pharmaceutical and food industries, oceanography, atmospheric sciences and civil engineering. In all these situations one goal is to quantify and often then to improve the degree of homogenization of a substance being stirred, referred to as a passive scalar or tracer. A classical measure of mixing is the variance of the concentration of the scalar, which is the L 2 norm of a mean-zero concentration field. Recently, other norms have been used to quantify mixing, in particular the mix-norm as well as negative Sobolev norms. These norms have the advantage that unlike variance they decay even in the absence of diffusion, and their decay corresponds to the flow being mixing in the sense of ergodic theory. General Sobolev norms weigh scalar gradients differently, and are known as multiscale norms for mixing. We review the applications of such norms to mixing and transport, and show how they can be used to optimize the stirring and mixing of a decaying passive scalar. We then review recent work on the less-studied case of a continuously replenished scalar field—the source–sink problem. In that case the flows that optimally reduce the norms are associated with transport rather than mixing: they push sources onto sinks, and vice versa. (invited article)

  11. Constructing carbon offsets: The obstacles to quantifying emission reductions

    International Nuclear Information System (INIS)

    Millard-Ball, Adam; Ortolano, Leonard

    2010-01-01

    The existing literature generally ascribes the virtual absence of the transport sector from the Clean Development Mechanism (CDM) to the inherent complexity of quantifying emission reductions from mobile sources. We use archival analysis and interviews with CDM decision-makers and experts to identify two additional groups of explanations. First, we show the significance of aspects of the CDM's historical evolution, such as the order in which methodologies were considered and the assignment of expert desk reviewers. Second, we highlight inconsistencies in the treatment of uncertainty across sectors. In contrast to transport methodologies, other sectors are characterized by a narrow focus on sources of measurement uncertainty and a neglect of economic effects ('market leakages'). We do not argue that the rejection of transport methodologies was unjustified, but rather than many of the same problems are inherent in other sectors. Thus, the case of transport sheds light on fundamental problems in quantifying emission reductions under the CDM. We argue that a key theoretical attraction of the CDM-equalization of marginal abatement costs across all sectors-has been difficult to achieve in practice.

  12. A kernel plus method for quantifying wind turbine performance upgrades

    KAUST Repository

    Lee, Giwhyun

    2014-04-21

    Power curves are commonly estimated using the binning method recommended by the International Electrotechnical Commission, which primarily incorporates wind speed information. When such power curves are used to quantify a turbine\\'s upgrade, the results may not be accurate because many other environmental factors in addition to wind speed, such as temperature, air pressure, turbulence intensity, wind shear and humidity, all potentially affect the turbine\\'s power output. Wind industry practitioners are aware of the need to filter out effects from environmental conditions. Toward that objective, we developed a kernel plus method that allows incorporation of multivariate environmental factors in a power curve model, thereby controlling the effects from environmental factors while comparing power outputs. We demonstrate that the kernel plus method can serve as a useful tool for quantifying a turbine\\'s upgrade because it is sensitive to small and moderate changes caused by certain turbine upgrades. Although we demonstrate the utility of the kernel plus method in this specific application, the resulting method is a general, multivariate model that can connect other physical factors, as long as their measurements are available, with a turbine\\'s power output, which may allow us to explore new physical properties associated with wind turbine performance. © 2014 John Wiley & Sons, Ltd.

  13. Disordered crystals from first principles I: Quantifying the configuration space

    Science.gov (United States)

    Kühne, Thomas D.; Prodan, Emil

    2018-04-01

    This work represents the first chapter of a project on the foundations of first-principle calculations of the electron transport in crystals at finite temperatures. We are interested in the range of temperatures, where most electronic components operate, that is, room temperature and above. The aim is a predictive first-principle formalism that combines ab-initio molecular dynamics and a finite-temperature Kubo-formula for homogeneous thermodynamic phases. The input for this formula is the ergodic dynamical system (Ω , G , dP) defining the thermodynamic crystalline phase, where Ω is the configuration space for the atomic degrees of freedom, G is the space group acting on Ω and dP is the ergodic Gibbs measure relative to the G-action. The present work develops an algorithmic method for quantifying (Ω , G , dP) from first principles. Using the silicon crystal as a working example, we find the Gibbs measure to be extremely well characterized by a multivariate normal distribution, which can be quantified using a small number of parameters. The latter are computed at various temperatures and communicated in the form of a table. Using this table, one can generate large and accurate thermally-disordered atomic configurations to serve, for example, as input for subsequent simulations of the electronic degrees of freedom.

  14. Quantifying Fire Cycle from Dendroecological Records Using Survival Analyses

    Directory of Open Access Journals (Sweden)

    Dominic Cyr

    2016-06-01

    Full Text Available Quantifying fire regimes in the boreal forest ecosystem is crucial for understanding the past and present dynamics, as well as for predicting its future dynamics. Survival analyses have often been used to estimate the fire cycle in eastern Canada because they make it possible to take into account the censored information that is made prevalent by the typically long fire return intervals and the limited scope of the dendroecological methods that are used to quantify them. Here, we assess how the true length of the fire cycle, the short-term temporal variations in fire activity, and the sampling effort affect the accuracy and precision of estimates obtained from two types of parametric survival models, the Weibull and the exponential models, and one non-parametric model obtained with the Cox regression. Then, we apply those results in a case area located in eastern Canada. Our simulation experiment confirms some documented concerns regarding the detrimental effects of temporal variations in fire activity on parametric estimation of the fire cycle. Cox regressions appear to provide the most accurate and robust estimator, being by far the least affected by temporal variations in fire activity. The Cox-based estimate of the fire cycle for the last 300 years in the case study area is 229 years (CI95: 162–407, compared with the likely overestimated 319 years obtained with the commonly used exponential model.

  15. Quantifying Short-Chain Chlorinated Paraffin Congener Groups.

    Science.gov (United States)

    Yuan, Bo; Bogdal, Christian; Berger, Urs; MacLeod, Matthew; Gebbink, Wouter A; Alsberg, Tomas; de Wit, Cynthia A

    2017-09-19

    Accurate quantification of short-chain chlorinated paraffins (SCCPs) poses an exceptional challenge to analytical chemists. SCCPs are complex mixtures of chlorinated alkanes with variable chain length and chlorination level; congeners with a fixed chain length (n) and number of chlorines (m) are referred to as a "congener group" C n Cl m . Recently, we resolved individual C n Cl m by mathematically deconvolving soft ionization high-resolution mass spectra of SCCP mixtures. Here we extend the method to quantifying C n Cl m by introducing C n Cl m specific response factors (RFs) that are calculated from 17 SCCP chain-length standards with a single carbon chain length and variable chlorination level. The signal pattern of each standard is measured on APCI-QTOF-MS. RFs of each C n Cl m are obtained by pairwise optimization of the normal distribution's fit to the signal patterns of the 17 chain-length standards. The method was verified by quantifying SCCP technical mixtures and spiked environmental samples with accuracies of 82-123% and 76-109%, respectively. The absolute differences between calculated and manufacturer-reported chlorination degrees were -0.9 to 1.0%Cl for SCCP mixtures of 49-71%Cl. The quantification method has been replicated with ECNI magnetic sector MS and ECNI-Q-Orbitrap-MS. C n Cl m concentrations determined with the three instruments were highly correlated (R 2 > 0.90) with each other.

  16. Behavioral finance

    Directory of Open Access Journals (Sweden)

    Kapor Predrag

    2014-01-01

    Full Text Available This paper discuss some general principles of behavioral finance Behavioral finance is the dynamic and promising field of research that mergers concepts from financial economics and cognitive psychology in attempt to better understand systematic biases in decision-making process of financial agents. While the standard academic finance emphasizes theories such as modern portfolio theory and the efficient market hypothesis, the behavioral finance investigates the psychological and sociological issues that impact the decision-making process of individuals, groups and organizations. Most of the research behind behavioral finance has been empirical in nature, concentrating on what people do and why. The research has shown that people do not always act rationally, nor they fully utilise all information available to them.

  17. Behavior change

    Science.gov (United States)

    This brief entry presents the mediating-moderating variable model as a conceptual framework for understanding behavior change in regard to physical activity/exercise and adiposity. The ideas are applied to real world situations....

  18. What the laboratory rat has taught us about social play behavior: role in behavioral development and neural mechanisms

    NARCIS (Netherlands)

    Vanderschuren, L.J.M.J.|info:eu-repo/dai/nl/126514917; Trezza, V.

    2014-01-01

    Social play behavior is the most vigorous and characteristic form of social interaction displayed by developing mammals. The laboratory rat is an ideal species to study this behavior, since it shows ample social play that can be easily recognized and quantified. In this chapter, we will first

  19. A structural systems biology approach for quantifying the systemic consequences of missense mutations in proteins.

    Directory of Open Access Journals (Sweden)

    Tammy M K Cheng

    Full Text Available Gauging the systemic effects of non-synonymous single nucleotide polymorphisms (nsSNPs is an important topic in the pursuit of personalized medicine. However, it is a non-trivial task to understand how a change at the protein structure level eventually affects a cell's behavior. This is because complex information at both the protein and pathway level has to be integrated. Given that the idea of integrating both protein and pathway dynamics to estimate the systemic impact of missense mutations in proteins remains predominantly unexplored, we investigate the practicality of such an approach by formulating mathematical models and comparing them with experimental data to study missense mutations. We present two case studies: (1 interpreting systemic perturbation for mutations within the cell cycle control mechanisms (G2 to mitosis transition for yeast; (2 phenotypic classification of neuron-related human diseases associated with mutations within the mitogen-activated protein kinase (MAPK pathway. We show that the application of simplified mathematical models is feasible for understanding the effects of small sequence changes on cellular behavior. Furthermore, we show that the systemic impact of missense mutations can be effectively quantified as a combination of protein stability change and pathway perturbation.

  20. Quantifying the Traction Force of a Single Cell by Aligned Silicon Nanowire Array

    KAUST Repository

    Li, Zhou

    2009-10-14

    The physical behaviors of stationary cells, such as the morphology, motility, adhesion, anchorage, invasion and metastasis, are likely to be important for governing their biological characteristics. A change in the physical properties of mammalian cells could be an indication of disease. In this paper, we present a silicon-nanowire-array based technique for quantifying the mechanical behavior of single cells representing three distinct groups: normal mammalian cells, benign cells (L929), and malignant cells (HeLa). By culturing the cells on top of NW arrays, the maximum traction forces of two different tumor cells (HeLa, L929) have been measured by quantitatively analyzing the bending of the nanowires. The cancer cell exhibits a larger traction force than the normal cell by ∼20% for a HeLa cell and ∼50% for a L929 cell. The traction forces have been measured for the L929 cells and mechanocytes as a function of culture time. The relationship between cells extending area and their traction force has been investigated. Our study is likely important for studying the mechanical properties of single cells and their migration characteristics, possibly providing a new cellular level diagnostic technique. © 2009 American Chemical Society.

  1. Quantifying multi-dimensional attributes of human activities at various geographic scales based on smartphone tracking.

    Science.gov (United States)

    Zhou, Xiaolu; Li, Dongying

    2018-05-09

    Advancement in location-aware technologies, and information and communication technology in the past decades has furthered our knowledge of the interaction between human activities and the built environment. An increasing number of studies have collected data regarding individual activities to better understand how the environment shapes human behavior. Despite this growing interest, some challenges exist in collecting and processing individual's activity data, e.g., capturing people's precise environmental contexts and analyzing data at multiple spatial scales. In this study, we propose and implement an innovative system that integrates smartphone-based step tracking with an app and the sequential tile scan techniques to collect and process activity data. We apply the OpenStreetMap tile system to aggregate positioning points at various scales. We also propose duration, step and probability surfaces to quantify the multi-dimensional attributes of activities. Results show that, by running the app in the background, smartphones can measure multi-dimensional attributes of human activities, including space, duration, step, and location uncertainty at various spatial scales. By coordinating Global Positioning System (GPS) sensor with accelerometer sensor, this app can save battery which otherwise would be drained by GPS sensor quickly. Based on a test dataset, we were able to detect the recreational center and sports center as the space where the user was most active, among other places visited. The methods provide techniques to address key issues in analyzing human activity data. The system can support future studies on behavioral and health consequences related to individual's environmental exposure.

  2. Superlative Quantifiers as Modifiers of Meta-Speech Acts

    Directory of Open Access Journals (Sweden)

    Ariel Cohen

    2010-12-01

    Full Text Available The superlative quantifiers, at least and  at most, are commonly assumed  to have the same truth-conditions as the comparative quantifiers  more than and  fewer than. However, as Geurts & Nouwen (2007  have demonstrated, this is wrong, and several theories have been proposed to account for them. In this paper we propose that superlative quantifiers are illocutionary operators; specifically, they modify meta-speech acts. Meta speech-acts are operators that do not express a speech act, but a willingness to make or refrain from making a certain speech  act. The classic example is speech act denegation, e.g. I don't promise to come, where the speaker is explicitly refraining from performing the speech act of promising What denegations do is to delimit the future development of conversation, that is, they delimit future admissible speech acts. Hence we call them meta-speech acts. They are not moves in a game, but rather commitments to behave in certain ways in the future. We formalize the notion of meta speech acts as commitment development spaces, which are rooted graphs: The root of the graph describes the commitment development up to the current point in conversation; the continuations from the  root describe the admissible future directions. We define and formalize the meta-speech act GRANT, which indicates that the speaker, while not necessarily subscribing to a proposition, refrains from asserting its negation. We propose that superlative quantifiers are quantifiers over GRANTs. Thus, Mary petted at least three rabbits means that the minimal number n such that the speaker GRANTs that Mary petted  n rabbits is n = 3. In other words, the speaker denies that Mary petted two, one, or no rabbits, but GRANTs that she petted more. We formalize this interpretation of superlative quantifiers in terms of commitment development spaces, and show how the truth conditions that are derived from it are partly entailed and partly conversationally

  3. Fit by Bits: An Explorative Study of Sports Physiotherapists' Perception of Quantified Self Technologies.

    Science.gov (United States)

    Allouch, Somaya Ben; van Velsen, Lex

    2018-01-01

    Our aim was to determine sport physiotherapists' attitudes towards Quantified Self technology usage and adoption and to analyze factors that may influence this attitude. A survey was used to study a sample in the Netherlands. Assessment was made of attitudes of towards Quantified Self technology usage by clients of therapists, by therapists themselves and intention to adopt Quantified Self technology. Results show that the uptake of Quantified Self technology by sports physiotherapists is rather low but that the intention to adopt Quantified Self technology by sports physiotherapists is quite high. These results can provide a foundation to provide an infrastructure for sports physiotherapists to fulfill their wishes with regard to Quantified Self technology.

  4. Fuzzy Entropy Method for Quantifying Supply Chain Networks Complexity

    Science.gov (United States)

    Zhang, Jihui; Xu, Junqin

    Supply chain is a special kind of complex network. Its complexity and uncertainty makes it very difficult to control and manage. Supply chains are faced with a rising complexity of products, structures, and processes. Because of the strong link between a supply chain’s complexity and its efficiency the supply chain complexity management becomes a major challenge of today’s business management. The aim of this paper is to quantify the complexity and organization level of an industrial network working towards the development of a ‘Supply Chain Network Analysis’ (SCNA). By measuring flows of goods and interaction costs between different sectors of activity within the supply chain borders, a network of flows is built and successively investigated by network analysis. The result of this study shows that our approach can provide an interesting conceptual perspective in which the modern supply network can be framed, and that network analysis can handle these issues in practice.

  5. Quantifying human response capabilities towards tsunami threats at community level

    Science.gov (United States)

    Post, J.; Mück, M.; Zosseder, K.; Wegscheider, S.; Taubenböck, H.; Strunz, G.; Muhari, A.; Anwar, H. Z.; Birkmann, J.; Gebert, N.

    2009-04-01

    Decision makers at the community level need detailed information on tsunami risks in their area. Knowledge on potential hazard impact, exposed elements such as people, critical facilities and lifelines, people's coping capacity and recovery potential are crucial to plan precautionary measures for adaptation and to mitigate potential impacts of tsunamis on society and the environment. A crucial point within a people-centred tsunami risk assessment is to quantify the human response capabilities towards tsunami threats. Based on this quantification and spatial representation in maps tsunami affected and safe areas, difficult-to-evacuate areas, evacuation target points and evacuation routes can be assigned and used as an important contribution to e.g. community level evacuation planning. Major component in the quantification of human response capabilities towards tsunami impacts is the factor time. The human response capabilities depend on the estimated time of arrival (ETA) of a tsunami, the time until technical or natural warning signs (ToNW) can be received, the reaction time (RT) of the population (human understanding of a tsunami warning and the decision to take appropriate action), the evacuation time (ET, time people need to reach a safe area) and the actual available response time (RsT = ETA - ToNW - RT). If RsT is larger than ET, people in the respective areas are able to reach a safe area and rescue themselves. Critical areas possess RsT values equal or even smaller ET and hence people whin these areas will be directly affected by a tsunami. Quantifying the factor time is challenging and an attempt to this is presented here. The ETA can be derived by analyzing pre-computed tsunami scenarios for a respective area. For ToNW we assume that the early warning center is able to fulfil the Indonesian presidential decree to issue a warning within 5 minutes. RT is difficult as here human intrinsic factors as educational level, believe, tsunami knowledge and experience

  6. Quantifying cardiovascular disease risk factors in patients with psoriasis

    DEFF Research Database (Denmark)

    Miller, I M; Skaaby, T; Ellervik, C

    2013-01-01

    BACKGROUND: In a previous meta-analysis on categorical data we found an association between psoriasis and cardiovascular disease and associated risk factors. OBJECTIVES: To quantify the level of cardiovascular disease risk factors in order to provide additional data for the clinical management...... of the increased risk. METHODS: This was a meta-analysis of observational studies with continuous outcome using random-effects statistics. A systematic search of studies published before 25 October 2012 was conducted using the databases Medline, EMBASE, International Pharmaceutical Abstracts, PASCAL and BIOSIS......·65 mmol L(-1) )] and a higher HbA1c [1·09 mmol mol(-1) , 95% CI 0·87-1·31, P controls are significant, and therefore relevant to the clinical management of patients with psoriasis....

  7. Quantifying the Impact of Unavailability in Cyber-Physical Environments

    Energy Technology Data Exchange (ETDEWEB)

    Aissa, Anis Ben [Université de Tunis El Manar, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Federick T. [University of Memphis; Mili, Ali [New Jersey Insitute of Technology

    2014-01-01

    The Supervisory Control and Data Acquisition (SCADA) system discussed in this work manages a distributed control network for the Tunisian Electric & Gas Utility. The network is dispersed over a large geographic area that monitors and controls the flow of electricity/gas from both remote and centralized locations. The availability of the SCADA system in this context is critical to ensuring the uninterrupted delivery of energy, including safety, security, continuity of operations and revenue. Such SCADA systems are the backbone of national critical cyber-physical infrastructures. Herein, we propose adapting the Mean Failure Cost (MFC) metric for quantifying the cost of unavailability. This new metric combines the classic availability formulation with MFC. The resulting metric, so-called Econometric Availability (EA), offers a computational basis to evaluate a system in terms of the gain/loss ($/hour of operation) that affects each stakeholder due to unavailability.

  8. Quantifying chemical uncertainties in simulations of the ISM

    Science.gov (United States)

    Glover, Simon

    2018-06-01

    The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.

  9. Quantified safety objectives in high technology: Meaning and demonstration

    International Nuclear Information System (INIS)

    Vinck, W.F.; Gilby, E.; Chicken, J.

    1986-01-01

    An overview and trends-analysis is given of the types of quantified criteria and objectives which are presently applied or envisaged and discussed in Europe in the nuclear application, more specifically Nuclear Power Plants (NPPs), and in non-nuclear applications, more specifically in the chemical and petrochemical process industry. Some comparative deductions are made. Attention is paid to the similarities or discrepancies between such criteria and objectives and to problems associated with the demonstration that they are implemented. The role of cost-effectiveness of Risk deduction is briefly discussed and mention made of a search made into combining the technical, economic and socio-political factors playing a role in Risk acceptance

  10. Quantifying ground impact fatality rate for small unmanned aircraft

    DEFF Research Database (Denmark)

    La Cour-Harbo, Anders

    2018-01-01

    is based on a standard stochastic model, and employs a parameterized high fidelity ground impact distribution model that accounts for both aircraft specifications, parameter uncertainties, and wind. The method also samples the flight path to create an almost continuous quantification of the risk......One of the major challenges of conducting operation of unmanned aircraft, especially operations beyond visual line-of-sight (BVLOS), is to make a realistic and sufficiently detailed risk assessment. An important part of such an assessment is to identify the risk of fatalities, preferably...... in a quantitative way since this allows for comparison with manned aviation to determine whether an equivalent level of safety is achievable. This work presents a method for quantifying the probability of fatalities resulting from an uncontrolled descent of an unmanned aircraft conducting a BVLOS flight. The method...

  11. Quantifying tidally driven benthic oxygen exchange across permeable sediments

    DEFF Research Database (Denmark)

    McGinnis, Daniel F.; Sommer, Stefan; Lorke, Andreas

    2014-01-01

    Continental shelves are predominately (approximate to 70%) covered with permeable, sandy sediments. While identified as critical sites for intense oxygen, carbon, and nutrient turnover, constituent exchange across permeable sediments remains poorly quantified. The central North Sea largely consists...... of permeable sediments and has been identified as increasingly at risk for developing hypoxia. Therefore, we investigate the benthic O-2 exchange across the permeable North Sea sediments using a combination of in situ microprofiles, a benthic chamber, and aquatic eddy correlation. Tidal bottom currents drive...... the variable sediment O-2 penetration depth (from approximate to 3 to 8 mm) and the concurrent turbulence-driven 25-fold variation in the benthic sediment O-2 uptake. The O-2 flux and variability were reproduced using a simple 1-D model linking the benthic turbulence to the sediment pore water exchange...

  12. Planck and the local Universe: quantifying the tension

    CERN Document Server

    Verde, Licia; Protopapas, Pavlos

    2013-01-01

    We use the latest Planck constraints, and in particular constraints on the derived parameters (Hubble constant and age of the Universe) for the local universe and compare them with local measurements of the same quantities. We propose a way to quantify whether cosmological parameters constraints from two different experiments are in tension or not. Our statistic, T, is an evidence ratio and therefore can be interpreted with the widely used Jeffrey's scale. We find that in the framework of the LCDM model, the Planck inferred two dimensional, joint, posterior distribution for the Hubble constant and age of the Universe is in "strong" tension with the local measurements; the odds being ~ 1:50. We explore several possibilities for explaining this tension and examine the consequences both in terms of unknown errors and deviations from the LCDM model. In some one-parameter LCDM model extensions, tension is reduced whereas in other extensions, tension is instead increased. In particular, small total neutrino masses ...

  13. Gradient approach to quantify the gradation smoothness for output media

    Science.gov (United States)

    Kim, Youn Jin; Bang, Yousun; Choh, Heui-Keun

    2010-01-01

    We aim to quantify the perception of color gradation smoothness using objectively measurable properties. We propose a model to compute the smoothness of hardcopy color-to-color gradations. It is a gradient-based method that can be determined as a function of the 95th percentile of second derivative for the tone-jump estimator and the fifth percentile of first derivative for the tone-clipping estimator. Performance of the model and a previously suggested method were psychophysically appreciated, and their prediction accuracies were compared to each other. Our model showed a stronger Pearson correlation to the corresponding visual data, and the magnitude of the Pearson correlation reached up to 0.87. Its statistical significance was verified through analysis of variance. Color variations of the representative memory colors-blue sky, green grass and Caucasian skin-were rendered as gradational scales and utilized as the test stimuli.

  14. Quantifying intrinsic quality of commercial varieties of Basmati rice

    International Nuclear Information System (INIS)

    Sagar, M.A.; Salim, M.; Siddiqui, H.

    2003-01-01

    Twelve quality-trait of five commercial varieties of Basmati rice viz: Basmati 370, Basmati 385, Basmati 198, Basmati 6129 and Super Basmati, were determined according to standard methods. Under existing conditions, it is very difficult to assess overall quality of Basmati rice-varieties, because if a variety is superior in one quality trait, it is inferior in another quality trait, to some other Basmati rice variety, which creates confusion. In order to determine the overall quality-status, we have made an effort to quantify the various quality-traits. Each quality-trait has been allotted a score, in terms of its importance, and the overall status of each variety is computed. Based on our estimation, Super Basmati is of highest quality (98.21%), followed by Basmati 6129 (96.46%), Basmati 370 (95.59%), Basmati 385 (95.26%) and Basmati 198 (92.95%). (author)

  15. GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES

    Directory of Open Access Journals (Sweden)

    Sérgio D. Sousa

    2015-03-01

    Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.

  16. Quantifying the limits of transition state theory in enzymatic catalysis.

    Science.gov (United States)

    Zinovjev, Kirill; Tuñón, Iñaki

    2017-11-21

    While being one of the most popular reaction rate theories, the applicability of transition state theory to the study of enzymatic reactions has been often challenged. The complex dynamic nature of the protein environment raised the question about the validity of the nonrecrossing hypothesis, a cornerstone in this theory. We present a computational strategy to quantify the error associated to transition state theory from the number of recrossings observed at the equicommittor, which is the best possible dividing surface. Application of a direct multidimensional transition state optimization to the hydride transfer step in human dihydrofolate reductase shows that both the participation of the protein degrees of freedom in the reaction coordinate and the error associated to the nonrecrossing hypothesis are small. Thus, the use of transition state theory, even with simplified reaction coordinates, provides a good theoretical framework for the study of enzymatic catalysis. Copyright © 2017 the Author(s). Published by PNAS.

  17. Quantifying phenomenological importance in best-estimate plus uncertainty analyses

    International Nuclear Information System (INIS)

    Martin, Robert P.

    2009-01-01

    This paper describes a general methodology for quantifying the importance of specific phenomenological elements to analysis measures evaluated from non-parametric best-estimate plus uncertainty evaluation methodologies. The principal objective of an importance analysis is to reveal those uncertainty contributors having the greatest influence on key analysis measures. This characterization supports the credibility of the uncertainty analysis, the applicability of the analytical tools, and even the generic evaluation methodology through the validation of the engineering judgments that guided the evaluation methodology development. A demonstration of the importance analysis is provided using data from a sample problem considered in the development of AREVA's Realistic LBLOCA methodology. The results are presented against the original large-break LOCA Phenomena Identification and Ranking Table developed by the Technical Program Group responsible for authoring the Code Scaling, Applicability and Uncertainty methodology. (author)

  18. Quantified risk assessment for hazardous industry: The Australian approach

    International Nuclear Information System (INIS)

    Haddad, S.

    1994-01-01

    The paper presents the key conceptual and methodological aspects of Quantified Risk Assessment (QRA) and Hazard Analysis techniques as applied in the process industry, mostly in New South Wales, Australia. Variations in the range of applications of the techniques between the nuclear and non-nuclear industries are highlighted. The opportunity is taken to discuss cur-rent and future issues and trends concerning QRA, including: uncertainties and limitations; acceptability of risk criteria; toxicity and chronic health effects; new technology; modelling topics; and, environmental risk. The paper concludes by indicating that the next generation QRA, as applicable to Australian conditions in particular, will benefit from are think in two areas: a multi-level approach to QRA, and a range of not fully explored applications

  19. Quantified risk assessment for hazardous industry: the Australian approach

    International Nuclear Information System (INIS)

    Haddad, S.

    1994-01-01

    The paper presents the key conceptual and methodological aspects of Quantified Risk Assessment (QRA) and Hazard Analysis techniques as applied in the process industry, mostly in New South Wales, Australia. Variations in the range of applications of the techniques between the nuclear and non-nuclear industries are highlighted. The opportunity is taken to discuss current and future issues and trends concerning QRA, including: uncertainties and limitations; acceptability of risk criteria; toxicity and chronic health effects; new technology; modelling topics; and environmental risk. The paper concludes by indicating that the next generation QRA, as applicable to Australian conditions in particular, will benefit from a rethink in two areas: a multi-level approach to QRA, and a range of not fully explored applications. 8 refs., 2 tabs

  20. Quantifying the BICEP2-Planck tension over gravitational waves.

    Science.gov (United States)

    Smith, Kendrick M; Dvorkin, Cora; Boyle, Latham; Turok, Neil; Halpern, Mark; Hinshaw, Gary; Gold, Ben

    2014-07-18

    The recent BICEP2 measurement of B-mode polarization in the cosmic microwave background (r = 0.2(-0.05)(+0.07)), a possible indication of primordial gravity waves, appears to be in tension with the upper limit from WMAP (r < 0.13 at 95% C.L.) and Planck (r < 0.11 at 95% C.L.). We carefully quantify the level of tension and show that it is very significant (around 0.1% unlikely) when the observed deficit of large-scale temperature power is taken into account. We show that measurements of TE and EE power spectra in the near future will discriminate between the hypotheses that this tension is either a statistical fluke or a sign of new physics. We also discuss extensions of the standard cosmological model that relieve the tension and some novel ways to constrain them.

  1. Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective

    Directory of Open Access Journals (Sweden)

    Arthur M. Jacobs

    2017-12-01

    Full Text Available In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials.

  2. Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective.

    Science.gov (United States)

    Jacobs, Arthur M

    2017-01-01

    In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-)Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA) and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials.

  3. Quantifying and analysing food waste generated by Indonesian undergraduate students

    Science.gov (United States)

    Mandasari, P.

    2018-03-01

    Despite the fact that environmental consequences derived from food waste have been widely known, studies on the amount of food waste and its influencing factors have relatively been paid little attention. Addressing this shortage, this paper aimed to quantify monthly avoidable food waste generated by Indonesian undergraduate students and analyse factors influencing the occurrence of avoidable food waste. Based on data from 106 undergraduate students, descriptive statistics and logistic regression were applied in this study. The results indicated that 4,987.5 g of food waste was generated in a month (equal to 59,850 g yearly); or 47.05 g per person monthly (equal to 564.62 g per person per a year). Meanwhile, eating out frequency and gender were found to be significant predictors of food waste occurrence.

  4. Quantifying greenhouse gas emissions from waste treatment facilities

    DEFF Research Database (Denmark)

    Mønster, Jacob

    to be in-stalled in any vehicle and thereby enabling measurements wherever there were roads. The validation of the measurement method was done by releasing a controlled amount of methane and quantifying the emission using the release of tracer gas. The validation test showed that even in areas with large...... treatment plants. The PhD study reviewed and evaluated previously used methane measurement methods and found the tracer dispersion method promising. The method uses release of tracer gas and the use of mobile equipment with high analytical sensitivity, to measure the downwind plumes of methane and tracer...... ranged from 10 to 92 kg per hour and was found to change in even short timescales of a few hours. The periods with large emissions correlated with a drop in methane utilization, indicating that emissions came from the digesters tanks or gas storage/use. The measurements indicated that the main emissions...

  5. Quantifying population genetic differentiation from next-generation sequencing data

    DEFF Research Database (Denmark)

    Fumagalli, Matteo; Garrett Vieira, Filipe Jorge; Korneliussen, Thorfinn Sand

    2013-01-01

    method for quantifying population genetic differentiation from next-generation sequencing data. In addition, we present a strategy to investigate population structure via Principal Components Analysis. Through extensive simulations, we compare the new method herein proposed to approaches based...... on genotype calling and demonstrate a marked improvement in estimation accuracy for a wide range of conditions. We apply the method to a large-scale genomic data set of domesticated and wild silkworms sequenced at low coverage. We find that we can infer the fine-scale genetic structure of the sampled......Over the last few years, new high-throughput DNA sequencing technologies have dramatically increased speed and reduced sequencing costs. However, the use of these sequencing technologies is often challenged by errors and biases associated with the bioinformatical methods used for analyzing the data...

  6. Word embeddings quantify 100 years of gender and ethnic stereotypes.

    Science.gov (United States)

    Garg, Nikhil; Schiebinger, Londa; Jurafsky, Dan; Zou, James

    2018-04-17

    Word embeddings are a powerful machine-learning framework that represents each English word by a vector. The geometric relationship between these vectors captures meaningful semantic relationships between the corresponding words. In this paper, we develop a framework to demonstrate how the temporal dynamics of the embedding helps to quantify changes in stereotypes and attitudes toward women and ethnic minorities in the 20th and 21st centuries in the United States. We integrate word embeddings trained on 100 y of text data with the US Census to show that changes in the embedding track closely with demographic and occupation shifts over time. The embedding captures societal shifts-e.g., the women's movement in the 1960s and Asian immigration into the United States-and also illuminates how specific adjectives and occupations became more closely associated with certain populations over time. Our framework for temporal analysis of word embedding opens up a fruitful intersection between machine learning and quantitative social science.

  7. Quantifying light exposure patterns in young adult students

    Science.gov (United States)

    Alvarez, Amanda A.; Wildsoet, Christine F.

    2013-08-01

    Exposure to bright light appears to be protective against myopia in both animals (chicks, monkeys) and children, but quantitative data on human light exposure are limited. In this study, we report on a technique for quantifying light exposure using wearable sensors. Twenty-seven young adult subjects wore a light sensor continuously for two weeks during one of three seasons, and also completed questionnaires about their visual activities. Light data were analyzed with respect to refractive error and season, and the objective sensor data were compared with subjects' estimates of time spent indoors and outdoors. Subjects' estimates of time spent indoors and outdoors were in poor agreement with durations reported by the sensor data. The results of questionnaire-based studies of light exposure should thus be interpreted with caution. The role of light in refractive error development should be investigated using multiple methods such as sensors to complement questionnaires.

  8. Quantifying the global cellular thiol-disulfide status

    DEFF Research Database (Denmark)

    Hansen, Rosa E; Roth, Doris; Winther, Jakob R

    2009-01-01

    It is widely accepted that the redox status of protein thiols is of central importance to protein structure and folding and that glutathione is an important low-molecular-mass redox regulator. However, the total cellular pools of thiols and disulfides and their relative abundance have never been...... determined. In this study, we have assembled a global picture of the cellular thiol-disulfide status in cultured mammalian cells. We have quantified the absolute levels of protein thiols, protein disulfides, and glutathionylated protein (PSSG) in all cellular protein, including membrane proteins. These data...... cell types. However, when cells are exposed to a sublethal dose of the thiol-specific oxidant diamide, PSSG levels increase to >15% of all protein cysteine. Glutathione is typically characterized as the "cellular redox buffer"; nevertheless, our data show that protein thiols represent a larger active...

  9. Quantifying chaotic dynamics from integrate-and-fire processes

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, A. N. [Department of Physics, Saratov State University, Astrakhanskaya Str. 83, 410012 Saratov (Russian Federation); Saratov State Technical University, Politehnicheskaya Str. 77, 410054 Saratov (Russian Federation); Pavlova, O. N. [Department of Physics, Saratov State University, Astrakhanskaya Str. 83, 410012 Saratov (Russian Federation); Mohammad, Y. K. [Department of Physics, Saratov State University, Astrakhanskaya Str. 83, 410012 Saratov (Russian Federation); Tikrit University Salahudin, Tikrit Qadisiyah, University Str. P.O. Box 42, Tikrit (Iraq); Kurths, J. [Potsdam Institute for Climate Impact Research, Telegraphenberg A 31, 14473 Potsdam (Germany); Institute of Physics, Humboldt University Berlin, 12489 Berlin (Germany)

    2015-01-15

    Characterizing chaotic dynamics from integrate-and-fire (IF) interspike intervals (ISIs) is relatively easy performed at high firing rates. When the firing rate is low, a correct estimation of Lyapunov exponents (LEs) describing dynamical features of complex oscillations reflected in the IF ISI sequences becomes more complicated. In this work we discuss peculiarities and limitations of quantifying chaotic dynamics from IF point processes. We consider main factors leading to underestimated LEs and demonstrate a way of improving numerical determining of LEs from IF ISI sequences. We show that estimations of the two largest LEs can be performed using around 400 mean periods of chaotic oscillations in the regime of phase-coherent chaos. Application to real data is discussed.

  10. Quantifying mast cells in bladder pain syndrome by immunohistochemical analysis

    DEFF Research Database (Denmark)

    Larsen, M.S.; Mortensen, S.; Nordling, J.

    2008-01-01

    OBJECTIVES To evaluate a simple method for counting mast cells, thought to have a role in the pathophysiology of bladder pain syndrome (BPS, formerly interstitial cystitis, a syndrome of pelvic pain perceived to be related to the urinary bladder and accompanied by other urinary symptoms, e. g....... frequency and nocturia), as > 28 mast cells/mm(2) is defined as mastocytosis and correlated with clinical outcome. PATIENTS AND METHODS The current enzymatic staining method (naphtolesterase) on 10 mu m sections for quantifying mast cells is complicated. In the present study, 61 patients had detrusor...... sections between, respectively. Mast cells were counted according to a well-defined procedure. RESULTS The old and the new methods, on 10 and 3 mu m sections, showed a good correlation between mast cell counts. When using tryptase staining and 3 mu m sections, the mast cell number correlated well...

  11. Methods for quantifying T cell receptor binding affinities and thermodynamics

    Science.gov (United States)

    Piepenbrink, Kurt H.; Gloor, Brian E.; Armstrong, Kathryn M.; Baker, Brian M.

    2013-01-01

    αβ T cell receptors (TCRs) recognize peptide antigens bound and presented by class I or class II major histocompatibility complex (MHC) proteins. Recognition of a peptide/MHC complex is required for initiation and propagation of a cellular immune response, as well as the development and maintenance of the T cell repertoire. Here we discuss methods to quantify the affinities and thermodynamics of interactions between soluble ectodomains of TCRs and their peptide/MHC ligands, focusing on titration calorimetry, surface plasmon resonance, and fluorescence anisotropy. As TCRs typically bind ligand with weak-to-moderate affinities, we focus the discussion on means to enhance the accuracy and precision of low affinity measurements. In addition to further elucidating the biology of the T cell mediated immune response, more reliable low affinity measurements will aid with more probing studies with mutants or altered peptides that can help illuminate the physical underpinnings of how TCRs achieve their remarkable recognition properties. PMID:21609868

  12. A method to quantify movement activity of groups of animals using automated image analysis

    Science.gov (United States)

    Xu, Jianyu; Yu, Haizhen; Liu, Ying

    2009-07-01

    Most physiological and environmental changes are capable of inducing variations in animal behavior. The behavioral parameters have the possibility to be measured continuously in-situ by a non-invasive and non-contact approach, and have the potential to be used in the actual productions to predict stress conditions. Most vertebrates tend to live in groups, herds, flocks, shoals, bands, packs of conspecific individuals. Under culture conditions, the livestock or fish are in groups and interact on each other, so the aggregate behavior of the group should be studied rather than that of individuals. This paper presents a method to calculate the movement speed of a group of animal in a enclosure or a tank denoted by body length speed that correspond to group activity using computer vision technique. Frame sequences captured at special time interval were subtracted in pairs after image segmentation and identification. By labeling components caused by object movement in difference frame, the projected area caused by the movement of every object in the capture interval was calculated; this projected area was divided by the projected area of every object in the later frame to get body length moving distance of each object, and further could obtain the relative body length speed. The average speed of all object can well respond to the activity of the group. The group activity of a tilapia (Oreochromis niloticus) school to high (2.65 mg/L) levels of unionized ammonia (UIA) concentration were quantified based on these methods. High UIA level condition elicited a marked increase in school activity at the first hour (P<0.05) exhibiting an avoidance reaction (trying to flee from high UIA condition), and then decreased gradually.

  13. Current challenges in quantifying preferential flow through the vadose zone

    Science.gov (United States)

    Koestel, John; Larsbo, Mats; Jarvis, Nick

    2017-04-01

    In this presentation, we give an overview of current challenges in quantifying preferential flow through the vadose zone. A review of the literature suggests that current generation models do not fully reflect the present state of process understanding and empirical knowledge of preferential flow. We believe that the development of improved models will be stimulated by the increasingly widespread application of novel imaging technologies as well as future advances in computational power and numerical techniques. One of the main challenges in this respect is to bridge the large gap between the scales at which preferential flow occurs (pore to Darcy scales) and the scale of interest for management (fields, catchments, regions). Studies at the pore scale are being supported by the development of 3-D non-invasive imaging and numerical simulation techniques. These studies are leading to a better understanding of how macropore network topology and initial/boundary conditions control key state variables like matric potential and thus the strength of preferential flow. Extrapolation of this knowledge to larger scales would require support from theoretical frameworks such as key concepts from percolation and network theory, since we lack measurement technologies to quantify macropore networks at these large scales. Linked hydro-geophysical measurement techniques that produce highly spatially and temporally resolved data enable investigation of the larger-scale heterogeneities that can generate preferential flow patterns at pedon, hillslope and field scales. At larger regional and global scales, improved methods of data-mining and analyses of large datasets (machine learning) may help in parameterizing models as well as lead to new insights into the relationships between soil susceptibility to preferential flow and site attributes (climate, land uses, soil types).

  14. Quantifiably secure power grid operation, management, and evolution :

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Genetha Anne.; Watson, Jean-Paul; Silva Monroy, Cesar Augusto; Gramacy, Robert B.

    2013-09-01

    This report summarizes findings and results of the Quantifiably Secure Power Grid Operation, Management, and Evolution LDRD. The focus of the LDRD was to develop decisionsupport technologies to enable rational and quantifiable risk management for two key grid operational timescales: scheduling (day-ahead) and planning (month-to-year-ahead). Risk or resiliency metrics are foundational in this effort. The 2003 Northeast Blackout investigative report stressed the criticality of enforceable metrics for system resiliency the grids ability to satisfy demands subject to perturbation. However, we neither have well-defined risk metrics for addressing the pervasive uncertainties in a renewable energy era, nor decision-support tools for their enforcement, which severely impacts efforts to rationally improve grid security. For day-ahead unit commitment, decision-support tools must account for topological security constraints, loss-of-load (economic) costs, and supply and demand variability especially given high renewables penetration. For long-term planning, transmission and generation expansion must ensure realized demand is satisfied for various projected technological, climate, and growth scenarios. The decision-support tools investigated in this project paid particular attention to tailoriented risk metrics for explicitly addressing high-consequence events. Historically, decisionsupport tools for the grid consider expected cost minimization, largely ignoring risk and instead penalizing loss-of-load through artificial parameters. The technical focus of this work was the development of scalable solvers for enforcing risk metrics. Advanced stochastic programming solvers were developed to address generation and transmission expansion and unit commitment, minimizing cost subject to pre-specified risk thresholds. Particular attention was paid to renewables where security critically depends on production and demand prediction accuracy. To address this

  15. THE SEGUE K GIANT SURVEY. III. QUANTIFYING GALACTIC HALO SUBSTRUCTURE

    Energy Technology Data Exchange (ETDEWEB)

    Janesh, William; Morrison, Heather L.; Ma, Zhibo; Harding, Paul [Department of Astronomy, Case Western Reserve University, Cleveland, OH 44106 (United States); Rockosi, Constance [UCO/Lick Observatory, University of California, Santa Cruz, 1156 High Street, Santa Cruz, CA 95064 (United States); Starkenburg, Else [Department of Physics and Astronomy, University of Victoria, P.O. Box 1700, STN CSC, Victoria BC V8W 3P6 (Canada); Xue, Xiang Xiang; Rix, Hans-Walter [Max-Planck-Institut für Astronomie, Königstuhl 17, D-69117 Heidelberg (Germany); Beers, Timothy C. [Department of Physics and JINA Center for the Evolution of the Elements, University of Notre Dame, Notre Dame, IN 46556 (United States); Johnson, Jennifer [Department of Astronomy, Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Lee, Young Sun [Department of Astronomy and Space Science, Chungnam National University, Daejeon 34134 (Korea, Republic of); Schneider, Donald P. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2016-01-10

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5–125 kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey’s Sloan Extension for Galactic Understanding and Exploration project. Using a position–velocity clustering estimator (the 4distance) and a model of a smooth stellar halo, we quantify the amount of substructure in the halo, divided by distance and metallicity. Overall, we find that the halo as a whole is highly structured. We also confirm earlier work using blue horizontal branch (BHB) stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius, and additionally find that the amount of substructure in the halo increases with increasing metallicity. Comparing to resampled BHB stars, we find that K giants and BHBs have similar amounts of substructure over equivalent ranges of Galactocentric radius. Using a friends-of-friends algorithm to identify members of individual groups, we find that a large fraction (∼33%) of grouped stars are associated with Sgr, and identify stars belonging to other halo star streams: the Orphan Stream, the Cetus Polar Stream, and others, including previously unknown substructures. A large fraction of sample K giants (more than 50%) are not grouped into any substructure. We find also that the Sgr stream strongly dominates groups in the outer halo for all except the most metal-poor stars, and suggest that this is the source of the increase of substructure with Galactocentric radius and metallicity.

  16. Quantifying center of pressure variability in chondrodystrophoid dogs.

    Science.gov (United States)

    Blau, S R; Davis, L M; Gorney, A M; Dohse, C S; Williams, K D; Lim, J-H; Pfitzner, W G; Laber, E; Sawicki, G S; Olby, N J

    2017-08-01

    The center of pressure (COP) position reflects a combination of proprioceptive, motor and mechanical function. As such, it can be used to quantify and characterize neurologic dysfunction. The aim of this study was to describe and quantify the movement of COP and its variability in healthy chondrodystrophoid dogs while walking to provide a baseline for comparison to dogs with spinal cord injury due to acute intervertebral disc herniations. Fifteen healthy adult chondrodystrophoid dogs were walked on an instrumented treadmill that recorded the location of each dog's COP as it walked. Center of pressure (COP) was referenced from an anatomical marker on the dogs' back. The root mean squared (RMS) values of changes in COP location in the sagittal (y) and horizontal (x) directions were calculated to determine the range of COP variability. Three dogs would not walk on the treadmill. One dog was too small to collect interpretable data. From the remaining 11 dogs, 206 trials were analyzed. Mean RMS for change in COPx per trial was 0.0138 (standard deviation, SD 0.0047) and for COPy was 0.0185 (SD 0.0071). Walking speed but not limb length had a significant effect on COP RMS. Repeat measurements in six dogs had high test retest consistency in the x and fair consistency in the y direction. In conclusion, COP variability can be measured consistently in dogs, and a range of COP variability for normal chondrodystrophoid dogs has been determined to provide a baseline for future studies on dogs with spinal cord injury. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Quantifying climatological ranges and anomalies for Pacific coral reef ecosystems.

    Science.gov (United States)

    Gove, Jamison M; Williams, Gareth J; McManus, Margaret A; Heron, Scott F; Sandin, Stuart A; Vetter, Oliver J; Foley, David G

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic-biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will help

  18. Quantifying polypeptide conformational space: sensitivity to conformation and ensemble definition.

    Science.gov (United States)

    Sullivan, David C; Lim, Carmay

    2006-08-24

    Quantifying the density of conformations over phase space (the conformational distribution) is needed to model important macromolecular processes such as protein folding. In this work, we quantify the conformational distribution for a simple polypeptide (N-mer polyalanine) using the cumulative distribution function (CDF), which gives the probability that two randomly selected conformations are separated by less than a "conformational" distance and whose inverse gives conformation counts as a function of conformational radius. An important finding is that the conformation counts obtained by the CDF inverse depend critically on the assignment of a conformation's distance span and the ensemble (e.g., unfolded state model): varying ensemble and conformation definition (1 --> 2 A) varies the CDF-based conformation counts for Ala(50) from 10(11) to 10(69). In particular, relatively short molecular dynamics (MD) relaxation of Ala(50)'s random-walk ensemble reduces the number of conformers from 10(55) to 10(14) (using a 1 A root-mean-square-deviation radius conformation definition) pointing to potential disconnections in comparing the results from simplified models of unfolded proteins with those from all-atom MD simulations. Explicit waters are found to roughen the landscape considerably. Under some common conformation definitions, the results herein provide (i) an upper limit to the number of accessible conformations that compose unfolded states of proteins, (ii) the optimal clustering radius/conformation radius for counting conformations for a given energy and solvent model, (iii) a means of comparing various studies, and (iv) an assessment of the applicability of random search in protein folding.

  19. A robust nonparametric method for quantifying undetected extinctions.

    Science.gov (United States)

    Chisholm, Ryan A; Giam, Xingli; Sadanandan, Keren R; Fung, Tak; Rheindt, Frank E

    2016-06-01

    How many species have gone extinct in modern times before being described by science? To answer this question, and thereby get a full assessment of humanity's impact on biodiversity, statistical methods that quantify undetected extinctions are required. Such methods have been developed recently, but they are limited by their reliance on parametric assumptions; specifically, they assume the pools of extant and undetected species decay exponentially, whereas real detection rates vary temporally with survey effort and real extinction rates vary with the waxing and waning of threatening processes. We devised a new, nonparametric method for estimating undetected extinctions. As inputs, the method requires only the first and last date at which each species in an ensemble was recorded. As outputs, the method provides estimates of the proportion of species that have gone extinct, detected, or undetected and, in the special case where the number of undetected extant species in the present day is assumed close to zero, of the absolute number of undetected extinct species. The main assumption of the method is that the per-species extinction rate is independent of whether a species has been detected or not. We applied the method to the resident native bird fauna of Singapore. Of 195 recorded species, 58 (29.7%) have gone extinct in the last 200 years. Our method projected that an additional 9.6 species (95% CI 3.4, 19.8) have gone extinct without first being recorded, implying a true extinction rate of 33.0% (95% CI 31.0%, 36.2%). We provide R code for implementing our method. Because our method does not depend on strong assumptions, we expect it to be broadly useful for quantifying undetected extinctions. © 2016 Society for Conservation Biology.

  20. Quantifying Selective Pressures Driving Bacterial Evolution Using Lineage Analysis

    Science.gov (United States)

    Lambert, Guillaume; Kussell, Edo

    2015-01-01

    Organisms use a variety of strategies to adapt to their environments and maximize long-term growth potential, but quantitative characterization of the benefits conferred by the use of such strategies, as well as their impact on the whole population's rate of growth, remains challenging. Here, we use a path-integral framework that describes how selection acts on lineages—i.e., the life histories of individuals and their ancestors—to demonstrate that lineage-based measurements can be used to quantify the selective pressures acting on a population. We apply this analysis to Escherichia coli bacteria exposed to cyclical treatments of carbenicillin, an antibiotic that interferes with cell-wall synthesis and affects cells in an age-dependent manner. While the extensive characterization of the life history of thousands of cells is necessary to accurately extract the age-dependent selective pressures caused by carbenicillin, the same measurement can be recapitulated using lineage-based statistics of a single surviving cell. Population-wide evolutionary pressures can be extracted from the properties of the surviving lineages within a population, providing an alternative and efficient procedure to quantify the evolutionary forces acting on a population. Importantly, this approach is not limited to age-dependent selection, and the framework can be generalized to detect signatures of other trait-specific selection using lineage-based measurements. Our results establish a powerful way to study the evolutionary dynamics of life under selection and may be broadly useful in elucidating selective pressures driving the emergence of antibiotic resistance and the evolution of survival strategies in biological systems.

  1. Quantifying changes and influences on mottled duck density in Texas

    Science.gov (United States)

    Ross, Beth; Haukos, David A.; Walther, Patrick

    2018-01-01

    Understanding the relative influence of environmental and intrinsic effects on populations is important for managing and conserving harvested species, especially those species inhabiting changing environments. Additionally, climate change can increase the uncertainty associated with management of species in these changing environments, making understanding factors affecting their populations even more important. Coastal ecosystems are particularly threatened by climate change; the combined effects of increasing severe weather events, sea level rise, and drought will likely have non-linear effects on coastal marsh wildlife species and their associated habitats. A species of conservation concern that persists in these coastal areas is the mottled duck (Anas fulvigula). Mottled ducks in the western Gulf Coast are approximately 50% below target abundance numbers established by the Gulf Coast Joint Venture for Texas and Louisiana, USA. Although evidence for declines in mottled duck abundance is apparent, specific causes of the decrease remain unknown. Our goals were to determine where the largest declines in mottled duck population were occurring along the system of Texas Gulf Coast National Wildlife Refuges and quantify the relative contribution of environmental and intrinsic effects on changes to relative population density. We modeled aerial survey data of mottled duck density along the Texas Gulf Coast from 1986–2015 to quantify effects of extreme weather events on an index to mottled duck density using the United States Climate Extremes Index and Palmer Drought Severity Index. Our results indicate that decreases in abundance are best described by an increase in days with extreme 1-day precipitation from June to November (hurricane season) and an increase in drought severity. Better understanding those portions of the life cycle affected by environmental conditions, and how to manage mottled duck habitat in conjunction with these events will likely be key to

  2. Quantifying the emissions reduction effectiveness and costs of oxygenated gasoline

    International Nuclear Information System (INIS)

    Lyons, C.E.

    1993-01-01

    During the fall, winter, and spring of 1991-1992, a measurement program was conducted in Denver, Colorado to quantify the technical and economic effectiveness of oxygenated gasoline in reducing automobile carbon monoxide (CO) emissions. Emissions from 80,000 vehicles under a variety of operating conditions were measured before, during, and after the seasonal introduction of oxygenated gasoline into the region. Gasoline samples were taken from several hundred vehicles to confirm the actual oxygen content of the fuel in use. Vehicle operating conditions, such as cold starts and warm operations, and ambient conditions were characterized. The variations in emissions attributable to fuel type and to operating conditions were then quantified. This paper describes the measurement program and its results. The 1991-1992 Colorado oxygenated gasoline program contributed to a reduction in carbon monoxide (CO) emissions from gasoline-powered vehicles. The measurement program demonstrated that most of the reduction is concentrated in a small percentage of the vehicles that use oxygenated gasoline. The remainder experience little or not reduction in emissions. The oxygenated gasoline program outlays are approximately $25 to $30 million per year in Colorado. These are directly measurable costs, incurred through increased government expenditures, higher costs to private industry, and losses in fuel economy. The measurement program determined the total costs of oxygenated gasoline as an air pollution control strategy for the region. Costs measured included government administration and enforcement, industry production and distribution, and consumer and other user costs. This paper describes the ability of the oxygenated gasoline program to reduce pollution; the overall cost of the program to government, industry, and consumers; and the effectiveness of the program in reducing pollution compared to its costs

  3. Quantifying Climatological Ranges and Anomalies for Pacific Coral Reef Ecosystems

    Science.gov (United States)

    Gove, Jamison M.; Williams, Gareth J.; McManus, Margaret A.; Heron, Scott F.; Sandin, Stuart A.; Vetter, Oliver J.; Foley, David G.

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic–biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will

  4. Quantifying uncertainties of seismic Bayesian inversion of Northern Great Plains

    Science.gov (United States)

    Gao, C.; Lekic, V.

    2017-12-01

    Elastic waves excited by earthquakes are the fundamental observations of the seismological studies. Seismologists measure information such as travel time, amplitude, and polarization to infer the properties of earthquake source, seismic wave propagation, and subsurface structure. Across numerous applications, seismic imaging has been able to take advantage of complimentary seismic observables to constrain profiles and lateral variations of Earth's elastic properties. Moreover, seismic imaging plays a unique role in multidisciplinary studies of geoscience by providing direct constraints on the unreachable interior of the Earth. Accurate quantification of uncertainties of inferences made from seismic observations is of paramount importance for interpreting seismic images and testing geological hypotheses. However, such quantification remains challenging and subjective due to the non-linearity and non-uniqueness of geophysical inverse problem. In this project, we apply a reverse jump Markov chain Monte Carlo (rjMcMC) algorithm for a transdimensional Bayesian inversion of continental lithosphere structure. Such inversion allows us to quantify the uncertainties of inversion results by inverting for an ensemble solution. It also yields an adaptive parameterization that enables simultaneous inversion of different elastic properties without imposing strong prior information on the relationship between them. We present retrieved profiles of shear velocity (Vs) and radial anisotropy in Northern Great Plains using measurements from USArray stations. We use both seismic surface wave dispersion and receiver function data due to their complementary constraints of lithosphere structure. Furthermore, we analyze the uncertainties of both individual and joint inversion of those two data types to quantify the benefit of doing joint inversion. As an application, we infer the variation of Moho depths and crustal layering across the northern Great Plains.

  5. Quantified social and aesthetic values in environmental decision making

    International Nuclear Information System (INIS)

    Burnham, J.B.; Maynard, W.S.; Jones, G.R.

    1975-01-01

    A method has been devised for quantifying the social criteria to be considered when selecting a nuclear design and/or site option. Community judgement of social values is measured directly and indirectly on eight siting factors. These same criteria are independently analysed by experts using techno-economic methods. The combination of societal and technical indices yields a weighted score for each alternative. The aesthetic impact was selected as the first to be quantified. A visual quality index was developed to measure the change in the visual quality of a viewscape caused by construction of a facility. Visual quality was measured by reducing it to its component parts - intactness, vividness and unity - and rating each part with and without the facility. Urban planners and landscape architects used the technique to analyse three viewscapes, testing three different methods on each viewscape. The three methods used the same aesthetic elements but varied in detail and depth. As expected, the technique with the greatest analytical detail (and least subjective judgement) was the most reliable method. Social value judgements were measured by social psychologists applying a questionnaire technique, using a number of design and site options to illustrate the range of criteria. Three groups of predictably different respondents - environmentalists, high-school students and businessmen - were selected. The three groups' response patterns were remarkably similar, though businessmen were consistently more biased towards nuclear power than were environmentalists. Correlational and multiple regression analyses provided indirect estimates of the relative importance of each impact category. Only the environmentalists showed a high correlation between the two methods. This is partially explained by their interest and knowledge. Also, the regression analysis encounters problems when small samples are used, and the environmental sample was considerably larger than the other two

  6. Quantifying induced effects of subsurface renewable energy storage

    Science.gov (United States)

    Bauer, Sebastian; Beyer, Christof; Pfeiffer, Tilmann; Boockmeyer, Anke; Popp, Steffi; Delfs, Jens-Olaf; Wang, Bo; Li, Dedong; Dethlefsen, Frank; Dahmke, Andreas

    2015-04-01

    New methods and technologies for energy storage are required for the transition to renewable energy sources. Subsurface energy storage systems such as salt caverns or porous formations offer the possibility of hosting large amounts of energy or substance. When employing these systems, an adequate system and process understanding is required in order to assess the feasibility of the individual storage option at the respective site and to predict the complex and interacting effects induced. This understanding is the basis for assessing the potential as well as the risks connected with a sustainable usage of these storage options, especially when considering possible mutual influences. For achieving this aim, in this work synthetic scenarios for the use of the geological underground as an energy storage system are developed and parameterized. The scenarios are designed to represent typical conditions in North Germany. The types of subsurface use investigated here include gas storage and heat storage in porous formations. The scenarios are numerically simulated and interpreted with regard to risk analysis and effect forecasting. For this, the numerical simulators Eclipse and OpenGeoSys are used. The latter is enhanced to include the required coupled hydraulic, thermal, geomechanical and geochemical processes. Using the simulated and interpreted scenarios, the induced effects are quantified individually and monitoring concepts for observing these effects are derived. This presentation will detail the general investigation concept used and analyze the parameter availability for this type of model applications. Then the process implementation and numerical methods required and applied for simulating the induced effects of subsurface storage are detailed and explained. Application examples show the developed methods and quantify induced effects and storage sizes for the typical settings parameterized. This work is part of the ANGUS+ project, funded by the German Ministry

  7. Global tropospheric ozone modeling: Quantifying errors due to grid resolution

    Science.gov (United States)

    Wild, Oliver; Prather, Michael J.

    2006-06-01

    Ozone production in global chemical models is dependent on model resolution because ozone chemistry is inherently nonlinear, the timescales for chemical production are short, and precursors are artificially distributed over the spatial scale of the model grid. In this study we examine the sensitivity of ozone, its precursors, and its production to resolution by running a global chemical transport model at four different resolutions between T21 (5.6° × 5.6°) and T106 (1.1° × 1.1°) and by quantifying the errors in regional and global budgets. The sensitivity to vertical mixing through the parameterization of boundary layer turbulence is also examined. We find less ozone production in the boundary layer at higher resolution, consistent with slower chemical production in polluted emission regions and greater export of precursors. Agreement with ozonesonde and aircraft measurements made during the NASA TRACE-P campaign over the western Pacific in spring 2001 is consistently better at higher resolution. We demonstrate that the numerical errors in transport processes on a given resolution converge geometrically for a tracer at successively higher resolutions. The convergence in ozone production on progressing from T21 to T42, T63, and T106 resolution is likewise monotonic but indicates that there are still large errors at 120 km scales, suggesting that T106 resolution is too coarse to resolve regional ozone production. Diagnosing the ozone production and precursor transport that follow a short pulse of emissions over east Asia in springtime allows us to quantify the impacts of resolution on both regional and global ozone. Production close to continental emission regions is overestimated by 27% at T21 resolution, by 13% at T42 resolution, and by 5% at T106 resolution. However, subsequent ozone production in the free troposphere is not greatly affected. We find that the export of short-lived precursors such as NOx by convection is overestimated at coarse resolution.

  8. Electromyographic permutation entropy quantifies diaphragmatic denervation and reinnervation.

    Directory of Open Access Journals (Sweden)

    Christopher Kramer

    Full Text Available Spontaneous reinnervation after diaphragmatic paralysis due to trauma, surgery, tumors and spinal cord injuries is frequently observed. A possible explanation could be collateral reinnervation, since the diaphragm is commonly double-innervated by the (accessory phrenic nerve. Permutation entropy (PeEn, a complexity measure for time series, may reflect a functional state of neuromuscular transmission by quantifying the complexity of interactions across neural and muscular networks. In an established rat model, electromyographic signals of the diaphragm after phrenicotomy were analyzed using PeEn quantifying denervation and reinnervation. Thirty-three anesthetized rats were unilaterally phrenicotomized. After 1, 3, 9, 27 and 81 days, diaphragmatic electromyographic PeEn was analyzed in vivo from sternal, mid-costal and crural areas of both hemidiaphragms. After euthanasia of the animals, both hemidiaphragms were dissected for fiber type evaluation. The electromyographic incidence of an accessory phrenic nerve was 76%. At day 1 after phrenicotomy, PeEn (normalized values was significantly diminished in the sternal (median: 0.69; interquartile range: 0.66-0.75 and mid-costal area (0.68; 0.66-0.72 compared to the non-denervated side (0.84; 0.78-0.90 at threshold p<0.05. In the crural area, innervated by the accessory phrenic nerve, PeEn remained unchanged (0.79; 0.72-0.86. During reinnervation over 81 days, PeEn normalized in the mid-costal area (0.84; 0.77-0.86, whereas it remained reduced in the sternal area (0.77; 0.70-0.81. Fiber type grouping, a histological sign for reinnervation, was found in the mid-costal area in 20% after 27 days and in 80% after 81 days. Collateral reinnervation can restore diaphragm activity after phrenicotomy. Electromyographic PeEn represents a new, distinctive assessment characterizing intramuscular function following denervation and reinnervation.

  9. Quantifying climatological ranges and anomalies for Pacific coral reef ecosystems.

    Directory of Open Access Journals (Sweden)

    Jamison M Gove

    Full Text Available Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic-biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km from 85% of our study locations

  10. Does Speech Emerge From Earlier Appearing Oral Motor Behaviors?

    OpenAIRE

    Moore, Christopher A.; Ruark, Jacki L.

    1996-01-01

    This investigation was designed to quantify the coordinative organization of mandibular muscles in toddlers during speech and nonspeech behaviors. Seven 15-month-olds were observed during spontaneous production of chewing, sucking, babbling, and speech. Comparison of mandibular coordination across these behaviors revealed that, even for children in the earliest stages of true word production, coordination was quite different from that observed for other behaviors. Production of true words was...

  11. Behavioral addictions.

    Science.gov (United States)

    Robbins, T W; Clark, L

    2015-02-01

    Behavioral addictions are slowly becoming recognized as a valid category of psychiatric disorder as shown by the recent allocation of pathological gambling to this category in DSM-5. However, several other types of psychiatric disorder proposed to be examples of behavioral addictions have yet to be accorded this formal acknowledgment and are dispersed across other sections of the DSM-5. This brief review marks this important point in the evolution of this concept and looks to future investigation of behavioral addictions with the theoretical frameworks currently being used successfully to investigate substance addiction and obsessive-compulsive disorder, in a potentially new spectrum of impulsive-compulsive disorders. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Cost Behavior

    DEFF Research Database (Denmark)

    Hoffmann, Kira

    The objective of this dissertation is to investigate determinants and consequences of asymmetric cost behavior. Asymmetric cost behavior arises if the change in costs is different for increases in activity compared to equivalent decreases in activity. In this case, costs are termed “sticky......” if the change is less when activity falls than when activity rises, whereas costs are termed “anti-sticky” if the change is more when activity falls than when activity rises. Understanding such cost behavior is especially relevant for decision-makers and financial analysts that rely on accurate cost information...... to facilitate resource planning and earnings forecasting. As such, this dissertation relates to the topic of firm profitability and the interpretation of cost variability. The dissertation consists of three parts that are written in the form of separate academic papers. The following section briefly summarizes...

  13. Quantifying risk of transfusion in children undergoing spine surgery.

    Science.gov (United States)

    Vitale, Michael G; Levy, Douglas E; Park, Maxwell C; Choi, Hyunok; Choe, Julie C; Roye, David P

    2002-01-01

    The risks and costs of transfusion are a great concern in the area of pediatric spine surgery, because it is a blood-intensive procedure with a high risk for transfusion. Therefore, determining the predictors of transfusion in this patient population is an important first step and has the potential to improve upon the current approaches to reducing transfusion rates. In this study, we reveal several predictors of transfusion in a pediatric patient population undergoing spine surgery. In turn, we present a general rule of thumb ("rule of two's") for gauging transfusion risk, thus enhancing the surgeon's approach to avoiding transfusion in certain clinical scenarios. This study was conducted to determine the main factors of transfusion in a population of pediatric patients undergoing scoliosis surgery. The goal was to present an algorithm for quantifying the true risk of transfusion for various patient groups that would highlight patients "at high risk" for transfusion. This is especially important in light of the various risks associated with undergoing a transfusion, as well as the costs involved in maintaining and disposing of exogenous blood materials. This is a retrospective review of a group of children who underwent scoliosis surgery between 1988 and 1995 at an academic institution. A total of 290 patients were analyzed in this study, of which 63 were transfused and 227 were not. No outcomes measures were used in this study. A retrospective review of 290 patients presenting to our institution for scoliosis surgery was conducted, with a focus on socioclinical data related to transfusion risk. Univariate analysis and logistic regression were used to quantify the determinants of transfusion risk. Univariate analysis identified many factors that were associated with the risk of transfusion. However, it is clear that several of these factors are dependent on each other, obscuring the true issues driving transfusion need. We used multivariate analysis to control for

  14. Quantifying the multiple, environmental benefits of reintroducing the Eurasian Beaver

    Science.gov (United States)

    Brazier, Richard; Puttock, Alan; Graham, Hugh; Anderson, Karen; Cunliffe, Andrew; Elliott, Mark

    2016-04-01

    Beavers are ecological engineers with an ability to modify the structure and flow of fluvial systems and create complex wetland environments with dams, ponds and canals. Consequently, beaver activity has potential for river restoration, management and the provision of multiple environmental ecosystem services including biodiversity, flood risk mitigation, water quality and sustainable drinking water provision. With the current debate surrounding the reintroduction of beavers into the United Kingdom, it is critical to monitor the impact of beavers upon the environment. We have developed and implemented a monitoring strategy to quantify the impact of reintroducing the Eurasian Beaver on multiple environmental ecosystem services and river systems at a range of scales. First, the experimental design and preliminary results will be presented from the Mid-Devon Beaver Trial, where a family of beavers has been introduced to a 3 ha enclosure situated upon a first order tributary of the River Tamar. The site was instrumented to monitor the flow rate and quality of water entering and leaving the site. Additionally, the impacts of beavers upon riparian vegetation structure, water/carbon storage were investigated. Preliminary results indicate that beaver activity, particularly the building of ponds and dams, increases water storage within the landscape and moderates the river response to rainfall. Baseflow is enhanced during dry periods and storm flow is attenuated, potentially reducing the risk of flooding downstream. Initial analysis of water quality indicates that water entering the site (running off intensively managed grasslands upslope), has higher suspended sediment loads and nitrate levels, than that leaving the site, after moving through the series of beaver ponds. These results suggest beaver activity may also act as a means by which the negative impact of diffuse water pollution from agriculture can be mitigated thus providing cleaner water in rivers downstream

  15. Discounting Behavior

    DEFF Research Database (Denmark)

    Andersen, Steffen; Harrison, Glenn W.; Lau, Morten

    2014-01-01

    We re-evaluate the theory, experimental design and econometrics behind claims that individuals exhibit non-constant discounting behavior. Theory points to the importance of controlling for the non-linearity of the utility function of individuals, since the discount rate is defined over time-dated...

  16. Consumer Behavior

    NARCIS (Netherlands)

    Hoyer, W.D.; MacInnis, D.J.; Pieters, R.

    2013-01-01

    CONSUMER BEHAVIOR combines a foundation in key concepts from marketing, psychology, sociology, and anthropology with a highly practical focus on real-world applications for today's business environment. The new edition of this popular, pioneering text incorporates the latest cutting-edge research

  17. Behavior Modification

    Science.gov (United States)

    Boardman, Randolph M.

    2010-01-01

    In a perfect world, students would never talk back to school staff and never argue or fight with each other. They would complete all their assigned tasks, and disciplinary actions never would be needed. Unfortunately, people don't live in a perfect world. Student behavior is a daily concern. Teachers continue to refer students to the office as a…

  18. Quantifying uncertainties of permafrost carbon–climate feedbacks

    Directory of Open Access Journals (Sweden)

    E. J. Burke

    2017-06-01

    Full Text Available The land surface models JULES (Joint UK Land Environment Simulator, two versions and ORCHIDEE-MICT (Organizing Carbon and Hydrology in Dynamic Ecosystems, each with a revised representation of permafrost carbon, were coupled to the Integrated Model Of Global Effects of climatic aNomalies (IMOGEN intermediate-complexity climate and ocean carbon uptake model. IMOGEN calculates atmospheric carbon dioxide (CO2 and local monthly surface climate for a given emission scenario with the land–atmosphere CO2 flux exchange from either JULES or ORCHIDEE-MICT. These simulations include feedbacks associated with permafrost carbon changes in a warming world. Both IMOGEN–JULES and IMOGEN–ORCHIDEE-MICT were forced by historical and three alternative future-CO2-emission scenarios. Those simulations were performed for different climate sensitivities and regional climate change patterns based on 22 different Earth system models (ESMs used for CMIP3 (phase 3 of the Coupled Model Intercomparison Project, allowing us to explore climate uncertainties in the context of permafrost carbon–climate feedbacks. Three future emission scenarios consistent with three representative concentration pathways were used: RCP2.6, RCP4.5 and RCP8.5. Paired simulations with and without frozen carbon processes were required to quantify the impact of the permafrost carbon feedback on climate change. The additional warming from the permafrost carbon feedback is between 0.2 and 12 % of the change in the global mean temperature (ΔT by the year 2100 and 0.5 and 17 % of ΔT by 2300, with these ranges reflecting differences in land surface models, climate models and emissions pathway. As a percentage of ΔT, the permafrost carbon feedback has a greater impact on the low-emissions scenario (RCP2.6 than on the higher-emissions scenarios, suggesting that permafrost carbon should be taken into account when evaluating scenarios of heavy mitigation and stabilization

  19. Quantifying Livestock Heat Stress Impacts in the Sahel

    Science.gov (United States)

    Broman, D.; Rajagopalan, B.; Hopson, T. M.

    2014-12-01

    Livestock heat stress, especially in regions of the developing world with limited adaptive capacity, has a largely unquantified impact on food supply. Though dominated by ambient air temperature, relative humidity, wind speed, and solar radiation all affect heat stress, which can decrease livestock growth, milk production, reproduction rates, and mortality. Indices like the thermal-humidity index (THI) are used to quantify the heat stress experienced from climate variables. Livestock experience differing impacts at different index critical thresholds that are empirically determined and specific to species and breed. This lack of understanding has been highlighted in several studies with a limited knowledge of the critical thresholds of heat stress in native livestock breeds, as well as the current and future impact of heat stress,. As adaptation and mitigation strategies to climate change depend on a solid quantitative foundation, this knowledge gap has limited such efforts. To address the lack of study, we have investigated heat stress impacts in the pastoral system of Sub-Saharan West Africa. We used a stochastic weather generator to quantify both the historic and future variability of heat stress. This approach models temperature, relative humidity, and precipitation, the climate variables controlling heat stress. Incorporating large-scale climate as covariates into this framework provides a better historical fit and allows us to include future CMIP5 GCM projections to examine the climate change impacts on heat stress. Health and production data allow us to examine the influence of this variability on livestock directly, and are considered in conjunction with the confounding impacts of fodder and water access. This understanding provides useful information to decision makers looking to mitigate the impacts of climate change and can provide useful seasonal forecasts of heat stress risk. A comparison of the current and future heat stress conditions based on

  20. Quantifying forest mortality with the remote sensing of snow

    Science.gov (United States)

    Baker, Emily Hewitt

    Greenhouse gas emissions have altered global climate significantly, increasing the frequency of drought, fire, and pest-related mortality in forests across the western United States, with increasing area affected each year. Associated changes in forests are of great concern for the public, land managers, and the broader scientific community. These increased stresses have resulted in a widespread, spatially heterogeneous decline of forest canopies, which in turn exerts strong controls on the accumulation and melt of the snowpack, and changes forest-atmosphere exchanges of carbon, water, and energy. Most satellite-based retrievals of summer-season forest data are insufficient to quantify canopy, as opposed to the combination of canopy and undergrowth, since the signals of the two types of vegetation greenness have proven persistently difficult to distinguish. To overcome this issue, this research develops a method to quantify forest canopy cover using winter-season fractional snow covered area (FSCA) data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) snow covered area and grain size (MODSCAG) algorithm. In areas where the ground surface and undergrowth are completely snow-covered, a pixel comprises only forest canopy and snow. Following a snowfall event, FSCA initially rises, as snow is intercepted in the canopy, and then falls, as snow unloads. A select set of local minima in a winter F SCA timeseries form a threshold where canopy is snow-free, but forest understory is snow-covered. This serves as a spatially-explicit measurement of forest canopy, and viewable gap fraction (VGF) on a yearly basis. Using this method, we determine that MODIS-observed VGF is significantly correlated with an independent product of yearly crown mortality derived from spectral analysis of Landsat imagery at 25 high-mortality sites in northern Colorado. (r =0.96 +/-0.03, p =0.03). Additionally, we determine the lag timing between green-stage tree mortality and

  1. Quantifying the Climate-Scale Accuracy of Satellite Cloud Retrievals

    Science.gov (United States)

    Roberts, Y.; Wielicki, B. A.; Sun-Mack, S.; Minnis, P.; Liang, L.; Di Girolamo, L.

    2014-12-01

    Instrument calibration and cloud retrieval algorithms have been developed to minimize retrieval errors on small scales. However, measurement uncertainties and assumptions within retrieval algorithms at the pixel level may alias into decadal-scale trends of cloud properties. We first, therefore, quantify how instrument calibration changes could alias into cloud property trends. For a perfect observing system the climate trend accuracy is limited only by the natural variability of the climate variable. Alternatively, for an actual observing system, the climate trend accuracy is additionally limited by the measurement uncertainty. Drifts in calibration over time may therefore be disguised as a true climate trend. We impose absolute calibration changes to MODIS spectral reflectance used as input to the CERES Cloud Property Retrieval System (CPRS) and run the modified MODIS reflectance through the CPRS to determine the sensitivity of cloud properties to calibration changes. We then use these changes to determine the impact of instrument calibration changes on trend uncertainty in reflected solar cloud properties. Secondly, we quantify how much cloud retrieval algorithm assumptions alias into cloud optical retrieval trends by starting with the largest of these biases: the plane-parallel assumption in cloud optical thickness (τC) retrievals. First, we collect liquid water cloud fields obtained from Multi-angle Imaging Spectroradiometer (MISR) measurements to construct realistic probability distribution functions (PDFs) of 3D cloud anisotropy (a measure of the degree to which clouds depart from plane-parallel) for different ISCCP cloud types. Next, we will conduct a theoretical study with dynamically simulated cloud fields and a 3D radiative transfer model to determine the relationship between 3D cloud anisotropy and 3D τC bias for each cloud type. Combining these results provides distributions of 3D τC bias by cloud type. Finally, we will estimate the change in

  2. Quantifying Pollutant Emissions from Office Equipment Phase IReport

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, R.L.; Destaillats, H.; Hodgson, A.T.; McKone, T.E.; Perino, C.

    2006-12-01

    Although office equipment has been a focal point for governmental efforts to promote energy efficiency through programs such as Energy Star, little is known about the relationship between office equipment use and indoor air quality. This report provides results of the first phase (Phase I) of a study in which the primary objective is to measure emissions of organic pollutants and particulate matter from a selected set of office equipment typically used in residential and office environments. The specific aims of the overall research effort are: (1) use screening-level measurements to identify and quantify the concentrations of air pollutants of interest emitted by major categories of distributed office equipment in a controlled environment; (2) quantify the emissions of air pollutants from generally representative, individual machines within each of the major categories in a controlled chamber environment using well defined protocols; (3) characterize the effects of ageing and use on emissions for individual machines spanning several categories; (4) evaluate the importance of operational factors that can be manipulated to reduce pollutant emissions from office machines; and (5) explore the potential relationship between energy consumption and pollutant emissions for machines performing equivalent tasks. The study includes desktop computers (CPU units), computer monitors, and three categories of desktop printing devices. The printer categories are: (1) printers and multipurpose devices using color inkjet technology; (2) low- to medium output printers and multipurpose devices employing monochrome or color laser technology; and (3) high-output monochrome and color laser printers. The literature review and screening level experiments in Phase 1 were designed to identify substances of toxicological significance for more detailed study. In addition, these screening level measurements indicate the potential relative importance of different categories of office equipment

  3. Quantifying the Influence of Urbanization on a Coastal Floodplain

    Science.gov (United States)

    Sebastian, A.; Juan, A.; Bedient, P. B.

    2016-12-01

    The U.S. Gulf Coast is the fastest growing region in the United States; between 1960 and 2010, the number of housing units along the Gulf of Mexico increased by 246%, vastly outpacing growth in other parts of the country (NOAA 2013). Numerous studies have shown that increases in impervious surface associated with urbanization reduce infiltration and increase surface runoff. While empirical evidence suggests that changes in land use are leading to increased flood damage in overland areas, earlier studies have largely focused on the impacts of urbanization on surface runoff and watershed hydrology, rather than quantifying its influence on the spatial extent of flooding. In this study, we conduct a longitudinal assessment of the evolution of flood risk since 1970 in an urbanizing coastal watershed. Utilizing the distributed hydrologic model, Vflo®, in combination with the hydraulic model, HEC-RAS, we quantify the impact of localized land use/land cover (LULC) change on the spatial extent of flooding in the watershed and the underlying flood hazard structure. The results demonstrate that increases in impervious cover between 1970 and 2010 (34%) and 2010 and 2040 (18%) increase the size of the floodplain by 26 and 17%, respectively. Furthermore, the results indicate that the depth and frequency of flooding in neighborhoods within the 1% floodplain have increased substantially (see attached figure). Finally, this analysis provides evidence that outdated FEMA floodplain maps could be underestimating the extent of the floodplain by upwards of 25%, depending on the rate of urbanization in the watershed; and, that by incorporating physics-based distributed hydrologic models into floodplain studies, floodplain maps can be easily updated to reflect the most recent LULC information available. The methods presented in this study have important implications for the development of mitigation strategies in coastal areas, such as deterring future development in flood prone areas

  4. A method to quantify the "cone of economy".

    Science.gov (United States)

    Haddas, Ram; Lieberman, Isador H

    2018-05-01

    A non-randomized, prospective, concurrent control cohort study. The purpose of this study is to develop and evaluate a method to quantify the dimensions of the cone of economy (COE) and the energy expenditure associated with maintaining a balanced posture within the COE, scoliosis patients and compare them to matched non-scoliotic controls in a group of adult degenerative. Balance is defined as the ability of the human body to maintain its center of mass (COM) within the base of support with minimal postural sway. The cone of economy refers to the stable region of upright standing posture. The underlying assumption is that deviating outside one's individual cone challenges the balance mechanisms. Adult degenerative scoliosis (ADS) patients exhibit a variety of postural changes within their COE, involving the spine, pelvis and lower extremities, in their effort to compensate for the altered posture. Ten ADS patients and ten non-scoliotic volunteers performed a series of functional balance tests. The dimensions of the COE and the energy expenditure related to maintaining balance within the COE were measured using a human motion video capture system and dynamic surface electromyography. ADS patients presented more COM sway in the sagittal (ADS: 1.59 cm vs. H: 0.61 cm; p = 0.049) and coronal (ADS: 2.84 cm vs. H: 1.72 cm; p = 0.046) directions in comparison to the non-scoliotic control. ADS patients presented with more COM (ADS: 33.30 cm vs. H: 19.13 cm; p = 0.039) and head (ADS: 31.06 cm vs. H: 19.13 cm; p = 0.013) displacements in comparison to the non-scoliotic controls. Scoliosis patients expended more muscle activity to maintain static standing, as manifest by increased muscle activity in their erector spinae (ADS: 37.16 mV vs. H: 20.31 mV; p = 0.050), and gluteus maximus (ADS: 33.12 mV vs. H: 12.09 mV; p = 0.001) muscles. We were able to develop and evaluate a method that quantifies the COE boundaries, COM displacement, and amount of sway within the COE

  5. Quantifying morphological changes of cape-related shoals

    Science.gov (United States)

    Paniagua-Arroyave, J. F.; Adams, P. N.; Parra, S. M.; Valle-Levinson, A.

    2017-12-01

    The rising demand for marine resources has motivated the study of inner shelf transport processes, especially in locations with highly-developed coastlines, endangered-species habitats, and valuable economic resources. These characteristics are found at Cape Canaveral shoals, on the Florida Atlantic coast, where transport dynamics and morphological evolution are not well understood. To study morphological changes at these shoals, two sets of paired upward- and downward-pointing acoustic Doppler current profilers (ADCPs) were deployed in winter 2015-2016. One set was deployed at the inner swale of Shoal E, 20 km southeast of the cape tip in 13 m depth, while the other set was located at the edge of Southeast shoal in 5 m deep. Upward-pointing velocity profiles and suspended particle concentrations were implemented in the Exner equation to quantify instantaneous rates of change in bed elevation. This computation includes changes in sediment concentration and the advection of suspended particles, but does not account for spatial gradients in bed-load fluxes and water velocities. The results of the computation were then compared to bed change rates measured directly by the downward-pointing ADCPs. At the easternmost ridge, quantified bed elevation change rates ranged from -7×10-7 to 4×10-7 m/s, and those at the inner swale ranged from -4×10-7 to 8×10-7 m/s. These values were two orders of magnitude smaller than rates measured by downward-pointing ADCPs. Moreover, the cumulative changes were two orders of magnitude larger at the ridge (-0.33 m, downward, and -0.13, m upward) than at the inner swale (cf. -6×10-3 m, downward, and 3×10-3 m, upward). These values suggest that bedform migration may be occurring at the ridge, that suspended sediments account for up to 30% of total bed changes, and that gradients in bed-load fluxes exert control on morphological change over the shoals. Despite uncertainties related to the ADCP-derived sediment concentrations, these

  6. Identifying Selected Behavioral Determinants of Risk and Uncertainty on the Real Estate Market

    Directory of Open Access Journals (Sweden)

    Brzezicka Justyna

    2014-07-01

    Full Text Available Various market behaviors can be characterized as risky or uncertain, thus their observation is important to the real estate market system. The extensive use of behavioral factors facilitates their implementation and studies in relation to the real estate market system. The behavioral approach has established its own instrumentation which enables elements of risk and uncertainty to be quantified.

  7. Effect of downed woody debris on small mammal anti-predator behavior

    Science.gov (United States)

    Travis M. Hinkelman; John L. Orrock; Susan C Loeb

    2011-01-01

    Anti-Predator behavior can affect prey growth, reproduction, survival, and generate emergent effects in food webs. Small mammals often lower the cost of predation by altering their behavior in response to shrubs, but the importance of other microhabitat features, such as downed woody debris, for anti-predator behavior is unknown. We used giving-up densities to quantify...

  8. Quantifying soil burn severity for hydrologic modeling to assess post-fire effects on sediment delivery

    Science.gov (United States)

    Dobre, Mariana; Brooks, Erin; Lew, Roger; Kolden, Crystal; Quinn, Dylan; Elliot, William; Robichaud, Pete

    2017-04-01

    Soil erosion is a secondary fire effect with great implications for many ecosystem resources. Depending on the burn severity, topography, and the weather immediately after the fire, soil erosion can impact municipal water supplies, degrade water quality, and reduce reservoirs' storage capacity. Scientists and managers use field and remotely sensed data to quickly assess post-fire burn severity in ecologically-sensitive areas. From these assessments, mitigation activities are implemented to minimize post-fire flood and soil erosion and to facilitate post-fire vegetation recovery. Alternatively, land managers can use fire behavior and spread models (e.g. FlamMap, FARSITE, FOFEM, or CONSUME) to identify sensitive areas a priori, and apply strategies such as fuel reduction treatments to proactively minimize the risk of wildfire spread and increased burn severity. There is a growing interest in linking fire behavior and spread models with hydrology-based soil erosion models to provide site-specific assessment of mitigation treatments on post-fire runoff and erosion. The challenge remains, however, that many burn severity mapping and modeling products quantify vegetation loss rather than measuring soil burn severity. Wildfire burn severity is spatially heterogeneous and depends on the pre-fire vegetation cover, fuel load, topography, and weather. Severities also differ depending on the variable of interest (e.g. soil, vegetation). In the United States, Burned Area Reflectance Classification (BARC) maps, derived from Landsat satellite images, are used as an initial burn severity assessment. BARC maps are classified from either a Normalized Burn Ratio (NBR) or differenced Normalized Burned Ratio (dNBR) scene into four classes (Unburned, Low, Moderate, and High severity). The development of soil burn severity maps requires further manual field validation efforts to transform the BARC maps into a product more applicable for post-fire soil rehabilitation activities

  9. Comparing strategies to assess multiple behavior change in behavioral intervention studies.

    Science.gov (United States)

    Drake, Bettina F; Quintiliani, Lisa M; Sapp, Amy L; Li, Yi; Harley, Amy E; Emmons, Karen M; Sorensen, Glorian

    2013-03-01

    Alternatives to individual behavior change methods have been proposed, however, little has been done to investigate how these methods compare. To explore four methods that quantify change in multiple risk behaviors targeting four common behaviors. We utilized data from two cluster-randomized, multiple behavior change trials conducted in two settings: small businesses and health centers. Methods used were: (1) summative; (2) z-score; (3) optimal linear combination; and (4) impact score. In the Small Business study, methods 2 and 3 revealed similar outcomes. However, physical activity did not contribute to method 3. In the Health Centers study, similar results were found with each of the methods. Multivitamin intake contributed significantly more to each of the summary measures than other behaviors. Selection of methods to assess multiple behavior change in intervention trials must consider study design, and the targeted population when determining the appropriate method/s to use.

  10. Quantifiers and the Foundations of Quasi-Set Theory

    Directory of Open Access Journals (Sweden)

    Jonas R. Becker Arenhart

    2009-12-01

    Full Text Available In this paper we discuss some questions proposed by Prof. Newton da Costa on the foundations of quasi-set theory. His main doubts concern the possibility of a reasonable semantical understanding of the theory, mainly due to the fact that identity and difference do not apply to some entities of the theory’s intended domain of discourse. According to him, the quantifiers employed in the theory, when understood in the usual way, rely on the assumption that identity applies to all entities in the domain of discourse. Inspired by his provocation, we suggest that, using some ideas presented by da Costa himself in his seminars at UFSC (the Federal University of Santa Catarina and by one of us (DK in some papers, these difficulties can be overcome both on a formal level and on an informal level, showing how quantification over items for which identity does not make sense can be understood without presupposing a semantics based on a ‘classical’ set theory.

  11. Quantifying food losses and the potential for reduction in Switzerland.

    Science.gov (United States)

    Beretta, Claudio; Stoessel, Franziska; Baier, Urs; Hellweg, Stefanie

    2013-03-01

    A key element in making our food systems more efficient is the reduction of food losses across the entire food value chain. Nevertheless, food losses are often neglected. This paper quantifies food losses in Switzerland at the various stages of the food value chain (agricultural production, postharvest handling and trade, processing, food service industry, retail, and households), identifies hotspots and analyses the reasons for losses. Twenty-two food categories are modelled separately in a mass and energy flow analysis, based on data from 31 companies within the food value chain, and from public institutions, associations, and from the literature. The energy balance shows that 48% of the total calories produced (edible crop yields at harvest time and animal products, including slaughter waste) is lost across the whole food value chain. Half of these losses would be avoidable given appropriate mitigation measures. Most avoidable food losses occur at the household, processing, and agricultural production stage of the food value chain. Households are responsible for almost half of the total avoidable losses (in terms of calorific content). Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Quantifying stretching and rearrangement in epithelial sheet migration

    International Nuclear Information System (INIS)

    Lee, Rachel M; Nordstrom, Kerstin N; Losert, Wolfgang; Kelley, Douglas H; Ouellette, Nicholas T

    2013-01-01

    Although understanding the collective migration of cells, such as that seen in epithelial sheets, is essential for understanding diseases such as metastatic cancer, this motion is not yet as well characterized as individual cell migration. Here we adapt quantitative metrics used to characterize the flow and deformation of soft matter to contrast different types of motion within a migrating sheet of cells. Using a finite-time Lyapunov exponent (FTLE) analysis, we find that—in spite of large fluctuations—the flow field of an epithelial cell sheet is not chaotic. Stretching of a sheet of cells (i.e. positive FTLE) is localized at the leading edge of migration and increases when the cells are more highly stimulated. By decomposing the motion of the cells into affine and non-affine components using the metric D m in 2 , we quantify local plastic rearrangements and describe the motion of a group of cells in a novel way. We find an increase in plastic rearrangements with increasing cell densities, whereas inanimate systems tend to exhibit less non-affine rearrangements with increasing density. (paper)

  13. Quantifying the provenance of aeolian sediments using multiple composite fingerprints

    Science.gov (United States)

    Liu, Benli; Niu, Qinghe; Qu, Jianjun; Zu, Ruiping

    2016-09-01

    We introduce a new fingerprinting method that uses multiple composite fingerprints for studies of aeolian sediment provenance. We used this method to quantify the provenance of sediments on both sides of the Qinghai-Tibetan Railway (QTR) in the Cuona Lake section of the Tibetan Plateau (TP), in an environment characterized by aeolian and fluvial interactions. The method involves repeatedly solving a linear mixing model based on mass conservation; the model is not limited to spatial scale or transport types and uses all the tracer groups that passed the range check, Kruskal-Wallis H-test, and a strict analytical solution screening. The proportional estimates that result from using different composite fingerprints are highly variable; however, the average of these fingerprints has a greater accuracy and certainty than any single fingerprint. The results show that sand from the lake beach, hilly surface, and gullies contribute, respectively, 48%, 31% and 21% to the western railway sediments and 43%, 33% and 24% to the eastern railway sediments. The difference between contributions from various sources on either side of the railway, which may increase in the future, was clearly related to variations in local transport characteristics, a conclusion that is supported by grain size analysis. The construction of the QTR changed the local cycling of materials, and the difference in provenance between the sediments that are separated by the railway reflects the changed sedimentary conditions on either side of the railway. The effectiveness of this method suggests that it will be useful in other studies of aeolian sediments.

  14. Quantifying mechanical force in axonal growth and guidance

    Directory of Open Access Journals (Sweden)

    Ahmad Ibrahim Mahmoud Athamneh

    2015-09-01

    Full Text Available Mechanical force plays a fundamental role in neuronal development, physiology, and regeneration. In particular, research has shown that force is involved in growth cone-mediated axonal growth and guidance as well as stretch-induced elongation when an organism increases in size after forming initial synaptic connections. However, much of the details about the exact role of force in these fundamental processes remain unknown. In this review, we highlight (1 standing questions concerning the role of mechanical force in axonal growth and guidance and (2 different experimental techniques used to quantify forces in axons and growth cones. We believe that satisfying answers to these questions will require quantitative information about the relationship between elongation, forces, cytoskeletal dynamics, axonal transport, signaling, substrate adhesion, and stiffness contributing to directional growth advance. Furthermore, we address why a wide range of force values have been reported in the literature, and what these values mean in the context of neuronal mechanics. We hope that this review will provide a guide for those interested in studying the role of force in development and regeneration of neuronal networks.

  15. Quantifying Changes in Accessible Water in the Colorado River Basin

    Science.gov (United States)

    Castle, S.; Thomas, B.; Reager, J. T.; Swenson, S. C.; Famiglietti, J. S.

    2013-12-01

    The Colorado River Basin (CRB) in the western United States is heavily managed yet remains one of the most over-allocated rivers in the world providing water across seven US states and Mexico. Future water management strategies in the CRB have employed land surface models to forecast discharges; such approaches have focused on discharge estimates to meet allocation requirements yet ignore groundwater abstractions to meet water demands. In this analysis, we illustrate the impact of changes in accessible water, which we define as the conjunctive use of both surface water reservoir storage and groundwater storage, using remote sensing observations to explore sustainable water management strategies in the CRB. We employ high resolution Landsat Thematic Mapper satellite data to detect changes in reservoir storage in the two largest reservoirs within the CRB, Lakes Mead and Powell, and the Gravity Recovery and Climate Experiment (GRACE) terrestrial water storage anomalies to isolate changes in basin-wide groundwater storage in the Upper and Lower CRB from October 2003 to December 2012. Our approach quantifies reservoir and groundwater storage within the CRB using remote sensing to provide new information to water managers to sustainably and conjunctively manage accessible water.

  16. Quantifying dose to the reconstructed breast: Can we adequately treat?

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Eugene; Marsh, Robin B.; Griffith, Kent A.; Moran, Jean M. [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States); Pierce, Lori J., E-mail: ljpierce@umich.edu [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States)

    2013-04-01

    To evaluate how immediate reconstruction (IR) impacts postmastectomy radiotherapy (PMRT) dose distributions to the reconstructed breast (RB), internal mammary nodes (IMN), heart, and lungs using quantifiable dosimetric end points. 3D conformal plans were developed for 20 IR patients, 10 autologous reconstruction (AR), and 10 expander-implant (EI) reconstruction. For each reconstruction type, 5 right- and 5 left-sided reconstructions were selected. Two plans were created for each patient, 1 with RB coverage alone and 1 with RB + IMN coverage. Left-sided EI plans without IMN coverage had higher heart Dmean than left-sided AR plans (2.97 and 0.84 Gy, p = 0.03). Otherwise, results did not vary by reconstruction type and all remaining metrics were evaluated using a combined AR and EI dataset. RB coverage was adequate regardless of laterality or IMN coverage (Dmean 50.61 Gy, D95 45.76 Gy). When included, IMN Dmean and D95 were 49.57 and 40.96 Gy, respectively. Mean heart doses increased with left-sided treatment plans and IMN inclusion. Right-sided treatment plans and IMN inclusion increased mean lung V{sub 20}. Using standard field arrangements and 3D planning, we observed excellent coverage of the RB and IMN, regardless of laterality or reconstruction type. Our results demonstrate that adequate doses can be delivered to the RB with or without IMN coverage.

  17. Quantifying Pemex E and P benefits from foreign strategic associations

    International Nuclear Information System (INIS)

    Baker, G.

    1993-01-01

    The recent critique by the Heritage Foundation of the management of Mexico's petroleum sector deserves attention by oil industry observers of Mexico as well as government and academic analysts. The foundation argues that sweeping changes are needed-for Mexico's own good-in upstream and downstream policy. The following analysis presents one form of quantifying the upstream argument: the Mexican government stands to gain $5.7 billion/year in taxable revenue from state petroleum company Petroleos Mexicanos by 2000 from strategic associations with international petroleum companies. In addition, there are efficiency advantages that Pemex would gain by strategic associations with international petroleum companies. The ripple effects would include not only oil production and tax revenue increases from Pemex operations but also the advantages of new management skills, new sources of funding and technology, and lower operating and overhead costs in Pemex. It will not be possible, however, for the Mexican government to choose one or more strategic partners by quantitative methods alone. To the contrary, a significant measure of trust will be required

  18. Wikipedia Culture Gap: Quantifying Content Imbalances Across 40 Language Editions

    Directory of Open Access Journals (Sweden)

    Marc Miquel-Ribé

    2018-06-01

    Full Text Available The online encyclopedia Wikipedia is the largest general information repository created through collaborative efforts from all over the globe. Despite the project's goal being to achieve the sum of human knowledge, there are strong content imbalances across the language editions. In order to quantify and investigate these imbalances, we study the impact of cultural context in 40 language editions. To this purpose, we developed a computational method to identify articles that can be related to the editors' cultural context associated to each Wikipedia language edition. We employed a combination of strategies taking into account geolocated articles, specific keywords and categories, as well as links between articles. We verified the method's quality with manual assessment and found an average precision of 0.92 and an average recall of 0.95. The results show that about a quarter of each Wikipedia language edition is dedicated to represent the corresponding cultural context. Although a considerable part of this content was created during the first years of the project, its creation is sustained over time. An analysis of cross-language coverage of this content shows that most of it is unique in its original language, and reveals special links between cultural contexts; at the same time, it highlights gaps where the encyclopedia could extend its content. The approach and findings presented in this study can help to foster participation and inter-cultural enrichment of Wikipedias. The datasets produced are made available for further research.

  19. QUANTIFYING KINEMATIC SUBSTRUCTURE IN THE MILKY WAY'S STELLAR HALO

    International Nuclear Information System (INIS)

    Xue Xiangxiang; Zhao Gang; Luo Ali; Rix, Hans-Walter; Bell, Eric F.; Koposov, Sergey E.; Kang, Xi; Liu, Chao; Yanny, Brian; Beers, Timothy C.; Lee, Young Sun; Bullock, James S.; Johnston, Kathryn V.; Morrison, Heather; Rockosi, Constance; Weaver, Benjamin A.

    2011-01-01

    We present and analyze the positions, distances, and radial velocities for over 4000 blue horizontal-branch (BHB) stars in the Milky Way's halo, drawn from SDSS DR8. We search for position-velocity substructure in these data, a signature of the hierarchical assembly of the stellar halo. Using a cumulative 'close pair distribution' as a statistic in the four-dimensional space of sky position, distance, and velocity, we quantify the presence of position-velocity substructure at high statistical significance among the BHB stars: pairs of BHB stars that are close in position on the sky tend to have more similar distances and radial velocities compared to a random sampling of these overall distributions. We make analogous mock observations of 11 numerical halo formation simulations, in which the stellar halo is entirely composed of disrupted satellite debris, and find a level of substructure comparable to that seen in the actually observed BHB star sample. This result quantitatively confirms the hierarchical build-up of the stellar halo through a signature in phase (position-velocity) space. In detail, the structure present in the BHB stars is somewhat less prominent than that seen in most simulated halos, quite possibly because BHB stars represent an older sub-population. BHB stars located beyond 20 kpc from the Galactic center exhibit stronger substructure than at r gc < 20 kpc.

  20. Quantifying uncertainty, variability and likelihood for ordinary differential equation models

    LENUS (Irish Health Repository)

    Weisse, Andrea Y

    2010-10-28

    Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.

  1. Quantifying the dilution effect for models in ecological epidemiology.

    Science.gov (United States)

    Roberts, M G; Heesterbeek, J A P

    2018-03-01

    The dilution effect , where an increase in biodiversity results in a reduction in the prevalence of an infectious disease, has been the subject of speculation and controversy. Conversely, an amplification effect occurs when increased biodiversity is related to an increase in prevalence. We explore the conditions under which these effects arise, using multi species compartmental models that integrate ecological and epidemiological interactions. We introduce three potential metrics for quantifying dilution and amplification, one based on infection prevalence in a focal host species, one based on the size of the infected subpopulation of that species and one based on the basic reproduction number. We introduce our approach in the simplest epidemiological setting with two species, and show that the existence and strength of a dilution effect is influenced strongly by the choices made to describe the system and the metric used to gauge the effect. We show that our method can be generalized to any number of species and to more complicated ecological and epidemiological dynamics. Our method allows a rigorous analysis of ecological systems where dilution effects have been postulated, and contributes to future progress in understanding the phenomenon of dilution in the context of infectious disease dynamics and infection risk. © 2018 The Author(s).

  2. Quantifying soil moisture impacts on light use efficiency across biomes.

    Science.gov (United States)

    Stocker, Benjamin D; Zscheischler, Jakob; Keenan, Trevor F; Prentice, I Colin; Peñuelas, Josep; Seneviratne, Sonia I

    2018-06-01

    Terrestrial primary productivity and carbon cycle impacts of droughts are commonly quantified using vapour pressure deficit (VPD) data and remotely sensed greenness, without accounting for soil moisture. However, soil moisture limitation is known to strongly affect plant physiology. Here, we investigate light use efficiency, the ratio of gross primary productivity (GPP) to absorbed light. We derive its fractional reduction due to soil moisture (fLUE), separated from VPD and greenness changes, using artificial neural networks trained on eddy covariance data, multiple soil moisture datasets and remotely sensed greenness. This reveals substantial impacts of soil moisture alone that reduce GPP by up to 40% at sites located in sub-humid, semi-arid or arid regions. For sites in relatively moist climates, we find, paradoxically, a muted fLUE response to drying soil, but reduced fLUE under wet conditions. fLUE identifies substantial drought impacts that are not captured when relying solely on VPD and greenness changes and, when seasonally recurring, are missed by traditional, anomaly-based drought indices. Counter to common assumptions, fLUE reductions are largest in drought-deciduous vegetation, including grasslands. Our results highlight the necessity to account for soil moisture limitation in terrestrial primary productivity data products, especially for drought-related assessments. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  3. Evolutions in food marketing, quantifying the impact, and policy implications.

    Science.gov (United States)

    Cairns, Georgina

    2013-03-01

    A case study on interactive digital marketing examined the adequacy of extant policy controls and their underpinning paradigms to constrain the effects of this rapidly emerging practice. Findings were interactive digital marketing is expanding the strategies available to promote products, brands and consumer behaviours. It facilitates relational marketing; the collection of personal data for marketing; integration of the marketing mix, and provides a platform for consumers to engage in the co-creation of marketing communications. The paradigmatic logic of current policies to constrain youth-oriented food marketing does not address the interactive nature of digital marketing. The evidence base on the effects of HFSS marketing and policy interventions is based on conceptualizations of marketing as a force promoting transactions rather than interactions. Digital technologies are generating rich consumer data. Interactive digital technologies increase the complexity of the task of quantifying the impact of marketing. The rapidity of its uptake also increases urgency of need to identify appropriate effects measures. Independent analysis of commercial consumer data (appropriately transformed to protect commercial confidentiality and personal privacy) would provide evidence sources for policy on the impacts of commercial food and beverage marketing and policy controls. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Quantifying the abnormal hemodynamics of sickle cell anemia

    Science.gov (United States)

    Lei, Huan; Karniadakis, George

    2012-02-01

    Sickle red blood cells (SS-RBC) exhibit heterogeneous morphologies and abnormal hemodynamics in deoxygenated states. A multi-scale model for SS-RBC is developed based on the Dissipative Particle Dynamics (DPD) method. Different cell morphologies (sickle, granular, elongated shapes) typically observed in deoxygenated states are constructed and quantified by the Asphericity and Elliptical shape factors. The hemodynamics of SS-RBC suspensions is studied in both shear and pipe flow systems. The flow resistance obtained from both systems exhibits a larger value than the healthy blood flow due to the abnormal cell properties. Moreover, SS-RBCs exhibit abnormal adhesive interactions with both the vessel endothelium cells and the leukocytes. The effect of the abnormal adhesive interactions on the hemodynamics of sickle blood is investigated using the current model. It is found that both the SS-RBC - endothelium and the SS-RBC - leukocytes interactions, can potentially trigger the vicious ``sickling and entrapment'' cycles, resulting in vaso-occlusion phenomena widely observed in micro-circulation experiments.

  5. Quantifying the performance of individual players in a team activity.

    Science.gov (United States)

    Duch, Jordi; Waitzman, Joshua S; Amaral, Luís A Nunes

    2010-06-16

    Teamwork is a fundamental aspect of many human activities, from business to art and from sports to science. Recent research suggest that team work is of crucial importance to cutting-edge scientific research, but little is known about how teamwork leads to greater creativity. Indeed, for many team activities, it is not even clear how to assign credit to individual team members. Remarkably, at least in the context of sports, there is usually a broad consensus on who are the top performers and on what qualifies as an outstanding performance. In order to determine how individual features can be quantified, and as a test bed for other team-based human activities, we analyze the performance of players in the European Cup 2008 soccer tournament. We develop a network approach that provides a powerful quantification of the contributions of individual players and of overall team performance. We hypothesize that generalizations of our approach could be useful in other contexts where quantification of the contributions of individual team members is important.

  6. Quantifying the performance of individual players in a team activity.

    Directory of Open Access Journals (Sweden)

    Jordi Duch

    2010-06-01

    Full Text Available Teamwork is a fundamental aspect of many human activities, from business to art and from sports to science. Recent research suggest that team work is of crucial importance to cutting-edge scientific research, but little is known about how teamwork leads to greater creativity. Indeed, for many team activities, it is not even clear how to assign credit to individual team members. Remarkably, at least in the context of sports, there is usually a broad consensus on who are the top performers and on what qualifies as an outstanding performance.In order to determine how individual features can be quantified, and as a test bed for other team-based human activities, we analyze the performance of players in the European Cup 2008 soccer tournament. We develop a network approach that provides a powerful quantification of the contributions of individual players and of overall team performance.We hypothesize that generalizations of our approach could be useful in other contexts where quantification of the contributions of individual team members is important.

  7. Quantifying interictal metabolic activity in human temporal lobe epilepsy

    International Nuclear Information System (INIS)

    Henry, T.R.; Mazziotta, J.C.; Engel, J. Jr.; Christenson, P.D.; Zhang, J.X.; Phelps, M.E.; Kuhl, D.E.

    1990-01-01

    The majority of patients with complex partial seizures of unilateral temporal lobe origin have interictal temporal hypometabolism on [18F]fluorodeoxyglucose positron emission tomography (FDG PET) studies. Often, this hypometabolism extends to ipsilateral extratemporal sites. The use of accurately quantified metabolic data has been limited by the absence of an equally reliable method of anatomical analysis of PET images. We developed a standardized method for visual placement of anatomically configured regions of interest on FDG PET studies, which is particularly adapted to the widespread, asymmetric, and often severe interictal metabolic alterations of temporal lobe epilepsy. This method was applied by a single investigator, who was blind to the identity of subjects, to 10 normal control and 25 interictal temporal lobe epilepsy studies. All subjects had normal brain anatomical volumes on structural neuroimaging studies. The results demonstrate ipsilateral thalamic and temporal lobe involvement in the interictal hypometabolism of unilateral temporal lobe epilepsy. Ipsilateral frontal, parietal, and basal ganglial metabolism is also reduced, although not as markedly as is temporal and thalamic metabolism

  8. Quantifying repetitive speech in autism spectrum disorders and language impairment.

    Science.gov (United States)

    van Santen, Jan P H; Sproat, Richard W; Hill, Alison Presmanes

    2013-10-01

    We report on an automatic technique for quantifying two types of repetitive speech: repetitions of what the child says him/herself (self-repeats) and of what is uttered by an interlocutor (echolalia). We apply this technique to a sample of 111 children between the ages of four and eight: 42 typically developing children (TD), 19 children with specific language impairment (SLI), 25 children with autism spectrum disorders (ASD) plus language impairment (ALI), and 25 children with ASD with normal, non-impaired language (ALN). The results indicate robust differences in echolalia between the TD and ASD groups as a whole (ALN + ALI), and between TD and ALN children. There were no significant differences between ALI and SLI children for echolalia or self-repetitions. The results confirm previous findings that children with ASD repeat the language of others more than other populations of children. On the other hand, self-repetition does not appear to be significantly more frequent in ASD, nor does it matter whether the child's echolalia occurred within one (immediate) or two turns (near-immediate) of the adult's original utterance. Furthermore, non-significant differences between ALN and SLI, between TD and SLI, and between ALI and TD are suggestive that echolalia may not be specific to ALN or to ASD in general. One important innovation of this work is an objective fully automatic technique for assessing the amount of repetition in a transcript of a child's utterances. © 2013 International Society for Autism Research, Wiley Periodicals, Inc.

  9. Quantifying anisotropy and fiber orientation in human brain histological sections

    Directory of Open Access Journals (Sweden)

    Matthew D Budde

    2013-02-01

    Full Text Available Diffusion weighted imaging (DWI has provided unparalleled insight into the microscopic structure and organization of the central nervous system. Diffusion tensor imaging (DTI and other models of the diffusion MRI signal extract microstructural properties of tissues with relevance to the normal and injured brain. Despite the prevalence of such techniques and applications, accurate and large-scale validation has proven difficult, particularly in the human brain. In this report, human brain sections obtained from a digital public brain bank were employed to quantify anisotropy and fiber orientation using structure tensor analysis. The derived maps depict the intricate complexity of white matter fibers at a resolution not attainable with current DWI experiments. Moreover, the effects of multiple fiber bundles (i.e. crossing fibers and intravoxel fiber dispersion were demonstrated. Examination of the cortex and hippocampal regions validated specific features of previous in vivo and ex vivo DTI studies of the human brain. Despite the limitation to two dimensions, the resulting images provide a unique depiction of white matter organization at resolutions currently unattainable with DWI. The method of analysis may be used to validate tissue properties derived from DTI and alternative models of the diffusion signal.

  10. Quantifying Neonatal Sucking Performance: Promise of New Methods.

    Science.gov (United States)

    Capilouto, Gilson J; Cunningham, Tommy J; Mullineaux, David R; Tamilia, Eleonora; Papadelis, Christos; Giannone, Peter J

    2017-04-01

    Neonatal feeding has been traditionally understudied so guidelines and evidence-based support for common feeding practices are limited. A major contributing factor to the paucity of evidence-based practice in this area has been the lack of simple-to-use, low-cost tools for monitoring sucking performance. We describe new methods for quantifying neonatal sucking performance that hold significant clinical and research promise. We present early results from an ongoing study investigating neonatal sucking as a marker of risk for adverse neurodevelopmental outcomes. We include quantitative measures of sucking performance to better understand how movement variability evolves during skill acquisition. Results showed the coefficient of variation of suck duration was significantly different between preterm neonates at high risk for developmental concerns (HRPT) and preterm neonates at low risk for developmental concerns (LRPT). For HRPT, results indicated the coefficient of variation of suck smoothness increased from initial feeding to discharge and remained significantly greater than healthy full-term newborns (FT) at discharge. There was no significant difference in our measures between FT and LRPT at discharge. Our findings highlight the need to include neonatal sucking assessment as part of routine clinical care in order to capture the relative risk of adverse neurodevelopmental outcomes at discharge. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  11. A National Approach to Quantify and Map Biodiversity ...

    Science.gov (United States)

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human welfare. The degradation of natural ecosystems and climate variation impact the environment and society by affecting ecological integrity and ecosystems’ capacity to provide critical services (i.e., the contributions of ecosystems to human well-being). These challenges will require complex management decisions that can often involve significant trade-offs between societal desires and environmental needs. Evaluating trade-offs in terms of ecosystem services and human well-being provides an intuitive and comprehensive way to assess the broad implications of our decisions and to help shape policies that enhance environmental and social sustainability. In answer to this challenge, the U.S. government has created a partnership among the U.S. Environmental Protection Agency, other Federal agencies, academic institutions, and, Non-Governmental Organizations to develop the EnviroAtlas, an online Decision Support Tool that allows users (e.g., planners, policy-makers, resource managers, NGOs, private indu

  12. Quantify uncertain emergency search techniques (QUEST) -- Theory and user's guide

    International Nuclear Information System (INIS)

    Johnson, M.M.; Goldsby, M.E.; Plantenga, T.D.; Porter, T.L.; West, T.H.; Wilcox, W.B.; Hensley, W.K.

    1998-01-01

    As recent world events show, criminal and terrorist access to nuclear materials is a growing national concern. The national laboratories are taking the lead in developing technologies to counter these potential threats to the national security. Sandia National laboratories, with support from Pacific Northwest National Laboratory and the Bechtel Nevada, Remote Sensing Laboratory, has developed QUEST (a model to Quantify Uncertain Emergency Search Techniques), to enhance the performance of organizations in the search for lost or stolen nuclear material. In addition, QUEST supports a wide range of other applications, such as environmental monitoring, nuclear facilities inspections, and searcher training. QUEST simulates the search for nuclear materials and calculates detector response for various source types and locations. The probability of detecting a radioactive source during a search is a function of many different variables, including source type, search location and structure geometry (including shielding), search dynamics (path and speed), and detector type and size. Through calculation of dynamic detector response, QUEST makes possible quantitative comparisons of various sensor technologies and search patterns. The QUEST model can be used as a tool to examine the impact of new detector technologies, explore alternative search concepts, and provide interactive search/inspector training

  13. Quantifying the limitations of small animal positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Oxley, D.C. [Department of Physics, University of Liverpool, Liverpool L69 7ZE (United Kingdom)], E-mail: dco@ns.ph.liv.ac.uk; Boston, A.J.; Boston, H.C.; Cooper, R.J.; Cresswell, J.R.; Grint, A.N.; Nolan, P.J.; Scraggs, D.P. [Department of Physics, University of Liverpool, Liverpool L69 7ZE (United Kingdom); Lazarus, I.H. [STFC Daresbury Laboratory, Warrington, WA4 4AD Cheshire (United Kingdom); Beveridge, T.E. [School of Materials and Engineering, Monash University, Melbourne (Australia)

    2009-06-01

    The application of position sensitive semiconductor detectors in medical imaging is a field of global research interest. The Monte-Carlo simulation toolkit GEANT4 [ (http://geant4.web.cern.ch/geant4/)] was employed to improve the understanding of detailed {gamma}-ray interactions within the small animal Positron Emission Tomography (PET), high-purity germanium (HPGe) imaging system, SmartPET [A.J. Boston, et al., Oral contribution, ANL, Chicago, USA, 2006]. This system has shown promising results in the field of PET [R.J. Cooper, et al., Nucl. Instr. and Meth. A (2009), accepted for publication] and Compton camera imaging [J.E. Gillam, et al., Nucl. Instr. and Meth. A 579 (2007) 76]. Images for a selection of single and multiple point, line and phantom sources were successfully reconstructed using both a filtered-back-projection (FBP) [A.R. Mather, Ph.D. Thesis, University of Liverpool, 2007] and an iterative reconstruction algorithm [A.R. Mather, Ph.D. Thesis, University of Liverpool, 2007]. Simulated data were exploited as an alternative route to a reconstructed image allowing full quantification of the image distortions introduced in each phase of the data processing. Quantifying the contribution of uncertainty in all system components from detector to reconstruction algorithm allows the areas in need of most attention on the SmartPET project and semiconductor PET to be addressed.

  14. Quantifying environmental limiting factors on tree cover using geospatial data.

    Science.gov (United States)

    Greenberg, Jonathan A; Santos, Maria J; Dobrowski, Solomon Z; Vanderbilt, Vern C; Ustin, Susan L

    2015-01-01

    Environmental limiting factors (ELFs) are the thresholds that determine the maximum or minimum biological response for a given suite of environmental conditions. We asked the following questions: 1) Can we detect ELFs on percent tree cover across the eastern slopes of the Lake Tahoe Basin, NV? 2) How are the ELFs distributed spatially? 3) To what extent are unmeasured environmental factors limiting tree cover? ELFs are difficult to quantify as they require significant sample sizes. We addressed this by using geospatial data over a relatively large spatial extent, where the wall-to-wall sampling ensures the inclusion of rare data points which define the minimum or maximum response to environmental factors. We tested mean temperature, minimum temperature, potential evapotranspiration (PET) and PET minus precipitation (PET-P) as potential limiting factors on percent tree cover. We found that the study area showed system-wide limitations on tree cover, and each of the factors showed evidence of being limiting on tree cover. However, only 1.2% of the total area appeared to be limited by the four (4) environmental factors, suggesting other unmeasured factors are limiting much of the tree cover in the study area. Where sites were near their theoretical maximum, non-forest sites (tree cover demand, and closed-canopy forests were not limited by any particular environmental factor. The detection of ELFs is necessary in order to fully understand the width of limitations that species experience within their geographic range.

  15. Quantifying solute transport processes: are chemically "conservative" tracers electrically conservative?

    Science.gov (United States)

    Singha, Kamini; Li, Li; Day-Lewis, Frederick D.; Regberg, Aaron B.

    2012-01-01

    The concept of a nonreactive or conservative tracer, commonly invoked in investigations of solute transport, requires additional study in the context of electrical geophysical monitoring. Tracers that are commonly considered conservative may undergo reactive processes, such as ion exchange, thus changing the aqueous composition of the system. As a result, the measured electrical conductivity may reflect not only solute transport but also reactive processes. We have evaluated the impacts of ion exchange reactions, rate-limited mass transfer, and surface conduction on quantifying tracer mass, mean arrival time, and temporal variance in laboratory-scale column experiments. Numerical examples showed that (1) ion exchange can lead to resistivity-estimated tracer mass, velocity, and dispersivity that may be inaccurate; (2) mass transfer leads to an overestimate in the mobile tracer mass and an underestimate in velocity when using electrical methods; and (3) surface conductance does not notably affect estimated moments when high-concentration tracers are used, although this phenomenon may be important at low concentrations or in sediments with high and/or spatially variable cation-exchange capacity. In all cases, colocated groundwater concentration measurements are of high importance for interpreting geophysical data with respect to the controlling transport processes of interest.

  16. Quantifying How Observations Inform a Numerical Reanalysis of Hawaii

    Science.gov (United States)

    Powell, B. S.

    2017-11-01

    When assimilating observations into a model via state-estimation, it is possible to quantify how each observation changes the modeled estimate of a chosen oceanic metric. Using an existing 2 year reanalysis of Hawaii that includes more than 31 million observations from satellites, ships, SeaGliders, and autonomous floats, I assess which observations most improve the estimates of the transport and eddy kinetic energy. When the SeaGliders were in the water, they comprised less than 2.5% of the data, but accounted for 23% of the transport adjustment. Because the model physics constrains advanced state-estimation, the prescribed covariances are propagated in time to identify observation-model covariance. I find that observations that constrain the isopycnal tilt across the transport section provide the greatest impact in the analysis. In the case of eddy kinetic energy, observations that constrain the surface-driven upper ocean have more impact. This information can help to identify optimal sampling strategies to improve both state-estimates and forecasts.

  17. Challenges in quantifying biosphere-atmosphere exchange of nitrogen species

    Energy Technology Data Exchange (ETDEWEB)

    Sutton, M.A. [Centre for Ecology and Hydrology (CEH), Edinburgh Research Station, Bush Estate, Penicuik, EH26 0QB (United Kingdom)], E-mail: ms@ceh.ac.uk; Nemitz, E. [Centre for Ecology and Hydrology (CEH), Edinburgh Research Station, Bush Estate, Penicuik, EH26 0QB (United Kingdom); Erisman, J.W. [ECN, Clean Fossil Fuels, PO Box 1, 1755 ZG Petten (Netherlands); Beier, C. [Riso National Laboratory, PO Box 49, DK-4000 Roskilde (Denmark); Bahl, K. Butterbach [Institute of Meteorology and Climate Research, Atmos. Environ. Research (IMK-IFU), Research Centre Karlsruhe GmbH, Kreuzeckbahnstr. 19, 82467 Garmisch-Partenkirchen (Germany); Cellier, P. [INRA Unite Mixte de Recherche, 78850 Thiverval-Grignon (France); Vries, W. de [Alterra, Green World Research, PO Box 47, 6700 AA Wageningen (Netherlands); Cotrufo, F. [Dip. Scienze Ambientali, Seconda Universita degli Studi di Napoli, via Vivaldi 43, 81100 Caserta (Italy); Skiba, U.; Di Marco, C.; Jones, S. [Centre for Ecology and Hydrology (CEH), Edinburgh Research Station, Bush Estate, Penicuik, EH26 0QB (United Kingdom); Laville, P.; Soussana, J.F.; Loubet, B. [INRA Unite Mixte de Recherche, 78850 Thiverval-Grignon (France); Twigg, M.; Famulari, D. [Centre for Ecology and Hydrology (CEH), Edinburgh Research Station, Bush Estate, Penicuik, EH26 0QB (United Kingdom); Whitehead, J.; Gallagher, M.W. [School of Earth, Atmospheric and Environmental Sciences, University of Manchester, Williamson Building, Oxford Road, Manchester, M13 9PL (United Kingdom); Neftel, A.; Flechard, C.R. [Agroscope FAL Reckenholz, Federal Research Station for Agroecology and Agriculture, PO Box, CH 8046 Zurich (Switzerland)] (and others)

    2007-11-15

    Recent research in nitrogen exchange with the atmosphere has separated research communities according to N form. The integrated perspective needed to quantify the net effect of N on greenhouse-gas balance is being addressed by the NitroEurope Integrated Project (NEU). Recent advances have depended on improved methodologies, while ongoing challenges include gas-aerosol interactions, organic nitrogen and N{sub 2} fluxes. The NEU strategy applies a 3-tier Flux Network together with a Manipulation Network of global-change experiments, linked by common protocols to facilitate model application. Substantial progress has been made in modelling N fluxes, especially for N{sub 2}O, NO and bi-directional NH{sub 3} exchange. Landscape analysis represents an emerging challenge to address the spatial interactions between farms, fields, ecosystems, catchments and air dispersion/deposition. European up-scaling of N fluxes is highly uncertain and a key priority is for better data on agricultural practices. Finally, attention is needed to develop N flux verification procedures to assess compliance with international protocols. - Current N research is separated by form; the challenge is to link N components, scales and issues.

  18. A practical method for quantifying atherosclerotic lesions in rabbits.

    Science.gov (United States)

    Zhang, C; Zheng, H; Yu, Q; Yang, P; Li, Y; Cheng, F; Fan, J; Liu, E

    2010-01-01

    The rabbit has been widely used for the study of human atherosclerosis; however, the method for analysis of the atherosclerotic lesions has not been standardized between laboratories. The present study reports a practical method for quantifying the changes that occur in aortic atherosclerosis of rabbits. Male Japanese white rabbits were fed with either a standard chow or a diet containing 10% fat and 0.3% cholesterol for 16 weeks. Plasma concentrations of glucose, insulin, total cholesterol, triglycerides and high-density lipoprotein were measured. Aortic atherosclerotic lesions were assessed in quantitative fashion using an image analysis system that measured (1) the gross area of the entire aorta affected by atherosclerosis as defined by Sudan IV staining, (2) the microscopical intimal lesion defined by the elastic van Gieson stain and (3) the infiltration of macrophages and smooth muscle cell proliferation as determined immunohistochemically. The rabbits developed severe aortic atherosclerosis without apparent abnormality of glucose metabolism. The quantitative method described here will be useful for the further investigation of atherosclerosis in rabbits. Copyright 2009 Elsevier Ltd. All rights reserved.

  19. Quantifying heterogeneity in human tumours using MRI and PET.

    Science.gov (United States)

    Asselin, Marie-Claude; O'Connor, James P B; Boellaard, Ronald; Thacker, Neil A; Jackson, Alan

    2012-03-01

    Most tumours, even those of the same histological type and grade, demonstrate considerable biological heterogeneity. Variations in genomic subtype, growth factor expression and local microenvironmental factors can result in regional variations within individual tumours. For example, localised variations in tumour cell proliferation, cell death, metabolic activity and vascular structure will be accompanied by variations in oxygenation status, pH and drug delivery that may directly affect therapeutic response. Documenting and quantifying regional heterogeneity within the tumour requires histological or imaging techniques. There is increasing evidence that quantitative imaging biomarkers can be used in vivo to provide important, reproducible and repeatable estimates of tumoural heterogeneity. In this article we review the imaging methods available to provide appropriate biomarkers of tumour structure and function. We also discuss the significant technical issues involved in the quantitative estimation of heterogeneity and the range of descriptive metrics that can be derived. Finally, we have reviewed the existing clinical evidence that heterogeneity metrics provide additional useful information in drug discovery and development and in clinical practice. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Quantifying design trade-offs of beryllium targets on NIF

    Science.gov (United States)

    Yi, S. A.; Zylstra, A. B.; Kline, J. L.; Loomis, E. N.; Kyrala, G. A.; Shah, R. C.; Perry, T. S.; Kanzleiter, R. J.; Batha, S. H.; MacLaren, S. A.; Ralph, J. E.; Masse, L. P.; Salmonson, J. D.; Tipton, R. E.; Callahan, D. A.; Hurricane, O. A.

    2017-10-01

    An important determinant of target performance is implosion kinetic energy, which scales with the capsule size. The maximum achievable performance for a given laser is thus related to the largest capsule that can be imploded symmetrically, constrained by drive uniformity. A limiting factor for symmetric radiation drive is the ratio of hohlraum to capsule radii, or case-to-capsule ratio (CCR). For a fixed laser energy, a larger hohlraum allows for driving bigger capsules symmetrically at the cost of reduced peak radiation temperature (Tr). Beryllium ablators may thus allow for unique target design trade-offs due to their higher ablation efficiency at lower Tr. By utilizing larger hohlraum sizes than most modern NIF designs, beryllium capsules thus have the potential to operate in unique regions of the target design parameter space. We present design simulations of beryllium targets with a large CCR = 4.3 3.7 . These are scaled surrogates of large hohlraum low Tr beryllium targets, with the goal of quantifying symmetry tunability as a function of CCR. This work performed under the auspices of the U.S. DOE by LANL under contract DE-AC52- 06NA25396, and by LLNL under Contract DE-AC52-07NA27344.

  1. Quantifying evolutionary dynamics from variant-frequency time series

    Science.gov (United States)

    Khatri, Bhavin S.

    2016-09-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.

  2. Quantifying the Benefits of Combining Offshore Wind and Wave Energy

    Science.gov (United States)

    Stoutenburg, E.; Jacobson, M. Z.

    2009-12-01

    For many locations the offshore wind resource and the wave energy resource are collocated, which suggests a natural synergy if both technologies are combined into one offshore marine renewable energy plant. Initial meteorological assessments of the western coast of the United States suggest only a weak correlation in power levels of wind and wave energy at any given hour associated with the large ocean basin wave dynamics and storm systems of the North Pacific. This finding indicates that combining the two power sources could reduce the variability in electric power output from a combined wind and wave offshore plant. A combined plant is modeled with offshore wind turbines and Pelamis wave energy converters with wind and wave data from meteorological buoys operated by the US National Buoy Data Center off the coast of California, Oregon, and Washington. This study will present results of quantifying the benefits of combining wind and wave energy for the electrical power system to facilitate increased renewable energy penetration to support reductions in greenhouse gas emissions, and air and water pollution associated with conventional fossil fuel power plants.

  3. Quantifying Nanoparticle Internalization Using a High Throughput Internalization Assay.

    Science.gov (United States)

    Mann, Sarah K; Czuba, Ewa; Selby, Laura I; Such, Georgina K; Johnston, Angus P R

    2016-10-01

    The internalization of nanoparticles into cells is critical for effective nanoparticle mediated drug delivery. To investigate the kinetics and mechanism of internalization of nanoparticles into cells we have developed a DNA molecular sensor, termed the Specific Hybridization Internalization Probe - SHIP. Self-assembling polymeric 'pHlexi' nanoparticles were functionalized with a Fluorescent Internalization Probe (FIP) and the interactions with two different cell lines (3T3 and CEM cells) were studied. The kinetics of internalization were quantified and chemical inhibitors that inhibited energy dependent endocytosis (sodium azide), dynamin dependent endocytosis (Dyngo-4a) and macropinocytosis (5-(N-ethyl-N-isopropyl) amiloride (EIPA)) were used to study the mechanism of internalization. Nanoparticle internalization kinetics were significantly faster in 3T3 cells than CEM cells. We have shown that ~90% of the nanoparticles associated with 3T3 cells were internalized, compared to only 20% of the nanoparticles associated with CEM cells. Nanoparticle uptake was via a dynamin-dependent pathway, and the nanoparticles were trafficked to lysosomal compartments once internalized. SHIP is able to distinguish between nanoparticles that are associated on the outer cell membrane from nanoparticles that are internalized. This study demonstrates the assay can be used to probe the kinetics of nanoparticle internalization and the mechanisms by which the nanoparticles are taken up by cells. This information is fundamental for engineering more effective nanoparticle delivery systems. The SHIP assay is a simple and a high-throughput technique that could have wide application in therapeutic delivery research.

  4. Plasma as alternatively sample to quantify tetanus antitoxin

    Directory of Open Access Journals (Sweden)

    Ariel Menéndez-Barrios

    2015-08-01

    Full Text Available Tetanus antitoxin is quantified in Cuba at blood banks, from the serum of immunized donors, to produce aspecific human gamma globulin. A heterogeneous indirect immunoenzymatic assay is used, using the serum as analytical sample. The possible use of plasma obtained from plasmapheresis as alternative sample was evaluated in this research, to minimize the volume of total blood extracted to the donors. One hundred plasma donors who came to donate between October and November 2013 were selected by simple random sampling. Serum sample was obtained for extraction of 5 mL of blood, deposited in dry glass tube. While the other sample took 1.5 mL of plasma in a plastic tube with cover, at the end of the donation directly of the unit of plasma collected. Comparison of the difference between the means of both groups was done using SPSS for Windows. It was found that the values obtained in serum were bigger than those obtained in plasma. Difference between the means of both groups was statistically significant (p 0.00. It is not advisable to use the obtained plasma of the plasmapheresis as analytic sample in this assay.

  5. Parkinson's Law quantified: three investigations on bureaucratic inefficiency

    Science.gov (United States)

    Klimek, Peter; Hanel, Rudolf; Thurner, Stefan

    2009-03-01

    We formulate three famous, descriptive essays of Parkinson on bureaucratic inefficiency in a quantifiable and dynamical socio-physical framework. In the first model we show how the use of recent opinion formation models for small groups can be used to understand Parkinson's observation that decision-making bodies such as cabinets or boards become highly inefficient once their size exceeds a critical 'Coefficient of Inefficiency', typically around 20. A second observation of Parkinson—which is sometimes referred to as Parkinson's Law—is that the growth of bureaucratic or administrative bodies usually goes hand in hand with a drastic decrease of its overall efficiency. In our second model we view a bureaucratic body as a system of a flow of workers, who enter, become promoted to various internal levels within the system over time, and leave the system after having served for a certain time. Promotion usually is associated with an increase of subordinates. Within the proposed model it becomes possible to work out the phase diagram under which conditions of bureaucratic growth can be confined. In our last model we assign individual efficiency curves to workers throughout their life in administration, and compute the optimum time to give them the old age pension, in order to ensure a maximum of efficiency within the body—in Parkinson's words we compute the 'Pension Point'.

  6. A new approach to quantify safety benefits of disaster robots

    Directory of Open Access Journals (Sweden)

    Inn Seock Kim

    2017-10-01

    Full Text Available Remote response technology has advanced to the extent that a robot system, if properly designed and deployed, may greatly help respond to beyond-design-basis accidents at nuclear power plants. Particularly in the aftermath of the Fukushima accident, there is increasing interest in developing disaster robots that can be deployed in lieu of a human operator to the field to perform mitigating actions in the harsh environment caused by extreme natural hazards. The nuclear robotics team of the Korea Atomic Energy Research Institute (KAERI is also endeavoring to construct disaster robots and, first of all, is interested in finding out to what extent safety benefits can be achieved by such a disaster robotic system. This paper discusses a new approach based on the probabilistic risk assessment (PRA technique, which can be used to quantify safety benefits associated with disaster robots, along with a case study for seismic-induced station blackout condition. The results indicate that to avoid core damage in this special case a robot system with reliability > 0.65 is needed because otherwise core damage is inevitable. Therefore, considerable efforts are needed to improve the reliability of disaster robots, because without assurance of high reliability, remote response techniques will not be practically used.

  7. Quantifying capital goods for collection and transport of waste.

    Science.gov (United States)

    Brogaard, Line K; Christensen, Thomas H

    2012-12-01

    The capital goods for collection and transport of waste were quantified for different types of containers (plastic containers, cubes and steel containers) and an 18-tonnes compacting collection truck. The data were collected from producers and vendors of the bins and the truck. The service lifetime and the capacity of the goods were also assessed. Environmental impact assessment of the production of the capital goods revealed that, per tonne of waste handled, the truck had the largest contribution followed by the steel container. Large high density polyethylene (HDPE) containers had the lowest impact per tonne of waste handled. The impact of producing the capital goods for waste collection and transport cannot be neglected as the capital goods dominate (>85%) the categories human-toxicity (non-cancer and cancer), ecotoxicity, resource depletion and aquatic eutrophication, but also play a role (>13%) within the other impact categories when compared with the impacts from combustion of fuels for the collection and transport of the waste, when a transport distance of 25 km was assumed.

  8. Quantifying the direct use value of Condor seamount

    Science.gov (United States)

    Ressurreição, Adriana; Giacomello, Eva

    2013-12-01

    Seamounts often satisfy numerous uses and interests. Multiple uses can generate multiple benefits but also conflicts and impacts, calling, therefore, for integrated and sustainable management. To assist in developing comprehensive management strategies, policymakers recognise the need to include measures of socioeconomic analysis alongside ecological data so that practical compromises can be made. This study assessed the direct output impact (DOI) of the relevant marine activities operating at Condor seamount (Azores, central northeast Atlantic) as proxies of the direct use values provided by the resource system. Results demonstrated that Condor seamount supported a wide range of uses yielding distinct economic outputs. Demersal fisheries, scientific research and shark diving were the top-three activities generating the highest revenues, while tuna fisheries, whale watching and scuba-diving had marginal economic significance. Results also indicated that the economic importance of non-extractive uses of Condor is considerable, highlighting the importance of these uses as alternative income-generating opportunities for local communities. It is hoped that quantifying the direct use values provided by Condor seamount will contribute to the decision making process towards its long-term conservation and sustainable use.

  9. A colorimetric method to quantify endo-polygalacturonase activity.

    Science.gov (United States)

    Torres, Sebastián; Sayago, Jorge E; Ordoñez, Roxana M; Isla, María Inés

    2011-02-08

    We report a new colorimetric assay to quantify endo-polygalacturonase activity, which hydrolyzes polygalacturonic acid to produce smaller chains of galacturonate. Some of the reported polygalacturonase assays measure the activity by detecting the appearance of reducing ends such as the Somogyi-Nelson method. As a result of being general towards reducing groups, the Somogyi-Nelson method is not appropriate when studying polygalacturonase and polygalacturonase inhibitors in plant crude extracts, which often have a strong reducing power. Ruthenium Red is an inorganic dye that binds polygalacturonic acid and causes its precipitation. In the presence of polygalacturonase, polygalacturonic acid is hydrolyzed bringing about a corresponding gain in soluble Ruthenium Red. The described assay utilizes Ruthenium Red as the detection reagent which has been used previously in plate-based assays but not in liquid medium reactions. The new method measures the disappearance of the substrate polygalacturonic acid and is compared to the Somogyi-Nelson assay. The experimental results using lemon peel, a fern fronds and castor leaf crude extracts demonstrate that the new method provides a way to the quickly screening of polygalacturonase activity and polygalacturonase inhibitors in plant crude extracts containing high amounts of reducing power. On the other hand, the Ruthenium Red assay is not able to determine the activity of an exo-polygalacturonase as initial velocity and thus would allow the differentiation between endo- and exo-polygalacturonase activities. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. Nudging the Arctic Ocean to quantify Arctic sea ice feedbacks

    Science.gov (United States)

    Dekker, Evelien; Severijns, Camiel; Bintanja, Richard

    2017-04-01

    It is well-established that the Arctic is warming 2 to 3 time faster than rest of the planet. One of the great uncertainties in climate research is related to what extent sea ice feedbacks amplify this (seasonally varying) Arctic warming. Earlier studies have analyzed existing climate model output using correlations and energy budget considerations in order to quantify sea ice feedbacks through indirect methods. From these analyses it is regularly inferred that sea ice likely plays an important role, but details remain obscure. Here we will take a different and a more direct approach: we will keep the sea ice constant in a sensitivity simulation, using a state-of -the-art climate model (EC-Earth), applying a technique that has never been attempted before. This experimental technique involves nudging the temperature and salinity of the ocean surface (and possibly some layers below to maintain the vertical structure and mixing) to a predefined prescribed state. When strongly nudged to existing (seasonally-varying) sea surface temperatures, ocean salinity and temperature, we force the sea ice to remain in regions/seasons where it is located in the prescribed state, despite the changing climate. Once we obtain fixed' sea ice, we will run a future scenario, for instance 2 x CO2 with and without prescribed sea ice, with the difference between these runs providing a measure as to what extent sea ice contributes to Arctic warming, including the seasonal and geographical imprint of the effects.

  11. Quantifying the Extremity of Windstorms for Regions Featuring Infrequent Events

    Science.gov (United States)

    Walz, M. A.; Leckebusch, G. C.; Kruschke, T.; Rust, H.; Ulbrich, U.

    2017-12-01

    This paper introduces the Distribution-Independent Storm Severity Index (DI-SSI). The DI-SSI represents an approach to quantify the severity of exceptional surface wind speeds of large scale windstorms that is complementary to the Storm Severity Index (SSI) introduced by Leckebusch et al. (2008). While the SSI approaches the extremeness of a storm from a meteorological and potential loss (impact) perspective, the DI-SSI defines the severity in a more climatological perspective. The idea is to assign equal index values to wind speeds of the same singularity (e.g. the 99th percentile) under consideration of the shape of the tail of the local wind speed climatology. Especially in regions at the edge of the classical storm track the DI-SSI shows more equitable severity estimates, e.g. for the extra-tropical cyclone Klaus. Here were compare the integral severity indices for several prominent windstorm in the European domain and discuss the advantages and disadvantages of the respective index. In order to compare the indices, their relation with the North Atlantic Oscillation (NAO) is studied, which is one of the main large scale drivers for the intensity of European windstorms. Additionally we can identify a significant relationship between the frequency and intensity of windstorms for large parts of the European domain.

  12. Quantifying social influence in an online cultural market.

    Science.gov (United States)

    Krumme, Coco; Cebrian, Manuel; Pickard, Galen; Pentland, Sandy

    2012-01-01

    We revisit experimental data from an online cultural market in which 14,000 users interact to download songs, and develop a simple model that can explain seemingly complex outcomes. Our results suggest that individual behavior is characterized by a two-step process--the decision to sample and the decision to download a song. Contrary to conventional wisdom, social influence is material to the first step only. The model also identifies the role of placement in mediating social signals, and suggests that in this market with anonymous feedback cues, social influence serves an informational rather than normative role.

  13. Quantifying social influence in an online cultural market.

    Directory of Open Access Journals (Sweden)

    Coco Krumme

    Full Text Available We revisit experimental data from an online cultural market in which 14,000 users interact to download songs, and develop a simple model that can explain seemingly complex outcomes. Our results suggest that individual behavior is characterized by a two-step process--the decision to sample and the decision to download a song. Contrary to conventional wisdom, social influence is material to the first step only. The model also identifies the role of placement in mediating social signals, and suggests that in this market with anonymous feedback cues, social influence serves an informational rather than normative role.

  14. Behavioral Economics

    OpenAIRE

    Sendhil Mullainathan; Richard H. Thaler

    2000-01-01

    Behavioral Economics is the combination of psychology and economics that investigates what happens in markets in which some of the agents display human limitations and complications. We begin with a preliminary question about relevance. Does some combination of market forces, learning and evolution render these human qualities irrelevant? No. Because of limits of arbitrage less than perfect agents survive and influence market outcomes. We then discuss three important ways in which humans devi...

  15. CONSUMER BEHAVIOR

    OpenAIRE

    Ilie BUDICA; Silvia PUIU; Bogdan Andrei BUDICA

    2010-01-01

    The study of consumers helps firms and organizations improve their marketing strategies by understanding issues such as: the psychology of how consumers think, feel, reason, and select between different alternatives; the psychology of how the consumer is influenced by his or her environment; the behavior of consumers while shopping or making other marketing decisions; limitations in consumer knowledge or information processing abilities influence decisions and marke...

  16. OPEC behavior

    Science.gov (United States)

    Yang, Bo

    This thesis aims to contribute to a further understanding of the real dynamics of OPEC production behavior and its impacts on the world oil market. A literature review in this area shows that the existing studies on OPEC still have some major deficiencies in theoretical interpretation and empirical estimation technique. After a brief background review in chapter 1, chapter 2 tests Griffin's market-sharing cartel model on the post-Griffin time horizon with a simultaneous system of equations, and an innovative hypothesis of OPEC's behavior (Saudi Arabia in particular) is then proposed based on the estimation results. Chapter 3 first provides a conceptual analysis of OPEC behavior under the framework of non-cooperative collusion with imperfect information. An empirical model is then constructed and estimated. The results of the empirical studies in this thesis strongly support the hypothesis that OPEC has operated as a market-sharing cartel since the early 1980s. In addition, the results also provide some support of the theory of non-cooperative collusion under imperfect information. OPEC members collude under normal circumstances and behave competitively at times in response to imperfect market signals of cartel compliance and some internal attributes. Periodic joint competition conduct plays an important role in sustaining the collusion in the long run. Saudi Arabia acts as the leader of the cartel, accommodating intermediate unfavorable market development and punishing others with a tit-for-tat strategy in extreme circumstances.

  17. Behavioral epigenetics.

    Science.gov (United States)

    Moore, David S

    2017-01-01

    Why do we grow up to have the traits we do? Most 20th century scientists answered this question by referring only to our genes and our environments. But recent discoveries in the emerging field of behavioral epigenetics have revealed factors at the interface between genes and environments that also play crucial roles in development. These factors affect how genes work; scientists now know that what matters as much as which genes you have (and what environments you encounter) is how your genes are affected by their contexts. The discovery that what our genes do depends in part on our experiences has shed light on how Nature and Nurture interact at the molecular level inside of our bodies. Data emerging from the world's behavioral epigenetics laboratories support the idea that a person's genes alone cannot determine if, for example, he or she will end up shy, suffering from cardiovascular disease, or extremely smart. Among the environmental factors that can influence genetic activity are parenting styles, diets, and social statuses. In addition to influencing how doctors treat diseases, discoveries about behavioral epigenetics are likely to alter how biologists think about evolution, because some epigenetic effects of experience appear to be transmissible from generation to generation. This domain of research will likely change how we think about the origins of human nature. WIREs Syst Biol Med 2017, 9:e1333. doi: 10.1002/wsbm.1333 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  18. Cordilleran forest scaling dynamics and disturbance regimes quantified by aerial lidar

    Science.gov (United States)

    Swetnam, Tyson L.

    Semi-arid forests are in a period of rapid transition as a result of unprecedented landscape scale fires, insect outbreaks, drought, and anthropogenic land use practices. Understanding how historically episodic disturbances led to coherent forest structural and spatial patterns that promoted resilience and resistance is a critical part of addressing change. Here my coauthors and I apply metabolic scaling theory (MST) to examine scaling behavior and structural patterns of semi-arid conifer forests in Arizona and New Mexico. We conceptualize a linkage to mechanistic drivers of forest assembly that incorporates the effects of low-intensity disturbance, and physiologic and resource limitations as an extension of MST. We use both aerial LiDAR data and field observations to quantify changes in forest structure from the sub-meter to landscape scales. We found: (1) semi-arid forest structure exhibits MST-predicted behaviors regardless of disturbance and that MST can help to quantitatively measure the level of disturbance intensity in a forest, (2) the application of a power law to a forest overstory frequency distribution can help predict understory presence/absence, (3) local indicators of spatial association can help to define first order effects (e.g. topographic changes) and map where recent disturbances (e.g. logging and fire) have altered forest structure. Lastly, we produced a comprehensive set of above-ground biomass and carbon models for five distinct forest types and ten common species of the southwestern US that are meant for use in aerial LiDAR forest inventory projects. This dissertation presents both a conceptual framework and applications for investigating local scales (stands of trees) up to entire ecosystems for diagnosis of current carbon balances, levels of departure from historical norms, and ecological stability. These tools and models will become more important as we prepare our ecosystems for a future characterized by increased climatic variability

  19. Quantifying Infra-slow Dynamics of Spectral Power and Heart Rate in Sleeping Mice.

    Science.gov (United States)

    Fernandez, Laura M J; Lecci, Sandro; Cardis, Romain; Vantomme, Gil; Béard, Elidie; Lüthi, Anita

    2017-08-02

    Three vigilance states dominate mammalian life: wakefulness, non-rapid eye movement (non-REM) sleep, and REM sleep. As more neural correlates of behavior are identified in freely moving animals, this three-fold subdivision becomes too simplistic. During wakefulness, ensembles of global and local cortical activities, together with peripheral parameters such as pupillary diameter and sympathovagal balance, define various degrees of arousal. It remains unclear the extent to which sleep also forms a continuum of brain states-within which the degree of resilience to sensory stimuli and arousability, and perhaps other sleep functions, vary gradually-and how peripheral physiological states co-vary. Research advancing the methods to monitor multiple parameters during sleep, as well as attributing to constellations of these functional attributes, is central to refining our understanding of sleep as a multifunctional process during which many beneficial effects must be executed. Identifying novel parameters characterizing sleep states will open opportunities for novel diagnostic avenues in sleep disorders. We present a procedure to describe dynamic variations of mouse non-REM sleep states via the combined monitoring and analysis of electroencephalogram (EEG)/electrocorticogram (ECoG), electromyogram (EMG), and electrocardiogram (ECG) signals using standard polysomnographic recording techniques. Using this approach, we found that mouse non-REM sleep is organized into cycles of coordinated neural and cardiac oscillations that generate successive 25-s intervals of high and low fragility to external stimuli. Therefore, central and autonomic nervous systems are coordinated to form behaviorally distinct sleep states during consolidated non-REM sleep. We present surgical manipulations for polysomnographic (i.e., EEG/EMG combined with ECG) monitoring to track these cycles in the freely sleeping mouse, the analysis to quantify their dynamics, and the acoustic stimulation protocols to

  20. Quantifying motivation with effort-based decision-making paradigms in health and disease.

    Science.gov (United States)

    Chong, T T-J; Bonnelle, V; Husain, M

    2016-01-01

    Motivation can be characterized as a series of cost-benefit valuations, in which we weigh the amount of effort we are willing to expend (the cost of an action) in return for particular rewards (its benefits). Human motivation has traditionally been measured with self-report and questionnaire-based tools, but an inherent limitation of these methods is that they are unable to provide a mechanistic explanation of the processes underlying motivated behavior. A major goal of current research is to quantify motivation objectively with effort-based decision-making paradigms, by drawing on a rich literature from nonhuman animals. Here, we review this approach by considering the development of these paradigms in the laboratory setting over the last three decades, and their more recent translation to understanding choice behavior in humans. A strength of this effort-based approach to motivation is that it is capable of capturing the wide range of individual differences, and offers the potential to dissect motivation into its component elements, thus providing the basis for more accurate taxonomic classifications. Clinically, modeling approaches might provide greater sensitivity and specificity to diagnosing disorders of motivation, for example, in being able to detect subclinical disorders of motivation, or distinguish a disorder of motivation from related but separate syndromes, such as depression. Despite the great potential in applying effort-based paradigms to index human motivation, we discuss several caveats to interpreting current and future studies, and the challenges in translating these approaches to the clinical setting. © 2016 Elsevier B.V. All rights reserved.