WorldWideScience

Sample records for quantifying history-dependent behavior

  1. Memory in microbes: quantifying history-dependent behavior in a bacterium.

    Directory of Open Access Journals (Sweden)

    Denise M Wolf

    Full Text Available Memory is usually associated with higher organisms rather than bacteria. However, evidence is mounting that many regulatory networks within bacteria are capable of complex dynamics and multi-stable behaviors that have been linked to memory in other systems. Moreover, it is recognized that bacteria that have experienced different environmental histories may respond differently to current conditions. These "memory" effects may be more than incidental to the regulatory mechanisms controlling acclimation or to the status of the metabolic stores. Rather, they may be regulated by the cell and confer fitness to the organism in the evolutionary game it participates in. Here, we propose that history-dependent behavior is a potentially important manifestation of memory, worth classifying and quantifying. To this end, we develop an information-theory based conceptual framework for measuring both the persistence of memory in microbes and the amount of information about the past encoded in history-dependent dynamics. This method produces a phenomenological measure of cellular memory without regard to the specific cellular mechanisms encoding it. We then apply this framework to a strain of Bacillus subtilis engineered to report on commitment to sporulation and degradative enzyme (AprE synthesis and estimate the capacity of these systems and growth dynamics to 'remember' 10 distinct cell histories prior to application of a common stressor. The analysis suggests that B. subtilis remembers, both in short and long term, aspects of its cell history, and that this memory is distributed differently among the observables. While this study does not examine the mechanistic bases for memory, it presents a framework for quantifying memory in cellular behaviors and is thus a starting point for studying new questions about cellular regulation and evolutionary strategy.

  2. Memory in Microbes: Quantifying History-Dependent Behavior in a Bacterium

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, Denise M.; Fontaine-Bodin, Lisa; Bischofs, Ilka; Price, Gavin; Keasling, Jay; Arkin, Adam P.

    2007-11-15

    Memory is usually associated with higher organisms rather than bacteria. However, evidence is mounting that many regulatory networks within bacteria are capable of complex dynamics and multi-stable behaviors that have been linked to memory in other systems. Moreover, it is recognized that bacteria that have experienced different environmental histories may respond differently to current conditions. These"memory" effects may be more than incidental to the regulatory mechanisms controlling acclimation or to the status of the metabolic stores. Rather, they may be regulated by the cell and confer fitness to the organism in the evolutionary game it participates in. Here, we propose that history-dependent behavior is a potentially important manifestation of memory, worth classifying and quantifying. To this end, we develop an information-theory based conceptual framework for measuring both the persistence of memory in microbes and the amount of information about the past encoded in history-dependent dynamics. This method produces a phenomenologicalmeasure of cellular memory without regard to the specific cellular mechanisms encoding it. We then apply this framework to a strain of Bacillus subtilis engineered to report on commitment to sporulation and degradative enzyme (AprE) synthesisand estimate the capacity of these systems and growth dynamics to"remember" 10 distinct cell histories prior to application of a common stressor. The analysis suggests that B. subtilis remembers, both in short and long term, aspects of its cellhistory, and that this memory is distributed differently among the observables. While this study does not examine the mechanistic bases for memory, it presents a framework for quantifying memory in cellular behaviors and is thus a starting point for studying new questions about cellular regulation and evolutionary strategy.

  3. Memory in microbes: quantifying history-Dependent behavior in a bacterium.

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, Denise M.; Fontaine-Bodin, Lisa; Bischofs, Ilka; Price, Gavin; Keaslin, Jay; Arkin, Adam P.

    2007-11-15

    Memory is usually associated with higher organisms rather than bacteria. However, evidence is mounting that many regulatory networks within bacteria are capable of complex dynamics and multi-stable behaviors that have been linked to memory in other systems. Moreover, it is recognized that bacteria that have experienced different environmental histories may respond differently to current conditions. These"memory" effects may be more than incidental to the regulatory mechanisms controlling acclimation or to the status of the metabolic stores. Rather, they may be regulated by the cell and confer fitness to the organism in the evolutionary game it participates in. Here, we propose that history-dependent behavior is a potentially important manifestation of memory, worth classifying and quantifying. To this end, we develop an information-theory based conceptual framework for measuring both the persistence of memory in microbes and the amount of information about the past encoded in history-dependent dynamics. This method produces a phenomenological measure of cellular memory without regard to the specific cellular mechanisms encoding it. We then apply this framework to a strain of Bacillus subtilis engineered to report on commitment to sporulation and degradative enzyme (AprE) synthesis and estimate the capacity of these systems and growth dynamics to 'remember' 10 distinct cell histories prior to application of a common stressor. The analysis suggests that B. subtilis remembers, both in short and long term, aspects of its cell history, and that this memory is distributed differently among the observables. While this study does not examine the mechanistic bases for memory, it presents a framework for quantifying memory in cellular behaviors and is thus a starting point for studying new questions about cellular regulation and evolutionary strategy.

  4. An exact solution for the history-dependent material and delamination behavior of laminated plates subjected to cylindrical bending

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Todd O [Los Alamos National Laboratory

    2009-01-01

    The exact solution for the history-dependent behavior of laminated plates subjected to cylindrical bending is presented. The solution represents the extension of Pagano's solution to consider arbitrary types of constitutive behaviors for the individual lamina as well as arbitrary types of cohesive zones models for delamination behavior. Examples of the possible types of material behavior are plasticity, viscoelasticity, viscoplasticity, and damaging. Examples of possible CZMs that can be considered are linear, nonlinear hardening, as well as nonlinear with softening. The resulting solution is intended as a benchmark solution for considering the predictive capabilities of different plate theories. Initial results are presented for several types of history-dependent material behaviors. It is shown that the plate response in the presence of history-dependent behaviors can differ dramatically from the elastic response. These results have strong implications for what constitutes an appropriate plate theory for modeling such behaviors.

  5. TIME AND POLING HISTORY DEPENDENT ENERGY STORAGE AND DISCHARGE BEHAVIORS IN POLY(VINYLIDENE FLUORIDE-CO-HEXAFLUOROPROPYLENE) RANDOM COPOLYMERS

    Institute of Scientific and Technical Information of China (English)

    Fang-xiao Guan; Jing Wang; Ji-lin Pan; Qing Wang; Lei Zhua

    2011-01-01

    We studied cycle time (0.01-10 s with triangular input waves) and poling history (continuous versus fresh poling) dependent electric energy storage and discharge behaviors in poly(vinylidene fluoride-co-hexafluoropropylene) [P(VDFHFP)] films using the electric displacement — the electric field (D-E) hysteresis loop measurements. Since the permanent dipoles in PVDF are orientational in nature, it is generally considered that both charging and discharging processes should be time and poling history dependent. Intriguingly, our experimental results showed that the charging process depended heavily on the cycle time and the prior poling history, and thus the D-E hysteresis loops had different shapes accordingly. However, the discharged energy density did not change no matter how the D-E loop shape varied due to different measurements. This experimental result could be explained in terms of reversible and irreversible polarizations. The reversible polarization could be charged and discharged fairly quickly (< 5 ms for each process), while the irreversible polarization depended heavily on the poling time and the prior poling history. This study suggests that it is only meaningful to compare the discharged energy density for PVDF and its copolymer films when different cycle times and poling histories are used.

  6. Quantifying Emergent Behavior of Autonomous Robots

    Directory of Open Access Journals (Sweden)

    Georg Martius

    2015-10-01

    Full Text Available Quantifying behaviors of robots which were generated autonomously from task-independent objective functions is an important prerequisite for objective comparisons of algorithms and movements of animals. The temporal sequence of such a behavior can be considered as a time series and hence complexity measures developed for time series are natural candidates for its quantification. The predictive information and the excess entropy are such complexity measures. They measure the amount of information the past contains about the future and thus quantify the nonrandom structure in the temporal sequence. However, when using these measures for systems with continuous states one has to deal with the fact that their values will depend on the resolution with which the systems states are observed. For deterministic systems both measures will diverge with increasing resolution. We therefore propose a new decomposition of the excess entropy in resolution dependent and resolution independent parts and discuss how they depend on the dimensionality of the dynamics, correlations and the noise level. For the practical estimation we propose to use estimates based on the correlation integral instead of the direct estimation of the mutual information based on next neighbor statistics because the latter allows less control of the scale dependencies. Using our algorithm we are able to show how autonomous learning generates behavior of increasing complexity with increasing learning duration.

  7. An Unsupervised Method for Quantifying the Behavior of Interacting Individuals

    CERN Document Server

    Klibaite, Ugne; Cande, Jessica; Stern, David L; Shaevitz, Joshua W

    2016-01-01

    Social behaviors involving the interaction of multiple individuals are complex and frequently crucial for an animal's survival. These interactions, ranging across sensory modalities, length scales, and time scales, are often subtle and difficult to quantify. Contextual effects on the frequency of behaviors become even more difficult to quantify when physical interaction between animals interferes with conventional data analysis, e.g. due to visual occlusion. We introduce a method for quantifying behavior in courting fruit flies that combines high-throughput video acquisition and tracking of individuals with recent unsupervised methods for capturing an animal's entire behavioral repertoire. We find behavioral differences in paired and solitary flies of both sexes, identifying specific behaviors that are affected by social and spatial context. Our pipeline allows for a comprehensive description of the interaction between multiple individuals using unsupervised machine learning methods, and will be used to answe...

  8. Quantified trends in the history of verbal behavior research.

    Science.gov (United States)

    Eshleman, J W

    1991-01-01

    The history of scientific research about verbal behavior research, especially that based on Verbal Behavior (Skinner, 1957), can be assessed on the basis of a frequency and celeration analysis of the published and presented literature. In order to discover these quantified trends, a comprehensive bibliographical database was developed. Based on several literature searches, the bibliographic database included papers pertaining to verbal behavior that were published in the Journal of the Experimental Analysis of Behavior, the Journal of Applied Behavior Analysis, Behaviorism, The Behavior Analyst, and The Analysis of Verbal Behavior. A nonbehavioral journal, the Journal of Verbal Learning and Verbal Behavior was assessed as a nonexample comparison. The bibliographic database also included a listing of verbal behavior papers presented at the meetings of the Association for Behavior Analysis. Papers were added to the database if they (a) were about verbal behavior, (b) referenced B.F. Skinner's (1957) book Verbal Behavior, or (c) did both. Because the references indicated the year of publication or presentation, a count per year of them was measured. These yearly frequencies were plotted on Standard Celeration Charts. Once plotted, various celeration trends in the literature became visible, not the least of which was the greater quantity of verbal behavior research than is generally acknowledged. The data clearly show an acceleration of research across the past decade. The data also question the notion that a "paucity" of research based on Verbal Behavior currently exists. Explanations of the acceleration of verbal behavior research are suggested, and plausible reasons are offered as to why a relative lack of verbal behavior research extended through the mid 1960s to the latter 1970s.

  9. An unsupervised method for quantifying the behavior of paired animals

    Science.gov (United States)

    Klibaite, Ugne; Berman, Gordon J.; Cande, Jessica; Stern, David L.; Shaevitz, Joshua W.

    2017-02-01

    Behaviors involving the interaction of multiple individuals are complex and frequently crucial for an animal’s survival. These interactions, ranging across sensory modalities, length scales, and time scales, are often subtle and difficult to characterize. Contextual effects on the frequency of behaviors become even more difficult to quantify when physical interaction between animals interferes with conventional data analysis, e.g. due to visual occlusion. We introduce a method for quantifying behavior in fruit fly interaction that combines high-throughput video acquisition and tracking of individuals with recent unsupervised methods for capturing an animal’s entire behavioral repertoire. We find behavioral differences between solitary flies and those paired with an individual of the opposite sex, identifying specific behaviors that are affected by social and spatial context. Our pipeline allows for a comprehensive description of the interaction between two individuals using unsupervised machine learning methods, and will be used to answer questions about the depth of complexity and variance in fruit fly courtship.

  10. Quantifying nisin adsorption behavior at pendant PEO layers.

    Science.gov (United States)

    Dill, Justen K; Auxier, Julie A; Schilke, Karl F; McGuire, Joseph

    2013-04-01

    The antimicrobial peptide nisin shows potent activity against Gram-positive bacteria including the most prevalent implant-associated pathogens. Its mechanism of action minimizes the opportunity for the rise of resistant bacteria and it does not appear to be toxic to humans, suggesting good potential for its use in antibacterial coatings for selected medical devices. A more quantitative understanding of nisin loading and release from polyethylene oxide (PEO) brush layers will inform new strategies for drug storage and delivery, and in this work optical waveguide lightmode spectroscopy was used to record changes in adsorbed mass during cyclic adsorption-elution experiments with nisin, at uncoated and PEO-coated surfaces. PEO layers were prepared by radiolytic grafting of Pluronic® surfactant F108 or F68 to silanized silica surfaces, producing long- or short-chain PEO layers, respectively. Kinetic patterns were interpreted with reference to a model accounting for history-dependent adsorption, in order to evaluate rate constants for nisin adsorption and desorption, as well as the effect of pendant PEO on the lateral clustering behavior of nisin. Nisin adsorption was observed at the uncoated and F108-coated surfaces, but not at the F68-coated surfaces. Nisin showed greater resistance to elution by peptide-free buffer at the uncoated surface, and lateral rearrangement and clustering of adsorbed nisin was apparent only at the uncoated surface. We conclude peptide entrapment at the F108-coated surface is governed by a hydrophobic inner region of the PEO brush layer that is not sufficient for nisin entrapment in the case of the shorter PEO chains of the F68-coated surface.

  11. Quantifying social vs. antisocial behavior in email networks

    CERN Document Server

    Gomes, L H; Almeida, V A F; Bettencourt, L M A; Castro, F D O; Almeida, Jussara M.; Almeida, Virgilio A. F.; Bettencourt, Luis M. A.; Castro, Fernando D. O.; Gomes, Luiz H.

    2006-01-01

    Email graphs have been used to illustrate general properties of social networks of communication and collaboration. However, increasingly, the majority of email traffic reflects opportunistic, rather than symbiotic social relations. Here we use e-mail data drawn from a large university to construct directed graphs of email exchange that quantify the differences between social and antisocial behaviors in networks of communication. We show that while structural characteristics typical of other social networks are shared to a large extent by the legitimate component they are not characteristic of antisocial traffic. Interestingly, opportunistic patterns of behavior do create nontrivial graphs with certain general characteristics that we identify. To complement the graph analysis, which suffers from incomplete knowledge of users external to the domain, we study temporal patterns of communication to show that the dynamical properties of email traffic can, in principle, distinguish different types of social relatio...

  12. Short- and Long-Term Biomarkers for Bacterial Robustness: A Framework for Quantifying Correlations between Cellular Indicators and Adaptive Behavior

    NARCIS (Netherlands)

    Besten, den H.M.W.; Arvind, A.; Gaballo, H.M.S.; Moezelaar, R.; Zwietering, M.H.; Abee, T.

    2010-01-01

    The ability of microorganisms to adapt to changing environments challenges the prediction of their history-dependent behavior. Cellular biomarkers that are quantitatively correlated to stress adaptive behavior will facilitate our ability to predict the impact of these adaptive traits. Here, we prese

  13. Spatially quantifying the leadership effectiveness in collective behavior

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Haitao [State Key Laboratory of Digital Manufacturing Equipment and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); Wang Ning [Department of Control Science and Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Chen, Michael Z Q [Department of Mechanical Engineering, University of Hong Kong, Pok Fu Lam Road, Hong Kong (Hong Kong); Su Riqi; Zhou Tao [Department of Modern Physics, University of Science and Technology of China, Hefei 230026 (China); Zhou Changsong, E-mail: zht@mail.hust.edu.cn, E-mail: cszhou@hkbu.edu.hk, E-mail: zhutou@ustc.edu [Department of Physics, Centre for Nonlinear Studies, and Beijing-Hong Kong-Singapore Joint Centre for Nonlinear and Complex Systems (Hong Kong), Hong Kong Baptist University, Kowloon Tong (Hong Kong)

    2010-12-15

    Among natural biological flocks/swarms or mass social activities, when the collective behavior of the followers has been dominated by the direction or opinion of one leader group, it seems difficult for later-coming leaders to reverse the orientation of the mass followers, especially when they are in quantitative minority. This paper, however, reports a counter-intuitive phenomenon, i.e. Following the Later-coming Minority, provided that the later-comers obey a favorable distribution pattern that enables them to spread their influence to as many followers as possible within a given time and to be dense enough to govern these local followers they can influence directly from the beginning. We introduce a discriminant index to quantify the whole group's orientation under competing leaderships, with which the eventual orientation of the mass followers can be predicted before launching the real dynamical procedure. From the application point of view, this leadership effectiveness index also helps us to design an economical way for the minority later-coming leaders to defeat the dominating majority leaders solely by optimizing their spatial distribution pattern provided that the premeditated goal is available. Our investigation provides insights into effective leadership in biological systems with meaningful implications for social and industrial applications.

  14. Quantifying Trading Behavior in Financial Markets Using Google Trends

    Science.gov (United States)

    Preis, Tobias; Moat, Helen Susannah; Stanley, H. Eugene

    2013-04-01

    Crises in financial markets affect humans worldwide. Detailed market data on trading decisions reflect some of the complex human behavior that has led to these crises. We suggest that massive new data sources resulting from human interaction with the Internet may offer a new perspective on the behavior of market participants in periods of large market movements. By analyzing changes in Google query volumes for search terms related to finance, we find patterns that may be interpreted as ``early warning signs'' of stock market moves. Our results illustrate the potential that combining extensive behavioral data sets offers for a better understanding of collective human behavior.

  15. Quantifying and Disaggregating Consumer Purchasing Behavior for Energy Systems Modeling

    Science.gov (United States)

    Consumer behaviors such as energy conservation, adoption of more efficient technologies, and fuel switching represent significant potential for greenhouse gas mitigation. Current efforts to model future energy outcomes have tended to use simplified economic assumptions ...

  16. Quantifying and Disaggregating Consumer Purchasing Behavior for Energy Systems Modeling

    Science.gov (United States)

    Consumer behaviors such as energy conservation, adoption of more efficient technologies, and fuel switching represent significant potential for greenhouse gas mitigation. Current efforts to model future energy outcomes have tended to use simplified economic assumptions ...

  17. History dependence of magnetomechanical properties of steel

    Science.gov (United States)

    Melquiond, F.; Mouroux, A.; Jouglar, J.; Vuillermoz, P. L.; Weinstock, H.

    1996-05-01

    Magnetomechanical measurements using a superconducting SQUID magnetic gradiometer and a tensile-testing machine have been performed on a variety of steel specimens to determine the change in magnetization due to applied stress and the possible application of the observed behavior as a new form of nondestructive evaluation in steel. This study builds on earlier related measurements.

  18. Quantifying the behavior of stock correlations under market stress.

    Science.gov (United States)

    Preis, Tobias; Kenett, Dror Y; Stanley, H Eugene; Helbing, Dirk; Ben-Jacob, Eshel

    2012-01-01

    Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios.

  19. Quantifying the Behavior of Stock Correlations Under Market Stress

    Science.gov (United States)

    Preis, Tobias; Kenett, Dror Y.; Stanley, H. Eugene; Helbing, Dirk; Ben-Jacob, Eshel

    2012-10-01

    Understanding correlations in complex systems is crucial in the face of turbulence, such as the ongoing financial crisis. However, in complex systems, such as financial systems, correlations are not constant but instead vary in time. Here we address the question of quantifying state-dependent correlations in stock markets. Reliable estimates of correlations are absolutely necessary to protect a portfolio. We analyze 72 years of daily closing prices of the 30 stocks forming the Dow Jones Industrial Average (DJIA). We find the striking result that the average correlation among these stocks scales linearly with market stress reflected by normalized DJIA index returns on various time scales. Consequently, the diversification effect which should protect a portfolio melts away in times of market losses, just when it would most urgently be needed. Our empirical analysis is consistent with the interesting possibility that one could anticipate diversification breakdowns, guiding the design of protected portfolios.

  20. Short- and long-term biomarkers for bacterial robustness: a framework for quantifying correlations between cellular indicators and adaptive behavior.

    Directory of Open Access Journals (Sweden)

    Heidy M W den Besten

    Full Text Available The ability of microorganisms to adapt to changing environments challenges the prediction of their history-dependent behavior. Cellular biomarkers that are quantitatively correlated to stress adaptive behavior will facilitate our ability to predict the impact of these adaptive traits. Here, we present a framework for identifying cellular biomarkers for mild stress induced enhanced microbial robustness towards lethal stresses. Several candidate-biomarkers were selected by comparing the genome-wide transcriptome profiles of our model-organism Bacillus cereus upon exposure to four mild stress conditions (mild heat, acid, salt and oxidative stress. These candidate-biomarkers--a transcriptional regulator (activating general stress responses, enzymes (removing reactive oxygen species, and chaperones and proteases (maintaining protein quality--were quantitatively determined at transcript, protein and/or activity level upon exposure to mild heat, acid, salt and oxidative stress for various time intervals. Both unstressed and mild stress treated cells were also exposed to lethal stress conditions (severe heat, acid and oxidative stress to quantify the robustness advantage provided by mild stress pretreatment. To evaluate whether the candidate-biomarkers could predict the robustness enhancement towards lethal stress elicited by mild stress pretreatment, the biomarker responses upon mild stress treatment were correlated to mild stress induced robustness towards lethal stress. Both short- and long-term biomarkers could be identified of which their induction levels were correlated to mild stress induced enhanced robustness towards lethal heat, acid and/or oxidative stress, respectively, and are therefore predictive cellular indicators for mild stress induced enhanced robustness. The identified biomarkers are among the most consistently induced cellular components in stress responses and ubiquitous in biology, supporting extrapolation to other microorganisms

  1. Improved Software for Quantifying the Behavior of Drosophila Larvae

    Science.gov (United States)

    Bernat, Natalie; Gershow, Marc

    A key advantage of small crawling organisms like C elegans and the Drosophila larva is that their behaviors may be assayed automatically using computer vision software. Current state of the art software is capable of detecting the positions and postures of crawling larvae and automatically categorize their behaviors in parallel. However, these algorithms, which are based on frame-by-frame analysis of thresholded black and white images, fail to correctly describe the postures of larvae executing sharp bends and have difficulty separating multiple larvae that are physically touching. We present new tracking software that uses intensity information in grayscale images and applies temporal smoothness constraints to positions and postures. We implemented this software as an ImageJ plugin, extending its portability and applicability.

  2. The thermal probe test: A novel behavioral assay to quantify thermal paw withdrawal thresholds in mice

    OpenAIRE

    Deuis, Jennifer R; Vetter, Irina

    2016-01-01

    ABSTRACT Rodent models are frequently used to improve our understanding of the molecular mechanisms of pain and to develop novel analgesics. Robust behavioral assays that quantify nociceptive responses to different sensory modalities, such has heat, are therefore needed. Here, we describe a novel behavioral assay to quantify thermal paw withdrawal thresholds in mice, called the thermal probe test, and compared it with other methods commonly used to measure heat thresholds, namely the Hargreav...

  3. History dependence of vital capacity in constricted lungs.

    Science.gov (United States)

    Olson, Thomas P; Wilson, Theodore A; Johnson, Bruce D; Hyatt, Robert E

    2010-07-01

    Measurements of dynamic force-length behavior of maximally activated strips of smooth muscle during oscillatory length changes show that force decreases well below the isometric force during the shortening phase of the oscillation. The magnitude of the decrease depends on the rate of shortening; for slower shortening, the decrease is smaller and force is larger. Modeling of expiratory flow, based on these data, predicts that vital capacity in constricted lungs depends on the rate of expiration. In maximally constricted lungs, forced vital capacity (FVC) is predicted to be 16% smaller than control, and vital capacity for a very slow expiration (SVC), 31% less than control. These predictions were tested by measuring FVC and SVC in constricted normal subjects. In the first group of 9 subjects, four maneuvers were made following the delivery of two doses of methacholine in the order: SVC, FVC, FVC, SVC. In a second group of 11 subjects, two maneuvers were performed at each dose in the order: FVC, SVC. At the highest dose of methacholine, FVC for both trials in group 1 and for the one trial in group 2 were all approximately 13% less than control, a slightly smaller decrease than predicted. SVC for the 1st trial in group 1 was 27% less than control, also slightly smaller than predicted. The difference between FVC and SVC for this trial, 13%, was close to the predicted difference of 15%. However, SVC for the 2nd trial in group 1 (preceded by 3 vital capacity maneuvers) and for group 2 (preceded by 1) were no different from FVC. We conclude that vital capacity in constricted lungs depends on the dynamic force-length properties of smooth muscle and that the history dependence of the dynamic properties of smooth muscle is more complicated than has been inferred from oscillatory force-length behavior.

  4. Identifying and Quantifying Emergent Behavior Through System of Systems Modeling and Simulation

    Science.gov (United States)

    2015-09-01

    EMERGENT BEHAVIOR THROUGH SYSTEM OF SYSTEMS MODELING AND SIMULATION by Mary Ann Cummings September 2015 Dissertation Supervisor: Man-Tak Shing...COVERED Ph.D. Dissertation 4. TITLE AND SUBTITLE IDENTIFYING AND QUANTIFYING EMERGENT BEHAVIOR THROUGH SYSTEM OF SYSTEMS MODELING AND SIMULATION 5...functionality and interfaces in these SoSs. An inherent deficiency of existing M&S approaches, however, lies in the emergent behavior that occurs as a result of

  5. Evidence for history-dependence of influenza pandemic emergence

    Science.gov (United States)

    Hill, Edward M.; Tildesley, Michael J.; House, Thomas

    2017-03-01

    Influenza A viruses have caused a number of global pandemics, with considerable mortality in humans. Here, we analyse the time periods between influenza pandemics since 1700 under different assumptions to determine whether the emergence of new pandemic strains is a memoryless or history-dependent process. Bayesian model selection between exponential and gamma distributions for these time periods gives support to the hypothesis of history-dependence under eight out of nine sets of modelling assumptions. Using the fitted parameters to make predictions shows a high level of variability in the modelled number of pandemics from 2010-2110. The approach we take here relies on limited data, so is uncertain, but it provides cheap, safe and direct evidence relating to pandemic emergence, a field where indirect measurements are often made at great risk and cost.

  6. Quantifying the impact of Wellington Zoo's persuasive communication campaign on post-visit behavior.

    Science.gov (United States)

    MacDonald, Edith

    2015-01-01

    Zoos potential to facilitate visitor conservation behavior is commonly articulated. Few studies, however, have quantified whether zoos' conservation messages result in visitors implementing the behavior. To test if zoo conservation messages are adopted at home, I implemented a persuasive communication campaign which advocated keeping cats indoor at night, a behavior that is a potential solution to cats depredating native wildlife. Furthermore, I tested if a public commitment (signing a pledge card) strengthened the relationship between on-site intention to engage in the behavior and actual implementation of the behavior at home. The conservation behavior was included in the twice-daily animal presentations in the amphitheater. A sample of 691 visitors completed a survey as they exited the amphitheater that measured their recall of the conservation behavior and intention to engage in the behavior at home. The last 311 visitors to complete the survey were asked to sign a pledge card which was publicly displayed in the amphitheater. Six weeks after their zoo trip, visitors were contacted and asked if they had implemented the behavior. Recall of the conservation behavior was high (91% for control, 100% for pledge group) and the entire pledge group had implemented the behavior whereas just half (51%) of the control group did. Furthermore, signing the pledge card strengthened the relationship between onsite intention and at home behavior (r = 1.0 of for the pledge group and r = 0.21 for the control group). Overall, the zoo's conservation message was recalled and behavior implemented at home.

  7. Quantifying biopsychosocial aspects in everyday contexts: an integrative methodological approach from the behavioral sciences

    Science.gov (United States)

    Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K

    2015-01-01

    Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708

  8. Mapping urban climate zones and quantifying climate behaviors - An application on Toulouse urban area (France)

    Energy Technology Data Exchange (ETDEWEB)

    Houet, Thomas, E-mail: thomas.houet@univ-tlse2.fr [GEODE UMR 5602 CNRS, Universite de Toulouse, 5 allee Antonio Machado, 31058 Toulouse Cedex (France); Pigeon, Gregoire [Centre National de Recherches Meteorologiques, Meteo-France/CNRM-GAME, 42 avenue Coriolis, 31057 Toulouse Cedex (France)

    2011-08-15

    Facing the concern of the population to its environment and to climatic change, city planners are now considering the urban climate in their choices of planning. The use of climatic maps, such Urban Climate Zone-UCZ, is adapted for this kind of application. The objective of this paper is to demonstrate that the UCZ classification, integrated in the World Meteorological Organization guidelines, first can be automatically determined for sample areas and second is meaningful according to climatic variables. The analysis presented is applied on Toulouse urban area (France). Results show first that UCZ differentiate according to air and surface temperature. It has been possible to determine the membership of sample areas to an UCZ using landscape descriptors automatically computed with GIS and remote sensed data. It also emphasizes that climate behavior and magnitude of UCZ may vary from winter to summer. Finally we discuss the influence of climate data and scale of observation on UCZ mapping and climate characterization. - Highlights: > We proposed a method to map Urban Climate Zones and quantify their climate behaviors. > UCZ is an expert-based classification and is integrated in the WMO guidelines. > We classified 26 sample areas and quantified climate behaviors in winter/summer. > Results enhance urban heat islands and outskirts are surprisingly hottest in summer. > Influence of scale and climate data on UCZ mapping and climate evaluation is discussed. - This paper presents an automated approach to classify sample areas in a UCZ using landscape descriptors and demonstrate that climate behaviors of UCZ differ.

  9. Quantifying biopsychosocial aspects in everyday contexts: an integrative methodological approach from the behavioral sciences

    Directory of Open Access Journals (Sweden)

    Portell M

    2015-06-01

    Full Text Available Mariona Portell,1 M Teresa Anguera,2 Antonio Hernández-Mendo,3 Gudberg K Jonsson4 1Department of Psychobiology and Methodology of Health Sciences, Universitat Autònoma de Barcelona, Cerdanyola del Vallès, Spain; 2Department of Methodology of Behavioral Sciences, University of Barcelona, Barcelona, Spain; 3Department Social Psychology, Social Anthropology, Social Work and Social Services, University of Málaga, Málaga, Spain; 4Human Behavior Laboratory, University of Iceland, Reykjavík, Iceland Abstract: Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings and classification criteria (eg, variables of interest and level of participant involvement in the data collection process. We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed

  10. Quantifying Equid Behavior - A Research Ethogram for Free-Roaming Feral Horses

    Science.gov (United States)

    Ransom, Jason I.; Cade, Brian S.

    2009-01-01

    Feral horses (Equus caballus) are globally distributed in free-roaming populations on all continents except Antarctica and occupy a wide range of habitats including forest, grassland, desert, and montane environments. The largest populations occur in Australia and North America and have been the subject of scientific study for decades, yet guidelines and ethograms for feral horse behavioral research are largely absent in the scientific literature. The U.S. Geological Survey (USGS) Fort Collins Science Center conducted research on the influences of the immunocontraceptive porcine zona pellucida (PZP) on feral horse behavior from 2003-2006 in three discrete populations in the American west. These populations were the Little Book Cliffs Wild Horse Range in Colorado, McCullough Peaks Herd Management Area in Wyoming, and Pryor Mountain Wild Horse Range in Montana; the research effort included over 1,800 hours of behavioral observations of 317 adult free-roaming feral horses. An ethogram was developed during the course of this study to facilitate accurate scientific data collection on feral horse behavior, which is often challenging to quantify. By developing this set of discrete behavioral definitions and a set of strict research protocols, scientists were better able to address both applied questions, such as behavioral changes related to fertility control, and theoretical questions, such as understanding networks and dominance hierarchies within social groups of equids.

  11. Inferring Characteristics of Sensorimotor Behavior by Quantifying Dynamics of Animal Locomotion

    Science.gov (United States)

    Leung, KaWai

    Locomotion is one of the most well-studied topics in animal behavioral studies. Many fundamental and clinical research make use of the locomotion of an animal model to explore various aspects in sensorimotor behavior. In the past, most of these studies focused on population average of a specific trait due to limitation of data collection and processing power. With recent advance in computer vision and statistical modeling techniques, it is now possible to track and analyze large amounts of behavioral data. In this thesis, I present two projects that aim to infer the characteristics of sensorimotor behavior by quantifying the dynamics of locomotion of nematode Caenorhabditis elegans and fruit fly Drosophila melanogaster, shedding light on statistical dependence between sensing and behavior. In the first project, I investigate the possibility of inferring noxious sensory information from the behavior of Caenorhabditis elegans. I develop a statistical model to infer the heat stimulus level perceived by individual animals from their stereotyped escape responses after stimulation by an IR laser. The model allows quantification of analgesic-like effects of chemical agents or genetic mutations in the worm. At the same time, the method is able to differentiate perturbations of locomotion behavior that are beyond affecting the sensory system. With this model I propose experimental designs that allows statistically significant identification of analgesic-like effects. In the second project, I investigate the relationship of energy budget and stability of locomotion in determining the walking speed distribution of Drosophila melanogaster during aging. The locomotion stability at different age groups is estimated from video recordings using Floquet theory. I calculate the power consumption of different locomotion speed using a biomechanics model. In conclusion, the power consumption, not stability, predicts the locomotion speed distribution at different ages.

  12. Quantifying nonverbal communicative behavior in face-to-face human dialogues

    Science.gov (United States)

    Skhiri, Mustapha; Cerrato, Loredana

    2002-11-01

    The referred study is based on the assumption that understanding how humans use nonverbal behavior in dialogues can be very useful in the design of more natural-looking animated talking heads. The goal of the study is twofold: (1) to explore how people use specific facial expressions and head movements to serve important dialogue functions, and (2) to show evidence that it is possible to measure and quantify the entity of these movements with the Qualisys MacReflex motion tracking system. Naturally elicited dialogues between humans have been analyzed with focus on the attention on those nonverbal behaviors that serve the very relevant functions of regulating the conversational flux (i.e., turn taking) and producing information about the state of communication (i.e., feedback). The results show that eyebrow raising, head nods, and head shakes are typical signals involved during the exchange of speaking turns, as well as in the production and elicitation of feedback. These movements can be easily measured and quantified, and this measure can be implemented in animated talking heads.

  13. Diagnosis and characterization of mania: Quantifying increased energy and activity in the human behavioral pattern monitor.

    Science.gov (United States)

    Perry, William; McIlwain, Meghan; Kloezeman, Karen; Henry, Brook L; Minassian, Arpi

    2016-06-30

    Increased energy or activity is now an essential feature of the mania of Bipolar Disorder (BD) according to DSM-5. This study examined whether objective measures of increased energy can differentiate manic BD individuals and provide greater diagnostic accuracy compared to rating scales, extending the work of previous studies with smaller samples. We also tested the relationship between objective measures of energy and rating scales. 50 hospitalized manic BD patients were compared to healthy subjects (HCS, n=39) in the human Behavioral Pattern Monitor (hBPM) which quantifies motor activity and goal-directed behavior in an environment containing novel stimuli. Archival hBPM data from 17 schizophrenia patients were used in sensitivity and specificity analyses. Manic BD patients exhibited higher motor activity than HCS and higher novel object interactions. hBPM activity measures were not correlated with observer-rated symptoms, and hBPM activity was more sensitive in accurately classifying hospitalized BD subjects than observer ratings. Although the findings can only be generalized to inpatient populations, they suggest that increased energy, particularly specific and goal-directed exploration, is a distinguishing feature of BD mania and is best quantified by objective measures of motor activity. A better understanding is needed of the biological underpinnings of this cardinal feature.

  14. A DEM contact model for history-dependent powder flows

    Science.gov (United States)

    Hashibon, Adham; Schubert, Raphael; Breinlinger, Thomas; Kraft, Torsten

    2016-11-01

    Die filling is an important part of the powder handling process chain that greatly influences the characteristic structure and properties of the final part. Predictive modelling and simulation of the die-filling process can greatly contribute to the optimization of the part and the whole production procedure, e.g. by predicting the resulting powder compaction structure as a function of filling process parameters. The rheology of powders can be very difficult to model especially if heterogeneous agglomeration or time-dependent consolidation effects occur. We present a new discrete element contact force model that enables modelling complex powder flow characteristics including direct time-dependent consolidation effects and load history-dependent cohesion to describe the filling process of complex, difficult to handle powders. The model is demonstrated for simple flow and an industrial powder flow.

  15. Surface treatment and history-dependent corrosion in lead alloys

    Energy Technology Data Exchange (ETDEWEB)

    Li Ning [Los Alamos National Laboratory, Los Alamos, NM (United States)]. E-mail: ningli@lanl.gov; Zhang Jinsuo [Los Alamos National Laboratory, Los Alamos, NM (United States); Sencer, Bulent H. [Los Alamos National Laboratory, Los Alamos, NM (United States); Koury, Daniel [University of Nevada, Las Vegas, NV (United States)

    2006-06-23

    In oxygen-controlled lead and lead-bismuth eutectic (LBE), steel corrosion may be strongly history dependent. This is due to the competition between liquid metal dissolution corrosion and oxidation as a 'self-healing' protection barrier. Such effects can be observed from corrosion testing of a variety of surface-treated materials, such as cold working, shot peening, pre-oxidation, etc. Shot peening of austenitic steels produces surface-layer microstructural damages and grain compression, which could contribute to increased Cr migration to the surface and enhance the protection through an impervious oxide. Pre-oxidation under conditions different from operating ones may form more protective oxides, reduce oxygen and metal ion migration through the oxides, and achieve better protection for longer durations. Corrosion and oxidation modeling and analysis reveal the potential for significantly reducing long-term corrosion rates by initial and early-stage conditioning of steels for Pb/LBE services.

  16. Preference for the nearer of otherwise equivalent navigational goals quantifies behavioral motivation and natural selection.

    Directory of Open Access Journals (Sweden)

    Russell E Jackson

    Full Text Available Navigation and environmental perception precede most actions in mobile organisms. Navigation is based upon the fundamental assumption of a ubiquitous Preference for the Nearest of otherwise equivalent navigational goals (PfN. However, the magnitude and triggers for PfN are unknown and there is no clear evidence that PfN exists. I tested for PfN in human participants on a retrieval task. Results of these experiments provide the first evidence for PfN. Further, these data quantify the three primary PfN triggers and provide an experimental structure for using PfN as a behavioral metric across domains. Surprisingly, PfN exists at a high, but not universal, magnitude. Further, PfN derives most from the absolute distance to the farthest of multiple goals (d(f, with little influence of the distance to the nearest goal (d(n. These data provide previously unavailable quantification of behavioral motivation across species and may provide a measurable index of selection. These methods hold particular import for behavioral modification because proximity is a powerful determinant of decision outcomes across most behaviors.

  17. History-dependent excitability as a single-cell substrate of transient memory for information discrimination.

    Directory of Open Access Journals (Sweden)

    Fabiano Baroni

    Full Text Available Neurons react differently to incoming stimuli depending upon their previous history of stimulation. This property can be considered as a single-cell substrate for transient memory, or context-dependent information processing: depending upon the current context that the neuron "sees" through the subset of the network impinging on it in the immediate past, the same synaptic event can evoke a postsynaptic spike or just a subthreshold depolarization. We propose a formal definition of History-Dependent Excitability (HDE as a measure of the propensity to firing in any moment in time, linking the subthreshold history-dependent dynamics with spike generation. This definition allows the quantitative assessment of the intrinsic memory for different single-neuron dynamics and input statistics. We illustrate the concept of HDE by considering two general dynamical mechanisms: the passive behavior of an Integrate and Fire (IF neuron, and the inductive behavior of a Generalized Integrate and Fire (GIF neuron with subthreshold damped oscillations. This framework allows us to characterize the sensitivity of different model neurons to the detailed temporal structure of incoming stimuli. While a neuron with intrinsic oscillations discriminates equally well between input trains with the same or different frequency, a passive neuron discriminates better between inputs with different frequencies. This suggests that passive neurons are better suited to rate-based computation, while neurons with subthreshold oscillations are advantageous in a temporal coding scheme. We also address the influence of intrinsic properties in single-cell processing as a function of input statistics, and show that intrinsic oscillations enhance discrimination sensitivity at high input rates. Finally, we discuss how the recognition of these cell-specific discrimination properties might further our understanding of neuronal network computations and their relationships to the distribution and

  18. History-Dependent Excitability as a Single-Cell Substrate of Transient Memory for Information Discrimination

    Science.gov (United States)

    Baroni, Fabiano; Torres, Joaquín J.; Varona, Pablo

    2010-01-01

    Neurons react differently to incoming stimuli depending upon their previous history of stimulation. This property can be considered as a single-cell substrate for transient memory, or context-dependent information processing: depending upon the current context that the neuron “sees” through the subset of the network impinging on it in the immediate past, the same synaptic event can evoke a postsynaptic spike or just a subthreshold depolarization. We propose a formal definition of History-Dependent Excitability (HDE) as a measure of the propensity to firing in any moment in time, linking the subthreshold history-dependent dynamics with spike generation. This definition allows the quantitative assessment of the intrinsic memory for different single-neuron dynamics and input statistics. We illustrate the concept of HDE by considering two general dynamical mechanisms: the passive behavior of an Integrate and Fire (IF) neuron, and the inductive behavior of a Generalized Integrate and Fire (GIF) neuron with subthreshold damped oscillations. This framework allows us to characterize the sensitivity of different model neurons to the detailed temporal structure of incoming stimuli. While a neuron with intrinsic oscillations discriminates equally well between input trains with the same or different frequency, a passive neuron discriminates better between inputs with different frequencies. This suggests that passive neurons are better suited to rate-based computation, while neurons with subthreshold oscillations are advantageous in a temporal coding scheme. We also address the influence of intrinsic properties in single-cell processing as a function of input statistics, and show that intrinsic oscillations enhance discrimination sensitivity at high input rates. Finally, we discuss how the recognition of these cell-specific discrimination properties might further our understanding of neuronal network computations and their relationships to the distribution and functional

  19. Factorial Structure and Validity of the Quantified Behavior Test Plus (Qb+©).

    Science.gov (United States)

    Hirsch, Oliver; Christiansen, Hanna

    2016-03-14

    This study investigates the factorial structure and validity of the Quantified Behavior Test Plus (Qb+©), a computerized test to objectively evaluate the three attention deficit/hyperactivity disorder core symptoms, hyperactivity, inattention, and impulsivity, independently. Confirmatory and exploratory factor analyses were conducted with an outpatient sample of 773 subjects ≥12 years old. In a second sample of 297 patients ≥16 years, a multitrait-multimethod analysis was performed to examine concurrent and discriminant validity. The discriminative power of the Qb+ was investigated using a general linear model and logistic regression analysis. The three factorial structure (Hyperactivity, Inattention, Impulsivity) was verified in the confirmatory factor analysis. Fit indices demonstrated a good model fit and factor loadings were almost all moderate to high. In the multitrait-multimethod analysis, the criterion for convergent validity was fulfilled. The discriminant validity of the Qb+ was partially supported. Significant but small gender and age effects were found. In the logistic regression analysis, omission errors and reaction time variability, belonging to the Inattention factor, were able to discriminate between subjects with and without attention deficit/hyperactivity disorder. The internal structure of the Qb+ was verified. Its validity was partially supported. Results regarding discriminative power were mixed.

  20. History dependence of vital capacity in constricted lungs

    OpenAIRE

    Olson, Thomas P.; Wilson, Theodore A.; Johnson, Bruce D.; Hyatt, Robert E.

    2010-01-01

    Measurements of dynamic force-length behavior of maximally activated strips of smooth muscle during oscillatory length changes show that force decreases well below the isometric force during the shortening phase of the oscillation. The magnitude of the decrease depends on the rate of shortening; for slower shortening, the decrease is smaller and force is larger. Modeling of expiratory flow, based on these data, predicts that vital capacity in constricted lungs depends on the rate of expiratio...

  1. Quantifying fish swimming behavior in response to acute exposure of aqueous copper using computer assisted video and digital image analysis

    Science.gov (United States)

    Calfee, Robin D.; Puglis, Holly J.; Little, Edward E.; Brumbaugh, William G.; Mebane, Christopher A.

    2016-01-01

    Behavioral responses of aquatic organisms to environmental contaminants can be precursors of other effects such as survival, growth, or reproduction. However, these responses may be subtle, and measurement can be challenging. Using juvenile white sturgeon (Acipenser transmontanus) with copper exposures, this paper illustrates techniques used for quantifying behavioral responses using computer assisted video and digital image analysis. In previous studies severe impairments in swimming behavior were observed among early life stage white sturgeon during acute and chronic exposures to copper. Sturgeon behavior was rapidly impaired and to the extent that survival in the field would be jeopardized, as fish would be swept downstream, or readily captured by predators. The objectives of this investigation were to illustrate protocols to quantify swimming activity during a series of acute copper exposures to determine time to effect during early lifestage development, and to understand the significance of these responses relative to survival of these vulnerable early lifestage fish. With mortality being on a time continuum, determining when copper first affects swimming ability helps us to understand the implications for population level effects. The techniques used are readily adaptable to experimental designs with other organisms and stressors.

  2. Managing What We Can Measure: Quantifying the Susceptibility of Automated Scoring Systems to Gaming Behavior

    Science.gov (United States)

    Higgins, Derrick; Heilman, Michael

    2014-01-01

    As methods for automated scoring of constructed-response items become more widely adopted in state assessments, and are used in more consequential operational configurations, it is critical that their susceptibility to gaming behavior be investigated and managed. This article provides a review of research relevant to how construct-irrelevant…

  3. History-Dependent Problems with Applications to Contact Models for Elastic Beams

    Energy Technology Data Exchange (ETDEWEB)

    Bartosz, Krzysztof; Kalita, Piotr; Migórski, Stanisław; Ochal, Anna, E-mail: ochal@ii.uj.edu.pl [Jagiellonian University, Faculty of Mathematics and Computer Science (Poland); Sofonea, Mircea [Université de Perpignan Via Domitia, Laboratoire de Mathématiques et Physique (France)

    2016-02-15

    We prove an existence and uniqueness result for a class of subdifferential inclusions which involve a history-dependent operator. Then we specialize this result in the study of a class of history-dependent hemivariational inequalities. Problems of such kind arise in a large number of mathematical models which describe quasistatic processes of contact. To provide an example we consider an elastic beam in contact with a reactive obstacle. The contact is modeled with a new and nonstandard condition which involves both the subdifferential of a nonconvex and nonsmooth function and a Volterra-type integral term. We derive a variational formulation of the problem which is in the form of a history-dependent hemivariational inequality for the displacement field. Then, we use our abstract result to prove its unique weak solvability. Finally, we consider a numerical approximation of the model, solve effectively the approximate problems and provide numerical simulations.

  4. Quantifying the yellow signal driver behavior based on naturalistic data from digital enforcement cameras.

    Science.gov (United States)

    Bar-Gera, H; Musicant, O; Schechtman, E; Ze'evi, T

    2016-11-01

    The yellow signal driver behavior, reflecting the dilemma zone behavior, is analyzed using naturalistic data from digital enforcement cameras. The key variable in the analysis is the entrance time after the yellow onset, and its distribution. This distribution can assist in determining two critical outcomes: the safety outcome related to red-light-running angle accidents, and the efficiency outcome. The connection to other approaches for evaluating the yellow signal driver behavior is also discussed. The dataset was obtained from 37 digital enforcement cameras at non-urban signalized intersections in Israel, over a period of nearly two years. The data contain more than 200 million vehicle entrances, of which 2.3% (∼5million vehicles) entered the intersection during the yellow phase. In all non-urban signalized intersections in Israel the green phase ends with 3s of flashing green, followed by 3s of yellow. In most non-urban signalized roads in Israel the posted speed limit is 90km/h. Our analysis focuses on crossings during the yellow phase and the first 1.5s of the red phase. The analysis method consists of two stages. In the first stage we tested whether the frequency of crossings is constant at the beginning of the yellow phase. We found that the pattern was stable (i.e., the frequencies were constant) at 18 intersections, nearly stable at 13 intersections and unstable at 6 intersections. In addition to the 6 intersections with unstable patterns, two other outlying intersections were excluded from subsequent analysis. Logistic regression models were fitted for each of the remaining 29 intersection. We examined both standard (exponential) logistic regression and four parameters logistic regression. The results show a clear advantage for the former. The estimated parameters show that the time when the frequency of crossing reduces to half ranges from1.7 to 2.3s after yellow onset. The duration of the reduction of the relative frequency from 0.9 to 0.1 ranged

  5. Quantifying age-related differences in information processing behaviors when viewing prescription drug labels.

    Directory of Open Access Journals (Sweden)

    Raghav Prashant Sundar

    Full Text Available Adverse drug events (ADEs are a significant problem in health care. While effective warnings have the potential to reduce the prevalence of ADEs, little is known about how patients access and use prescription labeling. We investigated the effectiveness of prescription warning labels (PWLs, small, colorful stickers applied at the pharmacy in conveying warning information to two groups of patients (young adults and those 50+. We evaluated the early stages of information processing by tracking eye movements while participants interacted with prescription vials that had PWLs affixed to them. We later tested participants' recognition memory for the PWLs. During viewing, participants often failed to attend to the PWLs; this effect was more pronounced for older than younger participants. Older participants also performed worse on the subsequent memory test. However, when memory performance was conditionalized on whether or not the participant had fixated the PWL, these age-related differences in memory were no longer significant, suggesting that the difference in memory performance between groups was attributable to differences in attention rather than differences in memory encoding or recall. This is important because older adults are recognized to be at greater risk for ADEs. These data provide a compelling case that understanding consumers' attentive behavior is crucial to developing an effective labeling standard for prescription drugs.

  6. Quantifying Age-Related Differences in Information Processing Behaviors When Viewing Prescription Drug Labels

    Science.gov (United States)

    Sundar, Raghav Prashant; Becker, Mark W.; Bello, Nora M.; Bix, Laura

    2012-01-01

    Adverse drug events (ADEs) are a significant problem in health care. While effective warnings have the potential to reduce the prevalence of ADEs, little is known about how patients access and use prescription labeling. We investigated the effectiveness of prescription warning labels (PWLs, small, colorful stickers applied at the pharmacy) in conveying warning information to two groups of patients (young adults and those 50+). We evaluated the early stages of information processing by tracking eye movements while participants interacted with prescription vials that had PWLs affixed to them. We later tested participants’ recognition memory for the PWLs. During viewing, participants often failed to attend to the PWLs; this effect was more pronounced for older than younger participants. Older participants also performed worse on the subsequent memory test. However, when memory performance was conditionalized on whether or not the participant had fixated the PWL, these age-related differences in memory were no longer significant, suggesting that the difference in memory performance between groups was attributable to differences in attention rather than differences in memory encoding or recall. This is important because older adults are recognized to be at greater risk for ADEs. These data provide a compelling case that understanding consumers’ attentive behavior is crucial to developing an effective labeling standard for prescription drugs. PMID:22719955

  7. A new differential equations-based model for nonlinear history-dependent magnetic behaviour

    CERN Document Server

    Aktaa, J

    2000-01-01

    The paper presents a new kind of numerical model describing nonlinear magnetic behaviour. The model is formulated as a set of differential equations taking into account history dependence phenomena like the magnetisation hysteresis as well as saturation effects. The capability of the model is demonstrated carrying out comparisons between measurements and calculations.

  8. Tracing Mantle Plumes: Quantifying their Morphology and Behavior from Seismic Tomography

    Science.gov (United States)

    O'Farrell, K. A.; Eakin, C. M.; Jones, T. D.; Garcia, E.; Robson, A.; Mittal, T.; Lithgow-Bertelloni, C. R.; Jackson, M. G.; Lekic, V.; Rudolph, M. L.

    2016-12-01

    Hotspot volcanism provides a direct link between the deep mantle and the surface, but the location, depth and source of the mantle plumes that feed hotspots are highly controversial. In order to address this issue it is important to understand the journey along which plumes have travelled through the mantle. The general behavior of plumes in the mantle also has the potential to tell us about the vigor of mantle convection, net rotation of the mantle, the role of thermal versus chemical anomalies, and important bulk physical properties of the mantle such as the viscosity profile. To address these questions we developed an algorithm to trace plume-like features in shear-wave (Vs) seismic tomographic models based on picking local minima in velocity and searching for continuous features with depth. We apply this method to several of the latest tomographic models and can recover 30 or more continuous plume conduits that are >750 km long. Around half of these can be associated with a known hotspot at the surface. We study the morphology of these plume chains and find that the largest lateral deflections occur near the base of the lower mantle and in the upper mantle. We analyze the preferred orientation of the plume deflections and their gradient to infer large scale mantle flow patterns and the depth of viscosity contrasts in the mantle respectively. We also retrieve Vs profiles for our traced plumes and compare with velocity profiles predicted for different mantle adiabat temperatures. We use this to constrain the thermal anomaly associated with these plumes. This thermal anomaly is then converted to a density anomaly and an upwelling velocity is derived. We compare this to buoyancy fluxes calculated at the surface and use this in conjunction with our measured plume tilts/deflections to estimate the strength of the "mantle wind".

  9. Using data from the Microsoft Kinect 2 to quantify upper limb behavior: a feasibility study.

    Science.gov (United States)

    Dehbandi, Behdad; Barachant, Alexandre; Harary, David; Long, John; Tsagaris, K Zoe; Bumanlag, Silverio; He, Victor; Putrino, David

    2016-09-05

    The objective of this study was to assess whether the novel application of a machine learning approach to data collected from the Microsoft Kinect 2 (MK2) could be used to classify differing levels of upper limb impairment. twenty-four healthy subjects completed items of the Wolf Motor Function Test (WMFT), which is a clinically validated metric of upper limb function for stroke survivors. Subjects completed the WMFT 3 times: 1) as a healthy individual, 2) emulating mild impairment, and 3) emulating moderate impairment. A MK2 was positioned in front of participants, and collected kinematic data as they completed the WMFT. A classification framework, based on Riemannian geometry and the use of covariance matrices as feature representation of the MK2 data, was developed for these data, and its ability to successfully classify subjects as either "healthy", "mildly impaired" or "moderately impaired" was assessed. Mean accuracy for our classifier was 91.7%, with a specific accuracy breakdown of 100%, 83.3% and 91.7% for the "healthy", "mildly impaired" and "moderately impaired" conditions, respectively. We conclude that data from the MK2 is of sufficient quality to perform objective motor behavior classification in individuals with upper limb impairment. The data collection and analysis framework that we have developed has the potential to disrupt the field of clinical assessment. Future studies will focus on validating this protocol on large populations of individuals with actual upper limb impairments in order to create a toolkit that is clinically validated and available to the clinical community.

  10. An investigation of counter-current flow in porous media with history-dependent modeling

    Science.gov (United States)

    Li, G.; Grader, A. S.; Halleck, P. H.; Karpyn, Z. T.

    2003-04-01

    Counter-current fluid flow occurs in many reservoir processes. It is important to understand and model these processes in order to operate them effectively. Both drainage and imbibition processes exist simultaneously when counter-current flow occurs. It has thus proven difficult to model this type of flow, especially when fluid banks form. Previously, counter-current flow experiments have been done in glass bead packs and the spatial and temporal saturation distributions obtained with X-ray computed tomography (CT). In the current paper, a new saturation-history-dependent approach has been developed to simulate the experiments. Hysteresis in both capillary pressure and relative permeabilities is considered during the process of matching the simulation results to experimental data. Capillary pressure and relative permeabilities are extracted with the aid of a deterministic reservoir simulator. During the history matching process, a family of curves (called scanning curves) is constructed connecting the two branches of the capillary hystersis loop. Each grid block of the sample is assigned a different scanning curve according to its saturation history. Simulation of the experiments reproduced two-dimensional saturation distributions over time with good accuracy. Similar results could not be obtained with traditional simulation using only one capillary pressure curve. History-dependent modeling successfully predicted cross-diameter counter-current flow in a cylindrical geometry. The parameters used in the single capillary pressure method are the average of the parameters used in the history-dependent method. Total effective mobility controls the flow process, being smaller in counter-current flow than in co-current flow. Experiments documented in the literature that exhibited formation of fluid banks were also successfully simulated. We anticipate that application of this method will improve the prediction of full-scale fluid flow processes such as ground water

  11. Quantifying insufficient coping behavior under chronic stress: a cross-cultural study of 1,303 students from Italy, Spain and Argentina.

    Science.gov (United States)

    Delfino, Juan P; Barragán, Elena; Botella, Cristina; Braun, Silke; Bridler, René; Camussi, Elisabetta; Chafrat, Verónica; Lott, Petra; Mohr, Christine; Moragrega, Inés; Papagno, Costanza; Sanchez, Susana; Seifritz, Erich; Soler, Carla; Stassen, Hans H

    2015-01-01

    The question of how to quantify insufficient coping behavior under chronic stress is of major clinical relevance. In fact, chronic stress increasingly dominates modern work conditions and can affect nearly every system of the human body, as suggested by physical, cognitive, affective and behavioral symptoms. Since freshmen students experience constantly high levels of stress due to tight schedules and frequent examinations, we carried out a 3-center study of 1,303 students from Italy, Spain and Argentina in order to develop socioculturally independent means for quantifying coping behavior. The data analysis relied on 2 self-report questionnaires: the Coping Strategies Inventory (COPE) for the assessment of coping behavior and the Zurich Health Questionnaire which assesses consumption behavior and general health dimensions. A neural network approach was used to determine the structural properties inherent in the COPE instrument. Our analyses revealed 2 highly stable, socioculturally independent scales that reflected basic coping behavior in terms of the personality traits activity-passivity and defeatism-resilience. This replicated previous results based on Swiss and US-American data. The percentage of students exhibiting insufficient coping behavior was very similar across the study sites (11.5-18.0%). Given their stability and validity, the newly developed scales enable the quantification of basic coping behavior in a cost-efficient and reliable way, thus clearing the way for the early detection of subjects with insufficient coping skills under chronic stress who may be at risk of physical or mental health problems.

  12. Critical current characteristics and history dependence in superconducting SmFeAsOF bulk

    Energy Technology Data Exchange (ETDEWEB)

    Ni, B; Ge, J [Department of Life, Environment and Materials Science, Fukuoka Institute of Technology, Fukuoka 811-0295 (Japan); Kiuchi, M; Otabe, E S [Faculty of Computer Science and Systems Engineering, Kyushu Institute of Technology, Iizuka 820-8502 (Japan); Gao, Z; Wang, L; Qi, Y; Zhang, X; Ma, Y, E-mail: nee@fit.ac.j [Key Laboratory of Applied Superconductivity, Institute of Electrical Engineering, Chinese Academy of Sciences, PO Box 2703, Beijing 100190 (China)

    2010-06-01

    The superconducting SmFeAsO{sub 1-x}F{sub x} (x=0.2) polycrystalline bulks were prepared by the powder-in-tube (PIT) method. The magnetic field and temperature dependences of critical current densities in the samples were investigated by resistive and ac inductive (Campbell's) methods. It was found that a fairly large shielding current density over 10{sup 9} A/m{sup 2}, which is considered to correspond to the local critical current density, flows locally with the perimeter size similar to the average grain size of the bulk samples, while an extremely low transport current density of about 10{sup 5} A/m{sup 2} corresponding to the global critical current density flows through the whole sample. Furthermore, a unique history dependence of global critical current density was observed, i.e., it shows a smaller value in the increasing-field process than that in the decreasing-field process. The history dependence of global critical current characteristic in our case can be ascribed to the existence of the weak-link property between the grains in SmFeAsO{sub 1-x}F{sub x} bulk.

  13. Critical current characteristics and history dependence in superconducting SmFeAsOF bulk

    Science.gov (United States)

    Ni, B.; Ge, J.; Kiuchi, M.; Otabe, E. S.; Gao, Z.; Wang, L.; Qi, Y.; Zhang, X.; Ma, Y.

    2010-06-01

    The superconducting SmFeAsO1-xFx (x=0.2) polycrystalline bulks were prepared by the powder-in-tube (PIT) method. The magnetic field and temperature dependences of critical current densities in the samples were investigated by resistive and ac inductive (Campbell's) methods. It was found that a fairly large shielding current density over 109 A/m2, which is considered to correspond to the local critical current density, flows locally with the perimeter size similar to the average grain size of the bulk samples, while an extremely low transport current density of about 105 A/m2 corresponding to the global critical current density flows through the whole sample. Furthermore, a unique history dependence of global critical current density was observed, i.e., it shows a smaller value in the increasing-field process than that in the decreasing-field process. The history dependence of global critical current characteristic in our case can be ascribed to the existence of the weak-link property between the grains in SmFeAsO1-xFx bulk.

  14. History-dependent Selection of Primary Dendritic Spacing in Directionally Solidified Alloy

    Institute of Scientific and Technical Information of China (English)

    Linlin WANG; Xin LIN; Guolu DING; Lilin WANG; Weidong HUANG

    2007-01-01

    Directional solidification experiments were carried out for succinonitrile-1.0 wt pct acetone alloy with the orientation of dendritic arrays being not parallel to the direction of the temperature gradient. Experimental results show that there exists an allowable range of primary dendritic spacing under a given growth condition.The average primary spacing depends not only on the current growth conditions but also on the way by which the conditions were achieved. The upper limit of the allowable range becomes smaller in comparison with that with direction of dendrite arrays parallel to the direction of the temperature gradient, which means that the history-dependence of dendritic growth is weaker under this condition. The lower limit obtained is compared with a self-consistent model, which shows a good agreement with experimental results.

  15. Analysis of Cohesive Microsized Particle Packing Structure Using History-Dependent Contact Models

    CERN Document Server

    Tayeb, Raihan; Mao, Yijin; Zhang, Yuwen

    2016-01-01

    Granular packing structures of cohesive micro-sized particles with different sizes and size distributions, including mono-sized, uniform and Gaussian distribution, are investigated by using two different history dependent contact models with Discrete Element Method (DEM). The simulation is carried out in the framework of LIGGGHTS which is a DEM simulation package extended based on branch of granular package of widely used open-source code LAMMPS. Contact force caused by translation and rotation, frictional and damping forces due to collision with other particles or container boundaries, cohesive force, van der Waals force, and gravity are considered. The radial distribution functions (RDFs), force distributions, porosities, and coordination numbers under cohesive and non-cohesive conditions are reported. The results indicate that particle size and size distributions have great influences on the packing density for particle packing under cohesive effect: particles with Gaussian distribution have the lowest pac...

  16. Single DNA molecule jamming and history-dependent dynamics during motor-driven viral packaging

    Science.gov (United States)

    Keller, Nicholas; Grimes, Shelley; Jardine, Paul J.; Smith, Douglas E.

    2016-08-01

    In many viruses, molecular motors forcibly pack single DNA molecules to near-crystalline density into ~50-100 nm prohead shells. Unexpectedly, we found that packaging frequently stalls in conditions that induce net attractive DNA-DNA interactions. Here, we present findings suggesting that this stalling occurs because the DNA undergoes a nonequilibrium jamming transition analogous to that observed in many soft-matter systems, such as colloidal and granular systems. Experiments in which conditions are changed during packaging to switch DNA-DNA interactions between purely repulsive and net attractive reveal strongly history-dependent dynamics. An abrupt deceleration is usually observed before stalling, indicating that a transition in DNA conformation causes an abrupt increase in resistance. Our findings suggest that the concept of jamming can be extended to a single polymer molecule. However, compared with macroscopic samples of colloidal particles we find that single DNA molecules jam over a much larger range of densities. We attribute this difference to the nanoscale system size, consistent with theoretical predictions for jamming of attractive athermal particles.

  17. Thermal-history dependent magnetoelastic transition in (Mn,Fe){sub 2}(P,Si)

    Energy Technology Data Exchange (ETDEWEB)

    Miao, X. F., E-mail: x.f.miao@tudelft.nl; Dijk, N. H. van; Brück, E. [Fundamental Aspects of Materials and Energy, Faculty of Applied Sciences, Delft University of Technology, Mekelweg 15, 2629 JB Delft (Netherlands); Caron, L. [Fundamental Aspects of Materials and Energy, Faculty of Applied Sciences, Delft University of Technology, Mekelweg 15, 2629 JB Delft (Netherlands); Max Planck Institute for Chemical Physics of Solids, Nöthnitzer Straße 40, D-01187 Dresden (Germany); Gercsi, Z. [Blackett Laboratory, Department of Physics, Imperial College London, London SW7 2AZ (United Kingdom); CRANN and School of Physics, Trinity College Dublin, Dublin (Ireland); Daoud-Aladine, A. [ISIS Facility, Rutherford Appleton Laboratory, Chilton, Didcot, Oxfordshire OX11 0QX (United Kingdom)

    2015-07-27

    The thermal-history dependence of the magnetoelastic transition in (Mn,Fe){sub 2}(P,Si) compounds has been investigated using high-resolution neutron diffraction. As-prepared samples display a large difference in paramagnetic-ferromagnetic (PM-FM) transition temperature compared to cycled samples. The initial metastable state transforms into a lower-energy stable state when the as-prepared sample crosses the PM-FM transition for the first time. This additional transformation is irreversible around the transition temperature and increases the energy barrier which needs to be overcome through the PM-FM transition. Consequently, the transition temperature on first cooling is found to be lower than on subsequent cycles characterizing the so-called “virgin effect.” High-temperature annealing can restore the cycled sample to the high-temperature metastable state, which leads to the recovery of the virgin effect. A model is proposed to interpret the formation and recovery of the virgin effect.

  18. Understanding scaling through history-dependent processes with collapsing sample space

    Science.gov (United States)

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2015-01-01

    History-dependent processes are ubiquitous in natural and social systems. Many such stochastic processes, especially those that are associated with complex systems, become more constrained as they unfold, meaning that their sample space, or their set of possible outcomes, reduces as they age. We demonstrate that these sample-space-reducing (SSR) processes necessarily lead to Zipf’s law in the rank distributions of their outcomes. We show that by adding noise to SSR processes the corresponding rank distributions remain exact power laws, p(x)∼x−λ, where the exponent directly corresponds to the mixing ratio of the SSR process and noise. This allows us to give a precise meaning to the scaling exponent in terms of the degree to which a given process reduces its sample space as it unfolds. Noisy SSR processes further allow us to explain a wide range of scaling exponents in frequency distributions ranging from α=2 to ∞. We discuss several applications showing how SSR processes can be used to understand Zipf’s law in word frequencies, and how they are related to diffusion processes in directed networks, or aging processes such as in fragmentation processes. SSR processes provide a new alternative to understand the origin of scaling in complex systems without the recourse to multiplicative, preferential, or self-organized critical processes. PMID:25870294

  19. Universal quantifier derived from AFM analysis links cellular mechanical properties and cell-surface integration forces with microbial deposition and transport behavior.

    Science.gov (United States)

    Li, Yueyun; Wang, Xin; Onnis-Hayden, Annalisa; Wan, Kai-tak; Gu, April Z

    2014-01-01

    In this study, we employed AFM analysis combined with mathematical modeling for quantifying cell-surface contact mechanics and magnitude and range of cell-surface interaction forces for seven bacterial strains with a wide range of cell morphology, dimension, and surface characteristics. Comprehensive cell-surface characterization including surface charge, extracellular polymeric substance content, hydrophobicity, and cell-cell aggregation analyses were performed. Flow-through column tests were employed to determine the attachment efficiency and deposition-transport behavior of these bacterial strains. No statistically significant correlation between attachment efficiency and any single-cell surface property was identified. Single-cell characterization by atomic force microscopy (AFM) yielded the mechanical deformation and elastic modulus, penetration resistance to AFM probe penetration by cellular surface substances (CSS), range and magnitude of the repulsive-attractive intersurface forces, and geometry of each strain. We proposed and derived a universal dimensionless modified Tabor's parameter to integrate all these properties that account for their collective behavior. Results showed that the Tabor parameter derived from AFM analysis correlated well with experimentally determined attachment efficiency (α), which therefore is able to link microscale cell-surface properties with macroscale bacterial transport behavior. Results suggested that the AFM tests performed between a single cell and a surface captured the key quantities of the interactions between the cell and the surface that dictate overall cell attachment behavior. Tabor's parameter therefore can be potentially incorporated into the microbial transport model.

  20. Quantifying the Land-Atmosphere Coupling Behavior in Modern Reanalysis Products over the U.S. Southern Great Plains

    Science.gov (United States)

    Santanello, J. A.; Roundy, J. K.; Dirmeyer, P.

    2014-12-01

    The coupling of the land with the planetary boundary layer (PBL) on diurnal timescales is critical to regulating the strength of the connection between soil moisture and precipitation. To improve our understanding of land-atmosphere (L-A) interactions, recent studies have focused on the development of diagnostics to quantify the strength and accuracy of the land-PBL coupling at the process-level. In this paper, we apply a suite of local land-atmosphere coupling (LoCo) metrics to modern reanalysis (RA) products and observations during a 17-year period over the U. S. Southern Great Plains. Specifically, a range of diagnostics exploring the links between soil moisture, evaporation, PBL height, temperature, humidity, and precipitation are applied to the summertime monthly mean diurnal cycles of the North American Regional Reanalysis (NARR), Modern-Era Retrospective analysis for Research and Applications (MERRA), and Climate Forecast System Reanalysis (CFSR). Results show that CFSR is the driest and MERRA the wettest of the three RAs in terms of overall surface-PBL coupling. When compared against observations, CFSR has a significant dry bias that impacts all components of the land-PBL system. CFSR and NARR are more similar in terms of PBL dynamics and response to dry and wet extremes, while MERRA is more constrained in terms of evaporation and PBL variability. The implications for moist processes are also discussed, which warrants further investigation into the potential downstream impacts of land-PBL coupling on the diurnal cycle of clouds, convection, and precipitation. Lastly, the results are put into context of community investigations into drought assessment and predictability over the region and underscore that caution should be used when treating RAs as truth, as the coupled water and energy cycle representation in each can vary considerably.

  1. Educational Differences in Postmenopausal Breast Cancer - Quantifying Indirect Effects through Health Behaviors, Body Mass Index and Reproductive Patterns

    DEFF Research Database (Denmark)

    Hvidtfeldt, Ulla Arthur; Lange, Theis; Andersen, Ingelise;

    2013-01-01

    of the effect of educational level on breast cancer incidence into indirect effects through reproductive patterns (parity and age at first birth), body mass index and health behavior (alcohol consumption, physical inactivity, and hormone therapy use). The study was based on a pooled cohort of 6 studies from......-years at risk. Of these, 26% (95% CI 14%-69%) could be attributed to alcohol consumption. Similar effects were observed for age at first birth (32%; 95% CI 10%-257%), parity (19%; 95%CI 10%-45%), and hormone therapy use (10%; 95% CI 6%-18%). Educational level modified the effect of physical activity on breast...... educational level may be more vulnerable to physical inactivity compared to women of low educational level....

  2. The Risk of Repetition of Attempted Suicide Among Iranian Women with Psychiatric Disorders as Quantified by the Suicide Behaviors Questionnaire

    Directory of Open Access Journals (Sweden)

    Jalal Shakeri

    2015-05-01

    Full Text Available Objectives: The factors associated with repetition of attempted suicide are poorly categorized in the Iranian population. In this study, the prevalence of different psychiatric disorders among women who attempted suicide and the risk of repetition were assessed. Methods: Participants were women admitted to the Poisoning Emergency Hospital, Kermanshah University of Medical Sciences following failed suicide attempts. Psychiatric disorders were diagnosed based on the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV symptom checklist. Risk of repetition was evaluated using the Suicide Behaviors Questionnaire-Revised (SBQ-R. Results: About 72% of individuals had a SBQ-R score >8 and were considered to be at high risk for repeated attempted suicide. Adjustment disorders were the most common psychiatric disorders (40.8%. However, the type of psychiatric disorder was not associated with the risk of repetition (p=0.320. Marital status, educational level, employment, substance use, history of suicide among family members, and motivation were not determinant factors for repetition of suicide attempt (p=0.220, 0.880, 0.220, 0.290, 0.350 and 0.270, respectively. Younger women were associated with violent methods of attempted suicide, such as self-cutting, whereas older individuals preferred consumption of poison (p<0.001. Drug overdose was more common among single and married women whereas widows or divorcees preferred self-burning (p=0.004. Conclusion: About 72% of patients with failed suicide attempts were at high risk for repeated attempts. Age, marital status, and type of psychiatric disorder were the only determinants of suicide method. Adjustment disorders were the most common psychiatric disorders among Iranian women. However, this did not predict the risk of further attempts.

  3. Educational differences in postmenopausal breast cancer--quantifying indirect effects through health behaviors, body mass index and reproductive patterns.

    Science.gov (United States)

    Hvidtfeldt, Ulla Arthur; Lange, Theis; Andersen, Ingelise; Diderichsen, Finn; Keiding, Niels; Prescott, Eva; Sørensen, Thorkild I A; Tjønneland, Anne; Rod, Naja Hulvej

    2013-01-01

    Studying mechanisms underlying social inequality in postmenopausal breast cancer is important in order to develop prevention strategies. Standard methods for investigating indirect effects, by comparing crude models to adjusted, are often biased. We applied a new method enabling the decomposition of the effect of educational level on breast cancer incidence into indirect effects through reproductive patterns (parity and age at first birth), body mass index and health behavior (alcohol consumption, physical inactivity, and hormone therapy use). The study was based on a pooled cohort of 6 studies from the Copenhagen area including 33,562 women (1,733 breast cancer cases) aged 50-70 years at baseline. The crude absolute rate of breast cancer was 399 cases per 100,000 person-years. A high educational level compared to low was associated with 74 (95% CI 22-125) extra breast cancer cases per 100,000 person-years at risk. Of these, 26% (95% CI 14%-69%) could be attributed to alcohol consumption. Similar effects were observed for age at first birth (32%; 95% CI 10%-257%), parity (19%; 95%CI 10%-45%), and hormone therapy use (10%; 95% CI 6%-18%). Educational level modified the effect of physical activity on breast cancer. In conclusion, this analysis suggests that a substantial number of the excess postmenopausal breast cancer events among women with a high educational level compared to a low can be attributed to differences in alcohol consumption, use of hormone therapy, and reproductive patterns. Women of high educational level may be more vulnerable to physical inactivity compared to women of low educational level.

  4. Quantifying Concordance

    CERN Document Server

    Seehars, Sebastian; Amara, Adam; Refregier, Alexandre

    2015-01-01

    Quantifying the concordance between different cosmological experiments is important for testing the validity of theoretical models and systematics in the observations. In earlier work, we thus proposed the Surprise, a concordance measure derived from the relative entropy between posterior distributions. We revisit the properties of the Surprise and describe how it provides a general, versatile, and robust measure for the agreement between datasets. We also compare it to other measures of concordance that have been proposed for cosmology. As an application, we extend our earlier analysis and use the Surprise to quantify the agreement between WMAP 9, Planck 13 and Planck 15 constraints on the $\\Lambda$CDM model. Using a principle component analysis in parameter space, we find that the large Surprise between WMAP 9 and Planck 13 (S = 17.6 bits, implying a deviation from consistency at 99.8% confidence) is due to a shift along a direction that is dominated by the amplitude of the power spectrum. The Surprise disa...

  5. Bayesian deterministic decision making: A normative account of the operant matching law and heavy-tailed reward history dependency of choices

    Directory of Open Access Journals (Sweden)

    Hiroshi eSaito

    2014-03-01

    Full Text Available The decision making behaviors of humans and animals adapt and then satisfy an ``operant matching law'' in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.

  6. Mechanical History Dependence in Carbon Black Suspensions for Flow Batteries: A Rheo-Impedance Study

    Science.gov (United States)

    2017-01-01

    We studied the effects of shear and its history on suspensions of carbon black (CB) in lithium ion battery electrolyte via simultaneous rheometry and electrical impedance spectroscopy. Ketjen black (KB) suspensions showed shear thinning and rheopexy and exhibited a yield stress. Shear step experiments revealed a two time scale response. The immediate effect of decreasing the shear rate is an increase in both viscosity and electronic conductivity. In a much slower secondary response, both quantities change in the opposite direction, leading to a reversal of the initial change in the conductivity. Stepwise increases in the shear rate lead to similar responses in the opposite direction. This remarkable behavior is consistent with a picture in which agglomerating KB particles can stick directly on contact, forming open structures, and then slowly interpenetrate and densify. The fact that spherical CB particles show the opposite slow response suggests that the fractal structure of the KB primary units plays an important role. A theoretical scheme was used to analyze the shear and time-dependent viscosity and conductivity. Describing the agglomerates as effective hard spheres with a fractal architecture and using an effective medium approximation for the conductivity, we found the changes in the derived suspension structure to be in agreement with our qualitative mechanistic picture. This behavior of KB in flow has consequences for the properties of the gel network that is formed immediately after the cessation of shear: both the yield stress and the electronic conductivity increase with the previously applied shear rate. Our findings thus have clear implications for the operation and filling strategies of semisolid flow batteries. PMID:28122184

  7. Temperature dependent behavior of thermal conductivity of sub-5 nm Ir film: Defect-electron scattering quantified by residual thermal resistivity

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Zhe; Xu, Zaoli; Xu, Shen [Department of Mechanical Engineering, Iowa State University, 2010 Black Engineering Building, Ames, Iowa 50011 (United States); Wang, Xinwei, E-mail: xwang3@iastate.edu [Department of Mechanical Engineering, Iowa State University, 2010 Black Engineering Building, Ames, Iowa 50011 (United States); School of Urban Development and Environmental Engineering, Shanghai Second Polytechnic University, Shanghai 201209 (China)

    2015-01-14

    By studying the temperature-dependent behavior (300 K down to 43 K) of electron thermal conductivity (κ) in a 3.2 nm-thin Ir film, we quantify the extremely confined defect-electron scatterings and isolate the intrinsic phonon-electron scattering that is shared by the bulk Ir. At low temperatures below 50 K, κ of the film has almost two orders of magnitude reduction from that of bulk Ir. The film has ∂κ/∂T > 0, while the bulk Ir has ∂κ/∂T < 0. We introduce a unified thermal resistivity (Θ = T/κ) to interpret these completely different κ ∼ T relations. It is found that the film and the bulk Ir share a very similar Θ ∼ T trend, while they have a different residual part (Θ{sub 0}) at 0 K limit: Θ{sub 0} ∼ 0 for the bulk Ir, and Θ{sub 0} = 5.5 m·K{sup 2}/W for the film. The Ir film and the bulk Ir have very close ∂Θ/∂T (75–290 K): 6.33 × 10{sup −3} m K/W for the film and 7.62 × 10{sup −3} m K/W for the bulk Ir. This strongly confirms the similar phonon-electron scattering in them. Therefore, the residual thermal resistivity provides an unprecedented way to quantitatively evaluating defect-electron scattering (Θ{sub 0}) in heat conduction. Moreover, the interfacial thermal conductance across the grain boundaries is found larger than that of Al/Cu interface, and its value is proportional to temperature, largely due to the electron's specific heat. A unified interfacial thermal conductance is also defined and firmly proves this relation. Additionally, the electron reflection coefficient is found to be large (88%) and almost temperature independent.

  8. Quantifying economic fluctuations

    Science.gov (United States)

    Stanley, H. Eugene; Nunes Amaral, Luis A.; Gabaix, Xavier; Gopikrishnan, Parameswaran; Plerou, Vasiliki

    2001-12-01

    This manuscript is a brief summary of a talk designed to address the question of whether two of the pillars of the field of phase transitions and critical phenomena-scale invariance and universality-can be useful in guiding research on interpreting empirical data on economic fluctuations. Using this conceptual framework as a guide, we empirically quantify the relation between trading activity-measured by the number of transactions N-and the price change G( t) for a given stock, over a time interval [ t, t+Δ t]. We relate the time-dependent standard deviation of price changes-volatility-to two microscopic quantities: the number of transactions N( t) in Δ t and the variance W2( t) of the price changes for all transactions in Δ t. We find that the long-ranged volatility correlations are largely due to those of N. We then argue that the tail-exponent of the distribution of N is insufficient to account for the tail-exponent of P{ G> x}. Since N and W display only weak inter-dependency, our results show that the fat tails of the distribution P{ G> x} arises from W. Finally, we review recent work on quantifying collective behavior among stocks by applying the conceptual framework of random matrix theory (RMT). RMT makes predictions for “universal” properties that do not depend on the interactions between the elements comprising the system, and deviations from RMT provide clues regarding system-specific properties. We compare the statistics of the cross-correlation matrix C-whose elements Cij are the correlation coefficients of price fluctuations of stock i and j-against a random matrix having the same symmetry properties. It is found that RMT methods can distinguish random and non-random parts of C. The non-random part of C which deviates from RMT results, provides information regarding genuine collective behavior among stocks. We also discuss results that are reminiscent of phase transitions in spin systems, where the divergent behavior of the response function at

  9. The effect of the nonlinear velocity and history dependencies of the aerodynamic force on the dynamic response of a rotating wind turbine blade

    Science.gov (United States)

    van der Male, Pim; van Dalen, Karel N.; Metrikine, Andrei V.

    2016-11-01

    Existing models for the analysis of offshore wind turbines account for the aerodynamic action on the turbine rotor in detail, requiring a high computational price. When considering the foundation of an offshore wind turbine, however, a reduced rotor model may be sufficient. To define such a model, the significance of the nonlinear velocity and history dependency of the aerodynamic force on a rotating blade should be known. Aerodynamic interaction renders the dynamics of a rotating blade in an ambient wind field nonlinear in terms of the dependency on the wind velocity relative to the structural motion. Moreover, the development in time of the aerodynamic force does not follow the flow velocity instantaneously, implying a history dependency. In addition, both the non-uniform blade geometry and the aerodynamic interaction couple the blade motions in and out of the rotational plane. Therefore, this study presents the Euler-Bernoulli formulation of a twisted rotating blade connected to a rigid hub, excited by either instantaneous or history-dependent aerodynamic forces. On this basis, the importance of the history dependency is determined. Moreover, to assess the nonlinear contributions, both models are linearized. The structural response is computed for a stand-still and a rotating blade, based on the NREL 5-MW turbine. To this end, the model is reduced on the basis of its first three free-vibration mode shapes. Blade tip response predictions, computed from turbulent excitation, correctly account for both modal and directional couplings, and the added damping resulting from the dependency of the aerodynamic force on the structural motion. Considering the deflection of the blade tip, the history-dependent and the instantaneous force models perform equally well, providing a basis for the potential use of the instantaneous model for the rotor reduction. The linearized instantaneous model provides similar results for the rotating blade, indicating its potential

  10. Technical note: Quantifying and characterizing behavior in dairy calves using the IceTag automatic recording device

    DEFF Research Database (Denmark)

    Trénel, P.; Jensen, Margit Bak; Decker, Erik Luc

    2009-01-01

    The objectives of the current study were 1) to validate the IceTag ( http://www.icerobotics.com/ ) automatic recording device for measuring lying, standing, and moving behavior in dairy calves, and 2) to improve the information yield from this device by applying a filtering procedure allowing...... for the detection of lying versus upright. The IceTag device provides measures of intensity (I) of lying, standing, and activity measured as percent lying, percent standing, and percent active, but does not directly measure lying, standing, and moving behavior because body movements occurring while lying (e...... (LPC) was established empirically, and IceTag data were filtered according to the LPC, providing information on the posture of the animal as lying versus being upright. Third, a new threshold of I was estimated for moving activity conditional on the animal being upright. IceTag recordings from 9 calves...

  11. Calculating radiation exposures during use of (14)C-labeled nutrients, food components, and biopharmaceuticals to quantify metabolic behavior in humans.

    Science.gov (United States)

    Kim, Seung-Hyun; Kelly, Peter B; Clifford, Andrew J

    2010-04-28

    (14)C has long been used as a tracer for quantifying the in vivo human metabolism of food components, biopharmaceuticals, and nutrients. Minute amounts (nutrients to be organized into models suitable for quantitative hypothesis testing and determination of metabolic parameters. In vivo models are important for specification of intake levels for food components, biopharmaceuticals, and nutrients. Accurate estimation of the radiation exposure from ingested (14)C is an essential component of the experimental design. Therefore, this paper illustrates the calculation involved in determining the radiation exposure from a minute dose of orally administered (14)C-beta-carotene, (14)C-alpha-tocopherol, (14)C-lutein, and (14)C-folic acid from four prior experiments. The administered doses ranged from 36 to 100 nCi, and radiation exposure ranged from 0.12 to 5.2 microSv to whole body and from 0.2 to 3.4 microSv to liver with consideration of tissue weighting factor and fractional nutrient. In comparison, radiation exposure experienced during a 4 h airline flight across the United States at 37000 ft was 20 microSv.

  12. Electricity and Water Conservation on College and University Campuses in Response to National Competitions among Dormitories: Quantifying Relationships between Behavior, Conservation Strategies and Psychological Metrics.

    Science.gov (United States)

    Petersen, John E; Frantz, Cynthia M; Shammin, Md Rumi; Yanisch, Tess M; Tincknell, Evan; Myers, Noel

    2015-01-01

    "Campus Conservation Nationals" (CCN) is a recurring, nation-wide electricity and water-use reduction competition among dormitories on college campuses. We conducted a two year empirical study of the competition's effects on resource consumption and the relationship between conservation, use of web technology and various psychological measures. Significant reductions in electricity and water use occurred during the two CCN competitions examined (n = 105,000 and 197,000 participating dorm residents respectively). In 2010, overall reductions during the competition were 4% for electricity and 6% for water. The top 10% of dorms achieved 28% and 36% reductions in electricity and water respectively. Participation was larger in 2012 and reductions were slightly smaller (i.e. 3% electricity). The fact that no seasonal pattern in electricity use was evident during non-competition periods suggests that results are attributable to the competition. Post competition resource use data collected in 2012 indicates that conservation behavior was sustained beyond the competition. Surveys were used to assess psychological and behavioral responses (n = 2,900 and 2,600 in 2010 and 2012 respectively). Electricity reductions were significantly correlated with: web visitation, specific conservation behaviors, awareness of the competition, motivation and sense of empowerment. However, participants were significantly more motivated than empowered. Perceived benefits of conservation were skewed towards global and future concerns while perceived barriers tended to be local. Results also suggest that competitions may be useful for "preaching beyond the choir"-engaging those who might lack prior intrinsic or political motivation. Although college life is distinct, certain conclusions related to competitions, self-efficacy, and motivation and social norms likely extend to other residential settings.

  13. Electricity and Water Conservation on College and University Campuses in Response to National Competitions among Dormitories: Quantifying Relationships between Behavior, Conservation Strategies and Psychological Metrics.

    Directory of Open Access Journals (Sweden)

    John E Petersen

    Full Text Available "Campus Conservation Nationals" (CCN is a recurring, nation-wide electricity and water-use reduction competition among dormitories on college campuses. We conducted a two year empirical study of the competition's effects on resource consumption and the relationship between conservation, use of web technology and various psychological measures. Significant reductions in electricity and water use occurred during the two CCN competitions examined (n = 105,000 and 197,000 participating dorm residents respectively. In 2010, overall reductions during the competition were 4% for electricity and 6% for water. The top 10% of dorms achieved 28% and 36% reductions in electricity and water respectively. Participation was larger in 2012 and reductions were slightly smaller (i.e. 3% electricity. The fact that no seasonal pattern in electricity use was evident during non-competition periods suggests that results are attributable to the competition. Post competition resource use data collected in 2012 indicates that conservation behavior was sustained beyond the competition. Surveys were used to assess psychological and behavioral responses (n = 2,900 and 2,600 in 2010 and 2012 respectively. Electricity reductions were significantly correlated with: web visitation, specific conservation behaviors, awareness of the competition, motivation and sense of empowerment. However, participants were significantly more motivated than empowered. Perceived benefits of conservation were skewed towards global and future concerns while perceived barriers tended to be local. Results also suggest that competitions may be useful for "preaching beyond the choir"-engaging those who might lack prior intrinsic or political motivation. Although college life is distinct, certain conclusions related to competitions, self-efficacy, and motivation and social norms likely extend to other residential settings.

  14. Quantifiers, Anaphora and Intensionality

    CERN Document Server

    Dalrymple, M; Pereira, F C N; Saraswat, V; Dalrymple, Mary; Lamping, John; Pereira, Fernando; Saraswat, Vijay

    1995-01-01

    The relationship between Lexical-Functional Grammar (LFG) {\\em functional structures} (f-structures) for sentences and their semantic interpretations can be expressed directly in a fragment of linear logic in a way that correctly explains the constrained interactions between quantifier scope ambiguity, bound anaphora and intensionality. This deductive approach to semantic interpretaion obviates the need for additional mechanisms, such as Cooper storage, to represent the possible scopes of a quantified NP, and explains the interactions between quantified NPs, anaphora and intensional verbs such as `seek'. A single specification in linear logic of the argument requirements of intensional verbs is sufficient to derive the correct reading predictions for intensional-verb clauses both with nonquantified and with quantified direct objects. In particular, both de dicto and de re readings are derived for quantified objects. The effects of type-raising or quantifying-in rules in other frameworks here just follow as li...

  15. Quantifying the Behavioral Response of Spawning Chum Salmon to Elevated Discharges from Bonneville Dam, Columbia River : Annual Report 2005-2006.

    Energy Technology Data Exchange (ETDEWEB)

    Tiffan, Kenneth F.; Haskell, Craig A.; Kock, Tobias J.

    2008-12-01

    In unimpounded rivers, Pacific salmon (Oncorhynchus spp.) typically spawn under relatively stable stream flows, with exceptions occurring during periodic precipitation events. In contrast, hydroelectric development has often resulted in an artificial hydrograph characterized by rapid changes in discharge and tailwater elevation that occur on a daily, or even an hourly basis, due to power generation (Cushman 1985; Moog 1993). Consequently, populations of Pacific salmon that are known to spawn in main-stem habitats below hydroelectric dams face the risks of changing habitat suitability, potential redd dewatering, and uncertain spawning success (Hamilton and Buell 1976; Chapman et al. 1986; Dauble et al. 1999; Garland et al. 2003; Connor and Pflug 2004; McMichael et al. 2005). Although the direct effects of a variable hydrograph, such as redd dewatering are apparent, specific effects on spawning behavior remain largely unexplored. Chum salmon (O. keta) that spawn below Bonneville Dam on the Columbia River are particularly vulnerable to the effects of water level fluctuations. Although chum salmon generally spawn in smaller tributaries (Johnson et al. 1997), many fish spawn in main-stem habitats below Bonneville Dam near Ives Island (Tomaro et al. 2007; Figure 1). The primary spawning area near Ives Island is shallow and sensitive to changes in water level caused by hydroelectric power generation at Bonneville Dam. In the past, fluctuating water levels have dewatered redds and changed the amount of available spawning habitat (Garland et al. 2003). To minimize these effects, fishery managers attempt to maintain a stable tailwater elevation at Bonneville Dam of 3.5 m (above mean sea level) during spawning, which ensures adequate water is provided to the primary chum salmon spawning area below the mouth of Hamilton Creek (Figure 1). Given the uncertainty of winter precipitation and water supply, this strategy has been effective at restricting spawning to a specific

  16. Decomposing generalized quantifiers

    NARCIS (Netherlands)

    Westerståhl, D.

    2008-01-01

    This note explains the circumstances under which a type <1> quantifier can be decomposed into a type <1, 1> quantifier and a set, by fixing the first argument of the former to the latter. The motivation comes from the semantics of Noun Phrases (also called Determiner Phrases) in natural languages, b

  17. Decomposing generalized quantifiers

    NARCIS (Netherlands)

    Westerståhl, D.

    2008-01-01

    This note explains the circumstances under which a type <1> quantifier can be decomposed into a type <1, 1> quantifier and a set, by fixing the first argument of the former to the latter. The motivation comes from the semantics of Noun Phrases (also called Determiner Phrases) in natural languages,

  18. Understanding quantifiers in language

    NARCIS (Netherlands)

    Szymanik, J.; Zajenkowski, M.; Taatgen, N.; van Rijn, H.

    2009-01-01

    We compare time needed for understanding different types of quantifiers. We show that the computational distinction between quantifiers recognized by finite-automata and push-down automata is psychologically relevant. Our research improves upon hypothesis and explanatory power of recent neuroimaging

  19. Connected Car: Quantified Self becomes Quantified Car

    Directory of Open Access Journals (Sweden)

    Melanie Swan

    2015-02-01

    Full Text Available The automotive industry could be facing a situation of profound change and opportunity in the coming decades. There are a number of influencing factors such as increasing urban and aging populations, self-driving cars, 3D parts printing, energy innovation, and new models of transportation service delivery (Zipcar, Uber. The connected car means that vehicles are now part of the connected world, continuously Internet-connected, generating and transmitting data, which on the one hand can be helpfully integrated into applications, like real-time traffic alerts broadcast to smartwatches, but also raises security and privacy concerns. This paper explores the automotive connected world, and describes five killer QS (Quantified Self-auto sensor applications that link quantified-self sensors (sensors that measure the personal biometrics of individuals like heart rate and automotive sensors (sensors that measure driver and passenger biometrics or quantitative automotive performance metrics like speed and braking activity. The applications are fatigue detection, real-time assistance for parking and accidents, anger management and stress reduction, keyless authentication and digital identity verification, and DIY diagnostics. These kinds of applications help to demonstrate the benefit of connected world data streams in the automotive industry and beyond where, more fundamentally for human progress, the automation of both physical and now cognitive tasks is underway.

  20. Quantifying linguistic coordination

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Tylén, Kristian

    ). We employ nominal recurrence analysis (Orsucci et al 2005, Dale et al 2011) on the decision-making conversations between the participants. We report strong correlations between various indexes of recurrence and collective performance. We argue this method allows us to quantify the qualities......Language has been defined as a social coordination device (Clark 1996) enabling innovative modalities of joint action. However, the exact coordinative dynamics over time and their effects are still insufficiently investigated and quantified. Relying on the data produced in a collective decision...

  1. Quantifying synergistic mutual information

    CERN Document Server

    Griffith, Virgil

    2012-01-01

    Quantifying cooperation among random variables in predicting a single target random variable is an important problem in many biological systems with 10s to 1000s of co-dependent variables. We review the prior literature of information theoretical measures of synergy and introduce a novel synergy measure, entitled *synergistic mutual information* and compare it against the three existing measures of cooperation. We apply all four measures against a suite of binary circuits to demonstrate our measure alone quantifies the intuitive concept of synergy across all examples.

  2. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    -case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only......Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst...... compare the worst-case execution time bounds of different architectures....

  3. Behaviorism

    National Research Council Canada - National Science Library

    Moore, J

    2011-01-01

    .... Watson proposed an alternative: classical S-R behaviorism. According to Watson, behavior was a subject matter in its own right, to be studied by the observational methods common to all sciences...

  4. Behaviorism

    Science.gov (United States)

    Moore, J.

    2011-01-01

    Early forms of psychology assumed that mental life was the appropriate subject matter for psychology, and introspection was an appropriate method to engage that subject matter. In 1913, John B. Watson proposed an alternative: classical S-R behaviorism. According to Watson, behavior was a subject matter in its own right, to be studied by the…

  5. Behaviorism

    Science.gov (United States)

    Moore, J.

    2011-01-01

    Early forms of psychology assumed that mental life was the appropriate subject matter for psychology, and introspection was an appropriate method to engage that subject matter. In 1913, John B. Watson proposed an alternative: classical S-R behaviorism. According to Watson, behavior was a subject matter in its own right, to be studied by the…

  6. On Quantifying Semantic Information

    Directory of Open Access Journals (Sweden)

    Simon D’Alfonso

    2011-01-01

    Full Text Available The purpose of this paper is to look at some existing methods of semantic information quantification and suggest some alternatives. It begins with an outline of Bar-Hillel and Carnap’s theory of semantic information before going on to look at Floridi’s theory of strongly semantic information. The latter then serves to initiate an in-depth investigation into the idea of utilising the notion of truthlikeness to quantify semantic information. Firstly, a couple of approaches to measure truthlikeness are drawn from the literature and explored, with a focus on their applicability to semantic information quantification. Secondly, a similar but new approach to measure truthlikeness/information is presented and some supplementary points are made.

  7. Quantifying the adaptive cycle

    Science.gov (United States)

    Angeler, David G.; Allen, Craig R.; Garmestani, Ahjond S.; Gunderson, Lance H.; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994–2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  8. Quantifying traffic exposure.

    Science.gov (United States)

    Pratt, Gregory C; Parson, Kris; Shinoda, Naomi; Lindgren, Paula; Dunlap, Sara; Yawn, Barbara; Wollan, Peter; Johnson, Jean

    2014-01-01

    Living near traffic adversely affects health outcomes. Traffic exposure metrics include distance to high-traffic roads, traffic volume on nearby roads, traffic within buffer distances, measured pollutant concentrations, land-use regression estimates of pollution concentrations, and others. We used Geographic Information System software to explore a new approach using traffic count data and a kernel density calculation to generate a traffic density surface with a resolution of 50 m. The density value in each cell reflects all the traffic on all the roads within the distance specified in the kernel density algorithm. The effect of a given roadway on the raster cell value depends on the amount of traffic on the road segment, its distance from the raster cell, and the form of the algorithm. We used a Gaussian algorithm in which traffic influence became insignificant beyond 300 m. This metric integrates the deleterious effects of traffic rather than focusing on one pollutant. The density surface can be used to impute exposure at any point, and it can be used to quantify integrated exposure along a global positioning system route. The traffic density calculation compares favorably with other metrics for assessing traffic exposure and can be used in a variety of applications.

  9. Quantifying loopy network architectures.

    Directory of Open Access Journals (Sweden)

    Eleni Katifori

    Full Text Available Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes from the metric topology (connectivity and edge weight and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.

  10. Uncertainty quantified trait predictions

    Science.gov (United States)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  11. Quantifying innovation in surgery.

    Science.gov (United States)

    Hughes-Hallett, Archie; Mayer, Erik K; Marcus, Hani J; Cundy, Thomas P; Pratt, Philip J; Parston, Greg; Vale, Justin A; Darzi, Ara W

    2014-08-01

    The objectives of this study were to assess the applicability of patents and publications as metrics of surgical technology and innovation; evaluate the historical relationship between patents and publications; develop a methodology that can be used to determine the rate of innovation growth in any given health care technology. The study of health care innovation represents an emerging academic field, yet it is limited by a lack of valid scientific methods for quantitative analysis. This article explores and cross-validates 2 innovation metrics using surgical technology as an exemplar. Electronic patenting databases and the MEDLINE database were searched between 1980 and 2010 for "surgeon" OR "surgical" OR "surgery." Resulting patent codes were grouped into technology clusters. Growth curves were plotted for these technology clusters to establish the rate and characteristics of growth. The initial search retrieved 52,046 patents and 1,801,075 publications. The top performing technology cluster of the last 30 years was minimally invasive surgery. Robotic surgery, surgical staplers, and image guidance were the most emergent technology clusters. When examining the growth curves for these clusters they were found to follow an S-shaped pattern of growth, with the emergent technologies lying on the exponential phases of their respective growth curves. In addition, publication and patent counts were closely correlated in areas of technology expansion. This article demonstrates the utility of publically available patent and publication data to quantify innovations within surgical technology and proposes a novel methodology for assessing and forecasting areas of technological innovation.

  12. Quantifying uncertainty from material inhomogeneity.

    Energy Technology Data Exchange (ETDEWEB)

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the

  13. An opinion-driven behavioral dynamics model for addictive behaviors

    Science.gov (United States)

    Moore, Thomas W.; Finley, Patrick D.; Apelberg, Benjamin J.; Ambrose, Bridget K.; Brodsky, Nancy S.; Brown, Theresa J.; Husten, Corinne; Glass, Robert J.

    2015-04-01

    We present a model of behavioral dynamics that combines a social network-based opinion dynamics model with behavioral mapping. The behavioral component is discrete and history-dependent to represent situations in which an individual's behavior is initially driven by opinion and later constrained by physiological or psychological conditions that serve to maintain the behavior. Individuals are modeled as nodes in a social network connected by directed edges. Parameter sweeps illustrate model behavior and the effects of individual parameters and parameter interactions on model results. Mapping a continuous opinion variable into a discrete behavioral space induces clustering on directed networks. Clusters provide targets of opportunity for influencing the network state; however, the smaller the network the greater the stochasticity and potential variability in outcomes. This has implications both for behaviors that are influenced by close relationships verses those influenced by societal norms and for the effectiveness of strategies for influencing those behaviors.

  14. Quantifying Stock Return Distributions in Financial Markets.

    Science.gov (United States)

    Botta, Federico; Moat, Helen Susannah; Stanley, H Eugene; Preis, Tobias

    2015-01-01

    Being able to quantify the probability of large price changes in stock markets is of crucial importance in understanding financial crises that affect the lives of people worldwide. Large changes in stock market prices can arise abruptly, within a matter of minutes, or develop across much longer time scales. Here, we analyze a dataset comprising the stocks forming the Dow Jones Industrial Average at a second by second resolution in the period from January 2008 to July 2010 in order to quantify the distribution of changes in market prices at a range of time scales. We find that the tails of the distributions of logarithmic price changes, or returns, exhibit power law decays for time scales ranging from 300 seconds to 3600 seconds. For larger time scales, we find that the distributions tails exhibit exponential decay. Our findings may inform the development of models of market behavior across varying time scales.

  15. Quantifying information leakage of randomized protocols

    DEFF Research Database (Denmark)

    Biondi, Fabrizio; Legay, Axel; Malacaria, Pasquale;

    2015-01-01

    capable to observe the internal behavior of the system, and quantify the information leakage of such systems. We also use our method to obtain an algorithm for the computation of channel capacity from our Markovian models. Finally, we show how to use the method to analyze timed and non-timed attacks......The quantification of information leakage provides a quantitative evaluation of the security of a system. We propose the usage of Markovian processes to model deterministic and probabilistic systems. By using a methodology generalizing the lattice of information approach we model refined attackers...

  16. Quantifying information leakage of randomized protocols

    DEFF Research Database (Denmark)

    Biondi, Fabrizio; Legay, Axel; Malacaria, Pasquale

    2015-01-01

    The quantification of information leakage provides a quantitative evaluation of the security of a system. We propose the usage of Markovian processes to model deterministic and probabilistic systems. By using a methodology generalizing the lattice of information approach we model refined attackers...... capable to observe the internal behavior of the system, and quantify the information leakage of such systems. We also use our method to obtain an algorithm for the computation of channel capacity from our Markovian models. Finally, we show how to use the method to analyze timed and non-timed attacks...

  17. Animal biometrics: quantifying and detecting phenotypic appearance.

    Science.gov (United States)

    Kühl, Hjalmar S; Burghardt, Tilo

    2013-07-01

    Animal biometrics is an emerging field that develops quantified approaches for representing and detecting the phenotypic appearance of species, individuals, behaviors, and morphological traits. It operates at the intersection between pattern recognition, ecology, and information sciences, producing computerized systems for phenotypic measurement and interpretation. Animal biometrics can benefit a wide range of disciplines, including biogeography, population ecology, and behavioral research. Currently, real-world applications are gaining momentum, augmenting the quantity and quality of ecological data collection and processing. However, to advance animal biometrics will require integration of methodologies among the scientific disciplines involved. Such efforts will be worthwhile because the great potential of this approach rests with the formal abstraction of phenomics, to create tractable interfaces between different organizational levels of life.

  18. Quantifying Pilot Visual Attention in Low Visibility Terminal Operations

    Science.gov (United States)

    Ellis, Kyle K.; Arthur, J. J.; Latorella, Kara A.; Kramer, Lynda J.; Shelton, Kevin J.; Norman, Robert M.; Prinzel, Lawrence J.

    2012-01-01

    Quantifying pilot visual behavior allows researchers to determine not only where a pilot is looking and when, but holds implications for specific behavioral tracking when these data are coupled with flight technical performance. Remote eye tracking systems have been integrated into simulators at NASA Langley with effectively no impact on the pilot environment. This paper discusses the installation and use of a remote eye tracking system. The data collection techniques from a complex human-in-the-loop (HITL) research experiment are discussed; especially, the data reduction algorithms and logic to transform raw eye tracking data into quantified visual behavior metrics, and analysis methods to interpret visual behavior. The findings suggest superior performance for Head-Up Display (HUD) and improved attentional behavior for Head-Down Display (HDD) implementations of Synthetic Vision System (SVS) technologies for low visibility terminal area operations. Keywords: eye tracking, flight deck, NextGen, human machine interface, aviation

  19. Quantifying resource use in computations

    NARCIS (Netherlands)

    van Son, R.J.J.H.

    2009-01-01

    It is currently not possible to quantify the resources needed to perform a computation. As a consequence, it is not possible to reliably evaluate the hardware resources needed for the application of algorithms or the running of programs. This is apparent in both computer science, for in- stance, in

  20. Quantifying resource use in computations

    NARCIS (Netherlands)

    van Son, R.J.J.H.

    2009-01-01

    It is currently not possible to quantify the resources needed to perform a computation. As a consequence, it is not possible to reliably evaluate the hardware resources needed for the application of algorithms or the running of programs. This is apparent in both computer science, for in- stance, in

  1. Asteroid Geophysics and Quantifying the Impact Hazard

    Science.gov (United States)

    Sears, D.; Wooden, D. H.; Korycanksy, D. G.

    2015-01-01

    Probably the major challenge in understanding, quantifying, and mitigating the effects of an impact on Earth is understanding the nature of the impactor. Of the roughly 25 meteorite craters on the Earth that have associated meteorites, all but one was produced by an iron meteorite and only one was produced by a stony meteorite. Equally important, even meteorites of a given chemical class produce a wide variety of behavior in the atmosphere. This is because they show considerable diversity in their mechanical properties which have a profound influence on the behavior of meteorites during atmospheric passage. Some stony meteorites are weak and do not reach the surface or reach the surface as thousands of relatively harmless pieces. Some stony meteorites roll into a maximum drag configuration and are strong enough to remain intact so a large single object reaches the surface. Others have high concentrations of water that may facilitate disruption. However, while meteorite falls and meteorites provide invaluable information on the physical nature of the objects entering the atmosphere, there are many unknowns concerning size and scale that can only be determined by from the pre-atmospheric properties of the asteroids. Their internal structure, their thermal properties, their internal strength and composition, will all play a role in determining the behavior of the object as it passes through the atmosphere, whether it produces an airblast and at what height, and the nature of the impact and amount and distribution of ejecta.

  2. Meditations on Quantified Constraint Satisfaction

    CERN Document Server

    Chen, Hubie

    2012-01-01

    The quantified constraint satisfaction problem (QCSP) is the problem of deciding, given a structure and a first-order prenex sentence whose quantifier-free part is the conjunction of atoms, whether or not the sentence holds on the structure. One obtains a family of problems by defining, for each structure B, the problem QCSP(B) to be the QCSP where the structure is fixed to be B. In this article, we offer a viewpoint on the research program of understanding the complexity of the problems QCSP(B) on finite structures. In particular, we propose and discuss a group of conjectures; throughout, we attempt to place the conjectures in relation to existing results and to emphasize open issues and potential research directions.

  3. Quantifier Elimination by Dependency Sequents

    CERN Document Server

    Goldberg, Eugene

    2012-01-01

    We consider the problem of existential quantifier elimination for Boolean formulas in Conjunctive Normal Form (CNF). We present a new method for solving this problem called Derivation of Dependency-Sequents (DDS). A Dependency-sequent (D-sequent) is used to record that a set of quantified variables is redundant under a partial assignment. We show that D-sequents can be resolved to obtain new, non-trivial D-sequents. We also show that DDS is compositional, i.e. if our input formula is a conjunction of independent formulas, DDS automatically recognizes and exploits this information. We introduce an algorithm based on DDS and present experimental results demonstrating its potential.

  4. Quantifying and measuring cyber resiliency

    Science.gov (United States)

    Cybenko, George

    2016-05-01

    Cyber resliency has become an increasingly attractive research and operational concept in cyber security. While several metrics have been proposed for quantifying cyber resiliency, a considerable gap remains between those metrics and operationally measurable and meaningful concepts that can be empirically determined in a scientific manner. This paper describes a concrete notion of cyber resiliency that can be tailored to meet specific needs of organizations that seek to introduce resiliency into their assessment of their cyber security posture.

  5. Quantifying strain variability in modeling growth of Listeria monocytogenes

    NARCIS (Netherlands)

    Aryani, D.; Besten, den H.M.W.; Hazeleger, W.C.; Zwietering, M.H.

    2015-01-01

    Prediction of microbial growth kinetics can differ from the actual behavior of the target microorganisms. In the present study, the impact of strain variability on maximum specific growth rate (µmax) (h- 1) was quantified using twenty Listeria monocytogenes strains. The µmax was determined as functi

  6. Towards Quantifying a Wider Reality: Shannon Exonerata

    Directory of Open Access Journals (Sweden)

    Robert E. Ulanowicz

    2011-10-01

    Full Text Available In 1872 Ludwig von Boltzmann derived a statistical formula to represent the entropy (an apophasis of a highly simplistic system. In 1948 Claude Shannon independently formulated the same expression to capture the positivist essence of information. Such contradictory thrusts engendered decades of ambiguity concerning exactly what is conveyed by the expression. Resolution of widespread confusion is possible by invoking the third law of thermodynamics, which requires that entropy be treated in a relativistic fashion. Doing so parses the Boltzmann expression into separate terms that segregate apophatic entropy from positivist information. Possibly more importantly, the decomposition itself portrays a dialectic-like agonism between constraint and disorder that may provide a more appropriate description of the behavior of living systems than is possible using conventional dynamics. By quantifying the apophatic side of evolution, the Shannon approach to information achieves what no other treatment of the subject affords: It opens the window on a more encompassing perception of reality.

  7. Quantifying mixing using equilibrium reactions

    Science.gov (United States)

    Wheat, Philip M.; Posner, Jonathan D.

    2009-03-01

    A method of quantifying equilibrium reactions in a microchannel using a fluorometric reaction of Fluo-4 and Ca2+ ions is presented. Under the proper conditions, equilibrium reactions can be used to quantify fluid mixing without the challenges associated with constituent mixing measures such as limited imaging spatial resolution and viewing angle coupled with three-dimensional structure. Quantitative measurements of CaCl and calcium-indicating fluorescent dye Fluo-4 mixing are measured in Y-shaped microchannels. Reactant and product concentration distributions are modeled using Green's function solutions and a numerical solution to the advection-diffusion equation. Equilibrium reactions provide for an unambiguous, quantitative measure of mixing when the reactant concentrations are greater than 100 times their dissociation constant and the diffusivities are equal. At lower concentrations and for dissimilar diffusivities, the area averaged fluorescence signal reaches a maximum before the species have interdiffused, suggesting that reactant concentrations and diffusivities must be carefully selected to provide unambiguous, quantitative mixing measures. Fluorometric equilibrium reactions work over a wide range of pH and background concentrations such that they can be used for a wide variety of fluid mixing measures including industrial or microscale flows.

  8. Lexical NP and VP quantifiers in Bulgarian

    Directory of Open Access Journals (Sweden)

    Kristina Kalpakchieva

    2015-11-01

    Full Text Available Lexical NP and VP quantifiers in Bulgarian The paper focuses on uniqueness, existential and universal quantification within the Bulgarian noun and verb phrase. Quantifiers scope is considered with respect to whether the quantifiers are used alone or in a group with other expressions. Another factor that affects the strength of quantifiers is the expression’s containing additional specifying functions or setting some circumstance or condition. Quantifiers within the verb phrase are particularly strongly affected by other conditions, while quantifiers within the subject NP have a broad scope and are not affected by the additional conditions of the situation described.

  9. Quantifying Resource Use in Computations

    CERN Document Server

    van Son, R J J H

    2009-01-01

    It is currently not possible to quantify the resources needed to perform a computation. As a consequence, it is not possible to reliably evaluate the hardware resources needed for the application of algorithms or the running of programs. This is apparent in both computer science, for instance, in cryptanalysis, and in neuroscience, for instance, comparative neuro-anatomy. A System versus Environment game formalism is proposed based on Computability Logic that allows to define a computational work function that describes the theoretical and physical resources needed to perform any purely algorithmic computation. Within this formalism, the cost of a computation is defined as the sum of information storage over the steps of the computation. The size of the computational device, eg, the action table of a Universal Turing Machine, the number of transistors in silicon, or the number and complexity of synapses in a neural net, is explicitly included in the computational cost. The proposed cost function leads in a na...

  10. Quantifying and simulating human sensation

    DEFF Research Database (Denmark)

    Quantifying and simulating human sensation – relating science and technology of indoor climate research Abstract In his doctoral thesis from 1970 civil engineer Povl Ole Fanger proposed that the understanding of indoor climate should focus on the comfort of the individual rather than averaged...... archival material related to Lund Madsen’s efforts are preserved at the Technical University of Denmark and I have used these artefacts as the point of departure for my investigation. In this paper I will examine which factors the researchers perceived as important for human indoor comfort and how...... this understanding of human sensation was adjusted to technology. I will look into the construction of the equipment, what it measures and the relationship between theory, equipment and tradition....

  11. Quantifying Evaporation in a Permeable Pavement System

    Science.gov (United States)

    Studies quantifying evaporation from permeable pavement systems are limited to a few laboratory studies and one field application. This research quantifies evaporation for a larger-scale field application by measuring the water balance from lined permeable pavement sections. Th...

  12. An adaptive method for history dependent materials

    NARCIS (Netherlands)

    Quak, W.; van den Boogaard, Antonius H.; Khalili, Nasser; Valliappan, Somasundaram; Li, Qing; Russel, Adrian

    2010-01-01

    Introduction: Finite element simulations of bulk forming processes, like extrusion or forging, can fail due to element distortion. Simulating a forming process requires either many re-meshing steps or an Eulerian formulation to avoid this problem. This re-meshing or usage of an Eulerian formulation,

  13. Quantifier Scope in Categorical Compositional Distributional Semantics

    Directory of Open Access Journals (Sweden)

    Mehrnoosh Sadrzadeh

    2016-08-01

    Full Text Available In previous work with J. Hedges, we formalised a generalised quantifiers theory of natural language in categorical compositional distributional semantics with the help of bialgebras. In this paper, we show how quantifier scope ambiguity can be represented in that setting and how this representation can be generalised to branching quantifiers.

  14. Quantifying the semantics of search behavior before stock market moves.

    Science.gov (United States)

    Curme, Chester; Preis, Tobias; Stanley, H Eugene; Moat, Helen Susannah

    2014-08-12

    Technology is becoming deeply interwoven into the fabric of society. The Internet has become a central source of information for many people when making day-to-day decisions. Here, we present a method to mine the vast data Internet users create when searching for information online, to identify topics of interest before stock market moves. In an analysis of historic data from 2004 until 2012, we draw on records from the search engine Google and online encyclopedia Wikipedia as well as judgments from the service Amazon Mechanical Turk. We find evidence of links between Internet searches relating to politics or business and subsequent stock market moves. In particular, we find that an increase in search volume for these topics tends to precede stock market falls. We suggest that extensions of these analyses could offer insight into large-scale information flow before a range of real-world events.

  15. Field history dependence of nonlinear dielectric properties of Ba{sub 0.6}Sr{sub 0.4}TiO{sub 3} ceramics under bias electric field: Polarization behavior of polar nano-regions

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Xiaofei [School of Materials Science and Engineering, Wuhan University of Technology, Wuhan 430070 (China); Xu Qing, E-mail: xuqing@whut.edu.c [School of Materials Science and Engineering, Wuhan University of Technology, Wuhan 430070 (China); Liu Hanxing; Chen Wen [School of Materials Science and Engineering, Wuhan University of Technology, Wuhan 430070 (China); Chen Min; Kim, Bok-Hee [Faculty of Advanced Materials Engineering, Chonbuk National University, Jeonju 561-756 (Korea, Republic of)

    2011-04-01

    Nonlinear dielectric properties of Ba{sub 0.6}Sr{sub 0.4}TiO{sub 3} ceramics prepared by citrate method were investigated under bias electric field with respect to field history. X-ray diffraction analysis and temperature dependence of the dielectric constant ({epsilon}{sub r}) confirmed a macroscopically paraelectric state for the specimen at room temperature. A slim polarization versus electric field (P-E) hysteresis loop of the specimen at room temperature indicated the existence of polar nano-regions (PNRs) superimposed on the paraelectric background. The nonlinear dielectric properties in continuous cycles of bias field sweep displayed a strong sensitivity to the field history. This phenomenon was qualitatively explained in terms of an irreversible polarization evolution of the PNRs under the bias fields. A considerable decline of the tunability with the cycle number suggests an appreciable contribution of the PNRs to the dielectric nonlinearity. The polarization and size of the PNRs were determined by fitting the dielectric constants to a multipolarization mechanism model.

  16. Quantifying Cricket Fast Bowling Skill.

    Science.gov (United States)

    Feros, Simon A; Young, Warren B; O'Brien, Brendan J

    2017-09-27

    To evaluate the current evidence regarding the quantification of cricket fast bowling skill. Studies that assessed fast bowling skill (bowling speed and accuracy) were identified from searches in SPORTDiscus (EBSCO) in June 2017. The reference lists of identified papers were also examined for relevant investigations. Sixteen papers matched the inclusion criteria, and discrepancies in assessment procedures were evident. Differences in: test environment, pitch and cricket ball characteristics, the warm-up prior to test, test familiarisation procedures, permitted run-up lengths, bowling spell length, delivery sequence, test instructions, collection of bowling speed data, collection and reportage of bowling accuracy data were apparent throughout the literature. The reliability and sensitivity of fast bowling skill measures has rarely been reported across the literature. Only one study has attempted to assess the construct validity of their skill measures. There are several discrepancies in how fast bowling skill has been assessed and subsequently quantified in the literature to date. This is a problem, as comparisons between studies are often difficult. Therefore, a strong rationale exists for the creation of match-specific standardised fast bowling assessments that offer greater ecological validity while maintaining acceptable reliability and sensitivity of the skill measures. If prospective research can act on the proposed recommendations from this review, then coaches will be able to make more informed decisions surrounding player selection, talent identification, return to skill following injury, and the efficacy of short- and long-term training interventions for fast bowlers.

  17. Quantifying Periodicity in Omics Data

    Directory of Open Access Journals (Sweden)

    Cornelia eAmariei

    2014-08-01

    Full Text Available Oscillations play a significant role in biological systems, with many examples in the fast, ultradian, circadian, circalunar and yearly time domains. However, determining periodicity in such data can be problematic. There are a number of computational methods to identify the periodic components in large datasets, such as signal-to-noise based Fourier decomposition, Fisher's g-test and autocorrelation. However, the available methods assume a sinusoidal model and do not attempt to quantify the waveform shape and the presence of multiple periodicities, which provide vital clues in determining the underlying dynamics. Here, we developed a Fourier based measure that generates a de-noised waveform from multiple significant frequencies. This waveform is then correlated with the raw data from the respiratory oscillation found in yeast, to provide oscillation statistics including waveform metrics and multi-periods. The method is compared and contrasted to commonly used statistics. Moreover we show the utility of the program in the analysis of noisy datasets and other high-throughput analyses, such as metabolomics and flow cytometry, respectively.

  18. Quantifying the vitamin D economy.

    Science.gov (United States)

    Heaney, Robert P; Armas, Laura A G

    2015-01-01

    Vitamin D enters the body through multiple routes and in a variety of chemical forms. Utilization varies with input, demand, and genetics. Vitamin D and its metabolites are carried in the blood on a Gc protein that has three principal alleles with differing binding affinities and ethnic prevalences. Three major metabolites are produced, which act via two routes, endocrine and autocrine/paracrine, and in two compartments, extracellular and intracellular. Metabolic consumption is influenced by physiological controls, noxious stimuli, and tissue demand. When administered as a supplement, varying dosing schedules produce major differences in serum metabolite profiles. To understand vitamin D's role in human physiology, it is necessary both to identify the foregoing entities, mechanisms, and pathways and, specifically, to quantify them. This review was performed to delineate the principal entities and transitions involved in the vitamin D economy, summarize the status of present knowledge of the applicable rates and masses, draw inferences about functions that are implicit in these quantifications, and point out implications for the determination of adequacy.

  19. Polymer microlenses for quantifying cell sheet mechanics.

    Science.gov (United States)

    Miquelard-Garnier, Guillaume; Zimberlin, Jessica A; Sikora, Christian B; Wadsworth, Patricia; Crosby, Alfred

    2010-01-01

    Mechanical interactions between individual cells and their substrate have been studied extensively over the past decade; however, understanding how these interactions change as cells interact with neighboring cells in the development of a cell sheet, or early stage tissue, is less developed. We use a recently developed experimental technique for quantifying the mechanics of confluent cell sheets. Living cells are cultured on a thin film of polystyrene [PS], which is attached to a patterned substrate of crosslinked poly(dimethyl siloxane) [PDMS] microwells. As cells attach to the substrate and begin to form a sheet, they apply sufficient contractile force to buckle the PS film over individual microwells to form a microlens array. The curvature for each microlens is measured by confocal microscopy and can be related to the strain and stress applied by the cell sheet using simple mechanical analysis for the buckling of thin films. We demonstrate that this technique can provide insight into the important materials properties and length scales that govern cell sheet responses, especially the role of stiffness of the substrate. We show that intercellular forces can lead to significantly different behaviors than the ones observed for individual cells, where focal adhesion is the relevant parameter.

  20. Quantifying uncertainty in climate change science through empirical information theory.

    Science.gov (United States)

    Majda, Andrew J; Gershgorin, Boris

    2010-08-24

    Quantifying the uncertainty for the present climate and the predictions of climate change in the suite of imperfect Atmosphere Ocean Science (AOS) computer models is a central issue in climate change science. Here, a systematic approach to these issues with firm mathematical underpinning is developed through empirical information theory. An information metric to quantify AOS model errors in the climate is proposed here which incorporates both coarse-grained mean model errors as well as covariance ratios in a transformation invariant fashion. The subtle behavior of model errors with this information metric is quantified in an instructive statistically exactly solvable test model with direct relevance to climate change science including the prototype behavior of tracer gases such as CO(2). Formulas for identifying the most sensitive climate change directions using statistics of the present climate or an AOS model approximation are developed here; these formulas just involve finding the eigenvector associated with the largest eigenvalue of a quadratic form computed through suitable unperturbed climate statistics. These climate change concepts are illustrated on a statistically exactly solvable one-dimensional stochastic model with relevance for low frequency variability of the atmosphere. Viable algorithms for implementation of these concepts are discussed throughout the paper.

  1. Quantifying uncertainty in climate change science through empirical information theory

    Science.gov (United States)

    Majda, Andrew J.; Gershgorin, Boris

    2010-01-01

    Quantifying the uncertainty for the present climate and the predictions of climate change in the suite of imperfect Atmosphere Ocean Science (AOS) computer models is a central issue in climate change science. Here, a systematic approach to these issues with firm mathematical underpinning is developed through empirical information theory. An information metric to quantify AOS model errors in the climate is proposed here which incorporates both coarse-grained mean model errors as well as covariance ratios in a transformation invariant fashion. The subtle behavior of model errors with this information metric is quantified in an instructive statistically exactly solvable test model with direct relevance to climate change science including the prototype behavior of tracer gases such as CO2. Formulas for identifying the most sensitive climate change directions using statistics of the present climate or an AOS model approximation are developed here; these formulas just involve finding the eigenvector associated with the largest eigenvalue of a quadratic form computed through suitable unperturbed climate statistics. These climate change concepts are illustrated on a statistically exactly solvable one-dimensional stochastic model with relevance for low frequency variability of the atmosphere. Viable algorithms for implementation of these concepts are discussed throughout the paper. PMID:20696940

  2. Quantifying synergistic information remains an unsolved problem

    CERN Document Server

    Griffith, Virgil

    2011-01-01

    We review the prior literature of information theoretical measures of synergy or synergistic information. We draw the hereto unnamed conceptual distinction between synergistic and holistic information and analyze six prior measures based on whether they aim to quantify synergy or holism. We apply all measures against a suite of examples to demonstrate no existing measure correctly quantifies synergy under all circumstances.

  3. Quantifying drug-protein binding in vivo.

    Energy Technology Data Exchange (ETDEWEB)

    Buchholz, B; Bench, G; Keating III, G; Palmblad, M; Vogel, J; Grant, P G; Hillegonds, D

    2004-02-17

    Accelerator mass spectrometry (AMS) provides precise quantitation of isotope labeled compounds that are bound to biological macromolecules such as DNA or proteins. The sensitivity is high enough to allow for sub-pharmacological (''micro-'') dosing to determine macromolecular targets without inducing toxicities or altering the system under study, whether it is healthy or diseased. We demonstrated an application of AMS in quantifying the physiologic effects of one dosed chemical compound upon the binding level of another compound in vivo at sub-toxic doses [4].We are using tissues left from this study to develop protocols for quantifying specific binding to isolated and identified proteins. We also developed a new technique to quantify nanogram to milligram amounts of isolated protein at precisions that are comparable to those for quantifying the bound compound by AMS.

  4. A stochastic approach for quantifying immigrant integration: the Spanish test case

    CERN Document Server

    Agliari, Elena; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia

    2014-01-01

    We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of "time", and the quantifier the role of "space" it become possible to analyze the behavior of the quantifiers by means of continuous time random walks. Two classes of results are obtained. First we show that social integration quantifiers evolve following pure diffusion law, while the evolution of economic quantifiers exhibit ballistic dynamics. Second we make predictions of best and worst case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. F...

  5. Evaluation of the status of rotary machines by time causal Information Theory quantifiers

    Science.gov (United States)

    Redelico, Francisco O.; Traversaro, Francisco; Oyarzabal, Nicolás; Vilaboa, Ivan; Rosso, Osvaldo A.

    2017-03-01

    In this paper several causal Information Theory quantifiers, i.e. Shannon entropy, statistical complexity and Fisher information using the Bandt and Pompe permutation probability distribution, measure are applied to describe the behavior of a rotating machine. An experiment was conducted where a rotating machine runs balanced and then, after a misalignment, runs unbalanced. All the causal Information Theory quantifiers applied are capable to distinguish between both states and grasp the corresponding transition between them.

  6. Quantifying renewable groundwater stress with GRACE

    Science.gov (United States)

    Richey, Alexandra S.; Thomas, Brian F.; Lo, Min‐Hui; Reager, John T.; Voss, Katalyn; Swenson, Sean; Rodell, Matthew

    2015-01-01

    Abstract Groundwater is an increasingly important water supply source globally. Understanding the amount of groundwater used versus the volume available is crucial to evaluate future water availability. We present a groundwater stress assessment to quantify the relationship between groundwater use and availability in the world's 37 largest aquifer systems. We quantify stress according to a ratio of groundwater use to availability, which we call the Renewable Groundwater Stress ratio. The impact of quantifying groundwater use based on nationally reported groundwater withdrawal statistics is compared to a novel approach to quantify use based on remote sensing observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission. Four characteristic stress regimes are defined: Overstressed, Variable Stress, Human‐dominated Stress, and Unstressed. The regimes are a function of the sign of use (positive or negative) and the sign of groundwater availability, defined as mean annual recharge. The ability to mitigate and adapt to stressed conditions, where use exceeds sustainable water availability, is a function of economic capacity and land use patterns. Therefore, we qualitatively explore the relationship between stress and anthropogenic biomes. We find that estimates of groundwater stress based on withdrawal statistics are unable to capture the range of characteristic stress regimes, especially in regions dominated by sparsely populated biome types with limited cropland. GRACE‐based estimates of use and stress can holistically quantify the impact of groundwater use on stress, resulting in both greater magnitudes of stress and more variability of stress between regions. PMID:26900185

  7. Availability Analysis of a Repairable k-out-of-n:G System with a History-Dependent Critical State%具有历史相依临界状态的可维修n中取k:G系统的可用度分析

    Institute of Scientific and Technical Information of China (English)

    吴玉旦

    2012-01-01

    对于一个具有历史相依临界状态的可维修n中取k:G系统,论文给出了当系统平稳时它的可用度,一个循环中的平均工作时间和平均失效时间.并且和不具有年龄相依临界状态的可维修n中取k:G系统进行了比较.%For a repairable k-out-of-n:G system with a history-dependent critical state,this paper derives the availability,the mean up time and the mean down time in one renewal cycle when the system goes into the steady state.A comparison between this system and a repairable k-out-of-n:G system without a history-dependent critical state is conducted as well.

  8. Quantifying robustness of biochemical network models

    Directory of Open Access Journals (Sweden)

    Iglesias Pablo A

    2002-12-01

    Full Text Available Abstract Background Robustness of mathematical models of biochemical networks is important for validation purposes and can be used as a means of selecting between different competing models. Tools for quantifying parametric robustness are needed. Results Two techniques for describing quantitatively the robustness of an oscillatory model were presented and contrasted. Single-parameter bifurcation analysis was used to evaluate the stability robustness of the limit cycle oscillation as well as the frequency and amplitude of oscillations. A tool from control engineering – the structural singular value (SSV – was used to quantify robust stability of the limit cycle. Using SSV analysis, we find very poor robustness when the model's parameters are allowed to vary. Conclusion The results show the usefulness of incorporating SSV analysis to single parameter sensitivity analysis to quantify robustness.

  9. Quantifying brain microstructure with diffusion MRI

    DEFF Research Database (Denmark)

    Novikov, Dmitry S.; Jespersen, Sune N.; Kiselev, Valerij G.

    2016-01-01

    We review, systematize and discuss models of diffusion in neuronal tissue, by putting them into an overarching physical context of coarse-graining over an increasing diffusion length scale. From this perspective, we view research on quantifying brain microstructure as occurring along the three ma...

  10. Quantifying the Reuse of Learning Objects

    Science.gov (United States)

    Elliott, Kristine; Sweeney, Kevin

    2008-01-01

    This paper reports the findings of one case study from a larger project, which aims to quantify the claimed efficiencies of reusing learning objects to develop e-learning resources. The case study describes how an online inquiry project "Diabetes: A waste of energy" was developed by searching for, evaluating, modifying and then…

  11. QS Spiral: Visualizing Periodic Quantified Self Data

    DEFF Research Database (Denmark)

    Larsen, Jakob Eg; Cuttone, Andrea; Jørgensen, Sune Lehmann

    2013-01-01

    In this paper we propose an interactive visualization technique QS Spiral that aims to capture the periodic properties of quantified self data and let the user explore those recurring patterns. The approach is based on time-series data visualized as a spiral structure. The interactivity includes ...

  12. Periodontal inflamed surface area : quantifying inflammatory burden

    NARCIS (Netherlands)

    Nesse, Willem; Abbas, Frank; van der Ploeg, Ids; Spijkervet, Frederik Karst Lucien; Dijkstra, Pieter Ubele; Vissink, Arjan

    2008-01-01

    Background: Currently, a large variety of classifications is used for periodontitis as a risk factor for other diseases. None of these classifications quantifies the amount of inflamed periodontal tissue, while this information is needed to assess the inflammatory burden posed by periodontitis. Aim:

  13. Periodontal inflamed surface area : quantifying inflammatory burden

    NARCIS (Netherlands)

    Nesse, Willem; Abbas, Frank; van der Ploeg, Ids; Spijkervet, Frederik Karst Lucien; Dijkstra, Pieter Ubele; Vissink, Arjan

    2008-01-01

    Background: Currently, a large variety of classifications is used for periodontitis as a risk factor for other diseases. None of these classifications quantifies the amount of inflamed periodontal tissue, while this information is needed to assess the inflammatory burden posed by periodontitis. Aim:

  14. The Emergence of the Quantified Child

    Science.gov (United States)

    Smith, Rebecca

    2017-01-01

    Using document analysis, this paper examines the historical emergence of the quantified child, revealing how the collection and use of data has become normalized through legitimizing discourses. First, following in the traditions of Foucault's genealogy and studies examining the sociology of numbers, this paper traces the evolution of data…

  15. Bosonic behavior of entangled fermions

    DEFF Research Database (Denmark)

    C. Tichy, Malte; Alexander Bouvrie, Peter; Mølmer, Klaus

    2012-01-01

    Two bound, entangled fermions form a composite boson, which can be treated as an elementary boson as long as the Pauli principle does not affect the behavior of many such composite bosons. The departure of ideal bosonic behavior is quantified by the normalization ratio of multi-composite-boson st......Two bound, entangled fermions form a composite boson, which can be treated as an elementary boson as long as the Pauli principle does not affect the behavior of many such composite bosons. The departure of ideal bosonic behavior is quantified by the normalization ratio of multi...

  16. Power Curve Measurements, quantify the production increase

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Vesth, Allan

    The purpose of this report is to quantify the production increase on a given turbine with respect to another given turbine. The used methodology is the “side by side” comparison method, provided by the client. This method involves the use of two neighboring turbines and it is based on the assumpt......The purpose of this report is to quantify the production increase on a given turbine with respect to another given turbine. The used methodology is the “side by side” comparison method, provided by the client. This method involves the use of two neighboring turbines and it is based...... on the assumption that the wind field in front of the tested turbines is statistically the same (i.e. has in average the same mean wind speed conditions in front of both turbines). The method is only used for the evaluation of a relative change in the AEP, not the AEP itself....

  17. Quantifying the robustness of metro networks

    CERN Document Server

    Wang, Xiangrong; Derrible, Sybil; Ahmad, Sk Nasir; Kooij, Robert E

    2015-01-01

    Metros (heavy rail transit systems) are integral parts of urban transportation systems. Failures in their operations can have serious impacts on urban mobility, and measuring their robustness is therefore critical. Moreover, as physical networks, metros can be viewed as network topological entities, and as such they possess measurable network properties. In this paper, by using network science and graph theoretical concepts, we investigate both theoretical and experimental robustness metrics (i.e., the robustness indicator, the effective graph conductance, and the critical thresholds) and their performance in quantifying the robustness of metro networks under random failures or targeted attacks. We find that the theoretical metrics quantify different aspects of the robustness of metro networks. In particular, the robustness indicator captures the number of alternative paths and the effective graph conductance focuses on the length of each path. Moreover, the high positive correlation between the theoretical m...

  18. Quantifying Shannon's Work Function for Cryptanalytic Attacks

    CERN Document Server

    van Son, R J J H

    2010-01-01

    Attacks on cryptographic systems are limited by the available computational resources. A theoretical understanding of these resource limitations is needed to evaluate the security of cryptographic primitives and procedures. This study uses an Attacker versus Environment game formalism based on computability logic to quantify Shannon's work function and evaluate resource use in cryptanalysis. A simple cost function is defined which allows to quantify a wide range of theoretical and real computational resources. With this approach the use of custom hardware, e.g., FPGA boards, in cryptanalysis can be analyzed. Applied to real cryptanalytic problems, it raises, for instance, the expectation that the computer time needed to break some simple 90 bit strong cryptographic primitives might theoretically be less than two years.

  19. Quantifying reliability uncertainty : a proof of concept.

    Energy Technology Data Exchange (ETDEWEB)

    Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna (Los Alamos National Laboratory, Los Alamos, NM); Lorio, John F.; Fatherley, Quinn (Los Alamos National Laboratory, Los Alamos, NM); Anderson-Cook, Christine (Los Alamos National Laboratory, Los Alamos, NM); Wilson, Alyson G. (Los Alamos National Laboratory, Los Alamos, NM); Zurn, Rena M.

    2009-10-01

    This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.

  20. Quantifying energy condition violations in traversable wormholes

    Indian Academy of Sciences (India)

    Sayan Kar; Naresh Dadhich; Matt Visser

    2004-10-01

    The `theoretical' existence of traversable Lorentzian wormholes in the classical, macroscopic world is plagued by the violation of the well-known energy conditions of general relativity. In this brief article we show: (i) how the extent of violation can be quantified using certain volume integrals and (ii) whether this `amount of violation' can be minimised for some specific cut-and-paste geometric constructions. Examples and possibilities are also outlined.

  1. Quantifying sediment production in steepland environments

    OpenAIRE

    2009-01-01

    Five published contributions to our understanding of the impacts of erosion processes on sustainable land management are reviewed and discussed. These focus on rapid shallow landsliding and gully erosion which are among the most prevalent forms of environmental degradation in New Zealand's hill country. The over-arching goal of this research has been to quantify the on-site (e.g., soil erosion, land productivity) impacts of these processes. Rather than measure erosion rates over long periods ...

  2. Quantifying habitat interactions: sediment transport and freshwater mussels

    Science.gov (United States)

    Kozarek, J. L.; MacGregor, K. R.; Hornbach, D.; Hove, M.

    2016-12-01

    Freshwater mussel abundance and distribution are integrally linked with their habitat through sediment transport processes in moving waters, including suspended sediment loads and bed mobility. This research seeks to quantify these complex interactions using a combination of field data collection in the intensively agricultural Minnesota River Basin, and laboratory experiments in the Outdoor StreamLab (OSL) and flumes at St. Anthony Falls Laboratory (SAFL) at the University of Minnesota. The OSL is a field-scale sand-bed meandering stream channel with independent control over sediment feed (recirculated) and water flow (diverted from the Mississippi River). Experiments in the OSL focused on the interactions between moving bedload and freshwater mussel behavior. Flooding experiments were used to quantify the movement during and post flood for three mussel species with different shell sculptures: threeridge (Amblema plicata), plain pockebook (Lampsilus cardium), and white heelsplitter (Lasmigona complanata). Flow fields, bed shear stress, bedform migration, and bar topography were measured during each flooding event with and without mussels present (density = 4/m2) to examine the influence of flooding on mussel movement, and to quantify the influence of mussels on channel morphology under steady state bedload transport. Additional experiments were conducted with threeridge at low flow (no bedload), under aggrading and degrading bed conditions, and doubled mussel density (8/m2). Mussel response to suspended sediment loads was examined in a complementary series of experiments in an indoor flume with Mississippi River water. Mussels outfitted with gape sensors were utilized in paired control/treatment experiments to examine the influence of moderate term (48 hours) exposure to elevated suspended sediment loads on mussel filtering activity. Together, these experiments provide multiple measures of mussel stress under high sediment loads and reveal how freshwater mussels

  3. Ground-based LIDAR: a novel approach to quantify fine-scale fuelbed characteristics

    Science.gov (United States)

    E.L. Loudermilk; J.K. Hiers; J.J. O’Brien; R.J. Mitchell; A. Singhania; J.C. Fernandez; W.P. Cropper; K.C. Slatton

    2009-01-01

    Ground-based LIDAR (also known as laser ranging) is a novel technique that may precisely quantify fuelbed characteristics important in determining fire behavior. We measured fuel properties within a south-eastern US longleaf pine woodland at the individual plant and fuelbed scale. Data were collected using a mobile terrestrial LIDAR unit at sub-cm scale for individual...

  4. Quantifying Diffuse Contamination: Method and Application to Pb in Soil.

    Science.gov (United States)

    Fabian, Karl; Reimann, Clemens; de Caritat, Patrice

    2017-06-20

    A new method for detecting and quantifying diffuse contamination at the continental to regional scale is based on the analysis of cumulative distribution functions (CDFs). It uses cumulative probability (CP) plots for spatially representative data sets, preferably containing >1000 determinations. Simulations demonstrate how different types of contamination influence elemental CDFs of different sample media. It is found that diffuse contamination is characterized by a distinctive shift of the low-concentration end of the distribution of the studied element in its CP plot. Diffuse contamination can be detected and quantified via either (1) comparing the distribution of the contaminating element to that of an element with a geochemically comparable behavior but no contamination source (e.g., Pb vs Rb), or (2) comparing the top soil distribution of an element to the distribution of the same element in subsoil samples from the same area, taking soil forming processes into consideration. Both procedures are demonstrated for geochemical soil data sets from Europe, Australia, and the U.S.A. Several different data sets from Europe deliver comparable results at different scales. Diffuse Pb contamination in surface soil is estimated to be contamination sources and can be used to efficiently monitor diffuse contamination at the continental to regional scale.

  5. Quantifying the relationship between financial news and the stock market.

    Science.gov (United States)

    Alanyali, Merve; Moat, Helen Susannah; Preis, Tobias

    2013-12-20

    The complex behavior of financial markets emerges from decisions made by many traders. Here, we exploit a large corpus of daily print issues of the Financial Times from 2(nd) January 2007 until 31(st) December 2012 to quantify the relationship between decisions taken in financial markets and developments in financial news. We find a positive correlation between the daily number of mentions of a company in the Financial Times and the daily transaction volume of a company's stock both on the day before the news is released, and on the same day as the news is released. Our results provide quantitative support for the suggestion that movements in financial markets and movements in financial news are intrinsically interlinked.

  6. Quantifying consumption rates of dissolved oxygen along bed forms

    Science.gov (United States)

    Boano, Fulvio; De Falco, Natalie; Arnon, Shai

    2016-04-01

    Streambed interfaces represent hotspots for nutrient transformations because they host different microbial species, and the evaluation of these reaction rates is important to assess the fate of nutrients in riverine environments. In this work we analyze a series of flume experiments on oxygen demand in dune-shaped hyporheic sediments under losing and gaining flow conditions. We employ a new modeling code to quantify oxygen consumption rates from observed vertical profiles of oxygen concentration. The code accounts for transport by molecular diffusion and water advection, and automatically determines the reaction rates that provide the best fit between observed and modeled concentration values. The results show that reaction rates are not uniformly distributed across the streambed, in agreement with the expected behavior predicted by hyporheic exchange theory. Oxygen consumption was found to be highly influenced by the presence of gaining or losing flow conditions, which controlled the delivery of labile DOC to streambed microorganisms.

  7. Gains and Pitfalls of Quantifier Elimination as a Teaching Tool

    Science.gov (United States)

    Oldenburg, Reinhard

    2015-01-01

    Quantifier Elimination is a procedure that allows simplification of logical formulas that contain quantifiers. Many mathematical concepts are defined in terms of quantifiers and especially in calculus their use has been identified as an obstacle in the learning process. The automatic deduction provided by quantifier elimination thus allows…

  8. Issues in the study of floating universal numeric quantifiers

    NARCIS (Netherlands)

    R. Cirillo

    2010-01-01

    In the Germanic and Romance languages (among others) a universal quantifier can combine with a numeral and form a floating quantifier. I refer to these quantifiers as universal numeric quantifiers or simply ∀NumQ. The following examples from Dutch and Romanian demonstrate this phenomenon: The aim of

  9. Certain Verbs Are Syntactically Explicit Quantifiers

    Directory of Open Access Journals (Sweden)

    Anna Szabolcsi

    2010-12-01

    Full Text Available Quantification over individuals, times, and worlds can in principle be made explicit in the syntax of the object language, or left to the semantics and spelled out in the meta-language. The traditional view is that quantification over individuals is syntactically explicit, whereas quantification over times and worlds is not. But a growing body of literature proposes a uniform treatment. This paper examines the scopal interaction of aspectual raising verbs (begin, modals (can, and intensional raising verbs (threaten with quantificational subjects in Shupamem, Dutch, and English. It appears that aspectual raising verbs and at least modals may undergo the same kind of overt or covert scope-changing operations as nominal quantifiers; the case of intensional raising verbs is less clear. Scope interaction is thus shown to be a new potential diagnostic of object-linguistic quantification, and the similarity in the scope behavior of nominal and verbal quantifiers supports the grammatical plausibility of ontological symmetry, explored in Schlenker (2006.ReferencesBen-Shalom, D. 1996. Semantic Trees. Ph.D. thesis, UCLA.Bittner, M. 1993. Case, Scope, and Binding. Dordrecht: Reidel.Cresswell, M. 1990. Entities and Indices. Dordrecht: Kluwer.Cresti, D. 1995. ‘Extraction and reconstruction’. Natural Language Semantics 3: 79–122.http://dx.doi.org/10.1007/BF01252885Curry, B. H. & Feys, R. 1958. Combinatory Logic I. Dordrecht: North-Holland.Dowty, D. R. 1988. ‘Type raising, functional composition, and non-constituent conjunction’. In Richard T. Oehrle, Emmon W. Bach & Deirdre Wheeler (eds. ‘Categorial Grammars and Natural Language Structures’, 153–197. Dordrecht: Reidel.Fox, D. 2002. ‘TOn Logical Form’. In Randall Hendrick (ed. ‘Minimalist Syntax’, 82–124. Oxford: Blackwell.Gallin, D. 1975. Intensional and higher-order modal logic: with applications to Montague semantics. North Holland Pub. Co.; American Elsevier Pub. Co., Amsterdam

  10. Quantifier spreading: children misled by ostensive cues

    Directory of Open Access Journals (Sweden)

    Katalin É. Kiss

    2017-04-01

    Full Text Available This paper calls attention to a methodological problem of acquisition experiments. It shows that the economy of the stimulus employed in child language experiments may lend an increased ostensive effect to the message communicated to the child. Thus, when the visual stimulus in a sentence-picture matching task is a minimal model abstracting away from the details of the situation, children often regard all the elements of the stimulus as ostensive clues to be represented in the corresponding sentence. The use of such minimal stimuli is mistaken when the experiment aims to test whether or not a certain element of the stimulus is relevant for the linguistic representation or interpretation. The paper illustrates this point by an experiment involving quantifier spreading. It is claimed that children find a universally quantified sentence like 'Every girl is riding a bicycle 'to be a false description of a picture showing three girls riding bicycles and a solo bicycle because they are misled to believe that all the elements in the visual stimulus are relevant, hence all of them are to be represented by the corresponding linguistic description. When the iconic drawings were replaced by photos taken in a natural environment rich in accidental details, the occurrence of quantifier spreading was radically reduced. It is shown that an extra object in the visual stimulus can lead to the rejection of the sentence also in the case of sentences involving no quantification, which gives further support to the claim that the source of the problem is not (or not only the grammatical or cognitive difficulty of quantification but the unintended ostensive effect of the extra object.  This article is part of the special collection: Acquisition of Quantification

  11. Quantifying graininess of glossy food products

    DEFF Research Database (Denmark)

    Møller, Flemming; Carstensen, Jens Michael

    The sensory quality of yoghurt can be altered when changing the milk composition or processing conditions. Part of the sensory quality may be assessed visually. It is described how a non-contact method for quantifying surface gloss and grains in yoghurt can be made. It was found that the standard...... deviation of the entire image evaluated at different scales in a Gaussian Image Pyramid was a measure for graininess of yoghurt. This methodology is used to predict graininess (or grittiness) and to evaluate effect of yoghurt composition and processing....

  12. Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.

    Science.gov (United States)

    Richie, Megan; Josephson, S Andrew

    2017-07-28

    Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained

  13. Quantifying meta-correlations in financial markets

    Science.gov (United States)

    Kenett, Dror Y.; Preis, Tobias; Gur-Gershgoren, Gitit; Ben-Jacob, Eshel

    2012-08-01

    Financial markets are modular multi-level systems, in which the relationships between the individual components are not constant in time. Sudden changes in these relationships significantly affect the stability of the entire system, and vice versa. Our analysis is based on historical daily closing prices of the 30 components of the Dow Jones Industrial Average (DJIA) from March 15th, 1939 until December 31st, 2010. We quantify the correlation among these components by determining Pearson correlation coefficients, to investigate whether mean correlation of the entire portfolio can be used as a precursor for changes in the index return. To this end, we quantify the meta-correlation - the correlation of mean correlation and index return. We find that changes in index returns are significantly correlated with changes in mean correlation. Furthermore, we study the relationship between the index return and correlation volatility - the standard deviation of correlations for a given time interval. This parameter provides further evidence of the effect of the index on market correlations and their fluctuations. Our empirical findings provide new information and quantification of the index leverage effect, and have implications to risk management, portfolio optimization, and to the increased stability of financial markets.

  14. Quantifying the synchronizability of externally driven oscillators.

    Science.gov (United States)

    Stefański, Andrzej

    2008-03-01

    This paper is focused on the problem of complete synchronization in arrays of externally driven identical or slightly different oscillators. These oscillators are coupled by common driving which makes an occurrence of generalized synchronization between a driving signal and response oscillators possible. Therefore, the phenomenon of generalized synchronization is also analyzed here. The research is concentrated on the cases of an irregular (chaotic or stochastic) driving signal acting on continuous-time (Duffing systems) and discrete-time (Henon maps) response oscillators. As a tool for quantifying the robustness of the synchronized state, response (conditional) Lyapunov exponents are applied. The most significant result presented in this paper is a novel method of estimation of the largest response Lyapunov exponent. This approach is based on the complete synchronization of two twin response subsystems via additional master-slave coupling between them. Examples of the method application and its comparison with the classical algorithm for calculation of Lyapunov exponents are widely demonstrated. Finally, the idea of effective response Lyapunov exponents, which allows us to quantify the synchronizability in case of slightly different response oscillators, is introduced.

  15. An optimised method for quantifying glenoid orientation

    Directory of Open Access Journals (Sweden)

    Amadi Hippolite

    2008-01-01

    Full Text Available A robust quantification method is essential for inter-subject glenoid comparison and planning of total shoulder arthroplasty. This study compared various scapular and glenoid axes with each other in order to optimally define the most appropriate method of quantifying glenoid version and inclination. Six glenoid and eight scapular axes were defined and quantified from identifiable landmarks of twenty-one scapular image scans. Pathology independency and insensitivity of each axis to inter-subject morphological variation within its region was tested. Glenoid version and inclination were calculated using the best axes from the two regions. The best glenoid axis was the normal to a least-square plane fit on the glenoid rim, directed approximately medio-laterally. The best scapular axis was the normal to a plane formed by the spine root and lateral border ridge. Glenoid inclination was 15.7° ± 5.1° superiorly and version was 4.9° ± 6.1°, retroversion. The choice of axes in the present technique makes it insensitive to pathology and scapular morphological variabilities. Its application would effectively improve inter-subject glenoid version comparison, surgical planning and design of prostheses for shoulder arthroplasty.

  16. Quantifying chemical reactions by using mixing analysis.

    Science.gov (United States)

    Jurado, Anna; Vázquez-Suñé, Enric; Carrera, Jesús; Tubau, Isabel; Pujades, Estanislao

    2015-01-01

    This work is motivated by a sound understanding of the chemical processes that affect the organic pollutants in an urban aquifer. We propose an approach to quantify such processes using mixing calculations. The methodology consists of the following steps: (1) identification of the recharge sources (end-members) and selection of the species (conservative and non-conservative) to be used, (2) identification of the chemical processes and (3) evaluation of mixing ratios including the chemical processes. This methodology has been applied in the Besòs River Delta (NE Barcelona, Spain), where the River Besòs is the main aquifer recharge source. A total number of 51 groundwater samples were collected from July 2007 to May 2010 during four field campaigns. Three river end-members were necessary to explain the temporal variability of the River Besòs: one river end-member is from the wet periods (W1) and two are from dry periods (D1 and D2). This methodology has proved to be useful not only to compute the mixing ratios but also to quantify processes such as calcite and magnesite dissolution, aerobic respiration and denitrification undergone at each observation point.

  17. Quantifying the efficiency of river regulation

    Directory of Open Access Journals (Sweden)

    R. Rödel

    2005-01-01

    Full Text Available Dam-affected hydrologic time series give rise to uncertainties when they are used for calibrating large-scale hydrologic models or for analysing runoff records. It is therefore necessary to identify and to quantify the impact of impoundments on runoff time series. Two different approaches were employed. The first, classic approach compares the volume of the dams that are located upstream from a station with the annual discharge. The catchment areas of the stations are calculated and then related to geo-referenced dam attributes. The paper introduces a data set of geo-referenced dams linked with 677 gauging stations in Europe. Second, the intensity of the impoundment impact on runoff times series can be quantified more exactly and directly when long-term runoff records are available. Dams cause a change in the variability of flow regimes. This effect can be measured using the model of linear single storage. The dam-caused storage change ΔS can be assessed through the volume of the emptying process between two flow regimes. As an example, the storage change ΔS is calculated for regulated long-term series of the Luleälven in northern Sweden.

  18. Quantifying lateral tissue heterogeneities in hadron therapy.

    Science.gov (United States)

    Pflugfelder, D; Wilkens, J J; Szymanowski, H; Oelfke, U

    2007-04-01

    In radiotherapy with scanned particle beams, tissue heterogeneities lateral to the beam direction are problematic in two ways: they pose a challenge to dose calculation algorithms, and they lead to a high sensitivity to setup errors. In order to quantify and avoid these problems, a heterogeneity number H(i) as a method to quantify lateral tissue heterogeneities of single beam spot i is introduced. To evaluate this new concept, two kinds of potential errors were investigated for single beam spots: First, the dose calculation error has been obtained by comparing the dose distribution computed by a simple pencil beam algorithm to more accurate Monte Carlo simulations. The resulting error is clearly correlated with H(i). Second, the analysis of the sensitivity to setup errors of single beam spots also showed a dependence on H(i). From this data it is concluded that H(i) can be used as a criterion to assess the risks of a compromised delivered dose due to lateral tissue heterogeneities. Furthermore, a method how to incorporate this information into the inverse planning process for intensity modulated proton therapy is presented. By suppressing beam spots with a high value of H(i), the unfavorable impact of lateral tissue heterogeneities can be reduced, leading to treatment plans which are more robust to dose calculation errors of the pencil beam algorithm. Additional possibilities to use the information of H(i) are outlined in the discussion.

  19. Computed tomography to quantify tooth abrasion

    Science.gov (United States)

    Kofmehl, Lukas; Schulz, Georg; Deyhle, Hans; Filippi, Andreas; Hotz, Gerhard; Berndt-Dagassan, Dorothea; Kramis, Simon; Beckmann, Felix; Müller, Bert

    2010-09-01

    Cone-beam computed tomography, also termed digital volume tomography, has become a standard technique in dentistry, allowing for fast 3D jaw imaging including denture at moderate spatial resolution. More detailed X-ray images of restricted volumes for post-mortem studies in dental anthropology are obtained by means of micro computed tomography. The present study evaluates the impact of the pipe smoking wear on teeth morphology comparing the abraded tooth with its contra-lateral counterpart. A set of 60 teeth, loose or anchored in the jaw, from 12 dentitions have been analyzed. After the two contra-lateral teeth were scanned, one dataset has been mirrored before the two datasets were registered using affine and rigid registration algorithms. Rigid registration provides three translational and three rotational parameters to maximize the overlap of two rigid bodies. For the affine registration, three scaling factors are incorporated. Within the present investigation, affine and rigid registrations yield comparable values. The restriction to the six parameters of the rigid registration is not a limitation. The differences in size and shape between the tooth and its contra-lateral counterpart generally exhibit only a few percent in the non-abraded volume, validating that the contralateral tooth is a reasonable approximation to quantify, for example, the volume loss as the result of long-term clay pipe smoking. Therefore, this approach allows quantifying the impact of the pipe abrasion on the internal tooth morphology including root canal, dentin, and enamel volumes.

  20. Methods for quantifying training in sprint kayak.

    Science.gov (United States)

    Borges, Thiago Oliveira; Bullock, Nicola; Duff, Christine; Coutts, Aaron J

    2014-02-01

    The aims of this study were to determine the validity of the session rating of perceived exertion (session-RPE) method by comparing 3 different scales of perceived exertion with common measures of training load (TL). A secondary aim was to verify the relationship between TLs, fitness, and performance in Sprint Kayak athletes. After laboratory assessment of maximal oxygen uptake (V[Combining Dot Above]O2peak) and lactate threshold, the athletes performed on water time trials over 200 and 1,000 m. Training load was quantified for external (distance and speed) and internal (session-RPE: 6-20, category ratio [CR]-10 and CR-100 scales, training impulse [TRIMP], and individual TRIMP). Ten (6 male, 4 female) well-trained junior Sprint Kayak athletes (age 17.1 ± 1.2 years; V[Combining Dot Above]O2peak 4.2 ± 0.7 L·min) were monitored over a 7-week period. There were large-to-very large within-individual correlations between the session distance and the various heart rate (HR) and RPE-based methods for quantifying TL (0.58-0.91). Correlations between the mean session speed and various HR- and RPE-based methods for quantifying TL were small to large (0.12-0.50). The within-individual relationships between the various objective and subjective methods of internal TL were large to very large (0.62-0.94). Moderate-to-large inverse relationships were found between mean session-RPE TL and various aerobic fitness variables (-0.58 to -0.37). Large-to-very large relationships were found between mean session-RPE TL and on water performance (0.57-0.75). In conclusion, session-RPE is a valid method for monitoring TL for junior Sprint Kayak athletes, regardless of the RPE scale used. The session-RPE TL relates to fitness and performance, supporting the use of session-RPE in Sprint Kayak training.

  1. Quantifying the topology of porous structures

    Energy Technology Data Exchange (ETDEWEB)

    Kinney, J. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Computerized x-ray tomography, with microscopic resolution, has been used to volumetrically visualize the evolution of porosity in a ceramic matrix composite during processing. The topological variables describing the porosity have been measured. The evolution of the porosity exhibits critical scaling behavior near final consolidation, and appears to be independent of the structure (universality).

  2. How to quantify conduits in wood?

    Directory of Open Access Journals (Sweden)

    Alexander eScholz

    2013-03-01

    Full Text Available Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology.

  3. Quantifying creativity: can measures span the spectrum?

    Science.gov (United States)

    Simonton, Dean Keith

    2012-01-01

    Because the cognitive neuroscientists have become increasingly interested in the phenomenon of creativity, the issue arises of how creativity is to be optimally measured. Unlike intelligence, which can be assessed across the full range of intellectual ability creativity measures tend to concentrate on different sections of the overall spectrum. After first defining creativity in terms of the three criteria of novelty, usefulness, and surprise, this article provides an overview of the available measures. Not only do these instruments vary according to whether they focus on the creative process, person, or product, but they differ regarding whether they tap into “little-c” versus “Big-C” creativity; only productivity and eminence measures reach into genius-level manifestations of the phenomenon. The article closes by discussing whether various alternative assessment techniques can be integrated into a single measure that quantifies creativity across the full spectrum. PMID:22577309

  4. Message passing for quantified Boolean formulas

    CERN Document Server

    Zhang, Pan; Zdeborová, Lenka; Zecchina, Riccardo

    2012-01-01

    We introduce two types of message passing algorithms for quantified Boolean formulas (QBF). The first type is a message passing based heuristics that can prove unsatisfiability of the QBF by assigning the universal variables in such a way that the remaining formula is unsatisfiable. In the second type, we use message passing to guide branching heuristics of a Davis-Putnam Logemann-Loveland (DPLL) complete solver. Numerical experiments show that on random QBFs our branching heuristics gives robust exponential efficiency gain with respect to the state-of-art solvers. We also manage to solve some previously unsolved benchmarks from the QBFLIB library. Apart from this our study sheds light on using message passing in small systems and as subroutines in complete solvers.

  5. Quantifying decoherence in continuous variable systems

    Energy Technology Data Exchange (ETDEWEB)

    Serafini, A [Dipartimento di Fisica ' ER Caianiello' , Universita di Salerno, INFM UdR Salerno, INFN Sezione Napoli, Gruppo Collegato Salerno, Via S Allende, 84081 Baronissi, SA (Italy); Paris, M G A [Dipartimento di Fisica and INFM, Universita di Milano, Milan (Italy); Illuminati, F [Dipartimento di Fisica ' ER Caianiello' , Universita di Salerno, INFM UdR Salerno, INFN Sezione Napoli, Gruppo Collegato Salerno, Via S Allende, 84081 Baronissi, SA (Italy); De Siena, S [Dipartimento di Fisica ' ER Caianiello' , Universita di Salerno, INFM UdR Salerno, INFN Sezione Napoli, Gruppo Collegato Salerno, Via S Allende, 84081 Baronissi, SA (Italy)

    2005-04-01

    We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the purity, some non-classicality indicators in phase space, and, for two-mode states, entanglement measures and total correlations between the modes. Different sets of physically relevant initial configurations are considered, including one- and two-mode Gaussian states, number states, and coherent superpositions. Our analysis shows that, generally, the use of initially squeezed configurations does not help to preserve the coherence of Gaussian states, whereas it can be effective in protecting coherent superpositions of both number states and Gaussian wavepackets. (review article)

  6. Quantifying truncation errors in effective field theory

    CERN Document Server

    Furnstahl, R J; Phillips, D R; Wesolowski, S

    2015-01-01

    Bayesian procedures designed to quantify truncation errors in perturbative calculations of quantum chromodynamics observables are adapted to expansions in effective field theory (EFT). In the Bayesian approach, such truncation errors are derived from degree-of-belief (DOB) intervals for EFT predictions. Computation of these intervals requires specification of prior probability distributions ("priors") for the expansion coefficients. By encoding expectations about the naturalness of these coefficients, this framework provides a statistical interpretation of the standard EFT procedure where truncation errors are estimated using the order-by-order convergence of the expansion. It also permits exploration of the ways in which such error bars are, and are not, sensitive to assumptions about EFT-coefficient naturalness. We first demonstrate the calculation of Bayesian probability distributions for the EFT truncation error in some representative examples, and then focus on the application of chiral EFT to neutron-pr...

  7. Quantifying interspecific coagulation efficiency of phytoplankton

    DEFF Research Database (Denmark)

    Hansen, J.L.S.; Kiørboe, Thomas

    1997-01-01

    Non-sticky latex beads and sticky diatoms were used as models to describe mutual coagulation between sticky and non-sticky particles. in mixed suspensions of beads and Thalassiosira nordenskjoeldii, both types of particles coagulated into mixed aggregates at specific rates, from which the intersp......Non-sticky latex beads and sticky diatoms were used as models to describe mutual coagulation between sticky and non-sticky particles. in mixed suspensions of beads and Thalassiosira nordenskjoeldii, both types of particles coagulated into mixed aggregates at specific rates, from which....... nordenskjoeldii. Mutual coagulation between Skeletonema costatum and the non-sticky cel:ls of Ditylum brightwellii also proceeded with hall the efficiency of S. costatum alone. The latex beads were suitable to be used as 'standard particles' to quantify the ability of phytoplankton to prime aggregation...

  8. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  9. Quantifying Power Grid Risk from Geomagnetic Storms

    Science.gov (United States)

    Homeier, N.; Wei, L. H.; Gannon, J. L.

    2012-12-01

    We are creating a statistical model of the geophysical environment that can be used to quantify the geomagnetic storm hazard to power grid infrastructure. Our model is developed using a database of surface electric fields for the continental United States during a set of historical geomagnetic storms. These electric fields are derived from the SUPERMAG compilation of worldwide magnetometer data and surface impedances from the United States Geological Survey. This electric field data can be combined with a power grid model to determine GICs per node and reactive MVARs at each minute during a storm. Using publicly available substation locations, we derive relative risk maps by location by combining magnetic latitude and ground conductivity. We also estimate the surface electric fields during the August 1972 geomagnetic storm that caused a telephone cable outage across the middle of the United States. This event produced the largest surface electric fields in the continental U.S. in at least the past 40 years.

  10. A Simulation Platform for Quantifying Survival Bias

    DEFF Research Database (Denmark)

    Mayeda, Elizabeth Rose; Tchetgen Tchetgen, Eric J; Power, Melinda C

    2016-01-01

    Bias due to selective mortality is a potential concern in many studies and is especially relevant in cognitive aging research because cognitive impairment strongly predicts subsequent mortality. Biased estimation of the effect of an exposure on rate of cognitive decline can occur when mortality i......-mortality situations. This simulation platform provides a flexible tool for evaluating biases in studies with high mortality, as is common in cognitive aging research.......Bias due to selective mortality is a potential concern in many studies and is especially relevant in cognitive aging research because cognitive impairment strongly predicts subsequent mortality. Biased estimation of the effect of an exposure on rate of cognitive decline can occur when mortality...... platform with which to quantify the expected bias in longitudinal studies of determinants of cognitive decline. We evaluated potential survival bias in naive analyses under several selective survival scenarios, assuming that exposure had no effect on cognitive decline for anyone in the population. Compared...

  11. Quantifying the risk of extreme aviation accidents

    Science.gov (United States)

    Das, Kumer Pial; Dey, Asim Kumer

    2016-12-01

    Air travel is considered a safe means of transportation. But when aviation accidents do occur they often result in fatalities. Fortunately, the most extreme accidents occur rarely. However, 2014 was the deadliest year in the past decade causing 111 plane crashes, and among them worst four crashes cause 298, 239, 162 and 116 deaths. In this study, we want to assess the risk of the catastrophic aviation accidents by studying historical aviation accidents. Applying a generalized Pareto model we predict the maximum fatalities from an aviation accident in future. The fitted model is compared with some of its competitive models. The uncertainty in the inferences are quantified using simulated aviation accident series, generated by bootstrap resampling and Monte Carlo simulations.

  12. Historic Food Production Shocks: Quantifying the Extremes

    Directory of Open Access Journals (Sweden)

    Aled W. Jones

    2016-04-01

    Full Text Available Understanding global food production trends is vital for ensuring food security and to allow the world to develop appropriate policies to manage the food system. Over the past few years, there has been an increasing attention on the global food system, particularly after the extreme shocks seen in food prices after 2007. Several papers and working groups have explored the links between food production and various societal impacts however they often categorise production shocks in different ways even to the extent of identifying different levels, countries and timings for shocks. In this paper we present a simple method to quantify and categorise cereal production shocks at a country level. This method can be used as a baseline for other studies that examine the impact of these production shocks on the global food system.

  13. Quantifying the Anthropogenic Footprint in Eastern China

    Science.gov (United States)

    Meng, Chunlei; Dou, Youjun

    2016-04-01

    Urban heat island (UHI) is one of the most focuses in urban climate study. The parameterization of the anthropogenic heat (AH) is crucial important in UHI study, but universal method to parameterize the spatial pattern of the AH is lacking now. This paper uses the NOAA DMSP/OLS nighttime light data to parameterize the spatial pattern of the AH. Two experiments were designed and performed to quantify the influences of the AH to land surface temperature (LST) in eastern China and 24 big cities. The annual mean heating caused by AH is up to 1 K in eastern China. This paper uses the relative LST differences rather than the absolute LST differences between the control run and contrast run of common land model (CoLM) to find the drivers. The heating effect of the anthropogenic footprint has less influence on relatively warm and wet cities.

  14. Quantifying decoherence in continuous variable systems

    CERN Document Server

    Serafini, A; Illuminati, F; De Siena, S

    2005-01-01

    We present a detailed report on the decoherence of quantum states of continuous variable systems under the action of a quantum optical master equation resulting from the interaction with general Gaussian uncorrelated environments. The rate of decoherence is quantified by relating it to the decay rates of various, complementary measures of the quantum nature of a state, such as the purity, some nonclassicality indicators in phase space and, for two-mode states, entanglement measures and total correlations between the modes. Different sets of physically relevant initial configurations are considered, including one- and two-mode Gaussian states, number states, and coherent superpositions. Our analysis shows that, generally, the use of initially squeezed configurations does not help to preserve the coherence of Gaussian states, whereas it can be effective in protecting coherent superpositions of both number states and Gaussian wave packets.

  15. Another analytic view about quantifying social forces

    CERN Document Server

    Ausloos, Marcel

    2012-01-01

    Montroll had considered a Verhulst evolution approach for introducing a notion he called "social force", to describe a jump in some economic output when a new technology or product outcompetes a previous one. In fact, Montroll's adaptation of Verhulst equation is more like an economic field description than a "social force". The empirical Verhulst logistic function and the Gompertz double exponential law are used here in order to present an alternative view, within a similar mechanistic physics framework. As an example, a "social force" modifying the rate in the number of temples constructed by a religious movement, the Antoinist community, between 1910 and 1940 in Belgium is found and quantified. Practically, two temple inauguration regimes are seen to exist over different time spans, separated by a gap attributed to a specific "constraint", a taxation system, but allowing for a different, smooth, evolution rather than a jump. The impulse force duration is also emphasized as being better taken into account w...

  16. Quantifying the Cognitive Extent of Science

    CERN Document Server

    Milojević, Staša

    2015-01-01

    While the modern science is characterized by an exponential growth in scientific literature, the increase in publication volume clearly does not reflect the expansion of the cognitive boundaries of science. Nevertheless, most of the metrics for assessing the vitality of science or for making funding and policy decisions are based on productivity. Similarly, the increasing level of knowledge production by large science teams, whose results often enjoy greater visibility, does not necessarily mean that "big science" leads to cognitive expansion. Here we present a novel, big-data method to quantify the extents of cognitive domains of different bodies of scientific literature independently from publication volume, and apply it to 20 million articles published over 60-130 years in physics, astronomy, and biomedicine. The method is based on the lexical diversity of titles of fixed quotas of research articles. Owing to large size of quotas, the method overcomes the inherent stochasticity of article titles to achieve...

  17. How to quantify conduits in wood?

    Science.gov (United States)

    Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven

    2013-01-01

    Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology.

  18. Quantifying capital goods for waste landfilling

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Stentsøe, Steen; Willumsen, Hans Christian

    2013-01-01

    Materials and energy used for construction of a hill-type landfill of 4 million m3 were quantified in detail. The landfill is engineered with a liner and leachate collections system, as well as a gas collection and control system. Gravel and clay were the most common materials used, amounting...... to approximately 260 kg per tonne of waste landfilled. The environmental burdens from the extraction and manufacturing of the materials used in the landfill, as well as from the construction of the landfill, were modelled as potential environmental impacts. For example, the potential impact on global warming was 2.......5 kg carbon dioxide (CO2) equivalents or 0.32 milli person equivalents per tonne of waste. The potential impacts from the use of materials and construction of the landfill are low-to-insignificant compared with data reported in the literature on impact potentials of landfills in operation...

  19. Quantifying capital goods for waste incineration

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Riber, C.; Christensen, Thomas Højlund

    2013-01-01

    Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000–240,000tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main...... of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7–14kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed...... material used amounting to 19,000–26,000tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000–5000MWh. In terms of the environmental burden...

  20. Quantifying creativity: can measures span the spectrum?

    Science.gov (United States)

    Simonton, Dean Keith

    2012-03-01

    Because the cognitive neuroscientists have become increasingly interested in the phenomenon of creativity, the issue arises of how creativity is to be optimally measured. Unlike intelligence, which can be assessed across the full range of intellectual ability creativity measures tend to concentrate on different sections of the overall spectrum. After first defining creativity in terms of the three criteria of novelty, usefulness, and surprise, this article provides an overview of the available measures. Not only do these instruments vary according to whether they focus on the creative process, person, or product, but they differ regarding whether they tap into "little-c" versus "Big-C" creativity; only productivity and eminence measures reach into genius-level manifestations of the phenomenon. The article closes by discussing whether various alternative assessment techniques can be integrated into a single measure that quantifies creativity across the full spectrum.

  1. Quantifying structural states of soft mudrocks

    Science.gov (United States)

    Li, B.; Wong, R. C. K.

    2016-05-01

    In this paper, a cm model is proposed to quantify structural states of soft mudrocks, which are dependent on clay fractions and porosities. Physical properties of natural and reconstituted soft mudrock samples are used to derive two parameters in the cm model. With the cm model, a simplified homogenization approach is proposed to estimate geomechanical properties and fabric orientation distributions of soft mudrocks based on the mixture theory. Soft mudrocks are treated as a mixture of nonclay minerals and clay-water composites. Nonclay minerals have a high stiffness and serve as a structural framework of mudrocks when they have a high volume fraction. Clay-water composites occupy the void space among nonclay minerals and serve as an in-fill matrix. With the increase of volume fraction of clay-water composites, there is a transition in the structural state from the state of framework supported to the state of matrix supported. The decreases in shear strength and pore size as well as increases in compressibility and anisotropy in fabric are quantitatively related to such transition. The new homogenization approach based on the proposed cm model yields better performance evaluation than common effective medium modeling approaches because the interactions among nonclay minerals and clay-water composites are considered. With wireline logging data, the cm model is applied to quantify the structural states of Colorado shale formations at different depths in the Cold Lake area, Alberta, Canada. Key geomechancial parameters are estimated based on the proposed homogenization approach and the critical intervals with low strength shale formations are identified.

  2. Quantifier hierarchies over the first-Order definable tree languages

    Institute of Scientific and Technical Information of China (English)

    沈云付

    1996-01-01

    Using Boolean operations and concatenation product w.r.t special trees,quantifier hierarchies are given by way of alternate existential and universal quantifiers for the first-order definable tree languages.

  3. Quantifying capital goods for biological treatment of organic waste

    DEFF Research Database (Denmark)

    Brogaard, Line Kai-Sørensen; Petersen, Per H.; Nielsen, Peter D.

    2015-01-01

    Materials and energy used for construction of anaerobic digestion (AD) and windrow composting plants were quantified in detail. The two technologies were quantified in collaboration with consultants and producers of the parts used to construct the plants. The composting plants were quantified bas...

  4. QUANTIFYING LIFE STYLE IMPACT ON LIFESPAN

    Directory of Open Access Journals (Sweden)

    Antonello Lorenzini

    2012-12-01

    Full Text Available A healthy diet, physical activity and avoiding dangerous habits such as smoking are effective ways of increasing health and lifespan. Although a significant portion of the world's population still suffers from malnutrition, especially children, the most common cause of death in the world today is non-communicable diseases. Overweight and obesity significantly increase the relative risk for the most relevant non communicable diseases: cardiovascular disease, type II diabetes and some cancers. Childhood overweight also seems to increase the likelihood of disease in adulthood through epigenetic mechanisms. This worrisome trend now termed "globesity" will deeply impact society unless preventive strategies are put into effect. Researchers of the basic biology of aging have clearly established that animals with short lifespans live longer when their diet is calorie restricted. Although similar experiments carried on rhesus monkeys, a longer-lived species more closely related to humans, yielded mixed results, overall the available scientific data suggest keeping the body mass index in the "normal" range increases the chances of living a longer, healthier life. This can be successfully achieved both by maintaining a healthy diet and by engaging in physical activity. In this review we will try to quantify the relative impact of life style choices on lifespan.

  5. Quantifying the reheating temperature of the universe

    Energy Technology Data Exchange (ETDEWEB)

    Mazumdar, Anupam [Consortion for Fundamental Physics, Lancaster University, Lancaster LA1 4YB (United Kingdom); Zaldívar, Bryan [Instituto de Fisica Teorica, IFT-UAM/CSIC, 28049 Madrid (Spain)

    2014-09-15

    The aim of this paper is to determine an exact definition of the reheat temperature for a generic perturbative decay of the inflaton. In order to estimate the reheat temperature, there are two important conditions one needs to satisfy: (a) the decay products of the inflaton must dominate the energy density of the universe, i.e. the universe becomes completely radiation dominated, and (b) the decay products of the inflaton have attained local thermodynamical equilibrium. For some choices of parameters, the latter is a more stringent condition, such that the decay products may thermalise much after the beginning of radiation–domination. Consequently, we have obtained that the reheat temperature can be much lower than the standard-lore estimation. In this paper we describe under what conditions our universe could have efficient or inefficient thermalisation, and quantify the reheat temperature for both the scenarios. This result has an immediate impact on many applications which rely on the thermal history of the universe, in particular gravitino abundance.

  6. Quantifying the reheating temperature of the universe

    Directory of Open Access Journals (Sweden)

    Anupam Mazumdar

    2014-09-01

    Full Text Available The aim of this paper is to determine an exact definition of the reheat temperature for a generic perturbative decay of the inflaton. In order to estimate the reheat temperature, there are two important conditions one needs to satisfy: (a the decay products of the inflaton must dominate the energy density of the universe, i.e. the universe becomes completely radiation dominated, and (b the decay products of the inflaton have attained local thermodynamical equilibrium. For some choices of parameters, the latter is a more stringent condition, such that the decay products may thermalise much after the beginning of radiation–domination. Consequently, we have obtained that the reheat temperature can be much lower than the standard-lore estimation. In this paper we describe under what conditions our universe could have efficient or inefficient thermalisation, and quantify the reheat temperature for both the scenarios. This result has an immediate impact on many applications which rely on the thermal history of the universe, in particular gravitino abundance.

  7. Quantifying and Mapping Global Data Poverty.

    Directory of Open Access Journals (Sweden)

    Mathias Leidig

    Full Text Available Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI. The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction.

  8. Quantifying and Mapping Global Data Poverty.

    Science.gov (United States)

    Leidig, Mathias; Teeuw, Richard M

    2015-01-01

    Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI). The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this 'proof of concept' study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction.

  9. Quantifying the evolutionary dynamics of language.

    Science.gov (United States)

    Lieberman, Erez; Michel, Jean-Baptiste; Jackson, Joe; Tang, Tina; Nowak, Martin A

    2007-10-11

    Human language is based on grammatical rules. Cultural evolution allows these rules to change over time. Rules compete with each other: as new rules rise to prominence, old ones die away. To quantify the dynamics of language evolution, we studied the regularization of English verbs over the past 1,200 years. Although an elaborate system of productive conjugations existed in English's proto-Germanic ancestor, Modern English uses the dental suffix, '-ed', to signify past tense. Here we describe the emergence of this linguistic rule amidst the evolutionary decay of its exceptions, known to us as irregular verbs. We have generated a data set of verbs whose conjugations have been evolving for more than a millennium, tracking inflectional changes to 177 Old-English irregular verbs. Of these irregular verbs, 145 remained irregular in Middle English and 98 are still irregular today. We study how the rate of regularization depends on the frequency of word usage. The half-life of an irregular verb scales as the square root of its usage frequency: a verb that is 100 times less frequent regularizes 10 times as fast. Our study provides a quantitative analysis of the regularization process by which ancestral forms gradually yield to an emerging linguistic rule.

  10. Quantifying survival in patients with Proteus syndrome.

    Science.gov (United States)

    Sapp, Julie C; Hu, Lian; Zhao, Jean; Gruber, Ashlyn; Schwartz, Brian; Ferrari, Dora; Biesecker Md, Leslie G

    2017-06-29

    PurposeProteus syndrome is a rare mosaic overgrowth disorder that is associated with severe complications. While anecdotal data have suggested that the life span of affected patients is reduced, this has not been measured. Mortality data on rare diseases is critical for assessing treatments and other interventions.MethodsTo address this we used the clinical research records of 64 patients in a longitudinal natural history cohort at the National Institutes of Health to ascertain the data in an organized manner and estimate survival using a Kaplan-Meier approach.ResultsThe median age of diagnosis was 19 months. Based on this analysis, there was 25% probability of death by 22 years of age. Ten of the 11 patients who died were younger than 22 years of age, and there was only a single death after this age.ConclusionThese data quantify the risk of premature death in Proteus syndrome, which can be used to support interventions and trials. Although the risk of death is substantial, the fact that only one patient died after 22 years of age supports anecdotal evidence that the disease process moderates after the end of adolescence. Interventions to reduce mortality should be targeted to the pediatric age range.GENETICS in MEDICINE advance online publication, 29 June 2017; doi:10.1038/gim.2017.65.

  11. Stimfit: quantifying electrophysiological data with Python

    Directory of Open Access Journals (Sweden)

    Segundo Jose Guzman

    2014-02-01

    Full Text Available Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals.

  12. Deciphering faces: quantifiable visual cues to weight.

    Science.gov (United States)

    Coetzee, Vinet; Chen, Jingying; Perrett, David I; Stephen, Ian D

    2010-01-01

    Body weight plays a crucial role in mate choice, as weight is related to both attractiveness and health. People are quite accurate at judging weight in faces, but the cues used to make these judgments have not been defined. This study consisted of two parts. First, we wanted to identify quantifiable facial cues that are related to body weight, as defined by body mass index (BMI). Second, we wanted to test whether people use these cues to judge weight. In study 1, we recruited two groups of Caucasian and two groups of African participants, determined their BMI and measured their 2-D facial images for: width-to-height ratio, perimeter-to-area ratio, and cheek-to-jaw-width ratio. All three measures were significantly related to BMI in males, while the width-to-height and cheek-to-jaw-width ratios were significantly related to BMI in females. In study 2, these images were rated for perceived weight by Caucasian observers. We showed that these observers use all three cues to judge weight in African and Caucasian faces of both sexes. These three facial cues, width-to-height ratio, perimeter-to-area ratio, and cheek-to-jaw-width ratio, are therefore not only related to actual weight but provide a basis for perceptual attributes as well.

  13. Quantifying acoustic damping using flame chemiluminescence

    Science.gov (United States)

    Boujo, E.; Denisov, A.; Schuermans, B.; Noiray, N.

    2016-12-01

    Thermoacoustic instabilities in gas turbines and aeroengine combustors falls within the category of complex systems. They can be described phenomenologically using nonlinear stochastic differential equations, which constitute the grounds for output-only model-based system identification. It has been shown recently that one can extract the governing parameters of the instabilities, namely the linear growth rate and the nonlinear component of the thermoacoustic feedback, using dynamic pressure time series only. This is highly relevant for practical systems, which cannot be actively controlled due to a lack of cost-effective actuators. The thermoacoustic stability is given by the linear growth rate, which results from the combination of the acoustic damping and the coherent feedback from the flame. In this paper, it is shown that it is possible to quantify the acoustic damping of the system, and thus to separate its contribution to the linear growth rate from the one of the flame. This is achieved by post-processing in a simple way simultaneously acquired chemiluminescence and acoustic pressure data. It provides an additional approach to further unravel from observed time series the key mechanisms governing the system dynamics. This straightforward method is illustrated here using experimental data from a combustion chamber operated at several linearly stable and unstable operating conditions.

  14. Quantifying Supply Risk at a Cellulosic Biorefinery

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Jason K.; Jacobson, Jacob J.; Cafferty, Kara G.; Lamers, Patrick; Roni, Mohammad S.

    2015-07-01

    In order to increase the sustainability and security of the nation’s energy supply, the U.S. Department of Energy through its Bioenergy Technology Office has set a vision for one billion tons of biomass to be processed for renewable energy and bioproducts annually by the year 2030. The Renewable Fuels Standard limits the amount of corn grain that can be used in ethanol conversion sold in the U.S, which is already at its maximum. Therefore making the DOE’s vision a reality requires significant growth in the advanced biofuels industry where currently three cellulosic biorefineries convert cellulosic biomass to ethanol. Risk mitigation is central to growing the industry beyond its infancy to a level necessary to achieve the DOE vision. This paper focuses on reducing the supply risk that faces a firm that owns a cellulosic biorefinery. It uses risk theory and simulation modeling to build a risk assessment model based on causal relationships of underlying, uncertain, supply driving variables. Using the model the paper quantifies supply risk reduction achieved by converting the supply chain from a conventional supply system (bales and trucks) to an advanced supply system (depots, pellets, and trains). Results imply that the advanced supply system reduces supply system risk, defined as the probability of a unit cost overrun, from 83% in the conventional system to 4% in the advanced system. Reducing cost risk in this nascent industry improves the odds of realizing desired growth.

  15. Quantifying and Mapping Global Data Poverty

    Science.gov (United States)

    2015-01-01

    Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. However, access to these technologies, as well as their associated software and training materials, is not evenly distributed: since the 1990s there has been concern about a "Digital Divide" between the data-rich and the data-poor. We present an innovative metric for evaluating international variations in access to digital data: the Data Poverty Index (DPI). The DPI is based on Internet speeds, numbers of computer owners and Internet users, mobile phone ownership and network coverage, as well as provision of higher education. The datasets used to produce the DPI are provided annually for almost all the countries of the world and can be freely downloaded. The index that we present in this ‘proof of concept’ study is the first to quantify and visualise the problem of global data poverty, using the most recent datasets, for 2013. The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. The DPI highlights countries where support is needed for improving access to the Internet and for the provision of training in geoinfomatics. We conclude that the DPI is of value as a potential metric for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction. PMID:26560884

  16. Quantifying uncertainty in the chemical master equation

    Science.gov (United States)

    Bayati, Basil S.

    2017-06-01

    We describe a novel approach to quantifying the uncertainty inherent in the chemical kinetic master equation with stochastic coefficients. A stochastic collocation method is coupled to an analytical expansion of the master equation to analyze the effects of both extrinsic and intrinsic noise. The method consists of an analytical moment-closure method resulting in a large set of differential equations with stochastic coefficients that are in turn solved via a Smolyak sparse grid collocation method. We discuss the error of the method relative to the dimension of the model and clarify which methods are most suitable for the problem. We apply the method to two typical problems arising in chemical kinetics with time-independent extrinsic noise. Additionally, we show agreement with classical Monte Carlo simulations and calculate the variance over time as the sum of two expectations. The method presented here has better convergence properties for low to moderate dimensions than standard Monte Carlo methods and is therefore a superior alternative in this regime.

  17. Quantifying capital goods for waste incineration.

    Science.gov (United States)

    Brogaard, L K; Riber, C; Christensen, T H

    2013-06-01

    Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000-240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000-26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000-5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7-14 kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2-3% with respect to kg CO2 per tonne of waste combusted.

  18. Quantifying self-organization in fusion plasmas

    Science.gov (United States)

    Rajković, M.; Milovanović, M.; Škorić, M. M.

    2017-05-01

    A multifaceted framework for understanding self-organization in fusion plasma dynamics is presented which concurrently manages several important issues related to the nonlinear and multiscale phenomena involved, namely,(1) it chooses the optimal template wavelet for the analysis of temporal or spatio-temporal plasma dynamics, (2) it detects parameter values at which bifurcations occur, (3) it quantifies complexity and self-organization, (4) it enables short-term prediction of nonlinear dynamics, and (5) it extracts coherent structures in turbulence by separating them from the incoherent component. The first two aspects including the detection of changes in the dynamics of a nonlinear system are illustrated by analyzing Stimulated Raman Scattering in a bounded, weakly dissipative plasma. Self-organization in the fusion plasma is quantitatively analyzed based on the numerical simulations of the Gyrokinetic-Vlasov (GKV) model of plasma dynamics. The parameters for the standard and inward shifted magnetic configurations, relevant for the Large Helical Device, were used in order to quantitatively compare self-organization and complexity in the two configurations. Finally, self-organization is analyzed for three different confinement regimes of the MAST device.

  19. Quantifying capital goods for waste landfilling.

    Science.gov (United States)

    Brogaard, Line K; Stentsøe, Steen; Willumsen, Hans Christian; Christensen, Thomas H

    2013-06-01

    Materials and energy used for construction of a hill-type landfill of 4 million m(3) were quantified in detail. The landfill is engineered with a liner and leachate collections system, as well as a gas collection and control system. Gravel and clay were the most common materials used, amounting to approximately 260 kg per tonne of waste landfilled. The environmental burdens from the extraction and manufacturing of the materials used in the landfill, as well as from the construction of the landfill, were modelled as potential environmental impacts. For example, the potential impact on global warming was 2.5 kg carbon dioxide (CO2) equivalents or 0.32 milli person equivalents per tonne of waste. The potential impacts from the use of materials and construction of the landfill are low-to-insignificant compared with data reported in the literature on impact potentials of landfills in operation. The construction of the landfill is only a significant contributor to the impact of resource depletion owing to the high use of gravel and steel.

  20. Quantifying Rapid Variability in Accreting Compact Objects

    CERN Document Server

    Van der Klis, M

    1997-01-01

    I discuss some practical aspects of the analysis of millisecond time variability X-ray data obtained from accreting neutron stars and black holes. First I give an account of the statistical methods that are at present commonly applied in this field. These are mostly based on Fourier techniques. To a large extent these methods work well: they give astronomers the answers they need. Then I discuss a number of statistical questions that astronomers don't really know how to solve properly and that statisticians may have ideas about. These questions have to do with the highest and the lowest frequency ranges accessible in the Fourier analysis: how do you determine the shortest time scale present in the variability, how do you measure steep low-frequency noise. The point is stressed that in order for any method that resolves these issues to become popular, it is necessary to retain the capabilities the current methods already have in quantifying the complex, concurrent variability processes characteristic of accret...

  1. Quantifying Potential Groundwater Recharge In South Texas

    Science.gov (United States)

    Basant, S.; Zhou, Y.; Leite, P. A.; Wilcox, B. P.

    2015-12-01

    Groundwater in South Texas is heavily relied on for human consumption and irrigation for food crops. Like most of the south west US, woody encroachment has altered the grassland ecosystems here too. While brush removal has been widely implemented in Texas with the objective of increasing groundwater recharge, the linkage between vegetation and groundwater recharge in South Texas is still unclear. Studies have been conducted to understand plant-root-water dynamics at the scale of plants. However, little work has been done to quantify the changes in soil water and deep percolation at the landscape scale. Modeling water flow through soil profiles can provide an estimate of the total water flowing into deep percolation. These models are especially powerful with parameterized and calibrated with long term soil water data. In this study we parameterize the HYDRUS soil water model using long term soil water data collected in Jim Wells County in South Texas. Soil water was measured at every 20 cm intervals up to a depth of 200 cm. The parameterized model will be used to simulate soil water dynamics under a variety of precipitation regimes ranging from well above normal to severe drought conditions. The results from the model will be compared with the changes in soil moisture profile observed in response to vegetation cover and treatments from a study in a similar. Comparative studies like this can be used to build new and strengthen existing hypotheses regarding deep percolation and the role of soil texture and vegetation in groundwater recharge.

  2. A stochastic approach for quantifying immigrant integration: the Spanish test case

    Science.gov (United States)

    Agliari, Elena; Barra, Adriano; Contucci, Pierluigi; Sandell, Richard; Vernia, Cecilia

    2014-10-01

    We apply stochastic process theory to the analysis of immigrant integration. Using a unique and detailed data set from Spain, we study the relationship between local immigrant density and two social and two economic immigration quantifiers for the period 1999-2010. As opposed to the classic time-series approach, by letting immigrant density play the role of ‘time’ and the quantifier the role of ‘space,’ it becomes possible to analyse the behavior of the quantifiers by means of continuous time random walks. Two classes of results are then obtained. First, we show that social integration quantifiers evolve following diffusion law, while the evolution of economic quantifiers exhibits ballistic dynamics. Second, we make predictions of best- and worst-case scenarios taking into account large local fluctuations. Our stochastic process approach to integration lends itself to interesting forecasting scenarios which, in the hands of policy makers, have the potential to improve political responses to integration problems. For instance, estimating the standard first-passage time and maximum-span walk reveals local differences in integration performance for different immigration scenarios. Thus, by recognizing the importance of local fluctuations around national means, this research constitutes an important tool to assess the impact of immigration phenomena on municipal budgets and to set up solid multi-ethnic plans at the municipal level as immigration pressures build.

  3. Quantifying the dynamic of OSA brain using multifractal formalism: A novel measure for sleep fragmentation.

    Science.gov (United States)

    Raiesdana, Somayeh

    2017-01-01

    It is thought that the critical brain dynamics in sleep is modulated during frequent periods of wakefulness. This paper utilizes the capacity of EEG based scaling analysis to quantify sleep fragmentation in patients with obstructive sleep apnea. The scale-free (fractal) behavior refers to a state where no characteristic scale dominates the dynamics of the underlying process which is evident as long range correlations in a time series. Here, Multiscaling (multifractal) spectrum is utilized to quantify the disturbed dynamic of an OSA brain with fragmented sleep. The whole night multichannel sleep EEG recordings of 18 subjects were employed to compute and quantify variable power-law long-range correlations and singularity spectra. Based on this characteristic, a new marker for sleep fragmentation named ``scaling based sleep fragmentation'' was introduced. This measure takes into account the sleep run length and stage transition quality within a fuzzy inference system to improve decisions made on sleep fragmentation. The proposed index was implemented, validated with sleepiness parameters and compared to some common indexes including sleep fragmentation index, arousal index, sleep diversity index, and sleep efficiency index. Correlations were almost significant suggesting that the sleep characterizing measure, based on singularity spectra range, could properly detect fragmentations and quantify their rate. This method can be an alternative for quantifying the sleep fragmentation in clinical practice after being approved experimentally. Control of sleep fragmentation and, subsequently, suppression of excessive daytime sleepiness will be a promising outlook of this kind of researches.

  4. Quantifying Riverscape Connectivity with Graph Theory

    Science.gov (United States)

    Carbonneau, P.; Milledge, D.; Sinha, R.; Tandon, S. K.

    2013-12-01

    Fluvial catchments convey fluxes of water, sediment, nutrients and aquatic biota. At continental scales, crustal topography defines the overall path of channels whilst at local scales depositional and/or erosional features generally determine the exact path of a channel. Furthermore, constructions such as dams, for either water abstraction or hydropower, often have a significant impact on channel networks.The concept of ';connectivity' is commonly invoked when conceptualising the structure of a river network.This concept is easy to grasp but there have been uneven efforts across the environmental sciences to actually quantify connectivity. Currently there have only been a few studies reporting quantitative indices of connectivity in river sciences, notably, in the study of avulsion processes. However, the majority of current work describing some form of environmental connectivity in a quantitative manner is in the field of landscape ecology. Driven by the need to quantify habitat fragmentation, landscape ecologists have returned to graph theory. Within this formal setting, landscape ecologists have successfully developed a range of indices which can model connectivity loss. Such formal connectivity metrics are currently needed for a range of applications in fluvial sciences. One of the most urgent needs relates to dam construction. In the developed world, hydropower development has generally slowed and in many countries, dams are actually being removed. However, this is not the case in the developing world where hydropower is seen as a key element to low-emissions power-security. For example, several dam projects are envisaged in Himalayan catchments in the next 2 decades. This region is already under severe pressure from climate change and urbanisation, and a better understanding of the network fragmentation which can be expected in this system is urgently needed. In this paper, we apply and adapt connectivity metrics from landscape ecology. We then examine the

  5. Quantifying landscape resilience using vegetation indices

    Science.gov (United States)

    Eddy, I. M. S.; Gergel, S. E.

    2014-12-01

    Landscape resilience refers to the ability of systems to adapt to and recover from disturbance. In pastoral landscapes, degradation can be measured in terms of increased desertification and/or shrub encroachment. In many countries across Central Asia, the use and resilience of pastoral systems has changed markedly over the past 25 years, influenced by centralized Soviet governance, private property rights and recently, communal resource governance. In Kyrgyzstan, recent governance reforms were in response to the increasing degradation of pastures attributed to livestock overgrazing. Our goal is to examine and map the landscape-level factors that influence overgrazing throughout successive governance periods. Here, we map and examine some of the spatial factors influencing landscape resilience in agro-pastoral systems in the Kyrgyzstan Republic where pastures occupy >50% of the country's area. We ask three questions: 1) which mechanisms of pasture degradation (desertification vs. shrub encroachment), are detectable using remote sensing vegetation indices?; 2) Are these degraded pastures associated with landscape features that influence herder mobility and accessibility (e.g., terrain, distance to other pastures)?; and 3) Have these patterns changed through successive governance periods? Using a chronosequence of Landsat imagery (1999-2014), NDVI and other VIs were used to identify trends in pasture condition during the growing season. Least-cost path distances as well as graph theoretic indices were derived from topographic factors to assess landscape connectivity (from villages to pastures and among pastures). Fieldwork was used to assess the feasibility and accuracy of this approach using the most recent imagery. Previous research concluded that low herder mobility hindered pasture use, thus we expect the distance from pasture to village to be an important predictor of pasture condition. This research will quantify the magnitude of pastoral degradation and test

  6. Quantifying uncertainty in stable isotope mixing models

    Science.gov (United States)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-01

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, Stable Isotope Analysis in R (SIAR), a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  7. Quantifying human vitamin kinetics using AMS

    Energy Technology Data Exchange (ETDEWEB)

    Hillegonds, D; Dueker, S; Ognibene, T; Buchholz, B; Lin, Y; Vogel, J; Clifford, A

    2004-02-19

    Tracing vitamin kinetics at physiologic concentrations has been hampered by a lack of quantitative sensitivity for chemically equivalent tracers that could be used safely in healthy people. Instead, elderly or ill volunteers were sought for studies involving pharmacologic doses with radioisotopic labels. These studies fail to be relevant in two ways: vitamins are inherently micronutrients, whose biochemical paths are saturated and distorted by pharmacological doses; and while vitamins remain important for health in the elderly or ill, their greatest effects may be in preventing slow and cumulative diseases by proper consumption throughout youth and adulthood. Neither the target dose nor the target population are available for nutrient metabolic studies through decay counting of radioisotopes at high levels. Stable isotopic labels are quantified by isotope ratio mass spectrometry at levels that trace physiologic vitamin doses, but the natural background of stable isotopes severely limits the time span over which the tracer is distinguishable. Indeed, study periods seldom ranged over a single biological mean life of the labeled nutrients, failing to provide data on the important final elimination phase of the compound. Kinetic data for the absorption phase is similarly rare in micronutrient research because the phase is rapid, requiring many consecutive plasma samples for accurate representation. However, repeated blood samples of sufficient volume for precise stable or radio-isotope quantitations consume an indefensible amount of the volunteer's blood over a short period. Thus, vitamin pharmacokinetics in humans has often relied on compartmental modeling based upon assumptions and tested only for the short period of maximal blood circulation, a period that poorly reflects absorption or final elimination kinetics except for the most simple models.

  8. Quantifying Training Loads in Contemporary Dance.

    Science.gov (United States)

    Jeffries, Annie C; Wallace, Lee; Coutts, Aaron J

    2017-07-01

    To describe the training demands of contemporary dance and determine the validity of using the session rating of perceived exertion (sRPE) to monitor exercise intensity and training load in this activity. In addition, the authors examined the contribution of training (ie, accelerometry and heart rate) and non-training-related factors (ie, sleep and wellness) to perceived exertion during dance training. Training load and ActiGraphy for 16 elite amateur contemporary dancers were collected during a 49-d period, using heart-rate monitors, accelerometry, and sRPE. Within-individual correlation analysis was used to determine relationships between sRPE and several other measures of training intensity and load. Stepwise multiple regressions were used to determine a predictive equation to estimate sRPE during dance training. Average weekly training load was 4283 ± 2442 arbitrary units (AU), monotony 2.13 ± 0.92 AU, strain 10677 ± 9438 AU, and average weekly vector magnitude load 1809,707 ± 1015,402 AU. There were large to very large within-individual correlations between training-load sRPE and various other internal and external measures of intensity and load. The stepwise multiple-regression analysis also revealed that 49.7% of the adjusted variance in training-load sRPE was explained by peak heart rate, metabolic equivalents, soreness, motivation, and sleep quality (y = -4.637 + 13.817%HRpeak + 0.316 METS + 0.100 soreness + 0.116 motivation - 0.204 sleep quality). The current findings demonstrate the validity of the sRPE method for quantifying training load in dance, that dancers undertake very high training loads, and a combination of training and nontraining factors contribute to perceived exertion in dance training.

  9. Quantifying Sentiment and Influence in Blogspaces

    Energy Technology Data Exchange (ETDEWEB)

    Hui, Peter SY; Gregory, Michelle L.

    2010-07-25

    The weblog, or blog, has become a popular form of social media, through which authors can write posts, which can in turn generate feedback in the form of user comments. When considered in totality, a collection of blogs can thus be viewed as a sort of informal collection of mass sentiment and opinion. An obvious topic of interest might be to mine this collection to obtain some gauge of public sentiment over the wide variety of topics contained therein. However, the sheer size of the so-called blogosphere, combined with the fact that the subjects of posts can vary over a practically limitless number of topics poses some serious challenges when any meaningful analysis is attempted. Namely, the fact that largely anyone with access to the Internet can author their own blog, raises the serious issue of credibility— should some blogs be considered to be more influential than others, and consequently, when gauging sentiment with respect to a topic, should some blogs be weighted more heavily than others? In addition, as new posts and comments can be made on almost a constant basis, any blog analysis algorithm must be able to handle such updates efficiently. In this paper, we give a formalization of the blog model. We give formal methods of quantifying sentiment and influence with respect to a hierarchy of topics, with the specific aim of facilitating the computation of a per-topic, influence-weighted sentiment measure. Finally, as efficiency is a specific endgoal, we give upper bounds on the time required to update these values with new posts, showing that our analysis and algorithms are scalable.

  10. Quantifying antimicrobial resistance at veal calf farms.

    Directory of Open Access Journals (Sweden)

    Angela B Bosman

    Full Text Available This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm were collected. From each individual sample and one pooled faecal sample per farm, 90 selected Escherichia coli isolates were tested for their resistance against 25 mg/L amoxicillin, 25 mg/L tetracycline, 0.5 mg/L cefotaxime, 0.125 mg/L ciprofloxacin and 8/152 mg/L trimethoprim/sulfamethoxazole (tmp/s by replica plating. From each faecal sample another 10 selected E. coli isolates were tested for their resistance by broth microdilution as a reference. Logistic regression analysis was performed to compare the odds of testing an isolate resistant between both test methods (replica plating vs. broth microdilution and to evaluate the effect of pooling faecal samples. Bootstrap analysis was used to investigate the precision of the estimated prevalence of resistance to each antimicrobial obtained by several simulated sampling strategies. Replica plating showed similar odds of E. coli isolates tested resistant compared to broth microdilution, except for ciprofloxacin (OR 0.29, p ≤ 0.05. Pooled samples showed in general lower odds of an isolate being resistant compared to individual samples, although these differences were not significant. Bootstrap analysis showed that within each antimicrobial the various compositions of a pooled sample provided consistent estimates for the mean proportion of resistant isolates. Sampling strategies should be based on the variation in resistance among isolates within faecal samples and between faecal samples, which may vary by antimicrobial. In our study, the optimal sampling strategy from the perspective of precision of the estimated levels of resistance and practicality consists of a pooled faecal sample from 20 individual animals, of which

  11. Quantifying the Clinical Significance of Cannabis Withdrawal

    Science.gov (United States)

    Allsop, David J.; Copeland, Jan; Norberg, Melissa M.; Fu, Shanlin; Molnar, Anna; Lewis, John; Budney, Alan J.

    2012-01-01

    Background and Aims Questions over the clinical significance of cannabis withdrawal have hindered its inclusion as a discrete cannabis induced psychiatric condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM IV). This study aims to quantify functional impairment to normal daily activities from cannabis withdrawal, and looks at the factors predicting functional impairment. In addition the study tests the influence of functional impairment from cannabis withdrawal on cannabis use during and after an abstinence attempt. Methods and Results A volunteer sample of 49 non-treatment seeking cannabis users who met DSM-IV criteria for dependence provided daily withdrawal-related functional impairment scores during a one-week baseline phase and two weeks of monitored abstinence from cannabis with a one month follow up. Functional impairment from withdrawal symptoms was strongly associated with symptom severity (p = 0.0001). Participants with more severe cannabis dependence before the abstinence attempt reported greater functional impairment from cannabis withdrawal (p = 0.03). Relapse to cannabis use during the abstinence period was associated with greater functional impairment from a subset of withdrawal symptoms in high dependence users. Higher levels of functional impairment during the abstinence attempt predicted higher levels of cannabis use at one month follow up (p = 0.001). Conclusions Cannabis withdrawal is clinically significant because it is associated with functional impairment to normal daily activities, as well as relapse to cannabis use. Sample size in the relapse group was small and the use of a non-treatment seeking population requires findings to be replicated in clinical samples. Tailoring treatments to target withdrawal symptoms contributing to functional impairment during a quit attempt may improve treatment outcomes. PMID:23049760

  12. Quantifying Uncertainties in Land Surface Microwave Emissivity Retrievals

    Science.gov (United States)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2012-01-01

    Uncertainties in the retrievals of microwave land surface emissivities were quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including SSM/I, TMI and AMSR-E, were studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors in the retrievals. Generally these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 14% (312 K) over desert and 17% (320 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.52% (26 K). In particular, at 85.0/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are mostly likely caused by rain/cloud contamination, which can lead to random errors up to 1017 K under the most severe conditions.

  13. Quantifying Uncertainties in Land-Surface Microwave Emissivity Retrievals

    Science.gov (United States)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2013-01-01

    Uncertainties in the retrievals of microwaveland-surface emissivities are quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including the Special Sensor Microwave Imager, the Tropical Rainfall Measuring Mission Microwave Imager, and the Advanced Microwave Scanning Radiometer for Earth Observing System, are studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land-surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors inthe retrievals. Generally, these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 1%-4% (3-12 K) over desert and 1%-7% (3-20 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.5%-2% (2-6 K). In particular, at 85.5/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are most likely caused by rain/cloud contamination, which can lead to random errors up to 10-17 K under the most severe conditions.

  14. Quantifying the Frictional Forces between Skin and Nonwoven Fabrics

    Science.gov (United States)

    Jayawardana, Kavinda; Ovenden, Nicholas C.; Cottenden, Alan

    2017-01-01

    When a compliant sheet of material is dragged over a curved surface of a body, the frictional forces generated can be many times greater than they would be for a planar interface. This phenomenon is known to contribute to the abrasion damage to skin often suffered by wearers of incontinence pads and bed/chairbound people susceptible to pressure sores. Experiments that attempt to quantify these forces often use a simple capstan-type equation to obtain a characteristic coefficient of friction. In general, the capstan approach assumes the ratio of applied tensions depends only on the arc of contact and the coefficient of friction, and ignores other geometric and physical considerations; this approach makes it straightforward to obtain explicitly a coefficient of friction from the tensions measured. In this paper, two mathematical models are presented that compute the material displacements and surface forces generated by, firstly, a membrane under tension in moving contact with a rigid obstacle and, secondly, a shell-membrane under tension in contact with a deformable substrate. The results show that, while the use of a capstan equation remains fairly robust in some cases, effects such as the curvature and flaccidness of the underlying body, and the mass density of the fabric can lead to significant variations in stresses generated in the contact region. Thus, the coefficient of friction determined by a capstan model may not be an accurate reflection of the true frictional behavior of the contact region. PMID:28321192

  15. Quantifying and predicting Drosophila larvae crawling phenotypes

    Science.gov (United States)

    Günther, Maximilian N.; Nettesheim, Guilherme; Shubeita, George T.

    2016-06-01

    The fruit fly Drosophila melanogaster is a widely used model for cell biology, development, disease, and neuroscience. The fly’s power as a genetic model for disease and neuroscience can be augmented by a quantitative description of its behavior. Here we show that we can accurately account for the complex and unique crawling patterns exhibited by individual Drosophila larvae using a small set of four parameters obtained from the trajectories of a few crawling larvae. The values of these parameters change for larvae from different genetic mutants, as we demonstrate for fly models of Alzheimer’s disease and the Fragile X syndrome, allowing applications such as genetic or drug screens. Using the quantitative model of larval crawling developed here we use the mutant-specific parameters to robustly simulate larval crawling, which allows estimating the feasibility of laborious experimental assays and aids in their design.

  16. Quantifying emission reduction contributions by emerging economics

    Energy Technology Data Exchange (ETDEWEB)

    Moltmann, Sara; Hagemann, Markus; Eisbrenner, Katja; Hoehne, Niklas [Ecofys GmbH, Koeln (Germany); Sterk, Wolfgang; Mersmann, Florian; Ott, Hermann E.; Watanabe, Rie [Wuppertal Institut (Germany)

    2011-04-15

    Further action is needed that goes far beyond what has been agreed so far under the United Nations Framework Convention on Climate Change (UNFCCC) and the Kyoto Protocol to 'prevent dangerous anthropogenic interference with the climate system', the ultimate objective of the UNFCCC. It is out of question that developed countries (Annex I countries) will have to take a leading role. They will have to commit to substantial emission reductions and financing commitments due to their historical responsibility and their financial capability. However, the stabilisation of the climate system will require global emissions to peak within the next decade and decline well below half of current levels by the middle of the century. It is hence a global issue and, thus, depends on the participation of as many countries as possible. This report provides a comparative analysis of greenhouse gas (GHG) emissions, including their national climate plans, of the major emitting developing countries Brazil, China, India, Mexico, South Africa and South Korea. It includes an overview of emissions and economic development, existing national climate change strategies, uses a consistent methodology for estimating emission reduction potential, costs of mitigation options, provides an estimate of the reductions to be achieved through the national climate plans and finally provides a comparison of the results to the allocation of emission rights according to different global effort-sharing approaches. In addition, the report discusses possible nationally appropriate mitigation actions (NAMAs) the six countries could take based on the analysis of mitigation options. This report is an output of the project 'Proposals for quantifying emission reduction contributions by emerging economies' by Ecofys and the Wuppertal Institute for the Federal Environment Agency in Dessau. It builds upon earlier joint work ''Proposals for contributions of emerging economies to the climate

  17. Quantifying missing heritability at known GWAS loci.

    Directory of Open Access Journals (Sweden)

    Alexander Gusev

    Full Text Available Recent work has shown that much of the missing heritability of complex traits can be resolved by estimates of heritability explained by all genotyped SNPs. However, it is currently unknown how much heritability is missing due to poor tagging or additional causal variants at known GWAS loci. Here, we use variance components to quantify the heritability explained by all SNPs at known GWAS loci in nine diseases from WTCCC1 and WTCCC2. After accounting for expectation, we observed all SNPs at known GWAS loci to explain 1.29 x more heritability than GWAS-associated SNPs on average (P=3.3 x 10⁻⁵. For some diseases, this increase was individually significant: 2.07 x for Multiple Sclerosis (MS (P=6.5 x 10⁻⁹ and 1.48 x for Crohn's Disease (CD (P = 1.3 x 10⁻³; all analyses of autoimmune diseases excluded the well-studied MHC region. Additionally, we found that GWAS loci from other related traits also explained significant heritability. The union of all autoimmune disease loci explained 7.15 x more MS heritability than known MS SNPs (P 20,000 Rheumatoid Arthritis (RA samples typed on ImmunoChip, with 2.37 x more heritability from all SNPs at GWAS loci (P = 2.3 x 10⁻⁶ and 5.33 x more heritability from all autoimmune disease loci (P < 1 x 10⁻¹⁶ compared to known RA SNPs (including those identified in this cohort. Our methods adjust for LD between SNPs, which can bias standard estimates of heritability from SNPs even if all causal variants are typed. By comparing adjusted estimates, we hypothesize that the genome-wide distribution of causal variants is enriched for low-frequency alleles, but that causal variants at known GWAS loci are skewed towards common alleles. These findings have important ramifications for fine-mapping study design and our understanding of complex disease architecture.

  18. Quantifying geocode location error using GIS methods

    Directory of Open Access Journals (Sweden)

    Gardner Bennett R

    2007-04-01

    Full Text Available Abstract Background The Metropolitan Atlanta Congenital Defects Program (MACDP collects maternal address information at the time of delivery for infants and fetuses with birth defects. These addresses have been geocoded by two independent agencies: (1 the Georgia Division of Public Health Office of Health Information and Policy (OHIP and (2 a commercial vendor. Geographic information system (GIS methods were used to quantify uncertainty in the two sets of geocodes using orthoimagery and tax parcel datasets. Methods We sampled 599 infants and fetuses with birth defects delivered during 1994–2002 with maternal residence in either Fulton or Gwinnett County. Tax parcel datasets were obtained from the tax assessor's offices of Fulton and Gwinnett County. High-resolution orthoimagery for these counties was acquired from the U.S. Geological Survey. For each of the 599 addresses we attempted to locate the tax parcel corresponding to the maternal address. If the tax parcel was identified the distance and the angle between the geocode and the residence were calculated. We used simulated data to characterize the impact of geocode location error. In each county 5,000 geocodes were generated and assigned their corresponding Census 2000 tract. Each geocode was then displaced at a random angle by a random distance drawn from the distribution of observed geocode location errors. The census tract of the displaced geocode was determined. We repeated this process 5,000 times and report the percentage of geocodes that resolved into incorrect census tracts. Results Median location error was less than 100 meters for both OHIP and commercial vendor geocodes; the distribution of angles appeared uniform. Median location error was approximately 35% larger in Gwinnett (a suburban county relative to Fulton (a county with urban and suburban areas. Location error occasionally caused the simulated geocodes to be displaced into incorrect census tracts; the median percentage

  19. A Methodological Approach to Quantifying Plyometric Intensity.

    Science.gov (United States)

    Jarvis, Mark M; Graham-Smith, Phil; Comfort, Paul

    2016-09-01

    Jarvis, MM, Graham-Smith, P, and Comfort, P. A Methodological approach to quantifying plyometric intensity. J Strength Cond Res 30(9): 2522-2532, 2016-In contrast to other methods of training, the quantification of plyometric exercise intensity is poorly defined. The purpose of this study was to evaluate the suitability of a range of neuromuscular and mechanical variables to describe the intensity of plyometric exercises. Seven male recreationally active subjects performed a series of 7 plyometric exercises. Neuromuscular activity was measured using surface electromyography (SEMG) at vastus lateralis (VL) and biceps femoris (BF). Surface electromyography data were divided into concentric (CON) and eccentric (ECC) phases of movement. Mechanical output was measured by ground reaction forces and processed to provide peak impact ground reaction force (PF), peak eccentric power (PEP), and impulse (IMP). Statistical analysis was conducted to assess the reliability intraclass correlation coefficient and sensitivity smallest detectable difference of all variables. Mean values of SEMG demonstrate high reliability (r ≥ 0.82), excluding ECC VL during a 40-cm drop jump (r = 0.74). PF, PEP, and IMP demonstrated high reliability (r ≥ 0.85). Statistical power for force variables was excellent (power = 1.0), and good for SEMG (power ≥0.86) excluding CON BF (power = 0.57). There was no significant difference (p > 0.05) in CON SEMG between exercises. Eccentric phase SEMG only distinguished between exercises involving a landing and those that did not (percentage of maximal voluntary isometric contraction [%MVIC] = no landing -65 ± 5, landing -140 ± 8). Peak eccentric power, PF, and IMP all distinguished between exercises. In conclusion, CON neuromuscular activity does not appear to vary when intent is maximal, whereas ECC activity is dependent on the presence of a landing. Force characteristics provide a reliable and sensitive measure enabling precise description of intensity

  20. POST BEHAVIORAL FINANCE ADOLESCENCE

    Directory of Open Access Journals (Sweden)

    ADRIAN MITROI

    2016-12-01

    Full Text Available The study of behavioral finance combines the investigation and expertise from research and practice into smart portfolios of individual investors’ portfolios. Understanding cognitive errors and misleading emotions drive investors to their long-term goals of financial prosperity and capital preservation. 10 years ago, Behavioral Finance was still considered an incipient, adolescent science. First Nobel Prize in Economics awarded to the study of Behavioral Economics in 2002 established the field as a new, respected study of economics. 2013 Nobel Prize was awarded to three economists, one of them considered the one of the founders of the Behavioral Finance. As such, by now we are entering the coming of age of behavioral finance. It is now recognized as a science of understanding investors behaviors and their biased patterns. It applies quantitative finance and provides practical models grounded on robust understanding of investors behavior toward financial risk. Financial Personality influences investment decisions. Behavioral portfolio construction methods combine classic finance with rigorously quantified psychological metrics and improves models for financial advice to enhance investors chances in reaching their lifetime financial goals. Behavioral finance helps understanding psychological profile dissimilarities of individuals and how these differences manifest in investment decision process. This new science has become now a must topic in modern finance.

  1. Quantifier spreading in child eye movements: A case of the Russian quantifier kazhdyj ‘every'

    Directory of Open Access Journals (Sweden)

    Irina A. Sekerina

    2017-07-01

    Full Text Available Extensive cross-linguistic work has documented that children up to the age of 9–10 make errors when performing a sentence-picture verification task that pairs spoken sentences with the universal quantifier 'every 'and pictures with entities in partial one-to-one correspondence. These errors stem from children’s difficulties in restricting the domain of a universal quantifier to the appropriate noun phrase and are referred in the literature as 'quantifier-spreading '('q'-spreading. We adapted the task to be performed in conjunction with eye-movement recordings using the Visual World Paradigm. Russian-speaking 5-to-6-year-old children ('N '= 31 listened to sentences like 'Kazhdyj alligator lezhit v vanne '‘Every alligator is lying in a bathtub’ and viewed pictures with three alligators, each in a bathtub, and two extra empty bathtubs. Non-spreader children ('N '= 12 were adult-like in their accuracy whereas 'q'-spreading ones ('N '= 19 were only 43% correct in interpreting such sentences compared to the control sentences. Eye movements of 'q'-spreading children revealed that more looks to the extra containers (two empty bathtubs correlated with higher error rates reflecting the processing pattern of 'q'-spreading. In contrast, more looks to the distractors in control sentences did not lead to errors in interpretation. We argue that 'q'-spreading errors are caused by interference from the extra entities in the visual context, and our results support the processing difficulty account of acquisition of quantification. Interference results in cognitive overload as children have to integrate multiple sources of information, i.e., visual context with salient extra entities and the spoken sentence in which these entities are mentioned in real-time processing.   This article is part of the special collection: Acquisition of Quantification

  2. Psychomotor Behavior: A Practical Approach in Drosophila

    OpenAIRE

    Iliadi, Konstantin G.; Gluscencova, Oxana B.; Boulianne, Gabrielle L

    2016-01-01

    Psychomotor behaviors are governed by fine relationships between physical activity and cognitive functions. Disturbances in psychomotor development and performance are a hallmark of many mental illnesses and often appear as observable and measurable behaviors. Here, we describe a new method called an “equilibrist test,” which can be used to quantify psychomotor learning and performance in Drosophila. We also show how this test can be used to quantify motor disturbances at relatively early sta...

  3. On Quantified Propositional Logics and the Exponential Time Hierarchy

    Directory of Open Access Journals (Sweden)

    Miika Hannula

    2016-09-01

    Full Text Available We study quantified propositional logics from the complexity theoretic point of view. First we introduce alternating dependency quantified boolean formulae (ADQBF which generalize both quantified and dependency quantified boolean formulae. We show that the truth evaluation for ADQBF is AEXPTIME(poly-complete. We also identify fragments for which the problem is complete for the levels of the exponential hierarchy. Second we study propositional team-based logics. We show that DQBF formulae correspond naturally to quantified propositional dependence logic and present a general NEXPTIME upper bound for quantified propositional logic with a large class of generalized dependence atoms. Moreover we show AEXPTIME(poly-completeness for extensions of propositional team logic with generalized dependence atoms.

  4. Quantifying data worth toward reducing predictive uncertainty.

    Science.gov (United States)

    Dausman, Alyssa M; Doherty, John; Langevin, Christian D; Sukop, Michael C

    2010-01-01

    The present study demonstrates a methodology for optimization of environmental data acquisition. Based on the premise that the worth of data increases in proportion to its ability to reduce the uncertainty of key model predictions, the methodology can be used to compare the worth of different data types, gathered at different locations within study areas of arbitrary complexity. The method is applied to a hypothetical nonlinear, variable density numerical model of salt and heat transport. The relative utilities of temperature and concentration measurements at different locations within the model domain are assessed in terms of their ability to reduce the uncertainty associated with predictions of movement of the salt water interface in response to a decrease in fresh water recharge. In order to test the sensitivity of the method to nonlinear model behavior, analyses were repeated for multiple realizations of system properties. Rankings of observation worth were similar for all realizations, indicating robust performance of the methodology when employed in conjunction with a highly nonlinear model. The analysis showed that while concentration and temperature measurements can both aid in the prediction of interface movement, concentration measurements, especially when taken in proximity to the interface at locations where the interface is expected to move, are of greater worth than temperature measurements. Nevertheless, it was also demonstrated that pairs of temperature measurements, taken in strategic locations with respect to the interface, can also lead to more precise predictions of interface movement.

  5. Quantifying data worth toward reducing predictive uncertainty

    Science.gov (United States)

    Dausman, A.M.; Doherty, J.; Langevin, C.D.; Sukop, M.C.

    2010-01-01

    The present study demonstrates a methodology for optimization of environmental data acquisition. Based on the premise that the worth of data increases in proportion to its ability to reduce the uncertainty of key model predictions, the methodology can be used to compare the worth of different data types, gathered at different locations within study areas of arbitrary complexity. The method is applied to a hypothetical nonlinear, variable density numerical model of salt and heat transport. The relative utilities of temperature and concentration measurements at different locations within the model domain are assessed in terms of their ability to reduce the uncertainty associated with predictions of movement of the salt water interface in response to a decrease in fresh water recharge. In order to test the sensitivity of the method to nonlinear model behavior, analyses were repeated for multiple realizations of system properties. Rankings of observation worth were similar for all realizations, indicating robust performance of the methodology when employed in conjunction with a highly nonlinear model. The analysis showed that while concentration and temperature measurements can both aid in the prediction of interface movement, concentration measurements, especially when taken in proximity to the interface at locations where the interface is expected to move, are of greater worth than temperature measurements. Nevertheless, it was also demonstrated that pairs of temperature measurements, taken in strategic locations with respect to the interface, can also lead to more precise predictions of interface movement. Journal compilation ?? 2010 National Ground Water Association.

  6. Mechanical behaviors of nanowires

    Science.gov (United States)

    Chen, Yujie; An, Xianghai; Liao, Xiaozhou

    2017-09-01

    The mechanical behaviors of nanowires (NWs) are significantly different from those of their bulk materials because of their small dimensions. Determining the mechanical performance of NWs and understanding their deformation behavior are crucial for designing and manufacturing NW-based devices with predictable and reproducible operation. Owing to the difficulties to manipulate these nanoscale materials, nanomechanical testing of NWs is always challenging, and errors can be readily introduced in the measured mechanical data. Here, we survey the techniques that have been developed to quantify the mechanical properties and to understand the deformation mechanisms of NWs. We also provide a general review of the mechanical properties and deformation behaviors of NWs and discuss possible sources responsible for the discrepancy of measured mechanical properties. The effects of planar defects on the mechanical behavior of NWs are also reviewed.

  7. A fuzzy Bayesian network approach to quantify the human behaviour during an evacuation

    Science.gov (United States)

    Ramli, Nurulhuda; Ghani, Noraida Abdul; Ahmad, Nazihah

    2016-06-01

    Bayesian Network (BN) has been regarded as a successful representation of inter-relationship of factors affecting human behavior during an emergency. This paper is an extension of earlier work of quantifying the variables involved in the BN model of human behavior during an evacuation using a well-known direct probability elicitation technique. To overcome judgment bias and reduce the expert's burden in providing precise probability values, a new approach for the elicitation technique is required. This study proposes a new fuzzy BN approach for quantifying human behavior during an evacuation. Three major phases of methodology are involved, namely 1) development of qualitative model representing human factors during an evacuation, 2) quantification of BN model using fuzzy probability and 3) inferencing and interpreting the BN result. A case study of three inter-dependencies of human evacuation factors such as danger assessment ability, information about the threat and stressful conditions are used to illustrate the application of the proposed method. This approach will serve as an alternative to the conventional probability elicitation technique in understanding the human behavior during an evacuation.

  8. Quantifying uncertainty in observational rainfall datasets

    Science.gov (United States)

    Lennard, Chris; Dosio, Alessandro; Nikulin, Grigory; Pinto, Izidine; Seid, Hussen

    2015-04-01

    rainfall datasets available over Africa on monthly, daily and sub-daily time scales as appropriate to quantify spatial and temporal differences between the datasets. We find regional wet and dry biases between datasets (using the ensemble mean as a reference) with generally larger biases in reanalysis products. Rainfall intensity is poorly represented in some datasets which demonstrates some datasets should not be used for rainfall intensity analyses. Using 10 CORDEX models we show in east Africa that the spread between observed datasets is often similar to the spread between models. We recommend that specific observational rainfall datasets datasets be used for specific investigations and also that where many datasets are applicable to an investigation, a probabilistic view be adopted for rainfall studies over Africa. Endris, H. S., P. Omondi, S. Jain, C. Lennard, B. Hewitson, L. Chang'a, J. L. Awange, A. Dosio, P. Ketiem, G. Nikulin, H-J. Panitz, M. Büchner, F. Stordal, and L. Tazalika (2013) Assessment of the Performance of CORDEX Regional Climate Models in Simulating East African Rainfall. J. Climate, 26, 8453-8475. DOI: 10.1175/JCLI-D-12-00708.1 Gbobaniyi, E., A. Sarr, M. B. Sylla, I. Diallo, C. Lennard, A. Dosio, A. Dhie ?diou, A. Kamga, N. A. B. Klutse, B. Hewitson, and B. Lamptey (2013) Climatology, annual cycle and interannual variability of precipitation and temperature in CORDEX simulations over West Africa. Int. J. Climatol., DOI: 10.1002/joc.3834 Hernández-Díaz, L., R. Laprise, L. Sushama, A. Martynov, K. Winger, and B. Dugas (2013) Climate simulation over CORDEX Africa domain using the fifth-generation Canadian Regional Climate Model (CRCM5). Clim. Dyn. 40, 1415-1433. DOI: 10.1007/s00382-012-1387-z Kalognomou, E., C. Lennard, M. Shongwe, I. Pinto, A. Favre, M. Kent, B. Hewitson, A. Dosio, G. Nikulin, H. Panitz, and M. Büchner (2013) A diagnostic evaluation of precipitation in CORDEX models over southern Africa. Journal of Climate, 26, 9477-9506. DOI:10

  9. Quantifying Oldowan Stone Tool Production at Olduvai Gorge, Tanzania.

    Directory of Open Access Journals (Sweden)

    Jay S Reti

    Full Text Available Recent research suggests that variation exists among and between Oldowan stone tool assemblages. Oldowan variation might represent differential constraints on raw materials used to produce these stone implements. Alternatively, variation among Oldowan assemblages could represent different methods that Oldowan producing hominins utilized to produce these lithic implements. Identifying differential patterns of stone tool production within the Oldowan has implications for assessing how stone tool technology evolved, how traditions of lithic production might have been culturally transmitted, and for defining the timing and scope of these evolutionary events. At present there is no null model to predict what morphological variation in the Oldowan should look like. Without such a model, quantifying whether Oldowan assemblages vary due to raw material constraints or whether they vary due to differences in production technique is not possible. This research establishes a null model for Oldowan lithic artifact morphological variation. To establish these expectations this research 1 models the expected range of variation through large scale reduction experiments, 2 develops an algorithm to categorize archaeological flakes based on how they are produced, and 3 statistically assesses the methods of production behavior used by Oldowan producing hominins at the site of DK from Olduvai Gorge, Tanzania via the experimental model. Results indicate that a subset of quartzite flakes deviate from the null expectations in a manner that demonstrates efficiency in flake manufacture, while some basalt flakes deviate from null expectations in a manner that demonstrates inefficiency in flake manufacture. The simultaneous presence of efficiency in stone tool production for one raw material (quartzite and inefficiency in stone tool production for another raw material (basalt suggests that Oldowan producing hominins at DK were able to mediate the economic costs associated

  10. Quantifying Oldowan Stone Tool Production at Olduvai Gorge, Tanzania.

    Science.gov (United States)

    Reti, Jay S

    2016-01-01

    Recent research suggests that variation exists among and between Oldowan stone tool assemblages. Oldowan variation might represent differential constraints on raw materials used to produce these stone implements. Alternatively, variation among Oldowan assemblages could represent different methods that Oldowan producing hominins utilized to produce these lithic implements. Identifying differential patterns of stone tool production within the Oldowan has implications for assessing how stone tool technology evolved, how traditions of lithic production might have been culturally transmitted, and for defining the timing and scope of these evolutionary events. At present there is no null model to predict what morphological variation in the Oldowan should look like. Without such a model, quantifying whether Oldowan assemblages vary due to raw material constraints or whether they vary due to differences in production technique is not possible. This research establishes a null model for Oldowan lithic artifact morphological variation. To establish these expectations this research 1) models the expected range of variation through large scale reduction experiments, 2) develops an algorithm to categorize archaeological flakes based on how they are produced, and 3) statistically assesses the methods of production behavior used by Oldowan producing hominins at the site of DK from Olduvai Gorge, Tanzania via the experimental model. Results indicate that a subset of quartzite flakes deviate from the null expectations in a manner that demonstrates efficiency in flake manufacture, while some basalt flakes deviate from null expectations in a manner that demonstrates inefficiency in flake manufacture. The simultaneous presence of efficiency in stone tool production for one raw material (quartzite) and inefficiency in stone tool production for another raw material (basalt) suggests that Oldowan producing hominins at DK were able to mediate the economic costs associated with stone tool

  11. Quantifying object and material surface areas in residences

    Energy Technology Data Exchange (ETDEWEB)

    Hodgson, Alfred T.; Ming, Katherine Y.; Singer, Brett C.

    2005-01-05

    The dynamic behavior of volatile organic compounds (VOCs) in indoor environments depends, in part, on sorptive interactions between VOCs in the gas phase and material surfaces. Since information on the types and quantities of interior material surfaces is not generally available, this pilot-scale study was conducted in occupied residences to develop and demonstrate a method for quantifying surface areas of objects and materials in rooms. Access to 33 rooms in nine residences consisting of bathrooms, bedroom/offices and common areas was solicited from among research group members living in the East San Francisco Bay Area. A systematic approach was implemented for measuring rooms and objects from 300 cm{sup 2} and larger. The ventilated air volumes of the rooms were estimated and surface area-to-volume ratios were calculated for objects and materials, each segregated into 20 or more categories. Total surface area-to-volume ratios also were determined for each room. The bathrooms had the highest total surface area-to-volume ratios. Bedrooms generally had higher ratios than common areas consisting of kitchens, living/dining rooms and transitional rooms. Total surface area-to-volume ratios for the 12 bedrooms ranged between 2.3 and 4.7 m{sup 2} m{sup -3}. The importance of individual objects and materials with respect to sorption will depend upon the sorption coefficients for the various VOC/materials combinations. When combined, the highly permeable material categories, which may contribute to significant interactions, had a median ratio of about 0.5 m{sup 2} m{sup -3} for all three types of rooms.

  12. Visual Attention and Quantifier-Spreading in Heritage Russian Bilinguals

    Science.gov (United States)

    Sekerina, Irina A.; Sauermann, Antje

    2015-01-01

    It is well established in language acquisition research that monolingual children and adult second language learners misinterpret sentences with the universal quantifier "every" and make quantifier-spreading errors that are attributed to a preference for a match in number between two sets of objects. The present Visual World eye-tracking…

  13. Visual Attention and Quantifier-Spreading in Heritage Russian Bilinguals

    Science.gov (United States)

    Sekerina, Irina A.; Sauermann, Antje

    2015-01-01

    It is well established in language acquisition research that monolingual children and adult second language learners misinterpret sentences with the universal quantifier "every" and make quantifier-spreading errors that are attributed to a preference for a match in number between two sets of objects. The present Visual World eye-tracking…

  14. Children's Knowledge of the Quantifier "Dou" in Mandarin Chinese

    Science.gov (United States)

    Zhou, Peng; Crain, Stephen

    2011-01-01

    The quantifier "dou" (roughly corresponding to English "all") in Mandarin Chinese has been the topic of much discussion in the theoretical literature. This study investigated children's knowledge of this quantifier using a new methodological technique, which we dubbed the Question-Statement Task. Three questions were addressed: (i) whether young…

  15. On the contrast between Germanic and Romance negated quantifiers

    NARCIS (Netherlands)

    Cirillo, R.

    2009-01-01

    Universal quantifiers can be stranded in the manner described by Sportiche (1988), Giusti (1990) and Shlonsky (1991) in both the Romance and Germanic languages, but a negated universal quantifier can only be stranded in the Germanic languages. The goal of this paper is to show that this contrast

  16. Anatomy of Alternating Quantifier Satisfiability (Work in progress)

    DEFF Research Database (Denmark)

    Dung, Phan Anh; Bjørner, Nikolaj; Monniaux, David

    We report on work in progress to generalize an algorithm recently introduced in [10] for checking satisfiability of formulas with quantifier alternation. The algorithm uses two auxiliary procedures: a procedure for producing a candidate formula for quantifier elimination and a procedure for elimi...

  17. Quantifying terpenes in rumen fluid, serum, and plasma from sheep

    Science.gov (United States)

    Determining the fate of terpenes consumed by browsing ruminants require methods to quantify their presence in blood and rumen fluid. Our objective was to modify an existing procedure for plasma terpenes to quantify 25 structurally diverse mono- and sesquiterpenes in serum, plasma, and rumen fluid fr...

  18. Quantifying discipline practices using absolute versus relative frequencies: clinical and research implications for child welfare.

    Science.gov (United States)

    Lindhiem, Oliver; Shaffer, Anne; Kolko, David J

    2014-01-01

    In the parent intervention outcome literatures, discipline practices are generally quantified as absolute frequencies or, less commonly, as relative frequencies. These differences in methodology warrant direct comparison as they have critical implications for study results and conclusions among treatments targeted at reducing parental aggression and harsh discipline. In this study, we directly compared the absolute frequency method and the relative frequency method for quantifying physically aggressive, psychologically aggressive, and nonaggressive discipline practices. Longitudinal data over a 3-year period came from an existing data set of a clinical trial examining the effectiveness of a psychosocial treatment in reducing parental physical and psychological aggression and improving child behavior (N = 139). Discipline practices (aggressive and nonaggressive) were assessed using the Conflict Tactics Scale. The two methods yielded different patterns of results, particularly for nonaggressive discipline strategies. We suggest that each method makes its own unique contribution to a more complete understanding of the association between parental aggression and intervention effects.

  19. Probability distributions of whisker-surface contact: quantifying elements of the rat vibrissotactile natural scene.

    Science.gov (United States)

    Hobbs, Jennifer A; Towal, R Blythe; Hartmann, Mitra J Z

    2015-08-01

    Analysis of natural scene statistics has been a powerful approach for understanding neural coding in the auditory and visual systems. In the field of somatosensation, it has been more challenging to quantify the natural tactile scene, in part because somatosensory signals are so tightly linked to the animal's movements. The present work takes a step towards quantifying the natural tactile scene for the rat vibrissal system by simulating rat whisking motions to systematically investigate the probabilities of whisker-object contact in naturalistic environments. The simulations permit an exhaustive search through the complete space of possible contact patterns, thereby allowing for the characterization of the patterns that would most likely occur during long sequences of natural exploratory behavior. We specifically quantified the probabilities of 'concomitant contact', that is, given that a particular whisker makes contact with a surface during a whisk, what is the probability that each of the other whiskers will also make contact with the surface during that whisk? Probabilities of concomitant contact were quantified in simulations that assumed increasingly naturalistic conditions: first, the space of all possible head poses; second, the space of behaviorally preferred head poses as measured experimentally; and third, common head poses in environments such as cages and burrows. As environments became more naturalistic, the probability distributions shifted from exhibiting a 'row-wise' structure to a more diagonal structure. Results also reveal that the rat appears to use motor strategies (e.g. head pitches) that generate contact patterns that are particularly well suited to extract information in the presence of uncertainty. © 2015. Published by The Company of Biologists Ltd.

  20. Wireless accelerometer iPod application for quantifying gait characteristics.

    Science.gov (United States)

    LeMoyne, Robert; Mastroianni, Timothy; Grundfest, Warren

    2011-01-01

    The capability to quantify gait characteristics through a wireless accelerometer iPod application in an effectively autonomous environment may alleviate the progressive strain on highly specific medical resources. The iPod consists of the inherent attributes imperative for robust gait quantification, such as a three dimensional accelerometer, data storage, flexible software, and the capacity for wireless transmission of the gait data through email. Based on the synthesis of the integral components of the iPod, a wireless accelerometer iPod application for quantifying gait characteristics has been tested and evaluated in an essentially autonomous environment. The quantified gait acceleration waveforms were wirelessly transmitted using email for postprocessing. The site for the gait experiment occurred in a remote location relative to the location where the postprocessing was conducted. The wireless accelerometer iPod application for quantifying gait characteristics demonstrated sufficient accuracy and consistency.

  1. Analyzing complex networks evolution through Information Theory quantifiers

    Energy Technology Data Exchange (ETDEWEB)

    Carpi, Laura C., E-mail: Laura.Carpi@studentmail.newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Rosso, Osvaldo A., E-mail: rosso@fisica.ufmg.b [Departamento de Fisica, Instituto de Ciencias Exatas, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, Belo Horizonte (31270-901), MG (Brazil); Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina); Saco, Patricia M., E-mail: Patricia.Saco@newcastle.edu.a [Civil, Surveying and Environmental Engineering, University of Newcastle, University Drive, Callaghan NSW 2308 (Australia); Departamento de Hidraulica, Facultad de Ciencias Exactas, Ingenieria y Agrimensura, Universidad Nacional de Rosario, Avenida Pellegrini 250, Rosario (Argentina); Ravetti, Martin Gomez, E-mail: martin.ravetti@dep.ufmg.b [Departamento de Engenharia de Producao, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, Belo Horizonte (31270-901), MG (Brazil)

    2011-01-24

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Nino/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  2. Quantifying within-household transmission of ESBL-producing bacteria

    NARCIS (Netherlands)

    Haverkate, Manon R; Platteel, Tamara N; Fluit, A C; Cohen Stuart, James W; Leverstein-van Hall, Maurine A; Thijsen, Steven F T; Scharringa, J.; Kloosterman, Fieke R C; Bonten, Marc J M; Bootsma, Martin C J

    OBJECTIVES: Patients can acquire extended-spectrum beta-lactamase (ESBL)-producing Enterobacteriaceae during hospitalisation and colonised patients may transmit these bacteria after discharge, most likely to household contacts. In this study, ESBL transmission was quantified in households. METHODS:

  3. Quantifier-Free Interpolation of a Theory of Arrays

    CERN Document Server

    Bruttomesso, Roberto; Ranise, Silvio

    2012-01-01

    The use of interpolants in model checking is becoming an enabling technology to allow fast and robust verification of hardware and software. The application of encodings based on the theory of arrays, however, is limited by the impossibility of deriving quantifier- free interpolants in general. In this paper, we show that it is possible to obtain quantifier-free interpolants for a Skolemized version of the extensional theory of arrays. We prove this in two ways: (1) non-constructively, by using the model theoretic notion of amalgamation, which is known to be equivalent to admit quantifier-free interpolation for universal theories; and (2) constructively, by designing an interpolating procedure, based on solving equations between array updates. (Interestingly, rewriting techniques are used in the key steps of the solver and its proof of correctness.) To the best of our knowledge, this is the first successful attempt of computing quantifier- free interpolants for a variant of the theory of arrays with extension...

  4. Removal of Quantifiers by Elimination of Boundary Points

    CERN Document Server

    Goldberg, Eugene

    2012-01-01

    We consider the problem of elimination of existential quantifiers from a Boolean CNF formula. Our approach is based on the following observation. One can get rid of dependency on a set of variables of a quantified CNF formula F by adding resolvent clauses of F eliminating boundary points. This approach is similar to the method of quantifier elimination described in [9]. The difference of the method described in the present paper is twofold: {\\bullet} branching is performed only on quantified variables, {\\bullet} an explicit search for boundary points is performed by calls to a SAT-solver Although we published the paper [9] before this one, chrono- logically the method of the present report was developed first. Preliminary presentations of this method were made in [10], [11]. We postponed a publication of this method due to preparation of a patent application [8].

  5. Quantifying Arsenic Leaching from Soils Using a Fractional-Derivative Model

    Science.gov (United States)

    Lu, B.; Zhang, Y.; LU, B.

    2015-12-01

    Arsenic leaching from soils can exhibit multiple-rate kinetics due to the heterogeneity nature of the medium, motivating the development of a fractional-order derivative model (FDM). The sorption-desorption process in saturated natural soils may not be limited to be a single rate or can reach equilibrium quickly, even at the laboratory scale. Applications of the FDM show that the multi-rate mass transfer quantifies the multi-stage desorption in Arsenic leaching characterized by the heavy late-time tailing behavior.

  6. Pitfalls in quantifying species turnover: the residency effect

    OpenAIRE

    Kevin Chase Burns

    2014-01-01

    The composition of ecological communities changes continuously through time and space. Understanding this turnover in species composition is a central goal in biogeography, but quantifying species turnover can be problematic. Here, I describe an underappreciated source of bias in quantifying species turnover, namely ‘the residency effect’, which occurs when the contiguous distributions of species across sampling domains are small relative to census intervals. I present the results of a simula...

  7. A miniaturized assay to quantify effects of chemicals or physical stimuli upon locust activity

    Institute of Scientific and Technical Information of China (English)

    BRUNO HOSTE; FILIP SAS; TIM VANDERSMISSEN; ARNOLD DE LOOF; MICHAEL BREUER; JURGEN HUYBRECHTS

    2006-01-01

    Solitary and gregarious locusts differ in many traits, such as body color,morphometrics and behavior. With respect to behavior, solitary animals shun each other,while gregarious animals seek each other's company, hence their crowding behavior.General activity, depending on the temperature, occurs throughout the day but is much lower in solitary locusts. Solitary locusts occasionally fly by night while gregarious locusts fly regularly during daytime (swarming). In search of new assays to identify substances that control or modify aspects of (phase) behavior, we designed a simple activity assay, meant to complement existing behavioral measurement tools. The general activity is reflected in the number of wall hits, that is, the number of contacts between the locust and the vertical walls of a small arena. Using this single parameter we were able to quantify differences in total activity of both nymphs and adults of isolation-reared (solitary), regrouped- and crowdreared (gregarious) locusts under different conditions. Furthermore, we demonstrate that there are inter- and intra-phase dependent differences in activities of 5th instar nymphs after injections of the three different adipokinetic hormones.

  8. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  9. Quantifying Cutting and Wearing Behaviors of TiN- and CrNCoated AISI 1070 Steel

    Directory of Open Access Journals (Sweden)

    Ahmet Cakan

    2008-11-01

    Full Text Available Hard coatings such as titanium nitride (TiN and chromium nitride (CrN are widely used in cutting and forming tools against wear and corrosion. In the present study, hard coating films were deposited onto AISI 1070 steels by a cathodic arc evaporation plating (CAVP technique. These samples were subjected to wear in a conventional lathe for investigating the tribological behaviour of coating structure, and prenitrided subsurface composition was characterized using scanning electron microscopy (SEM, line scan analyses and X-ray diffraction (XRD. The wear properties of TiN- and CrNcoated samples were determined using an on-line monitoring system. The results show that TiN-coated samples demonstrate higher wear resistance than CrN-coated samples.

  10. Quantifying Cutting and Wearing Behaviors of TiN- and CrNCoated AISI 1070 Steel.

    Science.gov (United States)

    Cakan, Ahmet; Ozkaner, Vedat; Yildirim, Mustafa M

    2008-11-05

    Hard coatings such as titanium nitride (TiN) and chromium nitride (CrN) are widely used in cutting and forming tools against wear and corrosion. In the present study, hard coating films were deposited onto AISI 1070 steels by a cathodic arc evaporation plating (CAVP) technique. These samples were subjected to wear in a conventional lathe for investigating the tribological behaviour of coating structure, and prenitrided subsurface composition was characterized using scanning electron microscopy (SEM), line scan analyses and X-ray diffraction (XRD). The wear properties of TiN- and CrNcoated samples were determined using an on-line monitoring system. The results show that TiN-coated samples demonstrate higher wear resistance than CrN-coated samples.

  11. Quantifying the behavior of price dynamics at opening time in stock market

    Science.gov (United States)

    Ochiai, Tomoshiro; Takada, Hideyuki; Nacher, Jose C.

    2014-11-01

    The availability of huge volume of financial data has offered the possibility for understanding the markets as a complex system characterized by several stylized facts. Here we first show that the time evolution of the Japan’s Nikkei stock average index (Nikkei 225) futures follows the resistance and breaking-acceleration effects when the complete time series data is analyzed. However, in stock markets there are periods where no regular trades occur between the close of the market on one day and the next day’s open. To examine these time gaps we decompose the time series data into opening time and intermediate time. Our analysis indicates that for the intermediate time, both the resistance and the breaking-acceleration effects are still observed. However, for the opening time there are almost no resistance and breaking-acceleration effects, and volatility is always constantly high. These findings highlight unique dynamic differences between stock markets and forex market and suggest that current risk management strategies may need to be revised to address the absence of these dynamic effects at the opening time.

  12. Microfluidic Device to Quantify the Behavior of Therapeutic Bacteria in Three-Dimensional Tumor Tissue

    Science.gov (United States)

    Brackett, Emily L.; Swofford, Charles A.; Forbes, Neil S.

    2016-01-01

    Summary Microfluidic devices enable precise quantification of the interactions between anticancer bacteria and tumor tissue. Direct observation of bacterial movement and gene expression in tissue is not possible with either monolayers of cells or tumor-bearing mice. Quantification of these interactions is necessary to understand the inherent mechanisms of bacterial targeting and to develop modified organisms with enhanced therapeutic properties. Here we describe the procedures for designing, printing and assembling microfluidic tumor-on-a-chip devices. We also describe the procedures for inserting three- dimensional tumor-cell masses, exposing to bacteria, and analyzing the resultant images. PMID:26846800

  13. Quantifying Grain Level Stress-Strain Behavior for AM40 via Instrumented Microindentation

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Guang; Barker, Erin I.; Stephens, Elizabeth V.; Choi, Kyoo Sil; Sun, Xin

    2016-01-13

    Nanoindentation is performed on hot isostatic pressing (HIP) Mg-Al (AM40) alloy samples prepared from high-pressure die cast (HPDC) for the purpose of characterizing the mechanical properties of the α-grains. Extracting elastic modulus and hardness from resulting load-depth curves is well established. A new inverse method is developed to extract plastic material properties as well. The method utilizes empirical yield strength-hardness relationship reported in the literature with finite element modeling of the indentation. Due to the shallow depth of the indentation, indentation size effect (ISE) is taken into account when determine plastic properties. Elastic and plastic properties are determined for a series of indents. The resulting average values and standard deviation are calculated for use as input distributions for microstructure modeling of the AM40.

  14. Odontocete Cetaceans: Quantifying Behavioral Ecology and Response to Predators Using a Multi-Species Approach

    Science.gov (United States)

    2016-03-21

    factors promoting the evolution of sociality in mammals (Connor 2000). Group living can provide protection to an individual in a variety of ways... mammal -eating killer whales (Orcinus orca), some of which have similarities to certain military sonars. A secondary objective of the project was to...12 Figure 2.1 Spectrograms of calls from mammal -eating killer whales used in

  15. Cross-linguistic patterns in the acquisition of quantifiers.

    Science.gov (United States)

    Katsos, Napoleon; Cummins, Chris; Ezeizabarrena, Maria-José; Gavarró, Anna; Kuvač Kraljević, Jelena; Hrzica, Gordana; Grohmann, Kleanthes K; Skordi, Athina; Jensen de López, Kristine; Sundahl, Lone; van Hout, Angeliek; Hollebrandse, Bart; Overweg, Jessica; Faber, Myrthe; van Koert, Margreet; Smith, Nafsika; Vija, Maigi; Zupping, Sirli; Kunnari, Sari; Morisseau, Tiffany; Rusieshvili, Manana; Yatsushiro, Kazuko; Fengler, Anja; Varlokosta, Spyridoula; Konstantzou, Katerina; Farby, Shira; Guasti, Maria Teresa; Vernice, Mirta; Okabe, Reiko; Isobe, Miwa; Crosthwaite, Peter; Hong, Yoonjee; Balčiūnienė, Ingrida; Ahmad Nizar, Yanti Marina; Grech, Helen; Gatt, Daniela; Cheong, Win Nee; Asbjørnsen, Arve; Torkildsen, Janne von Koss; Haman, Ewa; Miękisz, Aneta; Gagarina, Natalia; Puzanova, Julia; Anđelković, Darinka; Savić, Maja; Jošić, Smiljana; Slančová, Daniela; Kapalková, Svetlana; Barberán, Tania; Özge, Duygu; Hassan, Saima; Chan, Cecilia Yuet Hung; Okubo, Tomoya; van der Lely, Heather; Sauerland, Uli; Noveck, Ira

    2016-08-16

    Learners of most languages are faced with the task of acquiring words to talk about number and quantity. Much is known about the order of acquisition of number words as well as the cognitive and perceptual systems and cultural practices that shape it. Substantially less is known about the acquisition of quantifiers. Here, we consider the extent to which systems and practices that support number word acquisition can be applied to quantifier acquisition and conclude that the two domains are largely distinct in this respect. Consequently, we hypothesize that the acquisition of quantifiers is constrained by a set of factors related to each quantifier's specific meaning. We investigate competence with the expressions for "all," "none," "some," "some…not," and "most" in 31 languages, representing 11 language types, by testing 768 5-y-old children and 536 adults. We found a cross-linguistically similar order of acquisition of quantifiers, explicable in terms of four factors relating to their meaning and use. In addition, exploratory analyses reveal that language- and learner-specific factors, such as negative concord and gender, are significant predictors of variation.

  16. An indicator-based method for quantifying farm multifunctionality

    DEFF Research Database (Denmark)

    Andersen, Peter Stubkjær; Vejre, Henrik; Dalgaard, Tommy

    2013-01-01

    Production of food and fibres has traditionally been the main function of agriculture. In the last decades an increased focus on the importance of other functions has been discussed within the framework of agricultural and general land use multifunctionality. To a large extent farmers’ decisions...... and actions determine which functions their farming practices support. The extent of the production function is straightforward to identify and quantify but problems persist in rating functions such as ecosystem maintenance, housing, and amenity values. This paper presents a method to quantify and compare...... described as balance among functions–whilst smaller and bigger farms were biased towards mainly residence and production concerns, respectively. Challenges in quantifying functions still persist, but the suggested approach offer a method by which functionality can be compared among farms and among functions...

  17. Studying Behavioral Ecology on High School & College Campuses: A Practical Guide to Measuring Foraging Behavior Using Urban Wildlife

    Science.gov (United States)

    Baker, Mohammad A. Abu; Emerson, Sara E.; Brown, Joel S.

    2015-01-01

    We present a practical field exercise for ecology and animal behavior classes that can be carried out on campus, using urban wildlife. Students document an animal's feeding behavior to study its interactions with the surrounding environment. In this approach, an animal's feeding behavior is quantified at experimental food patches placed within its…

  18. Studying Behavioral Ecology on High School & College Campuses: A Practical Guide to Measuring Foraging Behavior Using Urban Wildlife

    Science.gov (United States)

    Baker, Mohammad A. Abu; Emerson, Sara E.; Brown, Joel S.

    2015-01-01

    We present a practical field exercise for ecology and animal behavior classes that can be carried out on campus, using urban wildlife. Students document an animal's feeding behavior to study its interactions with the surrounding environment. In this approach, an animal's feeding behavior is quantified at experimental food patches placed within its…

  19. Determinants of ventilation behavior in naturally ventilated dwellings: Identification and quantification of relationships

    NARCIS (Netherlands)

    Levie, D.; Kluizenaar, Y. de; Hoes-van Oeffelen, E.C.M.; Hofstetter, H.; Janssen, S.A.; Spiekman, M.E.; Koene, F.G.H.

    2014-01-01

    Background: Ventilation in dwellings is essential for well-being and health. However, insight in determinants of ventilation behavior is still limited. Aim: Identifying determinants of ventilation behavior and quantifying relationships. Secondly, identifying households characteristics associated wit

  20. How the brain learns how few are "many": An fMRI study of the flexibility of quantifier semantics.

    Science.gov (United States)

    Heim, Stefan; McMillan, Corey T; Clark, Robin; Baehr, Laura; Ternes, Kylie; Olm, Christopher; Min, Nam Eun; Grossman, Murray

    2016-01-15

    Previous work has shown that the meaning of a quantifier such as "many" or "few" depends in part on quantity. However, the meaning of a quantifier may vary depending on the context, e.g. in the case of common entities such as "many ants" (perhaps several thousands) compared to endangered species such as "many pandas" (perhaps a dozen). In a recent study (Heim et al., 2015 Front. Psychol.) we demonstrated that the relative meaning of "many" and "few" may be changed experimentally. In a truth value judgment task, displays with 40% of circles in a named color initially had a low probability of being labeled "many". After a training phase, the likelihood of acceptance 40% as "many" increased. Moreover, the semantic learning effect also generalized to the related quantifier "few" which had not been mentioned in the training phase. Thus, fewer 40% arrays were considered "few." In the present study, we tested the hypothesis that this semantic adaptation effect was supported by cytoarchitectonic Brodmann area (BA) 45 in Broca's region which may contribute to semantic evaluation in the context of language and quantification. In an event-related fMRI study, 17 healthy volunteers performed the same paradigm as in the previous behavioral study. We found a relative signal increase when comparing the critical, trained proportion to untrained proportions. This specific effect was found in left BA 45 for the trained quantifier "many", and in left BA 44 for both quantifiers, reflecting the semantic adjustment for the untrained but related quantifier "few." These findings demonstrate the neural basis for processing the flexible meaning of a quantifier, and illustrate the neuroanatomical structures that contribute to variable meanings that can be associated with a word when used in different contexts.

  1. Quantifying show jumping horse rider expertise using IMUs.

    Science.gov (United States)

    Patterson, M; Doyle, J; Cahill, E; Caulfield, B; McCarthy Persson, U

    2010-01-01

    Horse rider ability has long been measured using horse performance, competition results and visual observation. Scientific methods of measuring rider ability on the flat are emerging such as measuring position angles and harmony of the horse-rider system. To date no research has quantified rider ability in show jumping. Kinematic analysis and motion sensors have been used in sports other than show jumping to measure the quality of motor control patterns in humans. The aim of this study was to quantify rider ability in show jumping using body-mounted IMUs. Preliminary results indicate that there are clear differences in experienced and novice riders during show jumping.

  2. From Strong Amalgamability to Modularity of Quantifier-Free Interpolation

    CERN Document Server

    Bruttomesso, Roberto; Ranise, Silvio

    2012-01-01

    The use of interpolants in verification is gaining more and more importance. Since theories used in applications are usually obtained as (disjoint) combinations of simpler theories, it is important to modularly re-use interpolation algorithms for the component theories. We show that a sufficient and necessary condition to do this for quantifier-free interpolation is that the component theories have the 'strong (sub-)amalgamation' property. Then, we provide an equivalent syntactic characterization, identify a sufficient condition, and design a combined quantifier-free interpolation algorithm capable of handling both convex and non-convex theories, that subsumes and extends most existing work on combined interpolation.

  3. Applications of Quantified Constraint Solving over the Reals - Bibliography

    CERN Document Server

    Ratschan, Stefan

    2012-01-01

    Quantified constraints over the reals appear in numerous contexts. Usually existential quantification occurs when some parameter can be chosen by the user of a system, and univeral quantification when the exact value of a parameter is either unknown, or when it occurs in infinitely many, similar versions. The following is a list of application areas and publications that contain applications for solving quantified constraints over the reals. The list is certainly not complete, but grows as the author encounters new items. Contributions are very welcome!

  4. Hybrid Tractable Classes of Binary Quantified Constraint Satisfaction Problems

    CERN Document Server

    Gao, Jian; Zhou, Junping

    2011-01-01

    In this paper, we investigate the hybrid tractability of binary Quantified Constraint Satisfaction Problems (QCSPs). First, a basic tractable class of binary QCSPs is identified by using the broken-triangle property. In this class, the variable ordering for the broken-triangle property must be same as that in the prefix of the QCSP. Second, we break this restriction to allow that existentially quantified variables can be shifted within or out of their blocks, and thus identify some novel tractable classes by introducing the broken-angle property. Finally, we identify a more generalized tractable class, i.e., the min-of-max extendable class for QCSPs.

  5. Differential scanning calorimetry to quantify the stability of protein cages.

    Science.gov (United States)

    Zhang, Yu; Ardejani, Maziar S

    2015-01-01

    Differential scanning calorimetry (DSC) is an experimental technique through which the differences in amount of heat required to maintain equal temperature between a sample and a reference cell are measured at various temperatures. The quantified heat relates to the differences in apparent heat capacity of the analytes. The data from DSC studies will thereby provide direct information about the energetics of thermally induced processes in the sample. Here we present a detailed protocol to quantify the thermostability of protein cage, bacterioferritin (BFR), using differential scanning calorimetry.

  6. Degree quantifiers, bare quantifiers and intensifiers in the midfield: A new look at quantification at a distance

    Directory of Open Access Journals (Sweden)

    J.-Marc Authier

    2016-07-01

    Full Text Available Nearly all of the theories of Quantification at a Distance (QAD that have been put forth in the past fifteen years have assumed that degree quantifiers are first merged in the derivation as a midfield (VP- event-quantifying adverbs. This has one important consequence, pointed out in Bouchard and Burnett (2007: 8, which is that if the restriction of the quantifier in QAD is assumed to be a set of events and if the event variable is introduced in the left periphery of the VP, “the term Quantification at a Distance [...] is, in fact, a misnomer. There is nothing ‘long distance’ about the semantic composition of QAD; it simply proceeds via adjacency.” In this article, I aim to challenge this view. I first introduce novel empirical evidence, which I believe unambiguously supports a movement derivation of QAD. Specifically, I show that the degree quantifiers in QAD have the same distribution as bare quantifiers like 'tout '‘everything’ and 'rien '‘nothing’, which are arguments of the verb and are therefore first-merged VP-internally, yet are spelled out in the midfield. This leads me to re-examine the data that have led to the hypothesis that a movement analysis of QAD is undesirable and show that alternative explanations can be provided for them. Finally, I offer a new account of QAD, one that reconciles a movement derivation with the facts that have led to its demise.

  7. Evaluation of Quantified Social Perception Circuit Activity as a Neurobiological Marker of Autism Spectrum Disorder.

    Science.gov (United States)

    Björnsdotter, Malin; Wang, Nancy; Pelphrey, Kevin; Kaiser, Martha D

    2016-06-01

    Autism spectrum disorder (ASD) is marked by social disability and is associated with dysfunction in brain circuits supporting social cue perception. The degree to which neural functioning reflects individual-level behavioral phenotype is unclear, slowing the search for functional neuroimaging biomarkers of ASD. To examine whether quantified neural function in social perception circuits may serve as an individual-level marker of ASD in children and adolescents. The cohort study was conducted at the Yale Child Study Center and involved children and adolescents diagnosed as having ASD and typically developing participants. Participants included a discovery cohort and a larger replication cohort. Individual-level social perception circuit functioning was assessed as functional magnetic resonance imaging brain responses to point-light displays of coherent vs scrambled human motion. Outcome measures included performance of quantified brain responses in affected male and female participants in terms of area under the receiver operating characteristic curve (AUC), sensitivity and specificity, and correlations between brain responses and social behavior. Of the 39 participants in the discovery cohort aged 4 to 17 years, 22 had ASD and 30 were boys. Of the 75 participants in the replication cohort aged 7 to 20 years, 37 had ASD and 52 were boys. A relative reduction in social perception circuit responses was identified in discovery cohort boys with ASD at an AUC of 0.75 (95% CI, 0.52-0.89; P = .01); however, typically developing girls and girls with ASD could not be distinguished (P = .54). The results were confirmed in the replication cohort, where brain responses were identified in boys with ASD at an AUC of 0.79 (95% CI, 0.64-0.91; P social behavior in boys but not in girls. Quantified social perception circuit activity is a promising individual-level candidate neural marker of the male ASD behavioral phenotype. Our findings highlight the need to better

  8. Engineering Students Designing a Statistical Procedure for Quantifying Variability

    Science.gov (United States)

    Hjalmarson, Margret A.

    2007-01-01

    The study examined first-year engineering students' responses to a statistics task that asked them to generate a procedure for quantifying variability in a data set from an engineering context. Teams used technological tools to perform computations, and their final product was a ranking procedure. The students could use any statistical measures,…

  9. A Note on Spector's Quantifier-Free Rule of Extensionality

    DEFF Research Database (Denmark)

    Kohlenbach, Ulrich

    2001-01-01

     In this note we show that the so-called weakly extensional arithmetic in all finite types, which is based on a quantifier-free rule of extensionality due to C. Spector and which is of significance in the context of Gödel’s functional interpretation, does not satisfy the deduction theorem...

  10. A framework for quantifying net benefits of alternative prognostic models

    DEFF Research Database (Denmark)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit...

  11. Quantifying plate-cleaning using 'The Restaurant of the Future'

    NARCIS (Netherlands)

    Hinton, E.C.; Brunstrom, J.M.; Fay, S.H.; Wilkinson, L.L.; Ferriday, D.; Rogers, P.J.; Wijk, de R.A.

    2013-01-01

    Laboratory-based studies of human dietary behaviour benefit from highly controlled conditions; however, this approach can lack ecological validity. Identifying a reliable method to capture and quantify natural dietary behaviours represents an important challenge for researchers. In this study, we sc

  12. Quantifying Poincare’s Continuation Method for Nonlinear Oscillators

    Directory of Open Access Journals (Sweden)

    Daniel Núñez

    2015-01-01

    Full Text Available In the sixties, Loud obtained interesting results of continuation on periodic solutions in driven nonlinear oscillators with small parameter (Loud, 1964. In this paper Loud’s results are extended out for periodically driven Duffing equations with odd symmetry quantifying the continuation parameter for a periodic odd solution which is elliptic and emanates from the equilibrium of the nonperturbed problem.

  13. Quantifying the lag time to detect barriers in landscape genetics

    Science.gov (United States)

    E. L. Landguth; S. A Cushman; M. K. Schwartz; K. S. McKelvey; M. Murphy; G. Luikart

    2010-01-01

    Understanding how spatial genetic patterns respond to landscape change is crucial for advancing the emerging field of landscape genetics. We quantified the number of generations for new landscape barrier signatures to become detectable and for old signatures to disappear after barrier removal. We used spatially explicit, individualbased simulations to examine the...

  14. Quantifying and specifying the solar influence on terrestrial surface temperature

    NARCIS (Netherlands)

    de Jager, C.; Duhau, S.; van Geel, B.

    2010-01-01

    This investigation is a follow-up of a paper in which we showed that both major magnetic components of the solar dynamo, viz. the toroidal and the poloidal ones, are correlated with average terrestrial surface temperatures. Here, we quantify, improve and specify that result and search for their caus

  15. Quantifying [F-18]fluorodeoxyglucose uptake in the arterial wall

    DEFF Research Database (Denmark)

    Blomberg, Björn Alexander; Bashyam, A.; Ramachandran, A.

    2015-01-01

    Purpose The human arterial wall is smaller than the spatial resolution of current positron emission tomographs. Therefore, partial volume effects should be considered when quantifying arterial wall F-18-FDG uptake. We evaluated the impact of a novel method for partial volume effect (PVE) correcti...

  16. Dealing with Quantifier Scope Ambiguity in Natural Language Understanding

    Science.gov (United States)

    Hafezi Manshadi, Mohammad

    2014-01-01

    Quantifier scope disambiguation (QSD) is one of the most challenging problems in deep natural language understanding (NLU) systems. The most popular approach for dealing with QSD is to simply leave the semantic representation (scope-) underspecified and to incrementally add constraints to filter out unwanted readings. Scope underspecification has…

  17. Quantifying folate bioavailability: a critical appraisal of methods

    NARCIS (Netherlands)

    Boonstra, A.; Verhoef, P.; West, C.E.

    2004-01-01

    Purpose of review Dietary reference intakes for folate rely on a good estimate of folate bioavailability from the general diet. In this review, current methods for quantifying the bioavailability of dietary folate and specific folate vitamers in humans are reviewed. Emphasis is on isotopic labeling

  18. Quantifying Water Stress Using Total Water Volumes and GRACE

    Science.gov (United States)

    Richey, A. S.; Famiglietti, J. S.; Druffel-Rodriguez, R.

    2011-12-01

    Water will follow oil as the next critical resource leading to unrest and uprisings globally. To better manage this threat, an improved understanding of the distribution of water stress is required today. This study builds upon previous efforts to characterize water stress by improving both the quantification of human water use and the definition of water availability. Current statistics on human water use are often outdated or inaccurately reported nationally, especially for groundwater. This study improves these estimates by defining human water use in two ways. First, we use NASA's Gravity Recovery and Climate Experiment (GRACE) to isolate the anthropogenic signal in water storage anomalies, which we equate to water use. Second, we quantify an ideal water demand by using average water requirements for the domestic, industrial, and agricultural water use sectors. Water availability has traditionally been limited to "renewable" water, which ignores large, stored water sources that humans use. We compare water stress estimates derived using either renewable water or the total volume of water globally. We use the best-available data to quantify total aquifer and surface water volumes, as compared to groundwater recharge and surface water runoff from land-surface models. The work presented here should provide a more realistic image of water stress by explicitly quantifying groundwater, defining water availability as total water supply, and using GRACE to more accurately quantify water use.

  19. A Sustainability Initiative to Quantify Carbon Sequestration by Campus Trees

    Science.gov (United States)

    Cox, Helen M.

    2012-01-01

    Over 3,900 trees on a university campus were inventoried by an instructor-led team of geography undergraduates in order to quantify the carbon sequestration associated with biomass growth. The setting of the project is described, together with its logistics, methodology, outcomes, and benefits. This hands-on project provided a team of students…

  20. A framework for quantifying net benefits of alternative prognostic models

    NARCIS (Netherlands)

    Rapsomaniki, Eleni; White, Ian R.; Wood, Angela M.; Thompson, Simon G.

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) o

  1. Quantifying capital goods for biological treatment of organic waste.

    Science.gov (United States)

    Brogaard, Line K; Petersen, Per H; Nielsen, Peter D; Christensen, Thomas H

    2015-02-01

    Materials and energy used for construction of anaerobic digestion (AD) and windrow composting plants were quantified in detail. The two technologies were quantified in collaboration with consultants and producers of the parts used to construct the plants. The composting plants were quantified based on the different sizes for the three different types of waste (garden and park waste, food waste and sludge from wastewater treatment) in amounts of 10,000 or 50,000 tonnes per year. The AD plant was quantified for a capacity of 80,000 tonnes per year. Concrete and steel for the tanks were the main materials for the AD plant. For the composting plants, gravel and concrete slabs for the pavement were used in large amounts. To frame the quantification, environmental impact assessments (EIAs) showed that the steel used for tanks at the AD plant and the concrete slabs at the composting plants made the highest contribution to Global Warming. The total impact on Global Warming from the capital goods compared to the operation reported in the literature on the AD plant showed an insignificant contribution of 1-2%. For the composting plants, the capital goods accounted for 10-22% of the total impact on Global Warming from composting.

  2. Quantifying soil respiration at landscape scales. Chapter 11

    Science.gov (United States)

    John B. Bradford; Michael G. Ryan

    2008-01-01

    Soil CO2, efflux, or soil respiration, represents a substantial component of carbon cycling in terrestrial ecosystems. Consequently, quantifying soil respiration over large areas and long time periods is an increasingly important goal. However, soil respiration rates vary dramatically in space and time in response to both environmental conditions...

  3. Coupling and quantifying resilience and sustainability in facilities management

    DEFF Research Database (Denmark)

    Cox, Rimante Andrasiunaite; Nielsen, Susanne Balslev; Rode, Carsten

    2015-01-01

    Purpose – The purpose of this paper is to consider how to couple and quantify resilience and sustainability, where sustainability refers to not only environmental impact, but also economic and social impacts. The way a particular function of a building is provisioned may have significant repercus......Purpose – The purpose of this paper is to consider how to couple and quantify resilience and sustainability, where sustainability refers to not only environmental impact, but also economic and social impacts. The way a particular function of a building is provisioned may have significant...... repercussions beyond just resilience. The goal is to develop a decision support tool for facilities managers. Design/methodology/approach – A risk framework is used to quantify both resilience and sustainability in monetary terms. The risk framework allows to couple resilience and sustainability, so...... that the provisioning of a particular building can be investigated with consideration of functional, environmental, economic and, possibly, social dimensions. Findings – The method of coupling and quantifying resilience and sustainability (CQRS) is illustrated with a simple example that highlights how very different...

  4. Realizability Semantics for Quantified Modal Logic : Generalizing Flagg's 1985 Construction

    NARCIS (Netherlands)

    Rin, B.G.; Walsh, Sean

    2016-01-01

    A semantics for quantified modal logic is presented that is based on Kleene’s notion of realizability. This semantics generalizes Flagg’s 1985 construction of a model of a modal version of Church’s Thesis and first-order arithmetic. While the bulk of the paper is devoted to developing the details of

  5. Lecture Note on Discrete Mathematics: Predicates and Quantifiers

    DEFF Research Database (Denmark)

    Nordbjerg, Finn Ebertsen

    2016-01-01

    This lecture note supplements the treatment of predicates and quantifiers given in standard textbooks on Discrete Mathematics (e.g.: [1]) and introduces the notation used in this course. We will present central concepts that are important, when predicate logic is used for specification...

  6. FRAGSTATS: spatial pattern analysis program for quantifying landscape structure.

    Science.gov (United States)

    Kevin McGarigal; Barbara J. Marks

    1995-01-01

    This report describes a program, FRAGSTATS, developed to quantify landscape structure. FRAGSTATS offers a comprehensive choice of landscape metrics and was designed to be as versatile as possible. The program is almost completely automated and thus requires little technical training. Two separate versions of FRAGSTATS exist: one for vector images and one for raster...

  7. Quantifying Ladder Fuels: A New Approach Using LiDAR

    Science.gov (United States)

    Heather Kramer; Brandon Collins; Maggi Kelly; Scott Stephens

    2014-01-01

    We investigated the relationship between LiDAR and ladder fuels in the northern Sierra Nevada, California USA. Ladder fuels are often targeted in hazardous fuel reduction treatments due to their role in propagating fire from the forest floor to tree crowns. Despite their importance, ladder fuels are difficult to quantify. One common approach is to calculate canopy base...

  8. Quantifying plate-cleaning using 'The Restaurant of the Future'

    NARCIS (Netherlands)

    Hinton, E.C.; Brunstrom, J.M.; Fay, S.H.; Wilkinson, L.L.; Ferriday, D.; Rogers, P.J.; Wijk, de R.A.

    2013-01-01

    Laboratory-based studies of human dietary behaviour benefit from highly controlled conditions; however, this approach can lack ecological validity. Identifying a reliable method to capture and quantify natural dietary behaviours represents an important challenge for researchers. In this study, we sc

  9. Coupling and quantifying resilience and sustainability in facilities management

    DEFF Research Database (Denmark)

    Cox, Rimante Andrasiunaite; Nielsen, Susanne Balslev; Rode, Carsten

    2015-01-01

    Purpose – The purpose of this paper is to consider how to couple and quantify resilience and sustainability, where sustainability refers to not only environmental impact, but also economic and social impacts. The way a particular function of a building is provisioned may have significant repercus......Purpose – The purpose of this paper is to consider how to couple and quantify resilience and sustainability, where sustainability refers to not only environmental impact, but also economic and social impacts. The way a particular function of a building is provisioned may have significant...... repercussions beyond just resilience. The goal is to develop a decision support tool for facilities managers. Design/methodology/approach – A risk framework is used to quantify both resilience and sustainability in monetary terms. The risk framework allows to couple resilience and sustainability, so...... that the provisioning of a particular building can be investigated with consideration of functional, environmental, economic and, possibly, social dimensions. Findings – The method of coupling and quantifying resilience and sustainability (CQRS) is illustrated with a simple example that highlights how very different...

  10. Quantifying Stakeholder Values of VET Provision in the Netherlands

    Science.gov (United States)

    van der Sluis, Margriet E.; Reezigt, Gerry J.; Borghans, Lex

    2014-01-01

    It is well-known that the quality of vocational education and training (VET) depends on how well a given programme aligns with the values and interests of its stakeholders, but it is less well-known what these values and interests are and to what extent they are shared across different groups of stakeholders. We use vignettes to quantify the…

  11. Field sampling method for quantifying odorants in humid environments

    Science.gov (United States)

    Most air quality studies in agricultural environments typically use thermal desorption analysis for quantifying volatile organic compounds (VOC) associated with odor. Carbon molecular sieves (CMS) are popular sorbent materials used in these studies. However, there is little information on the effe...

  12. DOE: Quantifying the Value of Hydropower in the Electric Grid

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-12-31

    The report summarizes research to Quantify the Value of Hydropower in the Electric Grid. This 3-year DOE study focused on defining value of hydropower assets in a changing electric grid. Methods are described for valuation and planning of pumped storage and conventional hydropower. The project team conducted plant case studies, electric system modeling, market analysis, cost data gathering, and evaluations of operating strategies and constraints. Five other reports detailing these research results are available a project website, www.epri.com/hydrogrid. With increasing deployment of wind and solar renewable generation, many owners, operators, and developers of hydropower have recognized the opportunity to provide more flexibility and ancillary services to the electric grid. To quantify value of services, this study focused on the Western Electric Coordinating Council region. A security-constrained, unit commitment and economic dispatch model was used to quantify the role of hydropower for several future energy scenarios up to 2020. This hourly production simulation considered transmission requirements to deliver energy, including future expansion plans. Both energy and ancillary service values were considered. Addressing specifically the quantification of pumped storage value, no single value stream dominated predicted plant contributions in various energy futures. Modeling confirmed that service value depends greatly on location and on competition with other available grid support resources. In this summary, ten different value streams related to hydropower are described. These fell into three categories; operational improvements, new technologies, and electricity market opportunities. Of these ten, the study was able to quantify a monetary value in six by applying both present day and future scenarios for operating the electric grid. This study confirmed that hydropower resources across the United States contribute significantly to operation of the grid in terms

  13. Behavioral coaching.

    Science.gov (United States)

    Seniuk, Holly A; Witts, Benjamin N; Williams, W Larry; Ghezzi, Patrick M

    2013-01-01

    The term behavioral coaching has been used inconsistently in and outside the field of behavior analysis. In the sports literature, the term has been used to describe various intervention strategies, and in the organizational behavior management literature it has been used to describe an approach to training management personnel and staff. This inconsistency is problematic in terms of the replication of behavioral coaching across studies and aligning with Baer, Wolf, and Risley's (1968) technological dimension of applied behavior analysis. The current paper will outline and critique the discrepancies in the use of the term and suggest how Martin and Hrycaiko's (1983) characteristics of behavioral coaching in sports may be used to bring us closer to establishing a consistent definition of the term. In addition, we will suggest how these characteristics can also be applicable to the use of the term behavioral coaching in other domains of behavior analysis.

  14. Localized bending fatigue behavior of high-strength steel monostrands

    DEFF Research Database (Denmark)

    Winkler, Jan; Fischer, Gregor; Georgakis, Christos T.

    2012-01-01

    In this paper, the localized bending fatigue behavior of pretensioned high strength steel monostrands is investigated. Furthermore, a new methodology using an optical photogrammetry system, which can quantify surface deformations on the strand is presented. The system allows measurement of the st......In this paper, the localized bending fatigue behavior of pretensioned high strength steel monostrands is investigated. Furthermore, a new methodology using an optical photogrammetry system, which can quantify surface deformations on the strand is presented. The system allows measurement...

  15. Verbal behavior

    OpenAIRE

    Michael, Jack

    1984-01-01

    The recent history and current status of the area of verbal behavior are considered in terms of three major thematic lines: the operant conditioning of adult verbal behavior, learning to be an effective speaker and listener, and developments directly related to Skinner's Verbal Behavior. Other topics not directly related to the main themes are also considered: the work of Kurt Salzinger, ape-language research, and human operant research related to rule-governed behavior.

  16. Verbal behavior

    OpenAIRE

    Michael, Jack

    1984-01-01

    The recent history and current status of the area of verbal behavior are considered in terms of three major thematic lines: the operant conditioning of adult verbal behavior, learning to be an effective speaker and listener, and developments directly related to Skinner's Verbal Behavior. Other topics not directly related to the main themes are also considered: the work of Kurt Salzinger, ape-language research, and human operant research related to rule-governed behavior.

  17. Periodic behaviors

    CERN Document Server

    Napp, Diego; Shankar, Shiva

    2010-01-01

    This paper studies behaviors that are defined on a torus, or equivalently, behaviors defined in spaces of periodic functions, and establishes their basic properties analogous to classical results of Malgrange, Palamodov, Oberst et al. for behaviors on R^n. These properties - in particular the Nullstellensatz describing the Willems closure - are closely related to integral and rational points on affine algebraic varieties.

  18. PERIODIC BEHAVIORS

    NARCIS (Netherlands)

    Napp, Diego; Put, Marius van der; Shankar, Shiva

    2010-01-01

    This paper studies behaviors that are defined on a torus, or equivalently, behaviors defined in spaces of periodic functions, and establishes their basic properties analogous to classical results of Malgrange, Palamodov, Oberst et al. for behaviors on R(n). These properties-in particular the Nullste

  19. Behaviorally Speaking.

    Science.gov (United States)

    Porter, Elias H.; Dutton, Darell W. J.

    1987-01-01

    Consists of two articles focusing on (1) a modern behavioral model that takes cues from Hippocrates' Four Temperaments and (2) use of a behavioral approach to improve the effectiveness of meetings. Lists positive and negative behaviors within the meeting context. (CH)

  20. Information: theory, brain, and behavior.

    Science.gov (United States)

    Jensen, Greg; Ward, Ryan D; Balsam, Peter D

    2013-11-01

    In the 65 years since its formal specification, information theory has become an established statistical paradigm, providing powerful tools for quantifying probabilistic relationships. Behavior analysis has begun to adopt these tools as a novel means of measuring the interrelations between behavior, stimuli, and contingent outcomes. This approach holds great promise for making more precise determinations about the causes of behavior and the forms in which conditioning may be encoded by organisms. In addition to providing an introduction to the basics of information theory, we review some of the ways that information theory has informed the studies of Pavlovian conditioning, operant conditioning, and behavioral neuroscience. In addition to enriching each of these empirical domains, information theory has the potential to act as a common statistical framework by which results from different domains may be integrated, compared, and ultimately unified.

  1. INFORMATION: THEORY, BRAIN, AND BEHAVIOR

    Science.gov (United States)

    Jensen, Greg; Ward, Ryan D.; Balsam, Peter D.

    2016-01-01

    In the 65 years since its formal specification, information theory has become an established statistical paradigm, providing powerful tools for quantifying probabilistic relationships. Behavior analysis has begun to adopt these tools as a novel means of measuring the interrelations between behavior, stimuli, and contingent outcomes. This approach holds great promise for making more precise determinations about the causes of behavior and the forms in which conditioning may be encoded by organisms. In addition to providing an introduction to the basics of information theory, we review some of the ways that information theory has informed the studies of Pavlovian conditioning, operant conditioning, and behavioral neuroscience. In addition to enriching each of these empirical domains, information theory has the potential to act as a common statistical framework by which results from different domains may be integrated, compared, and ultimately unified. PMID:24122456

  2. Quantifying the role of motor imagery in brain-machine interfaces

    Science.gov (United States)

    Marchesotti, Silvia; Bassolino, Michela; Serino, Andrea; Bleuler, Hannes; Blanke, Olaf

    2016-04-01

    Despite technical advances in brain machine interfaces (BMI), for as-yet unknown reasons the ability to control a BMI remains limited to a subset of users. We investigate whether individual differences in BMI control based on motor imagery (MI) are related to differences in MI ability. We assessed whether differences in kinesthetic and visual MI, in the behavioral accuracy of MI, and in electroencephalographic variables, were able to differentiate between high- versus low-aptitude BMI users. High-aptitude BMI users showed higher MI accuracy as captured by subjective and behavioral measurements, pointing to a prominent role of kinesthetic rather than visual imagery. Additionally, for the first time, we applied mental chronometry, a measure quantifying the degree to which imagined and executed movements share a similar temporal profile. We also identified enhanced lateralized μ-band oscillations over sensorimotor cortices during MI in high- versus low-aptitude BMI users. These findings reveal that subjective, behavioral, and EEG measurements of MI are intimately linked to BMI control. We propose that poor BMI control cannot be ascribed only to intrinsic limitations of EEG recordings and that specific questionnaires and mental chronometry can be used as predictors of BMI performance (without the need to record EEG activity).

  3. Making Behavioral Activation More Behavioral

    Science.gov (United States)

    Kanter, Jonathan W.; Manos, Rachel C.; Busch, Andrew M.; Rusch, Laura C.

    2008-01-01

    Behavioral Activation, an efficacious treatment for depression, presents a behavioral theory of depression--emphasizing the need for clients to contact positive reinforcement--and a set of therapeutic techniques--emphasizing provision of instructions rather than therapeutic provision of reinforcement. An integration of Behavioral Activation with…

  4. Wiki surveys: Open and quantifiable social data collection

    CERN Document Server

    Salganik, Matthew J

    2012-01-01

    Research about attitudes and opinions is central to social science and relies on two common methodological approaches: surveys and interviews. While surveys enable the quantification of large amounts of information quickly and at a reasonable cost, they are routinely criticized for being "top-down" and rigid. In contrast, interviews allow unanticipated information to "bubble up" directly from respondents, but are slow, expensive, and difficult to quantify. Advances in computing technology now enable a hybrid approach that combines the quantifiability of a survey and the openness of an interview; we call this new class of data collection tools wiki surveys. Drawing on principles underlying successful information aggregation projects, such as Wikipedia, we propose three general criteria that wiki surveys should satisfy: they should be greedy, collaborative, and adaptive. We then present results from www.allourideas.org, a free and open-source website we created that enables groups all over the world to deploy w...

  5. Pitfalls in quantifying species turnover: the residency effect

    Directory of Open Access Journals (Sweden)

    Kevin Chase Burns

    2014-03-01

    Full Text Available The composition of ecological communities changes continuously through time and space. Understanding this turnover in species composition is a central goal in biogeography, but quantifying species turnover can be problematic. Here, I describe an underappreciated source of bias in quantifying species turnover, namely ‘the residency effect’, which occurs when the contiguous distributions of species across sampling domains are small relative to census intervals. I present the results of a simulation model that illustrates the problem theoretically and then I demonstrate the problem empirically using a long-term dataset of plant species turnover on islands. Results from both exercises indicate that empirical estimates of species turnover may be susceptible to significant observer bias, which may potentially cloud a better understanding of how the composition of ecological communities changes through time.

  6. A new paradigm of quantifying ecosystem stress through chemical signatures

    Energy Technology Data Exchange (ETDEWEB)

    Kravitz, Ben [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, P.O. Box 999, MSIN K9-30 Richland Washington 99352 USA; Guenther, Alex B. [Department of Earth System Science, University of California Irvine, 3200 Croul Hall Street Irvine California 92697 USA; Gu, Lianhong [Environmental Sciences Division, Oak Ridge National Laboratory, Oak Ridge Tennessee 37831 USA; Karl, Thomas [Institute of Atmospheric and Crysopheric Sciences, University of Innsbruck, Innrain 52f A-6020 Innsbruck Austria; Kaser, Lisa [National Center for Atmospheric Research, P.O. Box 3000 Boulder Colorado 80307 USA; Pallardy, Stephen G. [Department of Forestry, University of Missouri, 203 Anheuser-Busch Natural Resources Building Columbia Missouri 65211 USA; Peñuelas, Josep [CREAF, Cerdanyola del Vallès 08193 Catalonia Spain; Global Ecology Unit CREAF-CSIC-UAB, CSIC, Cerdanyola del Vallès 08193 Catalonia Spain; Potosnak, Mark J. [Department of Environmental Science and Studies, DePaul University, McGowan South, Suite 203 Chicago Illinois 60604 USA; Seco, Roger [Department of Earth System Science, University of California Irvine, 3200 Croul Hall Street Irvine California 92697 USA

    2016-11-01

    Stress-induced emissions of biogenic volatile organic compounds (VOCs) from terrestrial ecosystems may be one of the dominant sources of VOC emissions world-wide. Understanding the ecosystem stress response could reveal how ecosystems will respond and adapt to climate change and, in turn, quantify changes in the atmospheric burden of VOC oxidants and secondary organic aerosols. Here we argue, based on preliminary evidence from several opportunistic measurement sources, that chemical signatures of stress can be identified and quantified at the ecosystem scale. We also outline future endeavors that we see as next steps toward uncovering quantitative signatures of stress, including new advances in both VOC data collection and analysis of "big data."

  7. Resolving and quantifying overlapped chromatographic bands by transmutation

    Science.gov (United States)

    Malinowski

    2000-09-15

    A new chemometric technique called "transmutation" is developed for the purpose of sharpening overlapped chromatographic bands in order to quantify the components. The "transmutation function" is created from the chromatogram of the pure component of interest, obtained from the same instrument, operating under the same experimental conditions used to record the unresolved chromatogram of the sample mixture. The method is used to quantify mixtures containing toluene, ethylbenzene, m-xylene, naphthalene, and biphenyl from unresolved chromatograms previously reported. The results are compared to those obtained using window factor analysis, rank annihilation factor analysis, and matrix regression analysis. Unlike the latter methods, the transmutation method is not restricted to two-dimensional arrays of data, such as those obtained from HPLC/DAD, but is also applicable to chromatograms obtained from single detector experiments. Limitations of the method are discussed.

  8. Quantifying the Impact of Scenic Environments on Health

    Science.gov (United States)

    Seresinhe, Chanuki Illushka; Preis, Tobias; Moat, Helen Susannah

    2015-11-01

    Few people would deny an intuitive sense of increased wellbeing when spending time in beautiful locations. Here, we ask: can we quantify the relationship between environmental aesthetics and human health? We draw on data from Scenic-Or-Not, a website that crowdsources ratings of “scenicness” for geotagged photographs across Great Britain, in combination with data on citizen-reported health from the Census for England and Wales. We find that inhabitants of more scenic environments report better health, across urban, suburban and rural areas, even when taking core socioeconomic indicators of deprivation into account, such as income, employment and access to services. Our results provide evidence in line with the striking hypothesis that the aesthetics of the environment may have quantifiable consequences for our wellbeing.

  9. A Photometric Method for Quantifying Asymmetries in Disk Galaxies

    CERN Document Server

    Kornreich, D A; Lovelace, R V E; Kornreich, David A.; Haynes, Martha P.; Lovelace, Richard V.E.

    1998-01-01

    A photometric method for quantifying deviations from axisymmetry in optical images of disk galaxies is applied to a sample of 32 face-on and nearly face-on spirals. The method involves comparing the relative fluxes contained within trapezoidal sectors arranged symmetrically about the galaxy center of light, excluding the bulge and/or barred regions. Such a method has several advantages over others, especially when quantifying asymmetry in flocculent galaxies. Specifically, the averaging of large regions improves the signal-to-noise in the measurements; the method is not strongly affected by the presence of spiral arms; and it identifies the kinds of asymmetry that are likely to be dynamically important. Application of this "method of sectors" to R-band images of 32 disk galaxies indicates that about 30% of spirals show deviations from axisymmetry at the 5-sigma level.

  10. ASPRS research on quantifying the geometric quality of lidar data

    Science.gov (United States)

    Sampath, Aparajithan; Heidemann, Hans K.; Stensaas, Gregory L.; Christopherson, Jon B.

    2014-01-01

    The ASPRS Lidar Cal/Val (calibration/validation) Working Group led by the US Geological Survey (USGS) to establish “Guidelines on Geometric Accuracy and Quality of Lidar Data” has made excellent progress via regular teleconferences and meetings. The group is focused on identifying data quality metrics and establishing a set of guidelines for quantifying the quality of lidar data. The working group has defined and agreed on lidar Data Quality Measures (DQMs) to be used for this purpose. The DQMs are envisaged as the first ever consistent way of checking lidar data. It is expected that these metrics will be used as standard methods for quantifying the geometric quality of lidar data. The goal of this article is to communicate these developments to the readers and the larger geospatial community and invite them to participate in the process.  

  11. Thermophoresis in nanoliter droplets to quantify aptamer binding.

    Science.gov (United States)

    Seidel, Susanne A I; Markwardt, Niklas A; Lanzmich, Simon A; Braun, Dieter

    2014-07-21

    Biomolecule interactions are central to pharmacology and diagnostics. These interactions can be quantified by thermophoresis, the directed molecule movement along a temperature gradient. It is sensitive to binding induced changes in size, charge, or conformation. Established capillary measurements require at least 0.5 μL per sample. We cut down sample consumption by a factor of 50, using 10 nL droplets produced with acoustic droplet robotics (Labcyte). Droplets were stabilized in an oil-surfactant mix and locally heated with an IR laser. Temperature increase, Marangoni flow, and concentration distribution were analyzed by fluorescence microscopy and numerical simulation. In 10 nL droplets, we quantified AMP-aptamer affinity, cooperativity, and buffer dependence. Miniaturization and the 1536-well plate format make the method high-throughput and automation friendly. This promotes innovative applications for diagnostic assays in human serum or label-free drug discovery screening.

  12. How to quantify coherence: distinguishing speakable and unspeakable notions

    CERN Document Server

    Marvian, Iman

    2016-01-01

    Quantum coherence is a critical resource for many operational tasks. Understanding how to quantify and manipulate it also promises to have applications for a diverse set of problems in theoretical physics. For certain applications, however, one requires coherence between the eigenspaces of specific physical observables, such as energy, angular momentum, or photon number, and it makes a difference which eigenspaces appear in the superposition. For others, there is a preferred set of subspaces relative to which coherence is deemed a resource, but it is irrelevant which of the subspaces appear in the superposition. We term these two types of coherence unspeakable and speakable respectively. We argue that a useful approach to quantifying and characterizing unspeakable coherence is provided by the resource theory of asymmetry when the symmetry group is a group of translations, and we translate a number of prior results on asymmetry into the language of coherence. We also highlight some of the applications of this ...

  13. Quantifying temporal change in biodiversity: challenges and opportunities

    Science.gov (United States)

    Dornelas, Maria; Magurran, Anne E.; Buckland, Stephen T.; Chao, Anne; Chazdon, Robin L.; Colwell, Robert K.; Curtis, Tom; Gaston, Kevin J.; Gotelli, Nicholas J.; Kosnik, Matthew A.; McGill, Brian; McCune, Jenny L.; Morlon, Hélène; Mumby, Peter J.; Øvreås, Lise; Studeny, Angelika; Vellend, Mark

    2013-01-01

    Growing concern about biodiversity loss underscores the need to quantify and understand temporal change. Here, we review the opportunities presented by biodiversity time series, and address three related issues: (i) recognizing the characteristics of temporal data; (ii) selecting appropriate statistical procedures for analysing temporal data; and (iii) inferring and forecasting biodiversity change. With regard to the first issue, we draw attention to defining characteristics of biodiversity time series—lack of physical boundaries, uni-dimensionality, autocorrelation and directionality—that inform the choice of analytic methods. Second, we explore methods of quantifying change in biodiversity at different timescales, noting that autocorrelation can be viewed as a feature that sheds light on the underlying structure of temporal change. Finally, we address the transition from inferring to forecasting biodiversity change, highlighting potential pitfalls associated with phase-shifts and novel conditions. PMID:23097514

  14. Properties and relative measure for quantifying quantum synchronization

    Science.gov (United States)

    Li, Wenlin; Zhang, Wenzhao; Li, Chong; Song, Heshan

    2017-07-01

    Although quantum synchronization phenomena and corresponding measures have been widely discussed recently, it is still an open question how to characterize directly the influence of nonlocal correlation, which is the key distinction for identifying classical and quantum synchronizations. In this paper, we present basic postulates for quantifying quantum synchronization based on the related theory in Mari's work [Phys. Rev. Lett. 111, 103605 (2013), 10.1103/PhysRevLett.111.103605], and we give a general formula of a quantum synchronization measure with clear physical interpretations. By introducing Pearson's parameter, we show that the obvious characteristics of our measure are the relativity and monotonicity. As an example, the measure is applied to describe synchronization among quantum optomechanical systems under a Markovian bath. We also show the potential by quantifying generalized synchronization and discrete variable synchronization with this measure.

  15. A new method for quantifying entanglement of multipartite entangled states

    Science.gov (United States)

    Su, Pei-Yuan; Li, Wen-Dong; Ma, Xiao-Ping; Liu, Kai; Wang, Zhao-Ming; Gu, Yong-Jian

    2017-08-01

    We propose a new way for quantifying entanglement of multipartite entangled states which have a symmetrical structure and can be expressed as valence-bond-solid states. We put forward a new concept `unit.' The entangled state can be decomposed into a series of units or be reconstructed by multiplying the units successively, which simplifies the analyses of multipartite entanglement greatly. We compute and add up the generalized concurrence of each unit to quantify the entanglement of the whole state. We verify that the new method coincides with concurrence for two-partite pure states. We prove that the new method is a good entanglement measure obeying the three necessary conditions for all good entanglement quantification methods. Based on the method, we compute the entanglement of multipartite GHZ, cluster and AKLT states.

  16. Quantifying social vulnerability for flood disasters of insurance company

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Social vulnerability assessments are largely ignored when compared with biophysical vulnerability assessments. This is mainly due to the fact that there are more difficulties in quantifying them. Aiming at several pitfalls still existing in the Hoovering approach which is widely accepted, a suitable modified model is provided. In this modified model, the integrated vulnerability is made an analogy to the elasticity coefficient of a spring, and an objective evaluation criterion is established. With the evalu...

  17. The quantified self a sociology of self-tracking

    CERN Document Server

    Lupton, Deborah

    2016-01-01

    With the advent of digital devices and software, self-tracking practices have gained new adherents and have spread into a wide array of social domains. The Quantified Self movement has emerged to promote 'self knowledge through numbers'. In this ground-breaking book, Deborah Lupton critically analyses the social, cultural and political dimensions of contemporary self-tracking and identifies the concepts of selfhood, human embodiment and the value of data that underpin them.

  18. Quantifying the Impact of Scenic Environments on Health

    OpenAIRE

    Chanuki Illushka Seresinhe; Tobias Preis; Helen Susannah Moat

    2015-01-01

    Few people would deny an intuitive sense of increased wellbeing when spending time in beautiful locations. Here, we ask: can we quantify the relationship between environmental aesthetics and human health? We draw on data from Scenic-Or-Not, a website that crowdsources ratings of “scenicness” for geotagged photographs across Great Britain, in combination with data on citizen-reported health from the Census for England and Wales. We find that inhabitants of more scenic environments report bette...

  19. Quantifying the impact of scenic environments on health

    OpenAIRE

    Seresinhe, Chanuki Illushka; Preis, Tobias; Moat, Helen Susannah

    2015-01-01

    Few people would deny an intuitive sense of increased wellbeing when spending time in beautiful locations. Here, we ask: can we quantify the relationship between environmental aesthetics and human health? We draw on data from Scenic-Or-Not, a website that crowdsources ratings of “scenicness” for geotagged photographs across Great Britain, in combination with data on citizen-reported health from the Census for England and Wales. We find that inhabitants of more scenic environments report bette...

  20. Quantifying hypoxia in human cancers using static PET imaging

    Science.gov (United States)

    Taylor, Edward; Yeung, Ivan; Keller, Harald; Wouters, Bradley G.; Milosevic, Michael; Hedley, David W.; Jaffray, David A.

    2016-11-01

    Compared to FDG, the signal of 18F-labelled hypoxia-sensitive tracers in tumours is low. This means that in addition to the presence of hypoxic cells, transport properties contribute significantly to the uptake signal in static PET images. This sensitivity to transport must be minimized in order for static PET to provide a reliable standard for hypoxia quantification. A dynamic compartmental model based on a reaction-diffusion formalism was developed to interpret tracer pharmacokinetics and applied to static images of FAZA in twenty patients with pancreatic cancer. We use our model to identify tumour properties—well-perfused without substantial necrosis or partitioning—for which static PET images can reliably quantify hypoxia. Normalizing the measured activity in a tumour voxel by the value in blood leads to a reduction in the sensitivity to variations in ‘inter-corporal’ transport properties—blood volume and clearance rate—as well as imaging study protocols. Normalization thus enhances the correlation between static PET images and the FAZA binding rate K 3, a quantity which quantifies hypoxia in a biologically significant way. The ratio of FAZA uptake in spinal muscle and blood can vary substantially across patients due to long muscle equilibration times. Normalized static PET images of hypoxia-sensitive tracers can reliably quantify hypoxia for homogeneously well-perfused tumours with minimal tissue partitioning. The ideal normalizing reference tissue is blood, either drawn from the patient before PET scanning or imaged using PET. If blood is not available, uniform, homogeneously well-perfused muscle can be used. For tumours that are not homogeneously well-perfused or for which partitioning is significant, only an analysis of dynamic PET scans can reliably quantify hypoxia.

  1. A strong completeness theorem in intuitionistic quantified modal logic

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the intuitionistic first order predicate calculus H given by Thomason with the modal machinery of MIPC put forward by Prior this paper obtains the intuitionistic quantified modal logic system MIPC*, gives it a semantic interpretation and proves its strong (thus also weak) completeness theorem and soundness theorem with respect to that semantic. Since Zorn lemma plays a decisive role in our discussion, methodologically, it was even farther from the intuitionistic point of view than Thomason's result.

  2. A strong completeness theorem in intuitionistic quantified modal logic

    Institute of Scientific and Technical Information of China (English)

    高恒珊

    2000-01-01

    Based on the intuitionistic first order predicate calculus H given by Thomason with the modal machinery of MIPC put forward by Prior this paper obtains the intuitionistic quantified modal logic system MIPC* , gives it a semantic interpretation and proves its strong (thus also weak) completeness theorem and soundness theorem with respect to that semantic. Since Zom lemma plays a decisive role in our discussion, methodologically, it was even farther from the intuitionistic point of view than Thomason’s result.

  3. Quantifying self-motives: functional links between dispositional desires

    OpenAIRE

    Gregg, Aiden P.; Hepper, Erica G.; Sedikides, Constantine

    2011-01-01

    Previous research has sought to establish the existence, or gauge the relative strength, of key self-evaluation motives (i.e., self-enhancement, self-verification, self-assessment, self-improvement). Here, we attempted, across five samples, to quantify individual differences in self-motive strength, and explore their empirical ramifications. We devised brief self-report indices for each self-motive, and checked their factor structure, reliability, and validity. We found that self-enhancement ...

  4. Quantifying the Dynamic Ocean Surface Using Underwater Radiometric Measurements

    Science.gov (United States)

    2015-03-31

    2. REPORT DATE 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER 6. AUTHOR(S) 7. PERFORMING ORGANIZATION NAME(S) AND...WORK UNIT NUMBER 1. REPORT DATE (DD-MM-YYYY) 16. SECURITY CLASSIFICATION OF: PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 31-03-2015...Final March 2013 -- February 2015 Quantifying the Dynamic Ocean Surface Using Underwater Radiometric Measurements N00014-13-1-0352 Yue, Dick K.P

  5. Quantifying the Dynamic Ocean Surface Using Underwater Radiometric Measurement

    Science.gov (United States)

    2013-09-30

    Radiometric Measurement Lian Shen Department of Mechanical Engineering & St. Anthony Falls Laboratory University of Minnesota Minneapolis, MN...information if it does not display a currently valid OMB control number. 1. REPORT DATE 30 SEP 2013 2. REPORT TYPE 3. DATES COVERED 00-00-2013 to 00-00...2013 4. TITLE AND SUBTITLE Quantifying the Dynamic Ocean Surface Using Underwater Radiometric Measurement 5a. CONTRACT NUMBER 5b. GRANT NUMBER

  6. Quantifying hypoxia in human cancers using static PET imaging.

    Science.gov (United States)

    Taylor, Edward; Yeung, Ivan; Keller, Harald; Wouters, Bradley G; Milosevic, Michael; Hedley, David W; Jaffray, David A

    2016-11-21

    Compared to FDG, the signal of (18)F-labelled hypoxia-sensitive tracers in tumours is low. This means that in addition to the presence of hypoxic cells, transport properties contribute significantly to the uptake signal in static PET images. This sensitivity to transport must be minimized in order for static PET to provide a reliable standard for hypoxia quantification. A dynamic compartmental model based on a reaction-diffusion formalism was developed to interpret tracer pharmacokinetics and applied to static images of FAZA in twenty patients with pancreatic cancer. We use our model to identify tumour properties-well-perfused without substantial necrosis or partitioning-for which static PET images can reliably quantify hypoxia. Normalizing the measured activity in a tumour voxel by the value in blood leads to a reduction in the sensitivity to variations in 'inter-corporal' transport properties-blood volume and clearance rate-as well as imaging study protocols. Normalization thus enhances the correlation between static PET images and the FAZA binding rate K 3, a quantity which quantifies hypoxia in a biologically significant way. The ratio of FAZA uptake in spinal muscle and blood can vary substantially across patients due to long muscle equilibration times. Normalized static PET images of hypoxia-sensitive tracers can reliably quantify hypoxia for homogeneously well-perfused tumours with minimal tissue partitioning. The ideal normalizing reference tissue is blood, either drawn from the patient before PET scanning or imaged using PET. If blood is not available, uniform, homogeneously well-perfused muscle can be used. For tumours that are not homogeneously well-perfused or for which partitioning is significant, only an analysis of dynamic PET scans can reliably quantify hypoxia.

  7. Towards Quantifying Programmable Logic Controller Resilience Against Intentional Exploits

    Science.gov (United States)

    2012-03-22

    17 Figure 3: Queiroz’s Summary for Sample Data ( Queiroz , 2010) .................................... 20 Figure 4... Queiroz (2010) and Germanus (2010) present individual models for SCADA security analysis at a macro-level, while Shah (2008) explores SCADA security...systems. 2.2.1 Survivable SCADA Systems Queiroz (2010) presents a model to quantify SCADA system performance against a denial of service (DoS

  8. Video chat technology to remotely quantify dietary, supplement, and medication adherence in clinical trials

    Science.gov (United States)

    Peterson, Courtney M.; Apolzan, John W.; Wright, Courtney; Martin, Corby K.

    2017-01-01

    We conducted a pair of studies to test the validity, reliability, feasibility, and acceptability of using video chat technology as a novel method to quantify dietary and pill-taking (i.e., supplement and medication) adherence. In the first study, we investigated whether video chat technology can accurately quantify adherence to dietary and pill-taking interventions. Mock study participants ate food items and swallowed pills while performing randomized scripted “cheating” behaviors design to mimic non-adherence. Monitoring was conducted in a crossover design, with two monitors watching in-person and two watching remotely by Skype on a smartphone. For the second study, a 22-question online survey was sent to an email listserv with more than 20,000 unique email addresses of past and present study participants to assess the feasibility and acceptability of the technology. For the dietary adherence tests, monitors detected 86% of non-adherent events (sensitivity) in-person versus 78% of events via video chat monitoring (p=0.12), with comparable inter-rater agreement (0.88 vs. 0.85; p=0.62). However, for pill-taking, non-adherence trended towards being more easily detected in-person than by video chat (77% vs. 60%; p=0.08), with non-significantly higher inter-rater agreement (0.85 vs. 0.69; p=0.21). Survey results from the second study (N=1,076 respondents; at least a 5% response rate) indicated that 86.4% of study participants had video chatting hardware, 73.3% were comfortable using the technology; and 79.8% were willing to use it for clinical research. Given the capability of video chat technology to reduce participant burden and to outperform other adherence monitoring methods such as dietary self-report and pill counts, video chatting is a novel and highly promising platform to quantify dietary and pill-taking adherence. PMID:27753427

  9. Information criteria for quantifying loss of reversibility in parallelized KMC

    Energy Technology Data Exchange (ETDEWEB)

    Gourgoulias, Konstantinos, E-mail: gourgoul@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu; Rey-Bellet, Luc, E-mail: luc@math.umass.edu

    2017-01-01

    Parallel Kinetic Monte Carlo (KMC) is a potent tool to simulate stochastic particle systems efficiently. However, despite literature on quantifying domain decomposition errors of the particle system for this class of algorithms in the short and in the long time regime, no study yet explores and quantifies the loss of time-reversibility in Parallel KMC. Inspired by concepts from non-equilibrium statistical mechanics, we propose the entropy production per unit time, or entropy production rate, given in terms of an observable and a corresponding estimator, as a metric that quantifies the loss of reversibility. Typically, this is a quantity that cannot be computed explicitly for Parallel KMC, which is why we develop a posteriori estimators that have good scaling properties with respect to the size of the system. Through these estimators, we can connect the different parameters of the scheme, such as the communication time step of the parallelization, the choice of the domain decomposition, and the computational schedule, with its performance in controlling the loss of reversibility. From this point of view, the entropy production rate can be seen both as an information criterion to compare the reversibility of different parallel schemes and as a tool to diagnose reversibility issues with a particular scheme. As a demonstration, we use Sandia Lab's SPPARKS software to compare different parallelization schemes and different domain (lattice) decompositions.

  10. Using multilevel models to quantify heterogeneity in resource selection

    Science.gov (United States)

    Wagner, T.; Diefenbach, D.R.; Christensen, S.A.; Norton, A.S.

    2011-01-01

    Models of resource selection are being used increasingly to predict or model the effects of management actions rather than simply quantifying habitat selection. Multilevel, or hierarchical, models are an increasingly popular method to analyze animal resource selection because they impose a relatively weak stochastic constraint to model heterogeneity in habitat use and also account for unequal sample sizes among individuals. However, few studies have used multilevel models to model coefficients as a function of predictors that may influence habitat use at different scales or quantify differences in resource selection among groups. We used an example with white-tailed deer (Odocoileus virginianus) to illustrate how to model resource use as a function of distance to road that varies among deer by road density at the home range scale. We found that deer avoidance of roads decreased as road density increased. Also, we used multilevel models with sika deer (Cervus nippon) and white-tailed deer to examine whether resource selection differed between species. We failed to detect differences in resource use between these two species and showed how information-theoretic and graphical measures can be used to assess how resource use may have differed. Multilevel models can improve our understanding of how resource selection varies among individuals and provides an objective, quantifiable approach to assess differences or changes in resource selection. ?? The Wildlife Society, 2011.

  11. Information criteria for quantifying loss of reversibility in parallelized KMC

    Science.gov (United States)

    Gourgoulias, Konstantinos; Katsoulakis, Markos A.; Rey-Bellet, Luc

    2017-01-01

    Parallel Kinetic Monte Carlo (KMC) is a potent tool to simulate stochastic particle systems efficiently. However, despite literature on quantifying domain decomposition errors of the particle system for this class of algorithms in the short and in the long time regime, no study yet explores and quantifies the loss of time-reversibility in Parallel KMC. Inspired by concepts from non-equilibrium statistical mechanics, we propose the entropy production per unit time, or entropy production rate, given in terms of an observable and a corresponding estimator, as a metric that quantifies the loss of reversibility. Typically, this is a quantity that cannot be computed explicitly for Parallel KMC, which is why we develop a posteriori estimators that have good scaling properties with respect to the size of the system. Through these estimators, we can connect the different parameters of the scheme, such as the communication time step of the parallelization, the choice of the domain decomposition, and the computational schedule, with its performance in controlling the loss of reversibility. From this point of view, the entropy production rate can be seen both as an information criterion to compare the reversibility of different parallel schemes and as a tool to diagnose reversibility issues with a particular scheme. As a demonstration, we use Sandia Lab's SPPARKS software to compare different parallelization schemes and different domain (lattice) decompositions.

  12. MEI AND DOU IN CHINESE: A TALE OF TWO QUANTIFIERS

    Directory of Open Access Journals (Sweden)

    Qiong-peng Luo

    2011-12-01

    Full Text Available This study addresses two outstanding puzzles about the two well-known quantifiers mei and dou in Chinese: (i the indefinite/definite asymmetry when mei leads the subject NP: dou is not needed when there is an indefinite or a reflexive object within the scope of mei and (ii the subject/object asymmetry: when mei leads the subject NP, its distribution is restricted, depending on the type of the objects, and, by contrast, when it leads the object NP, its distribution is much freer. We propose a novel account for these puzzles. We argue that (i the indefinite/definite asymmetry can be explained away if we assume that mei is a distributive quantifier with a portmanteau semantic structure, i.e., that it is a standard universal quantifier plus a matching function; (ii mei can be domain-shifted into a distributive determiner to satisfy interpretability, and this explains the subject/object asymmetry and (iii this domain-shifting is regulated by the Principle of Economy (cf. Reinhart (2006, which is a last resort to satisfy interpretability

  13. Radiology reports: a quantifiable and objective textual approach.

    Science.gov (United States)

    Scott, J A; Palmer, E L

    2015-11-01

    To examine the feasibility of using automated lexical analysis in conjunction with machine learning to create a means of objectively characterising radiology reports for quality improvement. Twelve lexical parameters were quantified from the collected reports of four radiologists. These included the number of different words used, number of sentences, reading grade, readability, usage of the passive voice, and lexical metrics of concreteness, ambivalence, complexity, passivity, embellishment, communication and cognition. Each radiologist was statistically compared to the mean of the group for each parameter to determine outlying report characteristics. The reproducibility of these parameters in a given radiologist's reporting style was tested by using only these 12 parameters as input to a neural network designed to establish the authorship of 60 unknown reports. Significant differences in report characteristics were observed between radiologists, quantifying and characterising deviations of individuals from the group reporting style. The 12 metrics employed in a neural network correctly identified the author in each of 60 unknown reports tested, indicating a robust parametric signature. Automated and quantifiable methods can be used to analyse reporting style and provide impartial and objective feedback as well as to detect and characterise significant differences from the group. The parameters examined are sufficiently specific to identify the authors of reports and can potentially be useful in quality improvement and residency training. Copyright © 2015 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  14. Quantifying the underlying landscape and paths of cancer

    Science.gov (United States)

    Li, Chunhe; Wang, Jin

    2014-01-01

    Cancer is a disease regulated by the underlying gene networks. The emergence of normal and cancer states as well as the transformation between them can be thought of as a result of the gene network interactions and associated changes. We developed a global potential landscape and path framework to quantify cancer and associated processes. We constructed a cancer gene regulatory network based on the experimental evidences and uncovered the underlying landscape. The resulting tristable landscape characterizes important biological states: normal, cancer and apoptosis. The landscape topography in terms of barrier heights between stable state attractors quantifies the global stability of the cancer network system. We propose two mechanisms of cancerization: one is by the changes of landscape topography through the changes in regulation strengths of the gene networks. The other is by the fluctuations that help the system to go over the critical barrier at fixed landscape topography. The kinetic paths from least action principle quantify the transition processes among normal state, cancer state and apoptosis state. The kinetic rates provide the quantification of transition speeds among normal, cancer and apoptosis attractors. By the global sensitivity analysis of the gene network parameters on the landscape topography, we uncovered some key gene regulations determining the transitions between cancer and normal states. This can be used to guide the design of new anti-cancer tactics, through cocktail strategy of targeting multiple key regulation links simultaneously, for preventing cancer occurrence or transforming the early cancer state back to normal state. PMID:25232051

  15. How to quantify coherence: Distinguishing speakable and unspeakable notions

    Science.gov (United States)

    Marvian, Iman; Spekkens, Robert W.

    2016-11-01

    Quantum coherence is a critical resource for many operational tasks. Understanding how to quantify and manipulate it also promises to have applications for a diverse set of problems in theoretical physics. For certain applications, however, one requires coherence between the eigenspaces of specific physical observables, such as energy, angular momentum, or photon number, and it makes a difference which eigenspaces appear in the superposition. For others, there is a preferred set of subspaces relative to which coherence is deemed a resource, but it is irrelevant which of the subspaces appear in the superposition. We term these two types of coherence unspeakable and speakable, respectively. We argue that a useful approach to quantifying and characterizing unspeakable coherence is provided by the resource theory of asymmetry when the symmetry group is a group of translations, and we translate a number of prior results on asymmetry into the language of coherence. We also highlight some of the applications of this approach, for instance, in the context of quantum metrology, quantum speed limits, quantum thermodynamics, and nuclear magnetic resonance (NMR). The question of how best to treat speakable coherence as a resource is also considered. We review a popular approach in terms of operations that preserve the set of incoherent states, propose an alternative approach in terms of operations that are covariant under dephasing, and we outline the challenge of providing a physical justification for either approach. Finally, we note some mathematical connections that hold among the different approaches to quantifying coherence.

  16. Quantifying uncertainty in proxy-based paleoclimate reconstructions

    Science.gov (United States)

    D'Andrea, William J.; Polissar, Pratigya J.

    2017-04-01

    There are now numerous geochemical tools that can provide quantitative estimates of past temperatures and hydrologic variables. These proxies for past climate are usually calibrated using empirical relationships with quantifiable uncertainty. Laboratory analysis introduces additional sources of uncertainty that must also be accounted for in order to fully propagate the uncertainty in an estimated climate variable. The aim of this presentation is to review the sources of uncertainty that can be quantified and reported when using different geochemical climate proxies for paleoclimate reconstruction. I will consider a number of widely used climate proxies, discuss the relative importance of various sources of uncertainty for each, and attempt to identify ways in which the scientific community might reduce these uncertainties. For example, compound-specific δD measurements can be used for quantitative estimation of source water δD values, a useful tracer for paleohydrologic changes. Such estimates have quantifiable levels of uncertainty that are often miscalculated, resulting in inaccurate error reporting in the scientific literature that can impact paleohydrologic interpretations. Here, I will summarize the uncertainties inherent to molecular δD measurements and the quantification of source water δD values, and discuss the assumptions involved when omitting various sources of uncertainty. The analytical uncertainty of δD measurements is often improperly estimated and secondary to the apparent fractionation between δD values of source water and molecule, normalization of data to the VSMOW scale introduces the largest amount of uncertainty.

  17. A novel approach to quantify cybersecurity for electric power systems

    Science.gov (United States)

    Kaster, Paul R., Jr.

    Electric Power grid cybersecurity is a topic gaining increased attention in academia, industry, and government circles, yet a method of quantifying and evaluating a system's security is not yet commonly accepted. In order to be useful, a quantification scheme must be able to accurately reflect the degree to which a system is secure, simply determine the level of security in a system using real-world values, model a wide variety of attacker capabilities, be useful for planning and evaluation, allow a system owner to publish information without compromising the security of the system, and compare relative levels of security between systems. Published attempts at quantifying cybersecurity fail at one or more of these criteria. This document proposes a new method of quantifying cybersecurity that meets those objectives. This dissertation evaluates the current state of cybersecurity research, discusses the criteria mentioned previously, proposes a new quantification scheme, presents an innovative method of modeling cyber attacks, demonstrates that the proposed quantification methodology meets the evaluation criteria, and proposes a line of research for future efforts.

  18. Psychological behaviorism and behaviorizing psychology

    Science.gov (United States)

    Staats, Arthur W.

    1994-01-01

    Paradigmatic or psychological behaviorism (PB), in a four-decade history of development, has been shaped by its goal, the establishment of a behaviorism that can also serve as the approach in psychology (Watson's original goal). In the process, PB has become a new generation of behaviorism with abundant heuristic avenues for development in theory, philosophy, methodology, and research. Psychology has resources, purview and problem areas, and nascent developments of many kinds, gathered in chaotic diversity, needing unification (and other things) that cognitivism cannot provide. Behaviorism can, within PB's multilevel framework for connecting and advancing both psychology and behaviorism. PMID:22478175

  19. Psychological behaviorism and behaviorizing psychology.

    Science.gov (United States)

    Staats, A W

    1994-01-01

    Paradigmatic or psychological behaviorism (PB), in a four-decade history of development, has been shaped by its goal, the establishment of a behaviorism that can also serve as the approach in psychology (Watson's original goal). In the process, PB has become a new generation of behaviorism with abundant heuristic avenues for development in theory, philosophy, methodology, and research. Psychology has resources, purview and problem areas, and nascent developments of many kinds, gathered in chaotic diversity, needing unification (and other things) that cognitivism cannot provide. Behaviorism can, within PB's multilevel framework for connecting and advancing both psychology and behaviorism.

  20. Metrics to quantify the importance of mixing state for CCN activity

    Science.gov (United States)

    Ching, Joseph; Fast, Jerome; West, Matthew; Riemer, Nicole

    2017-06-01

    It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN) concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index (χ) proposed by Riemer and West (2013) to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportant for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations (χ below 20 %) the relationship of χ and the error in CCN predictions is not unique and ranges from lower than -40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.

  1. Metrics to quantify the importance of mixing state for CCN activity

    Energy Technology Data Exchange (ETDEWEB)

    Ching, Joseph; Fast, Jerome; West, Matthew; Riemer, Nicole

    2017-01-01

    It is commonly assumed that models are more prone to errors in predicted cloud condensation nuclei (CCN) concentrations when the aerosol populations are externally mixed. In this work we investigate this assumption by using the mixing state index (χ) proposed by Riemer and West (2013) to quantify the degree of external and internal mixing of aerosol populations. We combine this metric with particle-resolved model simulations to quantify error in CCN predictions when mixing state information is neglected, exploring a range of scenarios that cover different conditions of aerosol aging. We show that mixing state information does indeed become unimportant for more internally mixed populations, more precisely for populations with χ larger than 75 %. For more externally mixed populations (χ below 20 %) the relationship of χ and the error in CCN predictions is not unique and ranges from lower than -40 % to about 150 %, depending on the underlying aerosol population and the environmental supersaturation. We explain the reasons for this behavior with detailed process analyses.

  2. Fish dispersal in fragmented landscapes: a modeling framework for quantifying the permeability of structural barriers.

    Science.gov (United States)

    Pépino, Marc; Rodríguez, Marco A; Magnan, Pierre

    2012-07-01

    Dispersal is a key determinant of the spatial distribution and abundance of populations, but human-made fragmentation can create barriers that hinder dispersal and reduce population viability. This study presents a modeling framework based on dispersal kernels (modified Laplace distributions) that describe stream fish dispersal in the presence of obstacles to passage. We used mark-recapture trials to quantify summer dispersal of brook trout (Salvelinus fontinalis) in four streams crossed by a highway. The analysis identified population heterogeneity in dispersal behavior, as revealed by the presence of a dominant sedentary component (48-72% of all individuals) characterized by short mean dispersal distance (dispersal distance (56-1086 m). We did not detect evidence of barrier effects on dispersal through highway crossings. Simulation of various plausible scenarios indicated that detectability of barrier effects was strongly dependent on features of sampling design, such as spatial configuration of the sampling area, barrier extent, and sample size. The proposed modeling framework extends conventional dispersal kernels by incorporating structural barriers. A major strength of the approach is that ecological process (dispersal model) and sampling design (observation model) are incorporated simultaneously into the analysis. This feature can facilitate the use of prior knowledge to improve sampling efficiency of mark-recapture trials in movement studies. Model-based estimation of barrier permeability and its associated uncertainty provides a rigorous approach for quantifying the effect of barriers on stream fish dispersal and assessing population dynamics of stream fish in fragmented landscapes.

  3. The dolognawmeter: a novel instrument and assay to quantify nociception in rodent models of orofacial pain.

    Science.gov (United States)

    Dolan, John C; Lam, David K; Achdjian, Stacy H; Schmidt, Brian L

    2010-03-30

    Rodent pain models play an important role in understanding the mechanisms of nociception and have accelerated the search for new treatment approaches for pain. Creating an objective metric for orofacial nociception in these models presents significant technical obstacles. No animal assay accurately measures pain-induced orofacial dysfunction that is directly comparable to human orofacial dysfunction. We developed and validated a high throughput, objective, operant, nociceptive animal assay, and an instrument to perform the assay termed the dolognawmeter, for evaluation of conditions known to elicit orofacial pain in humans. Using the device our assay quantifies gnawing function in the mouse. We quantified a behavioral index of nociception and demonstrated blockade of nociception in three models of orofacial pain: (1) TMJ inflammation, (2) masticatory myositis, and (3) head and neck cancer. This assay will be useful in the study of nociceptive mediators involved in the development and progression of orofacial pain conditions and it will also provide a unique tool for development and assessment of new therapeutic approaches.

  4. The Dolognawmeter: A Novel Instrument and Assay to Quantify Nociception in Rodent Models of Orofacial Pain

    Science.gov (United States)

    Dolan, John C.; Lam, David K.; Achdjian, Stacy H.; Schmidt, Brian L.

    2010-01-01

    Rodent pain models play an important role in understanding the mechanisms of nociception and have accelerated the search for new treatment approaches for pain. Creating an objective metric for orofacial nociception in these models presents significant technical obstacles. No animal assay accurately measures pain-induced orofacial dysfunction that is directly comparable to human orofacial dysfunction. We developed and validated a high throughput, objective, operant, nociceptive animal assay, and an instrument to perform the assay termed the dolognawmeter, for evaluation of conditions known to elicit orofacial pain in humans. Using the device our assay quantifies gnawing function in the mouse. We quantified a behavioral index of nociception and demonstrated blockade of nociception in three models of orofacial pain: (1) TMJ inflammation, (2) masticatory myositis, and (3) head and neck cancer. This assay will be useful in the study of nociceptive mediators involved in the development and progression of orofacial pain conditions and it will also provide a unique tool for development and assessment of new therapeutic approaches. PMID:20096303

  5. Quantifying diagnostic uncertainty using item response theory: the Posterior Probability of Diagnosis Index.

    Science.gov (United States)

    Lindhiem, Oliver; Kolko, David J; Yu, Lan

    2013-06-01

    Using traditional Diagnostic and Statistical Manual of Mental Disorders, fourth edition, text revision (American Psychiatric Association, 2000) diagnostic criteria, clinicians are forced to make categorical decisions (diagnosis vs. no diagnosis). This forced choice implies that mental and behavioral health disorders are categorical and does not fully characterize varying degrees of uncertainty associated with a particular diagnosis. Using an item response theory (latent trait model) framework, we describe the development of the Posterior Probability of Diagnosis (PPOD) Index, which answers the question: What is the likelihood that a patient meets or exceeds the latent trait threshold for a diagnosis? The PPOD Index is based on the posterior distribution of θ (latent trait score) for each patient's profile of symptoms. The PPOD Index allows clinicians to quantify and communicate the degree of uncertainty associated with each diagnosis in probabilistic terms. We illustrate the advantages of the PPOD Index in a clinical sample (N = 321) of children and adolescents with oppositional defiant disorder.

  6. Measuring the jitter of ring oscillators by means of information theory quantifiers

    Science.gov (United States)

    Antonelli, M.; De Micco, L.; Larrondo, H. A.

    2017-02-01

    Ring oscillators (RO's) are elementary blocks widely used in digital design. Jitter is unavoidable in RO's, its presence is an undesired behavior in many applications, as clock generators. On the contrary, jitter may be used as the noise source in RO-based true-random numbers generators (TRNG). Consequently, jitter measure is a relevant issue to characterize a RO, and it is the subject of this paper. The main contribution is the use of Information Theory Quantifiers (ITQ) as measures of RO's jitter. It is shown that among several ITQ evaluated, two of them emerge as good measures because they are independent of parameters used for their statistical determination. They turned out to be robust and may be implemented experimentally. We encountered that a dual entropy plane allows a visual comparison of results.

  7. Tuning and Quantifying Steric and Electronic Effects of N-Heterocyclic Carbenes

    KAUST Repository

    Falivene, Laura

    2014-07-12

    This chapter states that the main handles for tuning steric and electronic effects are the substituents on N atoms, the nature of the C4-C5 bridge (either saturated or unsaturated), and the substituents on the C4 and C5 atoms. The initial intuition that steric properties of N-heterocyclic carbenes (NHCs) could be modulated and could impact catalytic behavior stimulated the development of steric descriptors to quantify the steric requirement of different NHCs and, possibly, to compare them with tertiary phosphines. NHCs can be classified as typically strong σ-basic/π-acid ligands, although they have been also shown to exhibit reasonable π-basic properties. This electronic modularity allows NHC ligands to adapt flexibly to different chemical environments represented by a transition metal and the other ligands. © 2014 Wiley-VCH Verlag GmbH & Co. KGaA. All rights reserved.

  8. Realization of quantifying interfacial interactions between a randomly rough membrane surface and a foulant particle.

    Science.gov (United States)

    Chen, Jianrong; Lin, Hongjun; Shen, Liguo; He, Yiming; Zhang, Meijia; Liao, Bao-Qiang

    2017-02-01

    Quantification of interfacial interaction with randomly rough surface is the prerequisite to quantitatively understand and control the interface behaviors such as adhesion, flocculation and membrane fouling. In this study, it was found that membrane surface was randomly rough with obvious fractal characteristics. The randomly rough surface of membrane could be well reconstructed by the fractal geometry represented by a modified Weierstrass-Mandelbrot function. A novel method, which combined composite Simpson's approach, surface element integration method and approximation by computer programming, was developed. By using this method, this study provided the first realization of quantifying interfacial energy between randomly rough surface of membrane and a foulant particle. The calculated interactions with randomly rough surface of membrane were significantly different from those with smooth surface of membrane, indicating the significant effect of surface topography on interactions. This proposed method could be also potentially used to investigate various natural interface environmental phenomena.

  9. Agent-Based Model to Study and Quantify the Evolution Dynamics of Android Malware Infection

    Directory of Open Access Journals (Sweden)

    Juan Alegre-Sanahuja

    2014-01-01

    Full Text Available In the last years the number of malware Apps that the users download to their devices has risen. In this paper, we propose an agent-based model to quantify the Android malware infection evolution, modeling the behavior of the users and the different markets where the users may download Apps. The model predicts the number of infected smartphones depending on the type of malware. Additionally, we will estimate the cost that the users should afford when the malware is in their devices. We will be able to analyze which part is more critical: the users, giving indiscriminate permissions to the Apps or not protecting their devices with antivirus software, or the Android platform, due to the vulnerabilities of the Android devices that permit their rooted. We focus on the community of Valencia, Spain, although the obtained results can be extrapolated to other places where the number of Android smartphones remains fairly stable.

  10. Derivation of evolutionary payoffs from observable behavior

    OpenAIRE

    Feigel, Alexander; Englander, Avraham; Engel, Assaf

    2008-01-01

    Interpretation of animal behavior, especially as cooperative or selfish, is a challenge for evolutionary theory. Strategy of a competition should follow from corresponding Darwinian payoffs for the available behavioral options. The payoffs and decision making processes, however, are difficult to observe and quantify. Here we present a general method for the derivation of evolutionary payoffs from observable statistics of interactions. The method is applied to combat of male bowl and doily spi...

  11. Plug Load Behavioral Change Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Metzger, I.; Kandt, A.; VanGeet, O.

    2011-08-01

    This report documents the methods and results of a plug load study of the Environmental Protection Agency's Region 8 Headquarters in Denver, Colorado, conducted by the National Renewable Energy Laboratory. The study quantified the effect of mechanical and behavioral change approaches on plug load energy reduction and identified effective ways to reduce plug load energy. Load reduction approaches included automated energy management systems and behavioral change strategies.

  12. Teleological behaviorism.

    Science.gov (United States)

    Rachlin, H

    1992-11-01

    A psychological science of efficient causes, using internal mechanisms to explain overt behavior, is distinguished from another psychological science, based on Aristotelian final causes, using external objects and goals to explain overt behavior. Efficient-cause psychology is designed to answer the question of how a particular act is emitted; final-cause psychology is designed to answer the question of why a particular act is emitted. Physiological psychology, modern cognitive psychology, and some parts of behaviorism including Skinnerian behaviorism are efficient-cause psychologies; final-cause psychology, a development of Skinnerian behaviorism, is here called teleological behaviorism. Each of these two conceptions of causality in psychology implies a different view of the mind, hence a different meaning of mental terms.

  13. Quantifying spatial habitat loss from hydrocarbon development through assessing habitat selection patterns of mule deer.

    Science.gov (United States)

    Northrup, Joseph M; Anderson, Charles R; Wittemyer, George

    2015-11-01

    Extraction of oil and natural gas (hydrocarbons) from shale is increasing rapidly in North America, with documented impacts to native species and ecosystems. With shale oil and gas resources on nearly every continent, this development is set to become a major driver of global land-use change. It is increasingly critical to quantify spatial habitat loss driven by this development to implement effective mitigation strategies and develop habitat offsets. Habitat selection is a fundamental ecological process, influencing both individual fitness and population-level distribution on the landscape. Examinations of habitat selection provide a natural means for understanding spatial impacts. We examined the impact of natural gas development on habitat selection patterns of mule deer on their winter range in Colorado. We fit resource selection functions in a Bayesian hierarchical framework, with habitat availability defined using a movement-based modeling approach. Energy development drove considerable alterations to deer habitat selection patterns, with the most substantial impacts manifested as avoidance of well pads with active drilling to a distance of at least 800 m. Deer displayed more nuanced responses to other infrastructure, avoiding pads with active production and roads to a greater degree during the day than night. In aggregate, these responses equate to alteration of behavior by human development in over 50% of the critical winter range in our study area during the day and over 25% at night. Compared to other regions, the topographic and vegetative diversity in the study area appear to provide refugia that allow deer to behaviorally mediate some of the impacts of development. This study, and the methods we employed, provides a template for quantifying spatial take by industrial activities in natural areas and the results offer guidance for policy makers, mangers, and industry when attempting to mitigate habitat loss due to energy development.

  14. Quantifying Volume of Groundwater in High Elevation Meadows

    Science.gov (United States)

    Ciruzzi, D.; Lowry, C.

    2013-12-01

    Assessing the current and future water needs of high elevation meadows is dependent on quantifying the volume of groundwater stored within the meadow sediment. As groundwater dependent ecosystems, these meadows rely on their ability to capture and store water in order to support ecologic function and base flow to streams. Previous research of these meadows simplified storage by assuming a homogenous reservoir of constant thickness. These previous storage models were able to close the water mass balance, but it is unclear if these assumptions will be successful under future anthropogenic impacts, such as increased air temperature resulting in dryer and longer growing seasons. Applying a geophysical approach, ground-penetrating radar was used at Tuolumne Meadows, CA to qualitatively and quantitatively identify the controls on volume of groundwater storage. From the geophysical results, a three-dimensional model of Tuolumne Meadows was created, which identified meadow thickness and bedrock geometry. This physical model was used in a suite of numerical models simulating high elevation meadows in order to quantify volume of groundwater stored with temporal and spatial variability. Modeling efforts tested both wet and dry water years in order to quantify the variability in the volume of groundwater storage for a range of aquifer properties. Each model was evaluated based on the seasonal depth to water in order to evaluate a particular scenario's ability to support ecological function and base flow. Depending on the simulated meadows ability or inability to support its ecosystem, each representative meadow was categorized as successful or unsuccessful. Restoration techniques to increase active storage volume were suggested at unsuccessful meadows.

  15. A simplified score to quantify comorbidity in COPD.

    Directory of Open Access Journals (Sweden)

    Nirupama Putcha

    Full Text Available Comorbidities are common in COPD, but quantifying their burden is difficult. Currently there is a COPD-specific comorbidity index to predict mortality and another to predict general quality of life. We sought to develop and validate a COPD-specific comorbidity score that reflects comorbidity burden on patient-centered outcomes.Using the COPDGene study (GOLD II-IV COPD, we developed comorbidity scores to describe patient-centered outcomes employing three techniques: 1 simple count, 2 weighted score, and 3 weighted score based upon statistical selection procedure. We tested associations, area under the Curve (AUC and calibration statistics to validate scores internally with outcomes of respiratory disease-specific quality of life (St. George's Respiratory Questionnaire, SGRQ, six minute walk distance (6MWD, modified Medical Research Council (mMRC dyspnea score and exacerbation risk, ultimately choosing one score for external validation in SPIROMICS.Associations between comorbidities and all outcomes were comparable across the three scores. All scores added predictive ability to models including age, gender, race, current smoking status, pack-years smoked and FEV1 (p<0.001 for all comparisons. Area under the curve (AUC was similar between all three scores across outcomes: SGRQ (range 0·7624-0·7676, MMRC (0·7590-0·7644, 6MWD (0·7531-0·7560 and exacerbation risk (0·6831-0·6919. Because of similar performance, the comorbidity count was used for external validation. In the SPIROMICS cohort, the comorbidity count performed well to predict SGRQ (AUC 0·7891, MMRC (AUC 0·7611, 6MWD (AUC 0·7086, and exacerbation risk (AUC 0·7341.Quantifying comorbidity provides a more thorough understanding of the risk for patient-centered outcomes in COPD. A comorbidity count performs well to quantify comorbidity in a diverse population with COPD.

  16. Information quantifiers, entropy squeezing and entanglement properties of superconducting qubit-deformed bosonic field system under dephasing effect

    Science.gov (United States)

    Berrada, K.; Al-Rajhi, M. A.

    2017-10-01

    In this paper, we present a detailed study on the evolution of some measures of nonclassicality and entanglement in the framework of the interaction between a superconducting qubit and deformed bosonic fields under decoherence effect. We compare the dynamical behavior of the different quantum quantifiers by exploiting a large set of nonlinear bosonic fields that are characterized by the deformation parameter. Additionally, we demonstrate how the connection between the appearance of the nonlinearity in the deformed field and the quantum information quantifiers. The time correlation between entropy squeezing, purity, and entanglement is examined in terms of the physical parameters involved in the whole system. Lastly, we explore the exact ranges of the physical parameters in order to combat the decoherence effect and maintain high amount of entanglement during the time evolution.

  17. Quantifying risks with exact analytical solutions of derivative pricing distribution

    Science.gov (United States)

    Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin

    2017-04-01

    Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.

  18. QUANTIFIED COST-BALANCED ROUTING SCHEME FOR OVERLAY MULTICAST

    Institute of Scientific and Technical Information of China (English)

    Lu Jun; Ruan Qiuqi

    2006-01-01

    This paper focuses on the quantitative analysis issue of the routing metrics tradeoff problem, and presents a Quantified Cost-Balanced overlay multicast routing scheme (QCost-Balanced) to the metric tradeoff problem between overlay path delay and access bandwidth at Multicast Server Nodes (MSN) for real-time applications over Internet. Besides implementing a dynamic priority to MSNs by weighing the size of its service clients for better efficiency, QCost-Balanced tradeoffs these two metrics by a unified tradeoff metric based on quantitative analysis. Simulation experiments demonstrate that the scheme achieves a better tradeoff gain in both two metrics, and effective performance in metric quantitative control.

  19. Quantifying population genetic differentiation from next-generation sequencing data

    DEFF Research Database (Denmark)

    Fumagalli, Matteo; Garrett Vieira, Filipe Jorge; Korneliussen, Thorfinn Sand

    2013-01-01

    Over the last few years, new high-throughput DNA sequencing technologies have dramatically increased speed and reduced sequencing costs. However, the use of these sequencing technologies is often challenged by errors and biases associated with the bioinformatical methods used for analyzing the data...... method for quantifying population genetic differentiation from next-generation sequencing data. In addition, we present a strategy to investigate population structure via Principal Components Analysis. Through extensive simulations, we compare the new method herein proposed to approaches based...... individuals, suggesting that employing this new method is useful for investigating the genetic relationships of populations sampled at low coverage....

  20. Quantifying Time Dependent Moisture Storage and Transport Properties

    DEFF Research Database (Denmark)

    Peuhkuri, Ruut H

    2003-01-01

    This paper describes an experimental and numerical approach to quantify the time dependence of sorption mechanisms for some hygroscopic building - mostly insulation - materials. Some investigations of retarded sorption and non-Fickian phenomena, mostly on wood, have given inspiration to the present...... analysis on these other materials. The true moisture capacity of a material can not be described by the slope of the sorption isotherms alone, when the material is exposed to dynamic changes in the moisture conditions. Still, the assumption of an immediate equilibrium is well accepted in the simulation...

  1. Quantifying fluvial bedrock erosion using repeat terrestrial Lidar

    Science.gov (United States)

    Cook, Kristen

    2013-04-01

    The Da'an River Gorge in western Taiwan provides a unique opportunity to observe the formation and evolution of a natural bedrock gorge. The 1.2 km long and up to 20 m deep gorge has formed since 1999 in response to uplift of the riverbed during the Chi-Chi earthquake. The extremely rapid pace of erosion enables us to observe both downcutting and channel widening over short time periods. We have monitored the evolution of the gorge since 2009 using repeat RTK GPS surveys and terrestrial Lidar scans. GPS surveys of the channel profile are conducted frequently, with 24 surveys to date, while Lidar scans are conducted after major floods, or after 5-9 months without a flood, for a total of 8 scans to date. The Lidar data are most useful for recording erosion of channel walls, which is quite episodic and highly variable along the channel. By quantifying the distribution of wall erosion in space and time, we can improve our understanding of channel widening processes and of the development of the channel planform, particularly the growth of bends. During the summer of 2012, the Da'an catchment experienced two large storm events, a meiyu (plum rain) event on June 10-13 that brought 800 mm of rain and a typhoon on August 1-3 that brought 650 mm of rain. The resulting floods had significant geomorphic effects on the Da'an gorge, including up to 10s of meters of erosion in some sections of the gorge walls. We quantify these changes using Lidar surveys conducted on June 7, July 3, and August 30. Channel wall collapses also occur in the absence of large floods, and we use scans from August 23, 2011 and June 7, 2012 to quantify erosion during a period that included a number of small floods, but no large ones. This allows us to compare the impact of 9 months of normal conditions to the impact of short-duration extreme events. The observed variability of erosion in space and time highlights the need for 3D techniques such as terrestrial Lidar to properly quantify erosion in this

  2. Challenges in quantifying biosphere-atmosphere exchange of nitrogen species

    DEFF Research Database (Denmark)

    Sutton, M.A.; Nemitz, E.; Erisman, J.W.

    2007-01-01

    Recent research in nitrogen exchange with the atmosphere has separated research communities according to N form. The integrated perspective needed to quantify the net effect of N on greenhouse-gas balance is being addressed by the NitroEurope Integrated Project (NEU). Recent advances have depended...... progress has been made in modelling N fluxes, especially for N2O, NO and bi-directional NH3 exchange. Landscape analysis represents an emerging challenge to address the spatial interactions between farms, fields, ecosystems, catchments and air dispersion/deposition. European up-scaling of N fluxes...

  3. A Bayesian approach to simultaneously quantify assignments and linguistic uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Chavez, Gregory M [Los Alamos National Laboratory; Booker, Jane M [BOOKER SCIENTIFIC FREDERICKSBURG; Ross, Timothy J [UNM

    2010-10-07

    Subject matter expert assessments can include both assignment and linguistic uncertainty. This paper examines assessments containing linguistic uncertainty associated with a qualitative description of a specific state of interest and the assignment uncertainty associated with assigning a qualitative value to that state. A Bayesian approach is examined to simultaneously quantify both assignment and linguistic uncertainty in the posterior probability. The approach is applied to a simplified damage assessment model involving both assignment and linguistic uncertainty. The utility of the approach and the conditions under which the approach is feasible are examined and identified.

  4. Quantifying camouflage: how to predict detectability from appearance.

    Science.gov (United States)

    Troscianko, Jolyon; Skelhorn, John; Stevens, Martin

    2017-01-06

    Quantifying the conspicuousness of objects against particular backgrounds is key to understanding the evolution and adaptive value of animal coloration, and in designing effective camouflage. Quantifying detectability can reveal how colour patterns affect survival, how animals' appearances influence habitat preferences, and how receiver visual systems work. Advances in calibrated digital imaging are enabling the capture of objective visual information, but it remains unclear which methods are best for measuring detectability. Numerous descriptions and models of appearance have been used to infer the detectability of animals, but these models are rarely empirically validated or directly compared to one another. We compared the performance of human 'predators' to a bank of contemporary methods for quantifying the appearance of camouflaged prey. Background matching was assessed using several established methods, including sophisticated feature-based pattern analysis, granularity approaches and a range of luminance and contrast difference measures. Disruptive coloration is a further camouflage strategy where high contrast patterns disrupt they prey's tell-tale outline, making it more difficult to detect. Disruptive camouflage has been studied intensely over the past decade, yet defining and measuring it have proven far more problematic. We assessed how well existing disruptive coloration measures predicted capture times. Additionally, we developed a new method for measuring edge disruption based on an understanding of sensory processing and the way in which false edges are thought to interfere with animal outlines. Our novel measure of disruptive coloration was the best predictor of capture times overall, highlighting the importance of false edges in concealment over and above pattern or luminance matching. The efficacy of our new method for measuring disruptive camouflage together with its biological plausibility and computational efficiency represents a substantial

  5. Quantifying the Efficiency Advantages of High Viscosity Index Hydraulic Fluids

    Institute of Scientific and Technical Information of China (English)

    Christian D. Neveu; Michael D. Zink; Alex Tsay

    2006-01-01

    By providing higher in- use viscosity at elevated operating temperatures, hydraulic fluids with high viscosity index improve the efficiency of the hydraulic system. For mobile hydraulic equipment this efficiency can be quantified as an increase in fuel economy. This paper reviews the research that demonstrates these efficiency advantages in gear, vane and piston pumps and presents a method for predicting the overall fuel economy for a fleet of hydraulic equipment in opquipment operator to easily improve the performance of the system and reduce fuel consumption.

  6. Quantifying environmental performance using an environmental footprint calculator

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.B.; Loney, A.C.; Chan, V. [Conestoga-Rovers & Associates, Waterloo, Ontario (Canada)

    2009-07-01

    This paper provides a case study using relevant key performance indicators (KPIs) to evaluate the environmental performance of a business. Using recognized calculation and reporting frameworks, Conestoga-Rovers & Associates (CRA) designed the Environmental Footprint Calculator to quantify the environmental performance of a Canadian construction materials company. CRA designed the Environmental Footprint calculator for our client to track and report their environmental performance in accordance with their targets, based on requirements of relevant guidance documents. The objective was to design a tool that effectively manages, calculates, and reports environmental performance to various stakeholders in a user-friendly format. (author)

  7. Behavioral economics.

    Science.gov (United States)

    Camerer, Colin F

    2014-09-22

    Behavioral economics uses evidence from psychology and other social sciences to create a precise and fruitful alternative to traditional economic theories, which are based on optimization. Behavioral economics may interest some biologists, as it shifts the basis for theories of economic choice away from logical calculation and maximization and toward biologically plausible mechanisms.

  8. Aggressive behavior

    NARCIS (Netherlands)

    Didden, H.C.M.; Lindsay, W.R.; Lang, R.; Sigafoos, J.; Deb, S.; Wiersma, J.; Peters-Scheffer, N.C.; Marschik, P.B.; O’Reilly, M.F.; Lancioni, G.E.

    2016-01-01

    Aggressive behavior is common in individuals with intellectual and developmental disabilities (IDDs), and it is most often targeted for intervention. Psychological, contextual, and biological risk factors may contribute to the risk of aggressive behavior. Risk factors are gender (males), level of ID

  9. Aggressive behavior

    NARCIS (Netherlands)

    Didden, H.C.M.; Lindsay, W.R.; Lang, R.; Sigafoos, J.; Deb, S.; Wiersma, J.; Peters-Scheffer, N.C.; Marschik, P.B.; O’Reilly, M.F.; Lancioni, G.E.

    2016-01-01

    Aggressive behavior is common in individuals with intellectual and developmental disabilities (IDDs), and it is most often targeted for intervention. Psychological, contextual, and biological risk factors may contribute to the risk of aggressive behavior. Risk factors are gender (males), level of ID

  10. Quantifying sublethal effects of glyphosate and Roundup® to Daphnia magna using a fluorescence based enzyme activity assay and video tracking

    DEFF Research Database (Denmark)

    Roslev, Peter; R. Hansen, Lone; Ørsted, Michael

    on endpoints such as immobility and mortality. In this study, we investigated sublethal effects of glyphosate and Roundup® to D. magna using video tracking for quantifying behavioral changes, and a novel fluorescence based assay for measuring in vivo hydrolytic enzyme activity (FLEA assay). Roundup® exposure...... resulted in behavioral effects quantified as decreases in average swimming velocity and distance moved whereas the inactive time in defined arenas increased. Exposure of D. magna to binary mixtures of glyphosate and copper (Cu) attenuated acute metal toxicity. The results suggest that a combination...... of assays targeting in vivo enzyme activity and behavioral changes may be applied as a quantitative and sensitive tool for detecting sublethal effects of glyphosate and Roundup® to D. magna. The inhibitory effect of Roundup® on alkaline phosphatase in non-target organisms warrants further investigations...

  11. Quantifying inflow uncertainties in RANS simulations of urban pollutant dispersion

    Science.gov (United States)

    García-Sánchez, C.; Van Tendeloo, G.; Gorlé, C.

    2017-07-01

    Numerical simulations of flow and pollutant dispersion in urban environments have the potential to support design and policy decisions that could reduce the population's exposure to air pollution. Reynolds-averaged Navier-Stokes simulations are a common modeling technique for urban flow and dispersion, but several sources of uncertainty in the simulations can affect the accuracy of the results. The present study proposes a method to quantify the uncertainty related to variability in the inflow boundary conditions. The method is applied to predict flow and pollutant dispersion in downtown Oklahoma City and the results are compared to field measurements available from the Joint Urban 2003 measurement campaign. Three uncertain parameters that define the inflow profiles for velocity, turbulence kinetic energy and turbulence dissipation are defined: the velocity magnitude and direction, and the terrain roughness length. The uncertain parameter space is defined based on the available measurement data, and a non-intrusive propagation approach that employs 729 simulations is used to quantify the uncertainty in the simulation output. A variance based sensitivity analysis is performed to identify the most influential uncertain parameters, and it is shown that the predicted tracer concentrations are influenced by all three uncertain variables. Subsequently, we specify different probability distributions for the uncertain inflow variables based on the available measurement data and calculate the corresponding means and 95% confidence intervals for comparison with the field measurements at 35 locations in downtown Oklahoma City.

  12. A kernel plus method for quantifying wind turbine performance upgrades

    KAUST Repository

    Lee, Giwhyun

    2014-04-21

    Power curves are commonly estimated using the binning method recommended by the International Electrotechnical Commission, which primarily incorporates wind speed information. When such power curves are used to quantify a turbine\\'s upgrade, the results may not be accurate because many other environmental factors in addition to wind speed, such as temperature, air pressure, turbulence intensity, wind shear and humidity, all potentially affect the turbine\\'s power output. Wind industry practitioners are aware of the need to filter out effects from environmental conditions. Toward that objective, we developed a kernel plus method that allows incorporation of multivariate environmental factors in a power curve model, thereby controlling the effects from environmental factors while comparing power outputs. We demonstrate that the kernel plus method can serve as a useful tool for quantifying a turbine\\'s upgrade because it is sensitive to small and moderate changes caused by certain turbine upgrades. Although we demonstrate the utility of the kernel plus method in this specific application, the resulting method is a general, multivariate model that can connect other physical factors, as long as their measurements are available, with a turbine\\'s power output, which may allow us to explore new physical properties associated with wind turbine performance. © 2014 John Wiley & Sons, Ltd.

  13. Quantifying Fire Cycle from Dendroecological Records Using Survival Analyses

    Directory of Open Access Journals (Sweden)

    Dominic Cyr

    2016-06-01

    Full Text Available Quantifying fire regimes in the boreal forest ecosystem is crucial for understanding the past and present dynamics, as well as for predicting its future dynamics. Survival analyses have often been used to estimate the fire cycle in eastern Canada because they make it possible to take into account the censored information that is made prevalent by the typically long fire return intervals and the limited scope of the dendroecological methods that are used to quantify them. Here, we assess how the true length of the fire cycle, the short-term temporal variations in fire activity, and the sampling effort affect the accuracy and precision of estimates obtained from two types of parametric survival models, the Weibull and the exponential models, and one non-parametric model obtained with the Cox regression. Then, we apply those results in a case area located in eastern Canada. Our simulation experiment confirms some documented concerns regarding the detrimental effects of temporal variations in fire activity on parametric estimation of the fire cycle. Cox regressions appear to provide the most accurate and robust estimator, being by far the least affected by temporal variations in fire activity. The Cox-based estimate of the fire cycle for the last 300 years in the case study area is 229 years (CI95: 162–407, compared with the likely overestimated 319 years obtained with the commonly used exponential model.

  14. Quantifying Head Impact Exposure in Collegiate Women's Soccer.

    Science.gov (United States)

    Press, Jaclyn N; Rowson, Steven

    2017-03-01

    The aim of this study was to quantify head impact exposure for a collegiate women's soccer team over the course of the 2014 season. Observational and prospective study. Virginia Tech women's soccer games and practices. Twenty-six collegiate level women's soccer players with a mean player age of 19 ± 1. Participating players were instrumented with head impact sensors for biomechanical analysis. Video recordings of each event were used to manually verify each impact sustained. Head impact counts by player position and impact situation. The sensors collected data from a total of 17 865 accelerative events, 8999 of which were classified as head impacts. Of these, a total of 1703 impacts were positively identified (19% of total real impacts recorded by sensor), 90% of which were associated with heading the ball. The average number of impacts per player per practice or game was 1.86 ± 1.42. Exposure to head impact varied by player position. Head impact exposure was quantified through 2 different methods, which illustrated the challenges associated with autonomously collecting acceleration data with head impact sensors. Users of head impact data must exercise caution when interpreting on-field head impact sensor data.

  15. Visualizing and quantifying Fusarium oxysporum in the plant host.

    Science.gov (United States)

    Diener, Andrew

    2012-12-01

    Host-specific forms of Fusarium oxysporum infect the roots of numerous plant species. I present a novel application of familiar methodology to visualize and quantify F. oxysporum in roots. Infection in the roots of Arabidopsis thaliana, tomato, and cotton was detected with colorimetric reagents that are substrates for Fusarium spp.-derived arabinofuranosidase and N-acetyl-glucosaminidase activities and without the need for genetic modification of either plant host or fungal pathogen. Similar patterns of blue precipitation were produced by treatment with 5-bromo-4-chloro-3-indoxyl-α-l-arabinofuranoside and 5-bromo-4-chloro-3-indoxyl-2-acetamido-2-deoxy-β-d-glucopyranoside, and these patterns were consistent with prior histological descriptions of F. oxysporum in roots. Infection was quantified in roots of wild-type and mutant Arabidopsis using 4-nitrophenyl-α-l-arabinofuranoside. In keeping with an expectation that disease severity above ground is correlated with F. oxysporum infection below ground, elevated levels of arabinofuranosidase activity were measured in the roots of susceptible agb1 and rfo1 while a reduced level was detected in the resistant eir1. In contrast, disease severity and F. oxysporum infection were uncoupled in tir3. The distribution of staining patterns in roots suggests that AGB1 and RFO1 restrict colonization of the vascular cylinder by F. oxysporum whereas EIR1 promotes colonization of root apices.

  16. Quantifying dynamic characteristics of human walking for comprehensive gait cycle.

    Science.gov (United States)

    Mummolo, Carlotta; Mangialardi, Luigi; Kim, Joo H

    2013-09-01

    Normal human walking typically consists of phases during which the body is statically unbalanced while maintaining dynamic stability. Quantifying the dynamic characteristics of human walking can provide better understanding of gait principles. We introduce a novel quantitative index, the dynamic gait measure (DGM), for comprehensive gait cycle. The DGM quantifies the effects of inertia and the static balance instability in terms of zero-moment point and ground projection of center of mass and incorporates the time-varying foot support region (FSR) and the threshold between static and dynamic walking. Also, a framework of determining the DGM from experimental data is introduced, in which the gait cycle segmentation is further refined. A multisegmental foot model is integrated into a biped system to reconstruct the walking motion from experiments, which demonstrates the time-varying FSR for different subphases. The proof-of-concept results of the DGM from a gait experiment are demonstrated. The DGM results are analyzed along with other established features and indices of normal human walking. The DGM provides a measure of static balance instability of biped walking during each (sub)phase as well as the entire gait cycle. The DGM of normal human walking has the potential to provide some scientific insights in understanding biped walking principles, which can also be useful for their engineering and clinical applications.

  17. Quantifying nonadditive selection caused by indirect ecological effects.

    Science.gov (United States)

    TerHorst, Casey P; Lau, Jennifer A; Cooper, Idelle A; Keller, Kane R; La Rosa, Raffica J; Royer, Anne M; Schultheis, Elizabeth H; Suwa, Tomomi; Conner, Jeffrey K

    2015-09-01

    In natural biological communities, species interact with many other species. Multiple species interactions can lead to indirect ecological effects that have important fitness consequences and can cause nonadditive patterns of natural selection. Given that indirect ecological effects are common in nature, nonadditive selection may also be quite common. As a result, quantifying nonadditive selection resulting from indirect ecological effects may be critical for understanding adaptation in natural communities composed of many interacting species. We describe how to quantify the relative strength of nonadditive selection resulting from indirect ecological effects compared to the strength of pairwise selection. We develop a clear method for testing for nonadditive selection caused by indirect ecological effects and consider how it might affect adaptation in multispecies communities. We use two case studies to illustrate how our method can be applied to empirical data sets. Our results suggest that nonadditive selection caused by indirect ecological effects may be common in nature. Our hope is that trait-based approaches, combined with multifactorial experiments, will result in more estimates of nonadditive selection that reveal the relative importance of indirect ecological effects for evolution in a community context.

  18. The Local Dimension: a method to quantify the Cosmic Web

    CERN Document Server

    Sarkar, Prakash

    2008-01-01

    It is now well accepted that the galaxies are distributed in filaments, sheets and clusters all of which form an interconnected network known as the Cosmic Web. It is a big challenge to quantify the shapes of the interconnected structural elements that form this network. Tools like the Minkowski functionals which use global properties, though well suited for an isolated object like a single sheet or filament, are not suited for an interconnected network of such objects. We consider the Local Dimension $D$, defined through $N(R)=A R^D$, where $N(R)$ is the galaxy number count within a sphere of comoving radius $R$ centered on a particular galaxy, as a tool to locally quantify the shape in the neigbourhood of different galaxies along the Cosmic Web. We expect $D \\sim 1,2$ and 3 for a galaxy located in a filament, sheet and cluster respectively. Using LCDM N-body simulations we find that it is possible to determine $D$ through a power law fit to $N(R)$ across the length-scales 2 to $10 {\\rm Mpc}$ for $\\sim 33 %$...

  19. A revised metric for quantifying body shape in vertebrates.

    Science.gov (United States)

    Collar, David C; Reynaga, Crystal M; Ward, Andrea B; Mehta, Rita S

    2013-08-01

    Vertebrates exhibit tremendous diversity in body shape, though quantifying this variation has been challenging. In the past, researchers have used simplified metrics that either describe overall shape but reveal little about its anatomical basis or that characterize only a subset of the morphological features that contribute to shape variation. Here, we present a revised metric of body shape, the vertebrate shape index (VSI), which combines the four primary morphological components that lead to shape diversity in vertebrates: head shape, length of the second major body axis (depth or width), and shape of the precaudal and caudal regions of the vertebral column. We illustrate the usefulness of VSI on a data set of 194 species, primarily representing five major vertebrate clades: Actinopterygii, Lissamphibia, Squamata, Aves, and Mammalia. We quantify VSI diversity within each of these clades and, in the course of doing so, show how measurements of the morphological components of VSI can be obtained from radiographs, articulated skeletons, and cleared and stained specimens. We also demonstrate that head shape, secondary body axis, and vertebral characteristics are important independent contributors to body shape diversity, though their importance varies across vertebrate groups. Finally, we present a functional application of VSI to test a hypothesized relationship between body shape and the degree of axial bending associated with locomotor modes in ray-finned fishes. Altogether, our study highlights the promise VSI holds for identifying the morphological variation underlying body shape diversity as well as the selective factors driving shape evolution.

  20. Basinsoft, a computer program to quantify drainage basin characteristics

    Science.gov (United States)

    Harvey, Craig A.; Eash, David A.

    2001-01-01

    Surface water runoff is a function of many interrelated factors including climate, soils, landuse, and the physiography of the drainage basin. A practical and effective method to quantify drainage basin characteristics would allow analysis of the interrelations of these factors, leading to an improved understanding of the effects of drainage basin characteristics on surface-water runoff. Historically, the quantification of drainage basin characteristics has been a tedious and time-consuming process. Recent improvements in computer hardware and software technology have enabled the developers of a program called Basinsoft to automate this process. Basinsoft requires minimal preprocessing of data and provides an efficient, automated procedure for quantifying selected morphometric characteristics and the option to area-weight characteristics for a drainage basin. The user of Basinsoft is assumed to have a limited amount of experience in the use of ARC/INFO, a proprietary geographic information system (GIS). (The use of brand names in this chapter is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey [USGS].)

  1. Quantifying Contributions of Climate Feedbacks to Global Warming Pattern Formation

    Science.gov (United States)

    Song, X.; Zhang, G. J.; Cai, M.

    2013-12-01

    The ';';climate feedback-response analysis method'' (CFRAM) was applied to the NCAR CCSM3.0 simulation to analyze the strength and spatial distribution of climate feedbacks and to quantify their contributions to global and regional surface temperature changes in response to a doubling of CO2. Instead of analyzing the climate sensitivity, the CFRAM directly attributes the temperature change to individual radiative and non-radiative feedbacks. The radiative feedback decomposition is based on hourly model output rather than monthly mean data that are commonly used in climate feedback analysis. This gives a more accurate quantification of the cloud and albedo feedbacks. The process-based decomposition of non-radiative feedback enables us to understand the roles of GCM physical and dynamic processes in climate change. The pattern correlation, the centered root-mean-square (RMS) difference and the ratio of variations (represented by standard deviations) between the partial surface temperature change due to each feedback process and the total surface temperature change in CCSM3.0 simulation are examined to quantify the roles of each feedback process in the global warming pattern formation. The contributions of climate feedbacks to the regional warming are also discussed.

  2. Quantifying the ventilatory control contribution to sleep apnoea using polysomnography.

    Science.gov (United States)

    Terrill, Philip I; Edwards, Bradley A; Nemati, Shamim; Butler, James P; Owens, Robert L; Eckert, Danny J; White, David P; Malhotra, Atul; Wellman, Andrew; Sands, Scott A

    2015-02-01

    Elevated loop gain, consequent to hypersensitive ventilatory control, is a primary nonanatomical cause of obstructive sleep apnoea (OSA) but it is not possible to quantify this in the clinic. Here we provide a novel method to estimate loop gain in OSA patients using routine clinical polysomnography alone. We use the concept that spontaneous ventilatory fluctuations due to apnoeas/hypopnoeas (disturbance) result in opposing changes in ventilatory drive (response) as determined by loop gain (response/disturbance). Fitting a simple ventilatory control model (including chemical and arousal contributions to ventilatory drive) to the ventilatory pattern of OSA reveals the underlying loop gain. Following mathematical-model validation, we critically tested our method in patients with OSA by comparison with a standard (continuous positive airway pressure (CPAP) drop method), and by assessing its ability to detect the known reduction in loop gain with oxygen and acetazolamide. Our method quantified loop gain from baseline polysomnography (correlation versus CPAP-estimated loop gain: n=28; r=0.63, ppolysomnography, enabling identification of likely responders to therapies targeting ventilatory control.

  3. Quantifying population recovery rates for ecological risk assessment.

    Science.gov (United States)

    Barnthouse, Lawrence W

    2004-02-01

    Ecological effects of modern agrochemicals are typically limited to brief episodes of increased mortality or reduced growth that are qualitatively similar to natural disturbance regimes. The long-term ecological consequences of agrochemical exposures depend on the intensity and frequency of the exposures relative to the rates of recovery of the exposed populations. This paper explores the feasibility of using readily available life history information to quantify recovery rates of aquatic populations. A simple modeling framework based on the logistic population growth model is used to compare population recovery rates for different types of organisms and to evaluate the influence of life history, initial percent reduction, disturbance frequency, and immigration on the time required for populations to recover from simulated agrochemical exposures. Recovery models are developed for aquatic biota ranging in size and longevity from unicellular algae to fish and turtles. Population growth rates and recovery times derived from life history data are consistent with measured recovery times reported in mesocosm and enclosure experiments, thus supporting the use of the models for quantifying population recovery rates for ecological risk assessment.

  4. Quantifying facial paralysis using the Kinect v2.

    Science.gov (United States)

    Gaber, Amira; Taher, Mona F; Wahed, Manal Abdel

    2015-01-01

    Assessment of facial paralysis (FP) and quantitative grading of facial asymmetry are essential in order to quantify the extent of the condition as well as to follow its improvement or progression. As such, there is a need for an accurate quantitative grading system that is easy to use, inexpensive and has minimal inter-observer variability. A comprehensive automated system to quantify and grade FP is the main objective of this work. An initial prototype has been presented by the authors. The present research aims to enhance the accuracy and robustness of one of this system's modules: the resting symmetry module. This is achieved by including several modifications to the computation method of the symmetry index (SI) for the eyebrows, eyes and mouth. These modifications are the gamma correction technique, the area of the eyes, and the slope of the mouth. The system was tested on normal subjects and showed promising results. The mean SI of the eyebrows decreased slightly from 98.42% to 98.04% using the modified method while the mean SI for the eyes and mouth increased from 96.93% to 99.63% and from 95.6% to 98.11% respectively while using the modified method. The system is easy to use, inexpensive, automated and fast, has no inter-observer variability and is thus well suited for clinical use.

  5. The SEGUE K Giant Survey. III. Quantifying Galactic Halo Substructure

    CERN Document Server

    Janesh, William; Ma, Zhibo; Harding, Paul; Rockosi, Constance; Starkenburg, Else; Xue, Xiang Xiang; Rix, Hans-Walter; Beers, Timothy C; Johnson, Jennifer; Lee, Young Sun; Schneider, Donald P

    2015-01-01

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5-125 kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey's SEGUE project. We use a position-velocity clustering estimator (the 4distance) and a smooth stellar halo model to quantify the amount of substructure in the halo. Overall, we find that the halo as a whole is highly structured, and confirm earlier work using BHB stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius. In addition, we find that the amount of substructure in the halo increases with increasing metallicity, and that the K giant sample shows significantly stronger substructure than the BHB stars, which only sample the most metal poor stars. Using a friends-of-friends algorithm to identify groups, we find that a large fraction ($\\sim 33\\%$) of the st...

  6. Quantifying complexity in translational research: an integrated approach

    Science.gov (United States)

    Munoz, David A.; Nembhard, Harriet Black; Kraschnewski, Jennifer L.

    2014-01-01

    Purpose This article quantifies complexity in translational research. The impact of major operational steps and technical requirements (TR) is calculated with respect to their ability to accelerate moving new discoveries into clinical practice. Design/Methodology/Approach A three-phase integrated Quality Function Deployment (QFD) and Analytic Hierarchy Process (AHP) method was used to quantify complexity in translational research. A case study in obesity was used to usability. Findings Generally, the evidence generated was valuable for understanding various components in translational research. Particularly, we found that collaboration networks, multidisciplinary team capacity and community engagement are crucial for translating new discoveries into practice. Research limitations/implications As the method is mainly based on subjective opinion, some argue that the results may be biased. However, a consistency ratio is calculated and used as a guide to subjectivity. Alternatively, a larger sample may be incorporated to reduce bias. Practical implications The integrated QFD-AHP framework provides evidence that could be helpful to generate agreement, develop guidelines, allocate resources wisely, identify benchmarks and enhance collaboration among similar projects. Originality/value Current conceptual models in translational research provide little or no clue to assess complexity. The proposed method aimed to fill this gap. Additionally, the literature review includes various features that have not been explored in translational research. PMID:25417380

  7. Part mutual information for quantifying direct associations in networks.

    Science.gov (United States)

    Zhao, Juan; Zhou, Yiwei; Zhang, Xiujun; Chen, Luonan

    2016-05-03

    Quantitatively identifying direct dependencies between variables is an important task in data analysis, in particular for reconstructing various types of networks and causal relations in science and engineering. One of the most widely used criteria is partial correlation, but it can only measure linearly direct association and miss nonlinear associations. However, based on conditional independence, conditional mutual information (CMI) is able to quantify nonlinearly direct relationships among variables from the observed data, superior to linear measures, but suffers from a serious problem of underestimation, in particular for those variables with tight associations in a network, which severely limits its applications. In this work, we propose a new concept, "partial independence," with a new measure, "part mutual information" (PMI), which not only can overcome the problem of CMI but also retains the quantification properties of both mutual information (MI) and CMI. Specifically, we first defined PMI to measure nonlinearly direct dependencies between variables and then derived its relations with MI and CMI. Finally, we used a number of simulated data as benchmark examples to numerically demonstrate PMI features and further real gene expression data from Escherichia coli and yeast to reconstruct gene regulatory networks, which all validated the advantages of PMI for accurately quantifying nonlinearly direct associations in networks.

  8. Quantifying historical carbon and climate debts among nations

    Science.gov (United States)

    Matthews, H. Damon

    2016-01-01

    Contributions to historical climate change have varied substantially among nations. These differences reflect underlying inequalities in wealth and development, and pose a fundamental challenge to the implementation of a globally equitable climate mitigation strategy. This Letter presents a new way to quantify historical inequalities among nations using carbon and climate debts, defined as the amount by which national climate contributions have exceeded a hypothetical equal per-capita share over time. Considering only national CO2 emissions from fossil fuel combustion, accumulated carbon debts across all nations from 1990 to 2013 total 250 billion tonnes of CO2, representing 40% of cumulative world emissions since 1990. Expanding this to reflect the temperature response to a range of emissions, historical climate debts accrued between 1990 and 2010 total 0.11 °C, close to a third of observed warming over that period. Large fractions of this debt are carried by industrialized countries, but also by countries with high levels of deforestation and agriculture. These calculations could contribute to discussions of climate responsibility by providing a tangible way to quantify historical inequalities, which could then inform the funding of mitigation, adaptation and the costs of loss and damages in those countries that have contributed less to historical warming.

  9. Methods for quantifying uncertainty in fast reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Fischer, P. F.

    2008-04-07

    Liquid-metal-cooled fast reactors in the form of sodium-cooled fast reactors have been successfully built and tested in the U.S. and throughout the world. However, no fast reactor has operated in the U.S. for nearly fourteen years. More importantly, the U.S. has not constructed a fast reactor in nearly 30 years. In addition to reestablishing the necessary industrial infrastructure, the development, testing, and licensing of a new, advanced fast reactor concept will likely require a significant base technology program that will rely more heavily on modeling and simulation than has been done in the past. The ability to quantify uncertainty in modeling and simulations will be an important part of any experimental program and can provide added confidence that established design limits and safety margins are appropriate. In addition, there is an increasing demand from the nuclear industry for best-estimate analysis methods to provide confidence bounds along with their results. The ability to quantify uncertainty will be an important component of modeling that is used to support design, testing, and experimental programs. Three avenues of UQ investigation are proposed. Two relatively new approaches are described which can be directly coupled to simulation codes currently being developed under the Advanced Simulation and Modeling program within the Reactor Campaign. A third approach, based on robust Monte Carlo methods, can be used in conjunction with existing reactor analysis codes as a means of verification and validation of the more detailed approaches.

  10. A method to quantify mouse coat-color proportions.

    Directory of Open Access Journals (Sweden)

    Songthip Ounpraseuth

    Full Text Available Coat-color proportions and patterns in mice are used as assays for many processes such as transgene expression, chimerism, and epigenetics. In many studies, coat-color readouts are estimated from subjective scoring of individual mice. Here we show a method by which mouse coat color is quantified as the proportion of coat shown in one or more digital images. We use the yellow-agouti mouse model of epigenetic variegation to demonstrate this method. We apply this method to live mice using a conventional digital camera for data collection. We use a raster graphics editing program to convert agouti regions of the coat to a standard, uniform, brown color and the yellow regions of the coat to a standard, uniform, yellow color. We use a second program to quantify the proportions of these standard colors. This method provides quantification that relates directly to the visual appearance of the live animal. It also provides an objective analysis with a traceable record, and it should allow for precise comparisons of mouse coats and mouse cohorts within and between studies.

  11. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  12. Quantifying three dimensional reconnection in fragmented current layers

    Energy Technology Data Exchange (ETDEWEB)

    Wyper, P. F., E-mail: peter.f.wyper@nasa.gov; Hesse, M., E-mail: michael.hesse-1@nasa.gov [Goddard Space Flight Center, 8800 Greenbelt Rd, Greenbelt, Maryland 20771 (United States)

    2015-04-15

    There is growing evidence that when magnetic reconnection occurs in high Lundquist number plasmas such as in the Solar Corona or the Earth's Magnetosphere it does so within a fragmented, rather than a smooth current layer. Within the extent of these fragmented current regions, the associated magnetic flux transfer and energy release occur simultaneously in many different places. This investigation focusses on how best to quantify the rate at which reconnection occurs in such layers. An analytical theory is developed which describes the manner in which new connections form within fragmented current layers in the absence of magnetic nulls. It is shown that the collective rate at which new connections form can be characterized by two measures; a total rate which measures the true rate at which new connections are formed and a net rate which measures the net change of connection associated with the largest value of the integral of E{sub ||} through all of the non-ideal regions. Two simple analytical models are presented which demonstrate how each should be applied and what they quantify.

  13. Quantifying prosthetic gait deviation using simple outcome measures

    Science.gov (United States)

    Kark, Lauren; Odell, Ross; McIntosh, Andrew S; Simmons, Anne

    2016-01-01

    AIM: To develop a subset of simple outcome measures to quantify prosthetic gait deviation without needing three-dimensional gait analysis (3DGA). METHODS: Eight unilateral, transfemoral amputees and 12 unilateral, transtibial amputees were recruited. Twenty-eight able-bodied controls were recruited. All participants underwent 3DGA, the timed-up-and-go test and the six-minute walk test (6MWT). The lower-limb amputees also completed the Prosthesis Evaluation Questionnaire. Results from 3DGA were summarised using the gait deviation index (GDI), which was subsequently regressed, using stepwise regression, against the other measures. RESULTS: Step-length (SL), self-selected walking speed (SSWS) and the distance walked during the 6MWT (6MWD) were significantly correlated with GDI. The 6MWD was the strongest, single predictor of the GDI, followed by SL and SSWS. The predictive ability of the regression equations were improved following inclusion of self-report data related to mobility and prosthetic utility. CONCLUSION: This study offers a practicable alternative to quantifying kinematic deviation without the need to conduct complete 3DGA. PMID:27335814

  14. Quantifying the savings of an industry energy efficiency programme

    Energy Technology Data Exchange (ETDEWEB)

    Cahill, C.J. [Sustainable Energy Research Group, Department of Civil and Environmental Engineering, University College Cork, Cork (Ireland); Gallachoir, B.P.O. [Environmental Research Institute, University College Cork, Lee Road, Cork (Ireland)

    2012-05-15

    In a developed economy, improving the energy performance of the industry sector can make an important contribution to achieving national energy efficiency goals. Policy measures aimed at increasing energy efficiency in industry must be proven to be effective. In Ireland one such measure is the Large Industry Energy Network (LIEN) programme. LIEN is a voluntary network of large energy users, facilitated by the national energy agency, working to maintain strong energy management practices. In this paper, we combine top-down methods for analysing national energy statistics with company-level figures supplied by LIEN participants to quantify the energy savings achieved by the companies and to determine the fraction of national savings that can be attributed to them. By comparing the collective performance of participant companies with the performance of the rest of industry we provide an indication of the effectiveness of the programme and quantify the savings that may be directly attributable to it. These figures when combined with national energy forecasts for industry help us assess the likely contribution of the programme to future national energy savings targets.

  15. Methods to Quantify Nickel in Soils and Plant Tissues

    Directory of Open Access Journals (Sweden)

    Bruna Wurr Rodak

    2015-06-01

    Full Text Available In comparison with other micronutrients, the levels of nickel (Ni available in soils and plant tissues are very low, making quantification very difficult. The objective of this paper is to present optimized determination methods of Ni availability in soils by extractants and total content in plant tissues for routine commercial laboratory analyses. Samples of natural and agricultural soils were processed and analyzed by Mehlich-1 extraction and by DTPA. To quantify Ni in the plant tissues, samples were digested with nitric acid in a closed system in a microwave oven. The measurement was performed by inductively coupled plasma/optical emission spectrometry (ICP-OES. There was a positive and significant correlation between the levels of available Ni in the soils subjected to Mehlich-1 and DTPA extraction, while for plant tissue samples the Ni levels recovered were high and similar to the reference materials. The availability of Ni in some of the natural soil and plant tissue samples were lower than the limits of quantification. Concentrations of this micronutrient were higher in the soil samples in which Ni had been applied. Nickel concentration differed in the plant parts analyzed, with highest levels in the grains of soybean. The grain, in comparison with the shoot and leaf concentrations, were better correlated with the soil available levels for both extractants. The methods described in this article were efficient in quantifying Ni and can be used for routine laboratory analysis of soils and plant tissues.

  16. Quantifying Urban Fragmentation under Economic Transition in Shanghai City, China

    Directory of Open Access Journals (Sweden)

    Heyuan You

    2015-12-01

    Full Text Available Urban fragmentation affects sustainability through multiple impacts on economic, social, and environmental cost. Characterizing the dynamics of urban fragmentation in relation to economic transition should provide implications for sustainability. However, rather few efforts have been made in this issue. Using the case of Shanghai (China, this paper quantifies urban fragmentation in relation to economic transition. In particular, urban fragmentation is quantified by a time-series of remotely sensed images and a set of landscape metrics; and economic transition is described by a set of indicators from three aspects (globalization, decentralization, and marketization. Results show that urban fragmentation presents an increasing linear trend. Multivariate regression identifies positive linear correlation between urban fragmentation and economic transition. More specifically, the relative influence is different for the three components of economic transition. The relative influence of decentralization is stronger than that of globalization and marketization. The joint influences of decentralization and globalization are the strongest for urban fragmentation. The demonstrated methodology can be applicable to other places after making suitable adjustment of the economic transition indicators and fragmentation metrics.

  17. Using multiscale norms to quantify mixing and transport

    CERN Document Server

    Thiffeault, Jean-Luc

    2012-01-01

    Mixing is relevant to many areas of science and engineering, including the pharmaceutical and food industries, oceanography, atmospheric sciences, and civil engineering. In all these situations one goal is to quantify and often then to improve the degree of homogenisation of a substance being stirred, referred to as a passive scalar or tracer. A classical measure of mixing is the variance of the concentration of the scalar, which can be related to the $L^2$ norm of the concentration field. Recently other norms have been used to quantify mixing, in particular the mix-norm as well as negative Sobolev norms. These norms have the advantage that unlike variance they decay even in the absence of diffusion, and their decay corresponds to the flow being mixing in the sense of ergodic theory. General Sobolev norms weigh scalar gradients differently, and are known as multiscale norms for mixing. We review the applications of such norms to mixing and transport, and show how they can be used to optimise the stirring and ...

  18. Quantifying the Traction Force of a Single Cell by Aligned Silicon Nanowire Array

    KAUST Repository

    Li, Zhou

    2009-10-14

    The physical behaviors of stationary cells, such as the morphology, motility, adhesion, anchorage, invasion and metastasis, are likely to be important for governing their biological characteristics. A change in the physical properties of mammalian cells could be an indication of disease. In this paper, we present a silicon-nanowire-array based technique for quantifying the mechanical behavior of single cells representing three distinct groups: normal mammalian cells, benign cells (L929), and malignant cells (HeLa). By culturing the cells on top of NW arrays, the maximum traction forces of two different tumor cells (HeLa, L929) have been measured by quantitatively analyzing the bending of the nanowires. The cancer cell exhibits a larger traction force than the normal cell by ∼20% for a HeLa cell and ∼50% for a L929 cell. The traction forces have been measured for the L929 cells and mechanocytes as a function of culture time. The relationship between cells extending area and their traction force has been investigated. Our study is likely important for studying the mechanical properties of single cells and their migration characteristics, possibly providing a new cellular level diagnostic technique. © 2009 American Chemical Society.

  19. Quantifying the traction force of a single cell by aligned silicon nanowire array.

    Science.gov (United States)

    Li, Zhou; Song, Jinhui; Mantini, Giulia; Lu, Ming-Yen; Fang, Hao; Falconi, Christian; Chen, Lih-Juann; Wang, Zhong Lin

    2009-10-01

    The physical behaviors of stationary cells, such as the morphology, motility, adhesion, anchorage, invasion and metastasis, are likely to be important for governing their biological characteristics. A change in the physical properties of mammalian cells could be an indication of disease. In this paper, we present a silicon-nanowire-array based technique for quantifying the mechanical behavior of single cells representing three distinct groups: normal mammalian cells, benign cells (L929), and malignant cells (HeLa). By culturing the cells on top of NW arrays, the maximum traction forces of two different tumor cells (HeLa, L929) have been measured by quantitatively analyzing the bending of the nanowires. The cancer cell exhibits a larger traction force than the normal cell by approximately 20% for a HeLa cell and approximately 50% for a L929 cell. The traction forces have been measured for the L929 cells and mechanocytes as a function of culture time. The relationship between cells extending area and their traction force has been investigated. Our study is likely important for studying the mechanical properties of single cells and their migration characteristics, possibly providing a new cellular level diagnostic technique.

  20. What the laboratory rat has taught us about social play behavior: role in behavioral development and neural mechanisms

    NARCIS (Netherlands)

    Vanderschuren, L.J.M.J.; Trezza, V.

    2014-01-01

    Social play behavior is the most vigorous and characteristic form of social interaction displayed by developing mammals. The laboratory rat is an ideal species to study this behavior, since it shows ample social play that can be easily recognized and quantified. In this chapter, we will first briefl

  1. Superlative Quantifiers as Modifiers of Meta-Speech Acts

    Directory of Open Access Journals (Sweden)

    Ariel Cohen

    2010-12-01

    Full Text Available The superlative quantifiers, at least and  at most, are commonly assumed  to have the same truth-conditions as the comparative quantifiers  more than and  fewer than. However, as Geurts & Nouwen (2007  have demonstrated, this is wrong, and several theories have been proposed to account for them. In this paper we propose that superlative quantifiers are illocutionary operators; specifically, they modify meta-speech acts. Meta speech-acts are operators that do not express a speech act, but a willingness to make or refrain from making a certain speech  act. The classic example is speech act denegation, e.g. I don't promise to come, where the speaker is explicitly refraining from performing the speech act of promising What denegations do is to delimit the future development of conversation, that is, they delimit future admissible speech acts. Hence we call them meta-speech acts. They are not moves in a game, but rather commitments to behave in certain ways in the future. We formalize the notion of meta speech acts as commitment development spaces, which are rooted graphs: The root of the graph describes the commitment development up to the current point in conversation; the continuations from the  root describe the admissible future directions. We define and formalize the meta-speech act GRANT, which indicates that the speaker, while not necessarily subscribing to a proposition, refrains from asserting its negation. We propose that superlative quantifiers are quantifiers over GRANTs. Thus, Mary petted at least three rabbits means that the minimal number n such that the speaker GRANTs that Mary petted  n rabbits is n = 3. In other words, the speaker denies that Mary petted two, one, or no rabbits, but GRANTs that she petted more. We formalize this interpretation of superlative quantifiers in terms of commitment development spaces, and show how the truth conditions that are derived from it are partly entailed and partly conversationally

  2. Cost Behavior

    DEFF Research Database (Denmark)

    Hoffmann, Kira

    The objective of this dissertation is to investigate determinants and consequences of asymmetric cost behavior. Asymmetric cost behavior arises if the change in costs is different for increases in activity compared to equivalent decreases in activity. In this case, costs are termed “sticky......” if the change is less when activity falls than when activity rises, whereas costs are termed “anti-sticky” if the change is more when activity falls than when activity rises. Understanding such cost behavior is especially relevant for decision-makers and financial analysts that rely on accurate cost information...

  3. Behavioral epigenetics

    Science.gov (United States)

    Lester, Barry M.; Tronick, Edward; Nestler, Eric; Abel, Ted; Kosofsky, Barry; Kuzawa, Christopher W.; Marsit, Carmen J.; Maze, Ian; Meaney, Michael J.; Monteggia, Lisa M.; Reul, Johannes M. H. M.; Skuse, David H.; Sweatt, J. David; Wood, Marcelo A.

    2013-01-01

    Sponsored by the New York Academy of Sciences, the Warren Alpert Medical School of Brown University and the University of Massachusetts Boston, “Behavioral Epigenetics” was held on October 29–30, 2010 at the University of Massachusetts Boston Campus Center, Boston, Massachusetts. This meeting featured speakers and panel discussions exploring the emerging field of behavioral epigenetics, from basic biochemical and cellular mechanisms to the epigenetic modulation of normative development, developmental disorders, and psychopathology. This report provides an overview of the research presented by leading scientists and lively discussion about the future of investigation at the behavioral epigenetic level. PMID:21615751

  4. Cost Behavior

    DEFF Research Database (Denmark)

    Hoffmann, Kira

    The objective of this dissertation is to investigate determinants and consequences of asymmetric cost behavior. Asymmetric cost behavior arises if the change in costs is different for increases in activity compared to equivalent decreases in activity. In this case, costs are termed “sticky......” if the change is less when activity falls than when activity rises, whereas costs are termed “anti-sticky” if the change is more when activity falls than when activity rises. Understanding such cost behavior is especially relevant for decision-makers and financial analysts that rely on accurate cost information...

  5. Characterization of electric load with Information Theory quantifiers

    Science.gov (United States)

    Aquino, Andre L. L.; Ramos, Heitor S.; Frery, Alejandro C.; Viana, Leonardo P.; Cavalcante, Tamer S. G.; Rosso, Osvaldo A.

    2017-01-01

    This paper presents a study of the electric load behavior based on the Causality Complexity-Entropy Plane. We use a public data set, namely REDD, which contains detailed power usage information from several domestic appliances. In our characterization, we use the available power data of the circuit/devices of all houses. The Bandt-Pompe methodology combined with the Causality Complexity-Entropy Plane was used to identify and characterize regimes and behaviors over these data. The results showed that this characterization provides a useful insight into the underlying dynamics that govern the electric load.

  6. Quantifying platelet gel coagulation using Sonoclot and Thrombelastograph hemostasis analyzer.

    Science.gov (United States)

    Cassidy, Lynsay K; Finney, Angela S; Ellis, William Cory; Spiwak, Allison J; Riley, Jeffrey B

    2005-03-01

    Little in vitro research exists discussing platelet gel composition and the resulting strength and degradation characteristics using point-of-care technologies. There must be a quantifiable way of determining the structural integrity of the resulting formed platelet gel thrombus. The Thrombelastograph Hemostasis Analyzer (TEG) and Sonoclot measure the elasticity of a clot as it forms and subsequently degrades naturally. The objective of this study was to determine the application of TEG and Sonoclot technologies as point-of-care devices for technicians using platelet gel therapy. The collected bovine blood was anticoagulated with CPD and processed using a previously published plasma sequestration protocol, using normal saline as a wash solution. The resulting platelet-rich plasma was stored in a sequestration bag in a water bath to maintain the blood temperature at 37 degrees C. Sequestered bovine platelet-rich plasma was made into platelet gel using three different thrombin concentrations. A total of 30 experiments were performed on the platelet gel product using both the TEG and the Sonoclot. We discovered that 6 of the Sonoclot tests and 15 of the TEG tests were valid. None of the TEG clot signatures and nine of the Sonoclot signatures were discovered to be invalid. A chi2 test was performed on the resultant data. The value of the chi2 test was calculated to be 12.86, which translated into a p value of less than 0.001. Despite the vast use and growing popularity of platelet gels, a method in which to quantify platelet gels has yet to be reported. There remains a possibility that gels formed with different concentrations of components may prove useful in different areas of surgery or their uses may expand to a broader spectrum of medicine. However, technology to quantify platelet gels must first be standardized. On the basis of the data collected in this study, it was determined that the TEG and the Sonoclot are not equally capable of analyzing platelet gel clots

  7. Quantifying Stream Bed Gravel Mobility from Friction Angle Measurements

    Science.gov (United States)

    Meyers, M. A.; Dunne, T.

    2012-12-01

    A method to measure friction angles using force gauges was field tested to determine its utility at quantifying critical shear stress in a gravel bedded reach of the San Joaquin River in California. Predictions of mobility from friction angles were compared with observations of the movement of tagged particles from locations for which local shear stress was quantified with a validated 2-D flow model. The observations of movement, distance of travel, and location of the end of travel were made after extended flow releases from Friant dam. Determining the critical shear stress for gravel bed material transport currently depends upon bedload sampling or tracer studies. Often, such measurements can only be made during occasional and untimely flow events, and at limited, suboptimal locations. Yet, theoretical studies conclude that the friction angle is an important control on the critical shear stress for mobility of any grain size, and therefore of the excess shear stress which strongly influences bedload transport rate. The ability to predict bed mobility at ungauged and unmonitored locations is also an important requirement for planning of flow regimes and channel design. Therefore, a method to measure friction angles that can be performed quickly in low flow conditions would prove useful for river management and research. To investigate this promising method friction angle surveys were performed at two riffle sites where differences in bed material size and distribution, and channel slope were observed. The friction angle surveys are sensitive enough to detect differences between the sites as well as spatially and temporally within a single riffle. Low friction angles were observed along the inside of a long bend where sand content was greater (by ~20%) than other surveyed locations. Friction angles decreased slightly after a depositional event associated with transient large woody debris and bank erosion, and increased again after a 5 year return interval flow

  8. Behaviorally inadequate

    DEFF Research Database (Denmark)

    Kasperbauer, Tyler Joshua

    2014-01-01

    is to reject or at least minimize the requirement that environmental ethics must provide protection and assistance to the environment. Virtue theory is often favored by environmentalists precisely because it does matter what one's reasons are for acting, even if one's actions are ineffective at producing......According to situationism in psychology, behavior is primarily influenced by external situational factors rather than internal traits or motivations such as virtues. Environmental ethicists wish to promote pro-environmental behaviors capable of providing adequate protection for the environment......, but situationist critiques suggest that character traits, and environmental virtues, are not as behaviorally robust as is typically supposed. Their views present a dilemma. Because ethicists cannot rely on virtues to produce pro-environmental behaviors, the only real way of salvaging environmental virtue theory...

  9. Behavioral finance

    Directory of Open Access Journals (Sweden)

    Kapor Predrag

    2014-01-01

    Full Text Available This paper discuss some general principles of behavioral finance Behavioral finance is the dynamic and promising field of research that mergers concepts from financial economics and cognitive psychology in attempt to better understand systematic biases in decision-making process of financial agents. While the standard academic finance emphasizes theories such as modern portfolio theory and the efficient market hypothesis, the behavioral finance investigates the psychological and sociological issues that impact the decision-making process of individuals, groups and organizations. Most of the research behind behavioral finance has been empirical in nature, concentrating on what people do and why. The research has shown that people do not always act rationally, nor they fully utilise all information available to them.

  10. Behavioral epigenetics

    OpenAIRE

    Lester, Barry M.; Tronick, Edward; Nestler, Eric; Abel, Ted; Kosofsky, Barry; Kuzawa, Christopher W.; Marsit, Carmen J; Maze, Ian; Meaney, Michael J.; Monteggia, Lisa M.; Reul, Johannes M. H. M.; Skuse, David H.; Sweatt, J. David; Wood, Marcelo A.

    2011-01-01

    Sponsored by the New York Academy of Sciences, the Warren Alpert Medical School of Brown University and the University of Massachusetts Boston, “Behavioral Epigenetics” was held on October 29–30, 2010 at the University of Massachusetts Boston Campus Center, Boston, Massachusetts. This meeting featured speakers and panel discussions exploring the emerging field of behavioral epigenetics, from basic biochemical and cellular mechanisms to the epigenetic modulation of normative development, devel...

  11. Uncommon Law: Understanding and Quantifying the Sovereign Citizen Movement

    Science.gov (United States)

    2016-12-01

    SUBJECT TERMS sovereign citizen, domestic terrorism, terrorism statute, social identity theory, behavioral markers, common law, Posse Comitatus...PERCEPTIONS AND PORTRAYALS .............................................................41 A. SOCIAL IDENTITY THEORY...theory revolves around the gold standard’s role in the American economy .76 The bulletin notes that some sovereign citizens believe that “when the

  12. A method to quantify movement activity of groups of animals using automated image analysis

    Science.gov (United States)

    Xu, Jianyu; Yu, Haizhen; Liu, Ying

    2009-07-01

    Most physiological and environmental changes are capable of inducing variations in animal behavior. The behavioral parameters have the possibility to be measured continuously in-situ by a non-invasive and non-contact approach, and have the potential to be used in the actual productions to predict stress conditions. Most vertebrates tend to live in groups, herds, flocks, shoals, bands, packs of conspecific individuals. Under culture conditions, the livestock or fish are in groups and interact on each other, so the aggregate behavior of the group should be studied rather than that of individuals. This paper presents a method to calculate the movement speed of a group of animal in a enclosure or a tank denoted by body length speed that correspond to group activity using computer vision technique. Frame sequences captured at special time interval were subtracted in pairs after image segmentation and identification. By labeling components caused by object movement in difference frame, the projected area caused by the movement of every object in the capture interval was calculated; this projected area was divided by the projected area of every object in the later frame to get body length moving distance of each object, and further could obtain the relative body length speed. The average speed of all object can well respond to the activity of the group. The group activity of a tilapia (Oreochromis niloticus) school to high (2.65 mg/L) levels of unionized ammonia (UIA) concentration were quantified based on these methods. High UIA level condition elicited a marked increase in school activity at the first hour (P<0.05) exhibiting an avoidance reaction (trying to flee from high UIA condition), and then decreased gradually.

  13. Quantifying the interdisciplinarity of scientific journals and fields

    CERN Document Server

    Silva, Filipi Nascimento; Junior, Osvaldo Novais de Oliveira; Costa, Luciano da Fontoura

    2012-01-01

    There is an overall perception of increased interdisciplinarity in science, but this is difficult to confirm quantitatively owing to the lack of adequate methods to evaluate subjective phenomena. This is no different from the difficulties in establishing quantitative relationships in human and social sciences. In this paper we quantified the interdisciplinarity of scientific journals and science fields by using an entropy measurement based on the diversity of the subject categories of journals citing a specific journal. The methodology consisted in building citation networks using the Journal Citation Reports database, in which the nodes were journals and edges were established based on citations among journals. The overall network for the 11-year period (1999-2009) studied was small-world and scale free with regard to the in-strength. Upon visualizing the network topology an overall structure of the various science fields could be inferred, especially their interconnections. We confirmed quantitatively that ...

  14. Quantifying mast cells in bladder pain syndrome by immunohistochemical analysis

    DEFF Research Database (Denmark)

    Larsen, M.S.; Mortensen, S.; Nordling, J.;

    2008-01-01

    OBJECTIVES To evaluate a simple method for counting mast cells, thought to have a role in the pathophysiology of bladder pain syndrome (BPS, formerly interstitial cystitis, a syndrome of pelvic pain perceived to be related to the urinary bladder and accompanied by other urinary symptoms, e. g....... frequency and nocturia), as > 28 mast cells/mm(2) is defined as mastocytosis and correlated with clinical outcome. PATIENTS AND METHODS The current enzymatic staining method (naphtolesterase) on 10 mu m sections for quantifying mast cells is complicated. In the present study, 61 patients had detrusor...... sections between, respectively. Mast cells were counted according to a well-defined procedure. RESULTS The old and the new methods, on 10 and 3 mu m sections, showed a good correlation between mast cell counts. When using tryptase staining and 3 mu m sections, the mast cell number correlated well...

  15. Dust as interstellar catalyst I. Quantifying the chemical desorption process

    CERN Document Server

    Minissale, M; Cazaux, S; Hocuk, S

    2015-01-01

    Context. The presence of dust in the interstellar medium has profound consequences on the chemical composition of regions where stars are forming. Recent observations show that many species formed onto dust are populating the gas phase, especially in cold environments where UV and CR induced photons do not account for such processes. Aims. The aim of this paper is to understand and quantify the process that releases solid species into the gas phase, the so-called chemical desorption process, so that an explicit formula can be derived that can be included into astrochemical models. Methods. We present a collection of experimental results of more than 10 reactive systems. For each reaction, different substrates such as oxidized graphite and compact amorphous water ice are used. We derive a formula to reproduce the efficiencies of the chemical desorption process, which considers the equipartition of the energy of newly formed products, followed by classical bounce on the surface. In part II we extend these resul...

  16. Quantifying population genetic differentiation from next-generation sequencing data

    DEFF Research Database (Denmark)

    Fumagalli, Matteo; Vieira, Filipe G.; Korneliussen, Thorfinn Sand;

    2013-01-01

    Over the last few years, new high-throughput DNA sequencing technologies have dramatically increased speed and reduced sequencing costs. However, the use of these sequencing technologies is often challenged by errors and biases associated with the bioinformatical methods used for analyzing the da...... individuals, suggesting that employing this new method is useful for investigating the genetic relationships of populations sampled at low coverage....... method for quantifying population genetic differentiation from next-generation sequencing data. In addition, we present a strategy to investigate population structure via Principal Components Analysis. Through extensive simulations, we compare the new method herein proposed to approaches based...... on genotype calling and demonstrate a marked improvement in estimation accuracy for a wide range of conditions. We apply the method to a large-scale genomic data set of domesticated and wild silkworms sequenced at low coverage. We find that we can infer the fine-scale genetic structure of the sampled...

  17. Quantifying the Use of Gestures in Autism Spectrum Disorder

    DEFF Research Database (Denmark)

    Lambrechts, Anna; Yarrow, K.; Maras, Katie

    Background: Autism Spectrum Disorder (ASD) is characterized by difficulties in communication and social interaction. In the absence of a biomarker, a diagnosis of Autism Spectrum Disorder (ASD) is reached in settings such as the ADOS (Lord et al., 2000) by observing disturbances of social...... interaction such as abnormalities in the use of gestures or flow of conversation. These observations rely exclusively on clinical judgement and are thus prone to error and inconsistency across contexts and clinicians. While studies in children show that co-speech gestures are fewer (e.g. Wetherby et al., 1998...... that abnormal temporal processes contribute to impaired social skills in ASD (Allman, 2011). Objectives: - Quantify the production of gestures in ASD in naturally occurring language - Characterise the temporal dynamics of speech and gesture coordination in ASD using two acoustic indices; pitch and volume...

  18. Quantifying the global cellular thiol-disulfide status

    DEFF Research Database (Denmark)

    Hansen, Rosa E; Roth, Doris; Winther, Jakob R

    2009-01-01

    It is widely accepted that the redox status of protein thiols is of central importance to protein structure and folding and that glutathione is an important low-molecular-mass redox regulator. However, the total cellular pools of thiols and disulfides and their relative abundance have never been...... determined. In this study, we have assembled a global picture of the cellular thiol-disulfide status in cultured mammalian cells. We have quantified the absolute levels of protein thiols, protein disulfides, and glutathionylated protein (PSSG) in all cellular protein, including membrane proteins. These data...... cell types. However, when cells are exposed to a sublethal dose of the thiol-specific oxidant diamide, PSSG levels increase to >15% of all protein cysteine. Glutathione is typically characterized as the "cellular redox buffer"; nevertheless, our data show that protein thiols represent a larger active...

  19. Quantified Risk Assessment for Plants Producing and Storing Explosives

    Institute of Scientific and Technical Information of China (English)

    Ioannis A. Papazoglou; Panagiotis Saravanos; Ieronymos Giakoumatos; Olga N. Aneziris

    2006-01-01

    This paper presents a methodology for risk assessment of plants producing and storing explosives. The major procedural steps for quantified risk assessment (QRA) in explosive plants are the following: hazard identification, accident sequence modeling, data acquisition, accident sequence quantification, consequence assessment and integration of results.This methodology is demonstrated and applied in an explosive plant consisting of four separate units, which produce detonating cord, nitroglycol, dynamites and ammonium nitrate fuel oil (ANFO). A GIS platform is used for depicting individual risk from explosions in this plant. Total individual risk is equal to 1.0 × 10-4/y in a distance of 340m from the center of the plant, and 1.0 × 10-6/y in a distance of 390m from the center of the plant.

  20. Quantifying disorder through conditional entropy: an application to fluid mixing.

    Science.gov (United States)

    Brandani, Giovanni B; Schor, Marieke; Macphee, Cait E; Grubmüller, Helmut; Zachariae, Ulrich; Marenduzzo, Davide

    2013-01-01

    In this paper, we present a method to quantify the extent of disorder in a system by using conditional entropies. Our approach is especially useful when other global, or mean field, measures of disorder fail. The method is equally suited for both continuum and lattice models, and it can be made rigorous for the latter. We apply it to mixing and demixing in multicomponent fluid membranes, and show that it has advantages over previous measures based on Shannon entropies, such as a much diminished dependence on binning and the ability to capture local correlations. Further potential applications are very diverse, and could include the study of local and global order in fluid mixtures, liquid crystals, magnetic materials, and particularly biomolecular systems.

  1. Quantifying the Dynamical Complexity of Chaotic Time Series

    Science.gov (United States)

    Politi, Antonio

    2017-04-01

    A powerful approach is proposed for the characterization of chaotic signals. It is based on the combined use of two classes of indicators: (i) the probability of suitable symbolic sequences (obtained from the ordinal patterns of the corresponding time series); (ii) the width of the corresponding cylinder sets. This way, much information can be extracted and used to quantify the complexity of a given signal. As an example of the potentiality of the method, I introduce a modified permutation entropy which allows for quantitative estimates of the Kolmogorov-Sinai entropy in hyperchaotic models, where other methods would be unpractical. As a by-product, estimates of the fractal dimension of the underlying attractors are possible as well.

  2. Quantifying cardiovascular disease risk factors in patients with psoriasis

    DEFF Research Database (Denmark)

    Miller, I M; Skaaby, T; Ellervik, C

    2013-01-01

    BACKGROUND: In a previous meta-analysis on categorical data we found an association between psoriasis and cardiovascular disease and associated risk factors. OBJECTIVES: To quantify the level of cardiovascular disease risk factors in order to provide additional data for the clinical management....... RESULTS: We included 59 studies with up to 18 666 cases and 50 724 controls. Psoriasis cases had a higher total cholesterol [weighted mean difference 8·83 mg dL(-1) , 95% confidence interval (CI) 2·94-14·72, P = 0·003 (= 0·23 mmol L(-1) )], higher low-density lipoprotein cholesterol [9·90 mg dL(-1) , 95......·65 mmol L(-1) )] and a higher HbA1c [1·09 mmol mol(-1) , 95% CI 0·87-1·31, P psoriasis....

  3. Quantifying uncertainty in NIF implosion performance across target scales

    Science.gov (United States)

    Spears, Brian; Baker, K.; Brandon, S.; Buchoff, M.; Callahan, D.; Casey, D.; Field, J.; Gaffney, J.; Hammer, J.; Humbird, K.; Hurricane, O.; Kruse, M.; Munro, D.; Nora, R.; Peterson, L.; Springer, P.; Thomas, C.

    2016-10-01

    Ignition experiments at NIF are being performed at a variety of target scales. Smaller targets require less energy and can be fielded more frequently. Successful small target designs can be scaled up to take advantage of the full NIF laser energy and power. In this talk, we will consider a rigorous framework for scaling from smaller to larger targets. The framework uses both simulation and experimental results to build a statistical prediction of target performance as scale is increased. Our emphasis is on quantifying uncertainty in scaling predictions with the goal of identifying the dominant contributors to that uncertainty. We take as a particular example the Big Foot platform that produces a round, 0.8 scale implosion with the potential to scale to full NIF size (1.0 scale). This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  4. Quantifying Solar Cell Cracks in Photovoltaic Modules by Electroluminescence Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Spataru, Sergiu; Hacke, Peter; Sera, Dezso; Glick, Stephen; Kerekes, Tamas; Teodorescu, Remus

    2015-06-14

    This article proposes a method for quantifying the percentage of partially and totally disconnected solar cell cracks by analyzing electroluminescence images of the photovoltaic module taken under high- and low-current forward bias. The method is based on the analysis of the module's electroluminescence intensity distribution, applied at module and cell level. These concepts are demonstrated on a crystalline silicon photovoltaic module that was subjected to several rounds of mechanical loading and humidity-freeze cycling, causing increasing levels of solar cell cracks. The proposed method can be used as a diagnostic tool to rate cell damage or quality of modules after transportation. Moreover, the method can be automated and used in quality control for module manufacturers, installers, or as a diagnostic tool by plant operators and diagnostic service providers.

  5. Quantifying human response capabilities towards tsunami threats at community level

    Science.gov (United States)

    Post, J.; Mück, M.; Zosseder, K.; Wegscheider, S.; Taubenböck, H.; Strunz, G.; Muhari, A.; Anwar, H. Z.; Birkmann, J.; Gebert, N.

    2009-04-01

    Decision makers at the community level need detailed information on tsunami risks in their area. Knowledge on potential hazard impact, exposed elements such as people, critical facilities and lifelines, people's coping capacity and recovery potential are crucial to plan precautionary measures for adaptation and to mitigate potential impacts of tsunamis on society and the environment. A crucial point within a people-centred tsunami risk assessment is to quantify the human response capabilities towards tsunami threats. Based on this quantification and spatial representation in maps tsunami affected and safe areas, difficult-to-evacuate areas, evacuation target points and evacuation routes can be assigned and used as an important contribution to e.g. community level evacuation planning. Major component in the quantification of human response capabilities towards tsunami impacts is the factor time. The human response capabilities depend on the estimated time of arrival (ETA) of a tsunami, the time until technical or natural warning signs (ToNW) can be received, the reaction time (RT) of the population (human understanding of a tsunami warning and the decision to take appropriate action), the evacuation time (ET, time people need to reach a safe area) and the actual available response time (RsT = ETA - ToNW - RT). If RsT is larger than ET, people in the respective areas are able to reach a safe area and rescue themselves. Critical areas possess RsT values equal or even smaller ET and hence people whin these areas will be directly affected by a tsunami. Quantifying the factor time is challenging and an attempt to this is presented here. The ETA can be derived by analyzing pre-computed tsunami scenarios for a respective area. For ToNW we assume that the early warning center is able to fulfil the Indonesian presidential decree to issue a warning within 5 minutes. RT is difficult as here human intrinsic factors as educational level, believe, tsunami knowledge and experience

  6. Novel Material to Quantify Sharpness and Traction of Vitreous Cutters

    Directory of Open Access Journals (Sweden)

    Nazafarin Honarpisheh

    2011-01-01

    Full Text Available There is no available method for evaluating cutting quality of vitreotomes. The available methods of assessment allow only indirect judgment of their quality and are difficult to apply in clinical practice. We propose using a collagen film with maximum thickness of 1-2 micron to test the sharpness of instruments under conditions resembling clinical ones. The collagen film is fixed by a special device, then place in physiological saline, and then cut under the operation microscope. It shows whether the cutting edges are sharp enough, and the collagen film is cut smoothly. We also use an Electroforce 3100 machine and Dynamic Mechanical Analysis software to quantify the vitreoretinal force applied to the retina during vitrectomy.

  7. Fuzzy Entropy Method for Quantifying Supply Chain Networks Complexity

    Science.gov (United States)

    Zhang, Jihui; Xu, Junqin

    Supply chain is a special kind of complex network. Its complexity and uncertainty makes it very difficult to control and manage. Supply chains are faced with a rising complexity of products, structures, and processes. Because of the strong link between a supply chain’s complexity and its efficiency the supply chain complexity management becomes a major challenge of today’s business management. The aim of this paper is to quantify the complexity and organization level of an industrial network working towards the development of a ‘Supply Chain Network Analysis’ (SCNA). By measuring flows of goods and interaction costs between different sectors of activity within the supply chain borders, a network of flows is built and successively investigated by network analysis. The result of this study shows that our approach can provide an interesting conceptual perspective in which the modern supply network can be framed, and that network analysis can handle these issues in practice.

  8. Quantifying Mapping Orbit Performance in the Vicinity of Primitive Bodies

    Science.gov (United States)

    Pavlak, Thomas A.; Broschart, Stephen B.; Lantoine, Gregory

    2015-01-01

    Predicting and quantifying the capability of mapping orbits in the vicinity of primitive bodies is challenging given the complex orbit geometries that exist and the irregular shape of the bodies themselves. This paper employs various quantitative metrics to characterize the performance and relative effectiveness of various types of mapping orbits including terminator, quasi-terminator, hovering, pingpong, and conic-like trajectories. Metrics of interest include surface area coverage, lighting conditions, and the variety of viewing angles achieved. The metrics discussed in this investigation are intended to enable mission designers and project stakeholders to better characterize candidate mapping orbits during preliminary mission formulation activities.The goal of this investigation is to understand the trade space associated with carrying out remotesensing campaigns at small primitive bodies in the context of a robotic space mission. Specifically,this study seeks to understand the surface viewing geometries, ranges, etc. that are available fromseveral commonly proposed mapping orbits architectures.

  9. Quantifiable Assessment of SWNT Dispersion in Polymer Composites

    Science.gov (United States)

    Park, Cheol; Kim, Jae-Woo; Wise, Kristopher E.; Working, Dennis; Siochi, Mia; Harrison, Joycelyn; Gibbons, Luke; Siochi, Emilie J.; Lillehei, Peter T.; Cantrell, Sean; Cantrell, John

    2007-01-01

    NASA LaRC has established a new protocol for visualizing the nanomaterials in structural polymer matrix resins. Using this new technique and reconstructing the 3D distribution of the nanomaterials allows us to compare this distribution against a theoretically perfect distribution. Additional tertiary structural information can now be obtained and quantified with the electron tomography studies. These tools will be necessary to establish the structural-functional relationships between the nano and the bulk. This will also help define the critical length scales needed for functional properties. Field ready tool development and calibration can begin by using these same samples and comparing the response. i.e. gold standards of good and bad dispersion.

  10. Review: quantifying mitochondrial dysfunction in complex diseases of aging.

    Science.gov (United States)

    Horan, Martin P; Pichaud, Nicolas; Ballard, J William O

    2012-10-01

    There is accumulating evidence that mitochondrial respiratory malfunction is associated with aging-associated complex diseases. However, progress in our understanding of these diseases has been hampered by the sensitivity and throughput of systems employed to quantify dysfunction and inherent limitations of the biological systems studied. In this review, we describe and contrast two methodologies that have been developed for measuring mitochondrial function to address the need for improved sensitivity and increased throughput. We then consider the utility of each methodology in studying three biological systems: isolated mitochondria, cultured cells, and cell fibers and tissues. Finally, we discuss the application of each methodology in the study of mitochondrial dysfunction in Alzheimer's disease, type 2 diabetes mellitus, and aging-associated autophagy impairment and mitochondrial malfunction. We conclude that the methodologies are complementary, and researchers may need to examine multiple biological systems to unravel complex diseases of aging.

  11. Quantifying Variability of Manual Annotation in Cryo-Electron Tomograms.

    Science.gov (United States)

    Hecksel, Corey W; Darrow, Michele C; Dai, Wei; Galaz-Montoya, Jesús G; Chin, Jessica A; Mitchell, Patrick G; Chen, Shurui; Jakana, Jemba; Schmid, Michael F; Chiu, Wah

    2016-06-01

    Although acknowledged to be variable and subjective, manual annotation of cryo-electron tomography data is commonly used to answer structural questions and to create a "ground truth" for evaluation of automated segmentation algorithms. Validation of such annotation is lacking, but is critical for understanding the reproducibility of manual annotations. Here, we used voxel-based similarity scores for a variety of specimens, ranging in complexity and segmented by several annotators, to quantify the variation among their annotations. In addition, we have identified procedures for merging annotations to reduce variability, thereby increasing the reliability of manual annotation. Based on our analyses, we find that it is necessary to combine multiple manual annotations to increase the confidence level for answering structural questions. We also make recommendations to guide algorithm development for automated annotation of features of interest.

  12. Quantifying greenhouse gas emissions from waste treatment facilities

    DEFF Research Database (Denmark)

    Mønster, Jacob

    times and the emissions ranged from 2.6 to 60.8 kg methane per hour, with the lowest emissions from the oldest and smallest landfills and the highest emissions from the bigger landfills. It was not possible to correlate the measured emission with a single factor such a landfill age, size or mitigation...... gas, and that this error becomes smaller with increasing measurement distance. A measurement protocol was developed and the methane emission was quantified from a series of landfills with different size, age and gas recovery and mitigation conditions. The landfills were measured between one and four...... actions. As an example the highest emission was measured at a landfill with active methane recovery and utilization. Compared with national and European greenhouse gas reporting schemes the measurement showed a large difference, with reporting ranging a factor of 100 above to a factor of 10 below...

  13. Quantifying Long-term Retention of Excised Fat Grafts

    DEFF Research Database (Denmark)

    bxg165, bxg165; Ørholt, Mathias; Glovinski, Peter V.

    2017-01-01

    Background: Predicting the degree of fat graft retention is essential whenplanning reconstruction or augmentation with free fat grafting. Most surgeonsobserve volume loss over time after fat grafting; however, the portion lost toresorption after surgery is still poorly defined, and the time...... to reach steadystate is unknown. Methods: The authors compiled a retrospective, longitudinal cohort of patientswith vestibular schwannoma who had undergone ablative surgery and reconstructionwith excised fat between the years 2006 and 2015. Fat volume retentionwas quantified by computed tomography....... The average baseline graft volumewas 18.1 ± 4.8 ml. The average time to reach steady state was 806 days aftertransplantation. By this time, the average fat graft retention was 50.6 percent(95 percent CI, 46.4 to 54.7 percent). No statistically significant association wasfound between baseline graft volume...

  14. Quantifying the effect of baryon physics on weak lensing tomography

    CERN Document Server

    Semboloni, Elisabetta; Schaye, Joop; van Daalen, Marcel P; McCarthy, Ian J

    2011-01-01

    We use matter power spectra from cosmological hydrodynamic simulations to quantify the effect of baryon physics on the weak gravitational lensing shear signal. The simulations consider a number of processes, such as radiative cooling, star formation, supernovae and feedback from active galactic nuclei (AGN). Van Daalen et al. (2011) used the same simulations to show that baryon physics, in particular the strong feedback that is required to solve the overcooling problem, modifies the matter power spectrum on scales relevant for cosmological weak lensing studies. As a result, the use of power spectra from dark matter simulations can lead to significant biases in the inferred cosmological parameters. We show that the typical biases are much larger than the precision with which future missions aim to constrain the dark energy equation of state, w_0. For instance, the simulation with AGN feedback, which reproduces X-ray and optical properties of groups of galaxies, gives rise to a ~40% bias in w_0. We demonstrate ...

  15. The allostery landscape: quantifying thermodynamic couplings in biomolecular systems

    CERN Document Server

    Cuendet, Michel A; LeVine, Michael V

    2016-01-01

    Allostery plays a fundament role in most biological processes. However, little theory is available to describe it outside of two-state models. Here we use a statistical mechanical approach to show that the allosteric coupling between two collective variables is not a single number, but instead a two-dimensional thermodynamic coupling function that is directly related to the mutual information from information theory and the copula density function from probability theory. On this basis, we demonstrate how to quantify the contribution of specific energy terms to this thermodynamic coupling function, enabling a decomposition that reveals the mechanism of allostery. We illustrate the thermodynamic coupling function and its use by showing how allosteric coupling in the alanine dipeptide molecule contributes to the overall shape of the {\\Phi}/{\\Psi} free energy surface, and by identifying the interactions that are necessary for this coupling.

  16. Quantifying classical entanglement using polarimetry: Spatially-inhomogeneously polarized beams

    CERN Document Server

    Samlan, C T

    2015-01-01

    Generation of spatially inhomogeneously polarized (SIP) beams due to polarization spatial mode nonseparability is demonstrated using a polarization Sagnac interferometer. Counter propagating horizontal (H) and vertical (V) polarized Laguerre Gaussian beams of opposite topological charges(LG01)corresponding to the two binary degrees of freedom (DoF) are coherently superposed in the interferometer. Quantum inspired entanglement witness and quantification such as Bell state measurement, violation of CHSH form of Bells inequality and concurrence measurements are experimentally measured to establish nonseparability between the two DoF. As the SIP beams are equivalent to bipartite pure states, point wise Stokes polarimetry measurements are used to calculate the reduced coherency matrix corresponding to polarization DoF. Using these we calculate the linear entropy and maximized Bells measure to quantify the degree of entanglement of the SIP beams.

  17. Quantifying similarity of pore-geometry in nanoporous materials

    Science.gov (United States)

    Lee, Yongjin; Barthel, Senja D.; Dłotko, Paweł; Moosavi, S. Mohamad; Hess, Kathryn; Smit, Berend

    2017-05-01

    In most applications of nanoporous materials the pore structure is as important as the chemical composition as a determinant of performance. For example, one can alter performance in applications like carbon capture or methane storage by orders of magnitude by only modifying the pore structure. For these applications it is therefore important to identify the optimal pore geometry and use this information to find similar materials. However, the mathematical language and tools to identify materials with similar pore structures, but different composition, has been lacking. We develop a pore recognition approach to quantify similarity of pore structures and classify them using topological data analysis. This allows us to identify materials with similar pore geometries, and to screen for materials that are similar to given top-performing structures. Using methane storage as a case study, we also show that materials can be divided into topologically distinct classes requiring different optimization strategies.

  18. Species determination - can we detect and quantify meat adulteration?

    DEFF Research Database (Denmark)

    Ballin, Nicolai Zederkopff; Vogensen, Finn Kvist; Karlsson, Anders H

    2009-01-01

    Proper labelling of meat products is important to help fair-trade, and to enable consumers to make informed choices. However, it has been shown that labelling of species, expressed as weight/weight (w/w), on meat product labels was incorrect in more than 20% of cases. Enforcement of labelling...... regulations requires reliable analytical methods. Analytical methods are often based on protein or DNA measurements, which are not directly comparable to labelled meat expressed as w/w. This review discusses a wide range of analytical methods with focus on their ability to quantify and their limits...... of detection (LOD). In particular, problems associated with a correlation from quantitative DNA based results to meat content (w/w) are discussed. The hope is to make researchers aware of the problems of expressing DNA results as meat content (w/w) in order to find better alternatives. One alternative...

  19. Quantifying chaotic dynamics from integrate-and-fire processes

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, A. N. [Department of Physics, Saratov State University, Astrakhanskaya Str. 83, 410012 Saratov (Russian Federation); Saratov State Technical University, Politehnicheskaya Str. 77, 410054 Saratov (Russian Federation); Pavlova, O. N. [Department of Physics, Saratov State University, Astrakhanskaya Str. 83, 410012 Saratov (Russian Federation); Mohammad, Y. K. [Department of Physics, Saratov State University, Astrakhanskaya Str. 83, 410012 Saratov (Russian Federation); Tikrit University Salahudin, Tikrit Qadisiyah, University Str. P.O. Box 42, Tikrit (Iraq); Kurths, J. [Potsdam Institute for Climate Impact Research, Telegraphenberg A 31, 14473 Potsdam (Germany); Institute of Physics, Humboldt University Berlin, 12489 Berlin (Germany)

    2015-01-15

    Characterizing chaotic dynamics from integrate-and-fire (IF) interspike intervals (ISIs) is relatively easy performed at high firing rates. When the firing rate is low, a correct estimation of Lyapunov exponents (LEs) describing dynamical features of complex oscillations reflected in the IF ISI sequences becomes more complicated. In this work we discuss peculiarities and limitations of quantifying chaotic dynamics from IF point processes. We consider main factors leading to underestimated LEs and demonstrate a way of improving numerical determining of LEs from IF ISI sequences. We show that estimations of the two largest LEs can be performed using around 400 mean periods of chaotic oscillations in the regime of phase-coherent chaos. Application to real data is discussed.

  20. Quantifying the exploratory behaviour of Amphibalanus amphitrite cyprids.

    Science.gov (United States)

    Chaw, Kuan Chun; Birch, William R

    2009-10-01

    The behavioural response of cypris larvae from A. amphitrite (=Balanus amphitrite) exploring three model glass surfaces is quantified by close-range microscopy. Step length and step duration measurements reveal a response to both surface properties and flow. Without flow, 2-day-old cyprids took larger steps with shorter step duration on hydrophilic glass surfaces (bare and NH2-treated) vs hydrophobic glass (CH3-treated). These parameters suggest a more detailed, local inspection of hydrophobic surfaces and a more extensive exploration for hydrophilic surfaces. Cyprids under flow took longer steps and exhibited shorter probing times on hydrophobic glass. On hydrophilic glass, cyprids increased their step duration under flow. This active response is attributed to drag and lift forces challenging the cyprids' temporary anchoring to the substratum. Seven-day-old cyprids showed almost no discrimination between the model surfaces. Microscopic-scale observation of cyprid exploration is expected to provide new insights into interactions between cyprids and surfaces.

  1. Quantifying the Impact of Unavailability in Cyber-Physical Environments

    Energy Technology Data Exchange (ETDEWEB)

    Aissa, Anis Ben [Université de Tunis El Manar, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Federick T. [University of Memphis; Mili, Ali [New Jersey Insitute of Technology

    2014-01-01

    The Supervisory Control and Data Acquisition (SCADA) system discussed in this work manages a distributed control network for the Tunisian Electric & Gas Utility. The network is dispersed over a large geographic area that monitors and controls the flow of electricity/gas from both remote and centralized locations. The availability of the SCADA system in this context is critical to ensuring the uninterrupted delivery of energy, including safety, security, continuity of operations and revenue. Such SCADA systems are the backbone of national critical cyber-physical infrastructures. Herein, we propose adapting the Mean Failure Cost (MFC) metric for quantifying the cost of unavailability. This new metric combines the classic availability formulation with MFC. The resulting metric, so-called Econometric Availability (EA), offers a computational basis to evaluate a system in terms of the gain/loss ($/hour of operation) that affects each stakeholder due to unavailability.

  2. Quantifying Interparticle Forces and Heterogeneity in 3D Granular Materials.

    Science.gov (United States)

    Hurley, R C; Hall, S A; Andrade, J E; Wright, J

    2016-08-26

    Interparticle forces in granular materials are intimately linked to mechanical properties and are known to self-organize into heterogeneous structures, or force chains, under external load. Despite progress in understanding the statistics and spatial distribution of interparticle forces in recent decades, a systematic method for measuring forces in opaque, three-dimensional (3D), frictional, stiff granular media has yet to emerge. In this Letter, we present results from an experiment that combines 3D x-ray diffraction, x-ray tomography, and a numerical force inference technique to quantify interparticle forces and their heterogeneity in an assembly of quartz grains undergoing a one-dimensional compression cycle. Forces exhibit an exponential decay above the mean and partition into strong and weak networks. We find a surprising inverse relationship between macroscopic load and the heterogeneity of interparticle forces, despite the clear emergence of two force chains that span the system.

  3. Front tracking for characterizing and quantifying reactive mixing

    Science.gov (United States)

    Kelley, Douglas; Nevins, Thomas

    2016-11-01

    Mixing in industrial chemical reactors involves complicated interactions between advection, reaction, and diffusion that are difficult to simulate or measure in detail. However, in large-Damköhler-number systems which show sharp fronts between reacted and unreacted regions, reactor dynamics might be more simply and usefully characterized in terms of the reaction fronts themselves. In fact, prior work has already shown that the reaction rate and material diffusivity can be calculated directly if front speed and front thickness are known. We have developed methods to optically track reaction fronts, measuring their speed and thickness throughout space and time. We will present such measurements in both simulation and experiment, consider their statistics, and discuss future efforts to characterize and quantify mixing in chemical reactors.

  4. Quantifying the clinical relevance of a laboratory observer performance paradigm.

    Science.gov (United States)

    Chakraborty, D P; Haygood, T M; Ryan, J; Marom, E M; Evanoff, M; McEntee, M F; Brennan, P C

    2012-09-01

    Laboratory observer performance measurements, receiver operating characteristic (ROC) and free-response ROC (FROC) differ from actual clinical interpretations in several respects, which could compromise their clinical relevance. The objective of this study was to develop a method for quantifying the clinical relevance of a laboratory paradigm and apply it to compare the ROC and FROC paradigms in a nodule detection task. The original prospective interpretations of 80 digital chest radiographs were classified by the truth panel as correct (C=1) or incorrect (C=0), depending on correlation with additional imaging, and the average of C was interpreted as the clinical figure of merit. FROC data were acquired for 21 radiologists and ROC data were inferred using the highest ratings. The areas under the ROC and alternative FROC curves were used as laboratory figures of merit. Bootstrap analysis was conducted to estimate conventional agreement measures between laboratory and clinical figures of merit. Also computed was a pseudovalue-based image-level correctness measure of the laboratory interpretations, whose association with C as measured by the area (rAUC) under an appropriately defined relevance ROC curve, is as a measure of the clinical relevance of a laboratory paradigm. Low correlations (e.g. κ=0.244) and near chance level rAUC values (e.g. 0.598), attributable to differences between the clinical and laboratory paradigms, were observed. The absolute width of the confidence interval was 0.38 for the interparadigm differences of the conventional measures and 0.14 for the difference of the rAUCs. The rAUC measure was consistent with the traditional measures but was more sensitive to the differences in clinical relevance. A new relevance ROC method for quantifying the clinical relevance of a laboratory paradigm is proposed.

  5. Current challenges in quantifying preferential flow through the vadose zone

    Science.gov (United States)

    Koestel, John; Larsbo, Mats; Jarvis, Nick

    2017-04-01

    In this presentation, we give an overview of current challenges in quantifying preferential flow through the vadose zone. A review of the literature suggests that current generation models do not fully reflect the present state of process understanding and empirical knowledge of preferential flow. We believe that the development of improved models will be stimulated by the increasingly widespread application of novel imaging technologies as well as future advances in computational power and numerical techniques. One of the main challenges in this respect is to bridge the large gap between the scales at which preferential flow occurs (pore to Darcy scales) and the scale of interest for management (fields, catchments, regions). Studies at the pore scale are being supported by the development of 3-D non-invasive imaging and numerical simulation techniques. These studies are leading to a better understanding of how macropore network topology and initial/boundary conditions control key state variables like matric potential and thus the strength of preferential flow. Extrapolation of this knowledge to larger scales would require support from theoretical frameworks such as key concepts from percolation and network theory, since we lack measurement technologies to quantify macropore networks at these large scales. Linked hydro-geophysical measurement techniques that produce highly spatially and temporally resolved data enable investigation of the larger-scale heterogeneities that can generate preferential flow patterns at pedon, hillslope and field scales. At larger regional and global scales, improved methods of data-mining and analyses of large datasets (machine learning) may help in parameterizing models as well as lead to new insights into the relationships between soil susceptibility to preferential flow and site attributes (climate, land uses, soil types).

  6. Quantifying climatological ranges and anomalies for Pacific coral reef ecosystems.

    Science.gov (United States)

    Gove, Jamison M; Williams, Gareth J; McManus, Margaret A; Heron, Scott F; Sandin, Stuart A; Vetter, Oliver J; Foley, David G

    2013-01-01

    Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic-biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km) from 85% of our study locations. These metrics will help

  7. Quantifying climatological ranges and anomalies for Pacific coral reef ecosystems.

    Directory of Open Access Journals (Sweden)

    Jamison M Gove

    Full Text Available Coral reef ecosystems are exposed to a range of environmental forcings that vary on daily to decadal time scales and across spatial scales spanning from reefs to archipelagos. Environmental variability is a major determinant of reef ecosystem structure and function, including coral reef extent and growth rates, and the abundance, diversity, and morphology of reef organisms. Proper characterization of environmental forcings on coral reef ecosystems is critical if we are to understand the dynamics and implications of abiotic-biotic interactions on reef ecosystems. This study combines high-resolution bathymetric information with remotely sensed sea surface temperature, chlorophyll-a and irradiance data, and modeled wave data to quantify environmental forcings on coral reefs. We present a methodological approach to develop spatially constrained, island- and atoll-scale metrics that quantify climatological range limits and anomalous environmental forcings across U.S. Pacific coral reef ecosystems. Our results indicate considerable spatial heterogeneity in climatological ranges and anomalies across 41 islands and atolls, with emergent spatial patterns specific to each environmental forcing. For example, wave energy was greatest at northern latitudes and generally decreased with latitude. In contrast, chlorophyll-a was greatest at reef ecosystems proximate to the equator and northern-most locations, showing little synchrony with latitude. In addition, we find that the reef ecosystems with the highest chlorophyll-a concentrations; Jarvis, Howland, Baker, Palmyra and Kingman are each uninhabited and are characterized by high hard coral cover and large numbers of predatory fishes. Finally, we find that scaling environmental data to the spatial footprint of individual islands and atolls is more likely to capture local environmental forcings, as chlorophyll-a concentrations decreased at relatively short distances (>7 km from 85% of our study locations

  8. Quantifying Particle Numbers and Mass Flux in Drifting Snow

    Science.gov (United States)

    Crivelli, Philip; Paterna, Enrico; Horender, Stefan; Lehning, Michael

    2016-12-01

    We compare two of the most common methods of quantifying mass flux, particle numbers and particle-size distribution for drifting snow events, the snow-particle counter (SPC), a laser-diode-based particle detector, and particle tracking velocimetry based on digital shadowgraphic imaging. The two methods were correlated for mass flux and particle number flux. For the SPC measurements, the device was calibrated by the manufacturer beforehand. The shadowgrapic imaging method measures particle size and velocity directly from consecutive images, and before each new test the image pixel length is newly calibrated. A calibration study with artificially scattered sand particles and glass beads provides suitable settings for the shadowgraphical imaging as well as obtaining a first correlation of the two methods in a controlled environment. In addition, using snow collected in trays during snowfall, several experiments were performed to observe drifting snow events in a cold wind tunnel. The results demonstrate a high correlation between the mass flux obtained for the calibration studies (r ≥slant 0.93) and good correlation for the drifting snow experiments (r ≥slant 0.81). The impact of measurement settings is discussed in order to reliably quantify particle numbers and mass flux in drifting snow. The study was designed and performed to optimize the settings of the digital shadowgraphic imaging system for both the acquisition and the processing of particles in a drifting snow event. Our results suggest that these optimal settings can be transferred to different imaging set-ups to investigate sediment transport processes.

  9. A robust nonparametric method for quantifying undetected extinctions.

    Science.gov (United States)

    Chisholm, Ryan A; Giam, Xingli; Sadanandan, Keren R; Fung, Tak; Rheindt, Frank E

    2016-06-01

    How many species have gone extinct in modern times before being described by science? To answer this question, and thereby get a full assessment of humanity's impact on biodiversity, statistical methods that quantify undetected extinctions are required. Such methods have been developed recently, but they are limited by their reliance on parametric assumptions; specifically, they assume the pools of extant and undetected species decay exponentially, whereas real detection rates vary temporally with survey effort and real extinction rates vary with the waxing and waning of threatening processes. We devised a new, nonparametric method for estimating undetected extinctions. As inputs, the method requires only the first and last date at which each species in an ensemble was recorded. As outputs, the method provides estimates of the proportion of species that have gone extinct, detected, or undetected and, in the special case where the number of undetected extant species in the present day is assumed close to zero, of the absolute number of undetected extinct species. The main assumption of the method is that the per-species extinction rate is independent of whether a species has been detected or not. We applied the method to the resident native bird fauna of Singapore. Of 195 recorded species, 58 (29.7%) have gone extinct in the last 200 years. Our method projected that an additional 9.6 species (95% CI 3.4, 19.8) have gone extinct without first being recorded, implying a true extinction rate of 33.0% (95% CI 31.0%, 36.2%). We provide R code for implementing our method. Because our method does not depend on strong assumptions, we expect it to be broadly useful for quantifying undetected extinctions. © 2016 Society for Conservation Biology.

  10. Quantifying Selective Pressures Driving Bacterial Evolution Using Lineage Analysis

    Science.gov (United States)

    Lambert, Guillaume; Kussell, Edo

    2015-01-01

    Organisms use a variety of strategies to adapt to their environments and maximize long-term growth potential, but quantitative characterization of the benefits conferred by the use of such strategies, as well as their impact on the whole population's rate of growth, remains challenging. Here, we use a path-integral framework that describes how selection acts on lineages—i.e., the life histories of individuals and their ancestors—to demonstrate that lineage-based measurements can be used to quantify the selective pressures acting on a population. We apply this analysis to Escherichia coli bacteria exposed to cyclical treatments of carbenicillin, an antibiotic that interferes with cell-wall synthesis and affects cells in an age-dependent manner. While the extensive characterization of the life history of thousands of cells is necessary to accurately extract the age-dependent selective pressures caused by carbenicillin, the same measurement can be recapitulated using lineage-based statistics of a single surviving cell. Population-wide evolutionary pressures can be extracted from the properties of the surviving lineages within a population, providing an alternative and efficient procedure to quantify the evolutionary forces acting on a population. Importantly, this approach is not limited to age-dependent selection, and the framework can be generalized to detect signatures of other trait-specific selection using lineage-based measurements. Our results establish a powerful way to study the evolutionary dynamics of life under selection and may be broadly useful in elucidating selective pressures driving the emergence of antibiotic resistance and the evolution of survival strategies in biological systems.

  11. THE SEGUE K GIANT SURVEY. III. QUANTIFYING GALACTIC HALO SUBSTRUCTURE

    Energy Technology Data Exchange (ETDEWEB)

    Janesh, William; Morrison, Heather L.; Ma, Zhibo; Harding, Paul [Department of Astronomy, Case Western Reserve University, Cleveland, OH 44106 (United States); Rockosi, Constance [UCO/Lick Observatory, University of California, Santa Cruz, 1156 High Street, Santa Cruz, CA 95064 (United States); Starkenburg, Else [Department of Physics and Astronomy, University of Victoria, P.O. Box 1700, STN CSC, Victoria BC V8W 3P6 (Canada); Xue, Xiang Xiang; Rix, Hans-Walter [Max-Planck-Institut für Astronomie, Königstuhl 17, D-69117 Heidelberg (Germany); Beers, Timothy C. [Department of Physics and JINA Center for the Evolution of the Elements, University of Notre Dame, Notre Dame, IN 46556 (United States); Johnson, Jennifer [Department of Astronomy, Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Lee, Young Sun [Department of Astronomy and Space Science, Chungnam National University, Daejeon 34134 (Korea, Republic of); Schneider, Donald P. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2016-01-10

    We statistically quantify the amount of substructure in the Milky Way stellar halo using a sample of 4568 halo K giant stars at Galactocentric distances ranging over 5–125 kpc. These stars have been selected photometrically and confirmed spectroscopically as K giants from the Sloan Digital Sky Survey’s Sloan Extension for Galactic Understanding and Exploration project. Using a position–velocity clustering estimator (the 4distance) and a model of a smooth stellar halo, we quantify the amount of substructure in the halo, divided by distance and metallicity. Overall, we find that the halo as a whole is highly structured. We also confirm earlier work using blue horizontal branch (BHB) stars which showed that there is an increasing amount of substructure with increasing Galactocentric radius, and additionally find that the amount of substructure in the halo increases with increasing metallicity. Comparing to resampled BHB stars, we find that K giants and BHBs have similar amounts of substructure over equivalent ranges of Galactocentric radius. Using a friends-of-friends algorithm to identify members of individual groups, we find that a large fraction (∼33%) of grouped stars are associated with Sgr, and identify stars belonging to other halo star streams: the Orphan Stream, the Cetus Polar Stream, and others, including previously unknown substructures. A large fraction of sample K giants (more than 50%) are not grouped into any substructure. We find also that the Sgr stream strongly dominates groups in the outer halo for all except the most metal-poor stars, and suggest that this is the source of the increase of substructure with Galactocentric radius and metallicity.

  12. Quantifying Uncertainty of Wind Power Production Through an Analog Ensemble

    Science.gov (United States)

    Shahriari, M.; Cervone, G.

    2016-12-01

    The Analog Ensemble (AnEn) method is used to generate probabilistic weather forecasts that quantify the uncertainty in power estimates at hypothetical wind farm locations. The data are from the NREL Eastern Wind Dataset that includes more than 1,300 modeled wind farms. The AnEn model uses a two-dimensional grid to estimate the probability distribution of wind speed (the predictand) given the values of predictor variables such as temperature, pressure, geopotential height, U-component and V-component of wind. The meteorological data is taken from the NCEP GFS which is available on a 0.25 degree grid resolution. The methodology first divides the data into two classes: training period and verification period. The AnEn selects a point in the verification period and searches for the best matching estimates (analogs) in the training period. The predictand value at those analogs are the ensemble prediction for the point in the verification period. The model provides a grid of wind speed values and the uncertainty (probability index) associated with each estimate. Each wind farm is associated with a probability index which quantifies the degree of difficulty to estimate wind power. Further, the uncertainty in estimation is related to other factors such as topography, land cover and wind resources. This is achieved by using a GIS system to compute the correlation between the probability index and geographical characteristics. This study has significant applications for investors in renewable energy sector especially wind farm developers. Lower level of uncertainty facilitates the process of submitting bids into day ahead and real time electricity markets. Thus, building wind farms in regions with lower levels of uncertainty will reduce the real-time operational risks and create a hedge against volatile real-time prices. Further, the links between wind estimate uncertainty and factors such as topography and wind resources, provide wind farm developers with valuable

  13. Quantifying Particle Numbers and Mass Flux in Drifting Snow

    Science.gov (United States)

    Crivelli, Philip; Paterna, Enrico; Horender, Stefan; Lehning, Michael

    2016-06-01

    We compare two of the most common methods of quantifying mass flux, particle numbers and particle-size distribution for drifting snow events, the snow-particle counter (SPC), a laser-diode-based particle detector, and particle tracking velocimetry based on digital shadowgraphic imaging. The two methods were correlated for mass flux and particle number flux. For the SPC measurements, the device was calibrated by the manufacturer beforehand. The shadowgrapic imaging method measures particle size and velocity directly from consecutive images, and before each new test the image pixel length is newly calibrated. A calibration study with artificially scattered sand particles and glass beads provides suitable settings for the shadowgraphical imaging as well as obtaining a first correlation of the two methods in a controlled environment. In addition, using snow collected in trays during snowfall, several experiments were performed to observe drifting snow events in a cold wind tunnel. The results demonstrate a high correlation between the mass flux obtained for the calibration studies (r ≥slant 0.93 ) and good correlation for the drifting snow experiments (r ≥slant 0.81 ). The impact of measurement settings is discussed in order to reliably quantify particle numbers and mass flux in drifting snow. The study was designed and performed to optimize the settings of the digital shadowgraphic imaging system for both the acquisition and the processing of particles in a drifting snow event. Our results suggest that these optimal settings can be transferred to different imaging set-ups to investigate sediment transport processes.

  14. Quantifiably secure power grid operation, management, and evolution :

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Genetha Anne.; Watson, Jean-Paul; Silva Monroy, Cesar Augusto; Gramacy, Robert B.

    2013-09-01

    This report summarizes findings and results of the Quantifiably Secure Power Grid Operation, Management, and Evolution LDRD. The focus of the LDRD was to develop decisionsupport technologies to enable rational and quantifiable risk management for two key grid operational timescales: scheduling (day-ahead) and planning (month-to-year-ahead). Risk or resiliency metrics are foundational in this effort. The 2003 Northeast Blackout investigative report stressed the criticality of enforceable metrics for system resiliency the grids ability to satisfy demands subject to perturbation. However, we neither have well-defined risk metrics for addressing the pervasive uncertainties in a renewable energy era, nor decision-support tools for their enforcement, which severely impacts efforts to rationally improve grid security. For day-ahead unit commitment, decision-support tools must account for topological security constraints, loss-of-load (economic) costs, and supply and demand variability especially given high renewables penetration. For long-term planning, transmission and generation expansion must ensure realized demand is satisfied for various projected technological, climate, and growth scenarios. The decision-support tools investigated in this project paid particular attention to tailoriented risk metrics for explicitly addressing high-consequence events. Historically, decisionsupport tools for the grid consider expected cost minimization, largely ignoring risk and instead penalizing loss-of-load through artificial parameters. The technical focus of this work was the development of scalable solvers for enforcing risk metrics. Advanced stochastic programming solvers were developed to address generation and transmission expansion and unit commitment, minimizing cost subject to pre-specified risk thresholds. Particular attention was paid to renewables where security critically depends on production and demand prediction accuracy. To address this

  15. Quantifying the Digital Traces of Hurricane Sandy on Flickr

    OpenAIRE

    Preis, Tobias; Moat, Helen Susannah; Bishop, Steven R.; Treleaven, Philip; Stanley, H. Eugene

    2013-01-01

    Society’s increasing interactions with technology are creating extensive “digital traces” of our collective human behavior. These new data sources are fuelling the rapid development of the new field of computational social science. To investigate user attention to the Hurricane Sandy disaster in 2012, we analyze data from Flickr, a popular website for sharing personal photographs. In this case study, we find that the number of photos taken and subsequently uploaded to Flickr with titles, desc...

  16. Quantifying Trust, Distrust, and Suspicion in Human-System Interactions

    Science.gov (United States)

    2015-10-26

    thesis, progress, quarterly, research, special, group study, etc. 3. DATES COVERED. Indicate the time during which the work was performed and the...Factors in Computing Systems. 2008. Florence, Italy . 3. Berka, C., et al., Real-Time Analysis of EEG Indexes of Alertness, Cognition, and Memory Acquired...Lang, MEASURING EMOTION: THE SELF-ASSESSMENT MANIKIN AND THE SEMANTIC DIFFERENTIAL. Journal of Behavior, Therapy , and Experimental Psychiatry

  17. Quantifying price fluctuations in the Brazilian stock market

    Science.gov (United States)

    Tabak, B. M.; Takami, M. Y.; Cajueiro, D. O.; Petitinga, A.

    2009-01-01

    This paper investigates price fluctuations in the Brazilian stock market. We employ a recently developed methodology to test whether the Brazilian stock price returns present a power law distribution and find that we cannot reject such behavior. Empirical results for sub-partitions of the time series suggests that for most of the time the power law is not rejected, but that in some cases the data set does not conform with a power law distribution.

  18. Baltimore WATERS Test Bed -- Quantifying Groundwater in Urban Areas

    Science.gov (United States)

    Welty, C.; Miller, A. J.; Ryan, R. J.; Crook, N.; Kerchkof, T.; Larson, P.; Smith, J.; Baeck, M. L.; Kaushal, S.; Belt, K.; McGuire, M.; Scanlon, T.; Warner, J.; Shedlock, R.; Band, L.; Groffman, P.

    2007-12-01

    The purpose of this project is to quantify the urban water cycle, with an emphasis on urban groundwater, using investigations at multiple spatial scales. The overall study focuses on the 171 sq km Gwynns Falls watershed, which spans an urban to rural gradient of land cover and is part of the Baltimore Ecosystem Study LTER. Within the Gwynns Falls, finer-scale studies focus on the 14.3 sq km Dead Run and its subwatersheds. A coarse-grid MODFLOW model has been set up to quantify groundwater flow magnitude and direction at the larger watershed scale. Existing wells in this urban area are sparse, but are being located through mining of USGS NWIS and local well data bases. Wet and dry season water level synoptics, stream seepage transects, and existing permeability data are being used in model calibration. In collaboration with CUAHSI HMF Geophysics, a regional-scale microgravity survey was conducted over the watershed in July 2007 and will be repeated in spring 2008. This will enable calculation of the change in groundwater levels for use in model calibration. At the smaller spatial scale (Dead Run catchment), three types of data have been collected to refine our understanding of the groundwater system. (1) Multiple bromide tracer tests were conducted along a 4 km reach of Dead Run under low-flow conditions to examine groundwater- surface water exchange as a function of land cover type and stream position in the watershed. The tests will be repeated under higher base flow conditions in early spring 2008. Tracer test data will be interpreted using the USGS OTIS model and results will be incorporated into the MODFLOW model. (2) Riparian zone geophysical surveys were carried out with support from CUAHSI HMF Geophysics to delineate depth to bedrock and the water table topography as a function of distance from the stream channel. Resistivity, ground penetrating radar, and seismic refraction surveys were run in ten transects across and around the stream channels. (3) A finer

  19. Quantifying the multiple, environmental benefits of reintroducing the Eurasian Beaver

    Science.gov (United States)

    Brazier, Richard; Puttock, Alan; Graham, Hugh; Anderson, Karen; Cunliffe, Andrew; Elliott, Mark

    2016-04-01

    Beavers are ecological engineers with an ability to modify the structure and flow of fluvial systems and create complex wetland environments with dams, ponds and canals. Consequently, beaver activity has potential for river restoration, management and the provision of multiple environmental ecosystem services including biodiversity, flood risk mitigation, water quality and sustainable drinking water provision. With the current debate surrounding the reintroduction of beavers into the United Kingdom, it is critical to monitor the impact of beavers upon the environment. We have developed and implemented a monitoring strategy to quantify the impact of reintroducing the Eurasian Beaver on multiple environmental ecosystem services and river systems at a range of scales. First, the experimental design and preliminary results will be presented from the Mid-Devon Beaver Trial, where a family of beavers has been introduced to a 3 ha enclosure situated upon a first order tributary of the River Tamar. The site was instrumented to monitor the flow rate and quality of water entering and leaving the site. Additionally, the impacts of beavers upon riparian vegetation structure, water/carbon storage were investigated. Preliminary results indicate that beaver activity, particularly the building of ponds and dams, increases water storage within the landscape and moderates the river response to rainfall. Baseflow is enhanced during dry periods and storm flow is attenuated, potentially reducing the risk of flooding downstream. Initial analysis of water quality indicates that water entering the site (running off intensively managed grasslands upslope), has higher suspended sediment loads and nitrate levels, than that leaving the site, after moving through the series of beaver ponds. These results suggest beaver activity may also act as a means by which the negative impact of diffuse water pollution from agriculture can be mitigated thus providing cleaner water in rivers downstream

  20. Cost Behavior

    DEFF Research Database (Denmark)

    Hoffmann, Kira

    The objective of this dissertation is to investigate determinants and consequences of asymmetric cost behavior. Asymmetric cost behavior arises if the change in costs is different for increases in activity compared to equivalent decreases in activity. In this case, costs are termed “sticky......” if the change is less when activity falls than when activity rises, whereas costs are termed “anti-sticky” if the change is more when activity falls than when activity rises. Understanding such cost behavior is especially relevant for decision-makers and financial analysts that rely on accurate cost information...... to facilitate resource planning and earnings forecasting. As such, this dissertation relates to the topic of firm profitability and the interpretation of cost variability. The dissertation consists of three parts that are written in the form of separate academic papers. The following section briefly summarizes...

  1. Behavioral addictions.

    Science.gov (United States)

    Robbins, T W; Clark, L

    2015-02-01

    Behavioral addictions are slowly becoming recognized as a valid category of psychiatric disorder as shown by the recent allocation of pathological gambling to this category in DSM-5. However, several other types of psychiatric disorder proposed to be examples of behavioral addictions have yet to be accorded this formal acknowledgment and are dispersed across other sections of the DSM-5. This brief review marks this important point in the evolution of this concept and looks to future investigation of behavioral addictions with the theoretical frameworks currently being used successfully to investigate substance addiction and obsessive-compulsive disorder, in a potentially new spectrum of impulsive-compulsive disorders. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Quantifying catchment water balances and their uncertainties by expert elicitation

    Science.gov (United States)

    Sebok, Eva; Refsgaard, Jens Christian; Warmink, Jord J.; Stisen, Simon; Høgh Jensen, Karsten

    2017-04-01

    The increasing demand on water resources necessitates a more responsible and sustainable water management requiring a thorough understanding of hydrological processes both on small scale and on catchment scale. On catchment scale, the characterization of hydrological processes is often carried out by calculating a water balance based on the principle of mass conservation in hydrological fluxes. Assuming a perfect water balance closure and estimating one of these fluxes as a residual of the water balance is a common practice although this estimate will contain uncertainties related to uncertainties in the other components. Water balance closure on the catchment scale is also an issue in Denmark, thus, it was one of the research objectives of the HOBE hydrological observatory, that has been collecting data in the Skjern river catchment since 2008. Water balance components in the 1050 km2 Ahlergaarde catchment and the nested 120 km2 Holtum catchment, located in the glacial outwash plan of the Skjern catchment, were estimated using a multitude of methods. As the collected data enables the complex assessment of uncertainty of both the individual water balance components and catchment-scale water balances, the expert elicitation approach was chosen to integrate the results of the hydrological observatory. This approach relies on the subjective opinion of experts whose available knowledge and experience about the subject allows to integrate complex information from multiple sources. In this study 35 experts were involved in a multi-step elicitation process with the aim of (1) eliciting average annual values of water balance components for two nested catchments and quantifying the contribution of different sources of uncertainties to the total uncertainty in these average annual estimates; (2) calculating water balances for two catchments by reaching consensus among experts interacting in form of group discussions. To address the complex problem of water balance closure

  3. Quantifying modern and ancient drainage basin erosion with detrital thermochronology

    Science.gov (United States)

    Ehlers, T. A.; Stock, G. M.; Rahl, J. M.; Farley, K. A.; van der Pluijm, B. A.

    2006-12-01

    Studies of drainage basin erosion and landform evolution are often limited by not knowing where sediment is sourced from and how erosion rates vary over different time scales. Detrital thermochronometer cooling ages collected from modern river sediments and basin deposits provide a promising tool to address these problems. We present two applications of detrital thermochronology to quantify: (1) spatial variations in erosion using modern river sediments; and (2) temporal variations in erosion calculated using syntectonic sedimentary deposits. In our first application, the elevation dependence of detrital apatite (U-Th)/He (AHe) ages is used to track the elevations where sediment is produced from bedrock. The ages are measured in river sediments from the mouth of two catchments in the Sierra Nevada, California, and used as sediment tracers to quantify spatial variations in erosion. We measured ~54 AHe single grain ages from each catchment. Measured AHe age probability density functions (PDFs) were compared with predicted PDFs, calculated by convolving catchment hypsometry with bedrock age-elevation relationships. Statistical comparison of the PDFs evaluates the spatial distribution of erosion in the catchments. Predicted and observed PDFs are statistically identical for the nonglaciated Inyo Creek catchment, indicating uniform erosion. However, a statistically significant lack of older ages is observed in the recently deglaciated Lone Pine catchment, suggesting sediment is derived from the lower half of the catchment; possibly due to sediment storage at higher elevations and/or enhanced erosion at intermediate elevations. Second, we evaluate the ability of detrital thermochronology to record transients in drainage basin erosion on million year time scales. A transient 1D thermal model is used to predict cooling ages in a syntectonic stratigraphic section where sediment is sourced from a region with temporally variable erosion. In simulations with transient erosion

  4. Quantifying App Store Dynamics: Longitudinal Tracking of Mental Health Apps

    Science.gov (United States)

    Nicholas, Jennifer; Christensen, Helen

    2016-01-01

    Background For many mental health conditions, mobile health apps offer the ability to deliver information, support, and intervention outside the clinical setting. However, there are difficulties with the use of a commercial app store to distribute health care resources, including turnover of apps, irrelevance of apps, and discordance with evidence-based practice. Objective The primary aim of this study was to quantify the longevity and rate of turnover of mental health apps within the official Android and iOS app stores. The secondary aim was to quantify the proportion of apps that were clinically relevant and assess whether the longevity of these apps differed from clinically nonrelevant apps. The tertiary aim was to establish the proportion of clinically relevant apps that included claims of clinical effectiveness. We performed additional subgroup analyses using additional data from the app stores, including search result ranking, user ratings, and number of downloads. Methods We searched iTunes (iOS) and the Google Play (Android) app stores each day over a 9-month period for apps related to depression, bipolar disorder, and suicide. We performed additional app-specific searches if an app no longer appeared within the main search Results On the Android platform, 50% of the search results changed after 130 days (depression), 195 days (bipolar disorder), and 115 days (suicide). Search results were more stable on the iOS platform, with 50% of the search results remaining at the end of the study period. Approximately 75% of Android and 90% of iOS apps were still available to download at the end of the study. We identified only 35.3% (347/982) of apps as being clinically relevant for depression, of which 9 (2.6%) claimed clinical effectiveness. Only 3 included a full citation to a published study. Conclusions The mental health app environment is volatile, with a clinically relevant app for depression becoming unavailable to download every 2.9 days. This poses

  5. Quantifying Livestock Heat Stress Impacts in the Sahel

    Science.gov (United States)

    Broman, D.; Rajagopalan, B.; Hopson, T. M.

    2014-12-01

    Livestock heat stress, especially in regions of the developing world with limited adaptive capacity, has a largely unquantified impact on food supply. Though dominated by ambient air temperature, relative humidity, wind speed, and solar radiation all affect heat stress, which can decrease livestock growth, milk production, reproduction rates, and mortality. Indices like the thermal-humidity index (THI) are used to quantify the heat stress experienced from climate variables. Livestock experience differing impacts at different index critical thresholds that are empirically determined and specific to species and breed. This lack of understanding has been highlighted in several studies with a limited knowledge of the critical thresholds of heat stress in native livestock breeds, as well as the current and future impact of heat stress,. As adaptation and mitigation strategies to climate change depend on a solid quantitative foundation, this knowledge gap has limited such efforts. To address the lack of study, we have investigated heat stress impacts in the pastoral system of Sub-Saharan West Africa. We used a stochastic weather generator to quantify both the historic and future variability of heat stress. This approach models temperature, relative humidity, and precipitation, the climate variables controlling heat stress. Incorporating large-scale climate as covariates into this framework provides a better historical fit and allows us to include future CMIP5 GCM projections to examine the climate change impacts on heat stress. Health and production data allow us to examine the influence of this variability on livestock directly, and are considered in conjunction with the confounding impacts of fodder and water access. This understanding provides useful information to decision makers looking to mitigate the impacts of climate change and can provide useful seasonal forecasts of heat stress risk. A comparison of the current and future heat stress conditions based on

  6. BOARD-INVITED REVIEW: Quantifying water use in ruminant production.

    Science.gov (United States)

    Legesse, G; Ominski, K H; Beauchemin, K A; Pfister, S; Martel, M; McGeough, E J; Hoekstra, A Y; Kroebel, R; Cordeiro, M R C; McAllister, T A

    2017-05-01

    The depletion of water resources, in terms of both quantity and quality, has become a major concern both locally and globally. Ruminants, in particular, are under increased public scrutiny due to their relatively high water use per unit of meat or milk produced. Estimating the water footprint of livestock production is a relatively new field of research for which methods are still evolving. This review describes the approaches used to quantify water use in ruminant production systems as well as the methodological and conceptual issues associated with each approach. Water use estimates for the main products from ruminant production systems are also presented, along with possible management strategies to reduce water use. In the past, quantifying water withdrawal in ruminant production focused on the water demand for drinking or operational purposes. Recently, the recognition of water as a scarce resource has led to the development of several methodologies including water footprint assessment, life cycle assessment, and livestock water productivity to assess water use and its environmental impacts. These methods differ with respect to their target outcome (efficiency or environmental impacts), geographic focus (local or global), description of water sources (green, blue, and gray), handling of water quality concerns, the interpretation of environmental impacts, and the metric by which results are communicated (volumetric units or impact equivalents). Ruminant production is a complex activity where animals are often reared at different sites using a range of resources over their lifetime. Additional water use occurs during slaughter, product processing, and packaging. Estimating water use at the various stages of meat and milk production and communicating those estimates will help producers and other stakeholders identify hotspots and implement strategies to improve water use efficiency. Improvements in ruminant productivity (i.e., BW and milk production) and

  7. Quantifying Pollutant Emissions from Office Equipment Phase IReport

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, R.L.; Destaillats, H.; Hodgson, A.T.; McKone, T.E.; Perino, C.

    2006-12-01

    Although office equipment has been a focal point for governmental efforts to promote energy efficiency through programs such as Energy Star, little is known about the relationship between office equipment use and indoor air quality. This report provides results of the first phase (Phase I) of a study in which the primary objective is to measure emissions of organic pollutants and particulate matter from a selected set of office equipment typically used in residential and office environments. The specific aims of the overall research effort are: (1) use screening-level measurements to identify and quantify the concentrations of air pollutants of interest emitted by major categories of distributed office equipment in a controlled environment; (2) quantify the emissions of air pollutants from generally representative, individual machines within each of the major categories in a controlled chamber environment using well defined protocols; (3) characterize the effects of ageing and use on emissions for individual machines spanning several categories; (4) evaluate the importance of operational factors that can be manipulated to reduce pollutant emissions from office machines; and (5) explore the potential relationship between energy consumption and pollutant emissions for machines performing equivalent tasks. The study includes desktop computers (CPU units), computer monitors, and three categories of desktop printing devices. The printer categories are: (1) printers and multipurpose devices using color inkjet technology; (2) low- to medium output printers and multipurpose devices employing monochrome or color laser technology; and (3) high-output monochrome and color laser printers. The literature review and screening level experiments in Phase 1 were designed to identify substances of toxicological significance for more detailed study. In addition, these screening level measurements indicate the potential relative importance of different categories of office equipment

  8. Quantifying the Influence of Urbanization on a Coastal Floodplain

    Science.gov (United States)

    Sebastian, A.; Juan, A.; Bedient, P. B.

    2016-12-01

    The U.S. Gulf Coast is the fastest growing region in the United States; between 1960 and 2010, the number of housing units along the Gulf of Mexico increased by 246%, vastly outpacing growth in other parts of the country (NOAA 2013). Numerous studies have shown that increases in impervious surface associated with urbanization reduce infiltration and increase surface runoff. While empirical evidence suggests that changes in land use are leading to increased flood damage in overland areas, earlier studies have largely focused on the impacts of urbanization on surface runoff and watershed hydrology, rather than quantifying its influence on the spatial extent of flooding. In this study, we conduct a longitudinal assessment of the evolution of flood risk since 1970 in an urbanizing coastal watershed. Utilizing the distributed hydrologic model, Vflo®, in combination with the hydraulic model, HEC-RAS, we quantify the impact of localized land use/land cover (LULC) change on the spatial extent of flooding in the watershed and the underlying flood hazard structure. The results demonstrate that increases in impervious cover between 1970 and 2010 (34%) and 2010 and 2040 (18%) increase the size of the floodplain by 26 and 17%, respectively. Furthermore, the results indicate that the depth and frequency of flooding in neighborhoods within the 1% floodplain have increased substantially (see attached figure). Finally, this analysis provides evidence that outdated FEMA floodplain maps could be underestimating the extent of the floodplain by upwards of 25%, depending on the rate of urbanization in the watershed; and, that by incorporating physics-based distributed hydrologic models into floodplain studies, floodplain maps can be easily updated to reflect the most recent LULC information available. The methods presented in this study have important implications for the development of mitigation strategies in coastal areas, such as deterring future development in flood prone areas

  9. Hydrology & isotope tools to quantify carbon sources and sinks

    Science.gov (United States)

    Barth, Johannes A. C.; Lischeid, Gunnar; Gessler, Arthur

    2010-05-01

    Vegetation is fundamental for carbon uptake and usually assumes the largest portion in the evapotranspiration term. While interception can be separated by mapping various plant types in a catchment, the water isotope method yields numbers for pure evaporation. The latter causes enrichment of the heavier isotope in the remaining water phase, while transpiration leaves the isotope signal of water unaltered over longer time periods. Evaporation can thus be quantified in an integral manner over large areas by measuring water stable isotopes at points of river discharge and by comparing them to incoming precipitation. This method has been applied on scales of several thousand square kilometres and its calibration on scales of few square kilometres will allow to better constrain uncertainties. This necessitates comparison with hydrometric methods of well-instrumented catchments in several climatic regimes. Innovative small-scale methods involve determination of effective rainfall by time series analyses of hydrological data. This in turn requires temporal resolution of daily to hourly values to apply methods such as runoff recession or principal component analyses. It is also known that continental water fluxes are related to carbon fluxes through photosynthesis that in turn recycles large amounts of water via transpiration. This is usually described by the water use efficiency (WUE) term that quantifies how many moles of water transpire to accumulate one mole of CO2. However, so far only few empirical numbers are available for the spatio-temporal variability in WUE of plants and plant communities and further field experiments combined with isoscape approaches are necessary to constrain this term on a regional scale and its dependencies on factors such as light, temperature, water availability, plant type and height. Combined data can then serve to determine catchment-wide carbon uptake via the transpiration rates. Carbon accumulation can also be determined with eddy

  10. Quantifying uncertainties of permafrost carbon–climate feedbacks

    Directory of Open Access Journals (Sweden)

    E. J. Burke

    2017-06-01

    Full Text Available The land surface models JULES (Joint UK Land Environment Simulator, two versions and ORCHIDEE-MICT (Organizing Carbon and Hydrology in Dynamic Ecosystems, each with a revised representation of permafrost carbon, were coupled to the Integrated Model Of Global Effects of climatic aNomalies (IMOGEN intermediate-complexity climate and ocean carbon uptake model. IMOGEN calculates atmospheric carbon dioxide (CO2 and local monthly surface climate for a given emission scenario with the land–atmosphere CO2 flux exchange from either JULES or ORCHIDEE-MICT. These simulations include feedbacks associated with permafrost carbon changes in a warming world. Both IMOGEN–JULES and IMOGEN–ORCHIDEE-MICT were forced by historical and three alternative future-CO2-emission scenarios. Those simulations were performed for different climate sensitivities and regional climate change patterns based on 22 different Earth system models (ESMs used for CMIP3 (phase 3 of the Coupled Model Intercomparison Project, allowing us to explore climate uncertainties in the context of permafrost carbon–climate feedbacks. Three future emission scenarios consistent with three representative concentration pathways were used: RCP2.6, RCP4.5 and RCP8.5. Paired simulations with and without frozen carbon processes were required to quantify the impact of the permafrost carbon feedback on climate change. The additional warming from the permafrost carbon feedback is between 0.2 and 12 % of the change in the global mean temperature (ΔT by the year 2100 and 0.5 and 17 % of ΔT by 2300, with these ranges reflecting differences in land surface models, climate models and emissions pathway. As a percentage of ΔT, the permafrost carbon feedback has a greater impact on the low-emissions scenario (RCP2.6 than on the higher-emissions scenarios, suggesting that permafrost carbon should be taken into account when evaluating scenarios of heavy mitigation and stabilization

  11. Quantifying App Store Dynamics: Longitudinal Tracking of Mental Health Apps.

    Science.gov (United States)

    Larsen, Mark Erik; Nicholas, Jennifer; Christensen, Helen

    2016-08-09

    For many mental health conditions, mobile health apps offer the ability to deliver information, support, and intervention outside the clinical setting. However, there are difficulties with the use of a commercial app store to distribute health care resources, including turnover of apps, irrelevance of apps, and discordance with evidence-based practice. The primary aim of this study was to quantify the longevity and rate of turnover of mental health apps within the official Android and iOS app stores. The secondary aim was to quantify the proportion of apps that were clinically relevant and assess whether the longevity of these apps differed from clinically nonrelevant apps. The tertiary aim was to establish the proportion of clinically relevant apps that included claims of clinical effectiveness. We performed additional subgroup analyses using additional data from the app stores, including search result ranking, user ratings, and number of downloads. We searched iTunes (iOS) and the Google Play (Android) app stores each day over a 9-month period for apps related to depression, bipolar disorder, and suicide. We performed additional app-specific searches if an app no longer appeared within the main search On the Android platform, 50% of the search results changed after 130 days (depression), 195 days (bipolar disorder), and 115 days (suicide). Search results were more stable on the iOS platform, with 50% of the search results remaining at the end of the study period. Approximately 75% of Android and 90% of iOS apps were still available to download at the end of the study. We identified only 35.3% (347/982) of apps as being clinically relevant for depression, of which 9 (2.6%) claimed clinical effectiveness. Only 3 included a full citation to a published study. The mental health app environment is volatile, with a clinically relevant app for depression becoming unavailable to download every 2.9 days. This poses challenges for consumers and clinicians seeking relevant

  12. Quantifying soil burn severity for hydrologic modeling to assess post-fire effects on sediment delivery

    Science.gov (United States)

    Dobre, Mariana; Brooks, Erin; Lew, Roger; Kolden, Crystal; Quinn, Dylan; Elliot, William; Robichaud, Pete

    2017-04-01

    Soil erosion is a secondary fire effect with great implications for many ecosystem resources. Depending on the burn severity, topography, and the weather immediately after the fire, soil erosion can impact municipal water supplies, degrade water quality, and reduce reservoirs' storage capacity. Scientists and managers use field and remotely sensed data to quickly assess post-fire burn severity in ecologically-sensitive areas. From these assessments, mitigation activities are implemented to minimize post-fire flood and soil erosion and to facilitate post-fire vegetation recovery. Alternatively, land managers can use fire behavior and spread models (e.g. FlamMap, FARSITE, FOFEM, or CONSUME) to identify sensitive areas a priori, and apply strategies such as fuel reduction treatments to proactively minimize the risk of wildfire spread and increased burn severity. There is a growing interest in linking fire behavior and spread models with hydrology-based soil erosion models to provide site-specific assessment of mitigation treatments on post-fire runoff and erosion. The challenge remains, however, that many burn severity mapping and modeling products quantify vegetation loss rather than measuring soil burn severity. Wildfire burn severity is spatially heterogeneous and depends on the pre-fire vegetation cover, fuel load, topography, and weather. Severities also differ depending on the variable of interest (e.g. soil, vegetation). In the United States, Burned Area Reflectance Classification (BARC) maps, derived from Landsat satellite images, are used as an initial burn severity assessment. BARC maps are classified from either a Normalized Burn Ratio (NBR) or differenced Normalized Burned Ratio (dNBR) scene into four classes (Unburned, Low, Moderate, and High severity). The development of soil burn severity maps requires further manual field validation efforts to transform the BARC maps into a product more applicable for post-fire soil rehabilitation activities

  13. Effect of downed woody debris on small mammal anti-predator behavior

    Science.gov (United States)

    Travis M. Hinkelman; John L. Orrock; Susan C Loeb

    2011-01-01

    Anti-Predator behavior can affect prey growth, reproduction, survival, and generate emergent effects in food webs. Small mammals often lower the cost of predation by altering their behavior in response to shrubs, but the importance of other microhabitat features, such as downed woody debris, for anti-predator behavior is unknown. We used giving-up densities to quantify...

  14. Behavior and neuropsychiatric manifestations in Angelman syndrome.

    Science.gov (United States)

    Pelc, Karine; Cheron, Guy; Dan, Bernard

    2008-06-01

    Angelman syndrome has been suggested as a disease model of neurogenetic developmental condition with a specific behavioral phenotype. It is due to lack of expression of the UBE3A gene, an imprinted gene located on chromosome 15q. Here we review the main features of this phenotype, characterized by happy demeanor with prominent smiling, poorly specific laughing and general exuberance, associated with hypermotor behavior, stereotypies, and reduced behavioral adaptive skills despite proactive social contact. All these phenotypic characteristics are currently difficult to quantify and have been subject to some differences in interpretation. For example, prevalence of autistic disorder is still debated. Many of these features may occur in other syndromic or nonsyndromic forms of severe intellectual disability, but their combination, with particularly prominent laughter and smiling may be specific of Angelman syndrome. Management of problematic behaviors is primarily based on behavioral approaches, though psychoactive medication (eg, neuroleptics or antidepressants) may be required.

  15. Discounting Behavior

    DEFF Research Database (Denmark)

    Andersen, Steffen; Harrison, Glenn W.; Lau, Morten

    2014-01-01

    We re-evaluate the theory, experimental design and econometrics behind claims that individuals exhibit non-constant discounting behavior. Theory points to the importance of controlling for the non-linearity of the utility function of individuals, since the discount rate is defined over time......-dated utility flows and not flows of money. It also points to a menagerie of functional forms to characterize different types of non-constant discounting behavior. The implied experimental design calls for individuals to undertake several tasks to allow us to identify these models, and to several treatments...

  16. Quantifying protein adsorption and function at nanostructured materials: enzymatic activity of glucose oxidase at GLAD structured electrodes.

    Science.gov (United States)

    Jensen, Uffe B; Ferapontova, Elena E; Sutherland, Duncan S

    2012-07-31

    Nanostructured materials strongly modulate the behavior of adsorbed proteins; however, the characterization of such interactions is challenging. Here we present a novel method combining protein adsorption studies at nanostructured quartz crystal microbalance sensor surfaces (QCM-D) with optical (surface plasmon resonance SPR) and electrochemical methods (cyclic voltammetry CV) allowing quantification of both bound protein amount and activity. The redox enzyme glucose oxidase is studied as a model system to explore alterations in protein functional behavior caused by adsorption onto flat and nanostructured surfaces. This enzyme and such materials interactions are relevant for biosensor applications. Novel nanostructured gold electrode surfaces with controlled curvature were fabricated using colloidal lithography and glancing angle deposition (GLAD). The adsorption of enzyme to nanostructured interfaces was found to be significantly larger compared to flat interfaces even after normalization for the increased surface area, and no substantial desorption was observed within 24 h. A decreased enzymatic activity was observed over the same period of time, which indicates a slow conformational change of the adsorbed enzyme induced by the materials interface. Additionally, we make use of inherent localized surface plasmon resonances in these nanostructured materials to directly quantify the protein binding. We hereby demonstrate a QCM-D-based methodology to quantify protein binding at complex nanostructured materials. Our approach allows label free quantification of protein binding at nanostructured interfaces.

  17. Quantifying Transmission of Clostridium difficile within and outside Healthcare Settings.

    Science.gov (United States)

    Durham, David P; Olsen, Margaret A; Dubberke, Erik R; Galvani, Alison P; Townsend, Jeffrey P

    2016-04-01

    To quantify the effect of hospital and community-based transmission and control measures on Clostridium difficile infection (CDI), we constructed a transmission model within and between hospital, community, and long-term care-facility settings. By parameterizing the model from national databases and calibrating it to C. difficile prevalence and CDI incidence, we found that hospitalized patients with CDI transmit C. difficile at a rate 15 (95% CI 7.2-32) times that of asymptomatic patients. Long-term care facility residents transmit at a rate of 27% (95% CI 13%-51%) that of hospitalized patients, and persons in the community at a rate of 0.1% (95% CI 0.062%-0.2%) that of hospitalized patients. Despite lower transmission rates for asymptomatic carriers and community sources, these transmission routes have a substantial effect on hospital-onset CDI because of the larger reservoir of hospitalized carriers and persons in the community. Asymptomatic carriers and community sources should be accounted for when designing and evaluating control interventions.

  18. A comparison of methods to quantify prolamin contents in cereals

    Directory of Open Access Journals (Sweden)

    Gianluca Giuberti

    2011-02-01

    Full Text Available Hydrophobic prolamins are endosperm storage proteins accounting for about 40% of the total protein in most cereals seeds. Despite the absence of a reference method, several procedures have been periodically published to quantify prolamins in cereals. The aim of this study was to compare a conventional fractionation assay (LND vs three other methods: one based on sequential extractions (HAM and two rapid turbidimetric procedures (L&H and DRO. Prolamins were extracted in duplicate on barley, corn and wheat samples. For the turbidimetric prolamin evaluation in barley and wheat, a universally available purified gliadin, as alternative to purified zein, was also tested as standard reference material (SRM. The extraction prolamin values were different among grain types (P0.05. LND agreed sufficiently well both with HAM and with L&H methods (R2=0.664 and R2=0.703, respectively, P0.05, whereas a higher prolamin quantification was obtained using HAM (P<0.05. Overall, DRO did not provide similar comparison and performance parameters with respect to other method comparisons. The effect of changing purified zein with purified gliadin was noteworthy only for L&H, both for wheat and barley samples (P<0.01. Considering the increasing attention of animal nutritionists on prolamins, our results could get useful information for routine laboratories analysis.

  19. Quantifying solute transport processes: are chemically "conservative" tracers electrically conservative?

    Science.gov (United States)

    Singha, Kamini; Li, Li; Day-Lewis, Frederick D.; Regberg, Aaron B.

    2012-01-01

    The concept of a nonreactive or conservative tracer, commonly invoked in investigations of solute transport, requires additional study in the context of electrical geophysical monitoring. Tracers that are commonly considered conservative may undergo reactive processes, such as ion exchange, thus changing the aqueous composition of the system. As a result, the measured electrical conductivity may reflect not only solute transport but also reactive processes. We have evaluated the impacts of ion exchange reactions, rate-limited mass transfer, and surface conduction on quantifying tracer mass, mean arrival time, and temporal variance in laboratory-scale column experiments. Numerical examples showed that (1) ion exchange can lead to resistivity-estimated tracer mass, velocity, and dispersivity that may be inaccurate; (2) mass transfer leads to an overestimate in the mobile tracer mass and an underestimate in velocity when using electrical methods; and (3) surface conductance does not notably affect estimated moments when high-concentration tracers are used, although this phenomenon may be important at low concentrations or in sediments with high and/or spatially variable cation-exchange capacity. In all cases, colocated groundwater concentration measurements are of high importance for interpreting geophysical data with respect to the controlling transport processes of interest.

  20. A Method for Quantifying, Visualising, and Analysing Gastropod Shell Form.

    Science.gov (United States)

    Liew, Thor-Seng; Schilthuizen, Menno

    2016-01-01

    Quantitative analysis of organismal form is an important component for almost every branch of biology. Although generally considered an easily-measurable structure, the quantification of gastropod shell form is still a challenge because many shells lack homologous structures and have a spiral form that is difficult to capture with linear measurements. In view of this, we adopt the idea of theoretical modelling of shell form, in which the shell form is the product of aperture ontogeny profiles in terms of aperture growth trajectory that is quantified as curvature and torsion, and of aperture form that is represented by size and shape. We develop a workflow for the analysis of shell forms based on the aperture ontogeny profile, starting from the procedure of data preparation (retopologising the shell model), via data acquisition (calculation of aperture growth trajectory, aperture form and ontogeny axis), and data presentation (qualitative comparison between shell forms) and ending with data analysis (quantitative comparison between shell forms). We evaluate our methods on representative shells of the genera Opisthostoma and Plectostoma, which exhibit great variability in shell form. The outcome suggests that our method is a robust, reproducible, and versatile approach for the analysis of shell form. Finally, we propose several potential applications of our methods in functional morphology, theoretical modelling, taxonomy, and evolutionary biology.

  1. Using tangles to quantify topological mixing of fluids

    Science.gov (United States)

    Chen, Qianting; Sattari, Sulimon; Mitchell, Kevin

    2014-11-01

    Topological mixing is important in understanding complex fluid problems, ranging from oceanic transport to the design of micro-mixers. Typically, topological entropy (TE), the exponential growth rate of material lines, is used to quantify topological mixing. Computing TE from the direct stretching rate is computationally expensive and sheds little light on the source of the mixing. Previous work has focused on braiding by ``ghost rods'' (See, e.g. works by Boyland, Aref, Stremler, Tiffeault, and Finn). Following Grover et al. [Chaos 22, 043135 (2012)], we study topological mixing in a two-dimensional lid-driven cavity flow. For a certain parameter range, the TE is dominated by a period-3 braid. However, this braid alone cannot explain all the TE within this range, nor the TE outside the range of existence of the braid. By contrast, we explain TE through the topology of intersecting stable and unstable manifolds, i.e. heteroclinic tangles, using homotopic lobe dynamics (HLD). In the HLD approach, stirring originates from ``ghost rods'' placed on heteroclinic orbits. We demonstrate that these heteroclinic orbits generate excess TE not accounted for in Grover et al. Furthermore, in the limit of utilizing arbitrarily long manifolds, the HLD technique converges to the true TE. Supported by the US National Science Foundation under Grant PHY-0748828.

  2. A Method for Quantifying, Visualising, and Analysing Gastropod Shell Form

    Science.gov (United States)

    Liew, Thor-Seng; Schilthuizen, Menno

    2016-01-01

    Quantitative analysis of organismal form is an important component for almost every branch of biology. Although generally considered an easily-measurable structure, the quantification of gastropod shell form is still a challenge because many shells lack homologous structures and have a spiral form that is difficult to capture with linear measurements. In view of this, we adopt the idea of theoretical modelling of shell form, in which the shell form is the product of aperture ontogeny profiles in terms of aperture growth trajectory that is quantified as curvature and torsion, and of aperture form that is represented by size and shape. We develop a workflow for the analysis of shell forms based on the aperture ontogeny profile, starting from the procedure of data preparation (retopologising the shell model), via data acquisition (calculation of aperture growth trajectory, aperture form and ontogeny axis), and data presentation (qualitative comparison between shell forms) and ending with data analysis (quantitative comparison between shell forms). We evaluate our methods on representative shells of the genera Opisthostoma and Plectostoma, which exhibit great variability in shell form. The outcome suggests that our method is a robust, reproducible, and versatile approach for the analysis of shell form. Finally, we propose several potential applications of our methods in functional morphology, theoretical modelling, taxonomy, and evolutionary biology. PMID:27280463

  3. Quantifying the contribution of genetic variants for survival phenotypes.

    Science.gov (United States)

    Müller, Martina; Döring, Angela; Küchenhoff, Helmut; Lamina, Claudia; Malzahn, Dörthe; Bickeböller, Heike; Vollmert, Caren; Klopp, Norman; Meisinger, Christa; Heinrich, Joachim; Kronenberg, Florian; Wichmann, H Erich; Heid, Iris M

    2008-09-01

    Particularly in studies based on population representative samples, it is of major interest what impact a genetic variant has on the phenotype of interest, which cannot be answered by mere association estimates alone. One possible measure for quantifying the phenotype's variance explained by the genetic variant is R(2). However, for survival outcomes, no clear definition of R(2) is available in the presence of censored observations. We selected three criteria proposed for this purpose in the literature and compared their performance for single nucleotide polymorphism (SNP) data through simulation studies and for mortality data with candidate SNPs in the general population-based KORA cohort. The evaluated criteria were based on: (1) the difference of deviance residuals, (2) the variation of individual survival curves, and (3) the variation of Schoenfeld residuals. Our simulation studies included various censoring and genetic scenarios. The simulation studies revealed that the deviance residuals' criterion had a high dependence on the censoring percentage, was generally not limited to the range [0; 1] and therefore lacked interpretation as a percentage of explained variation. The second criterion (variation of survival curves) hardly reached values above 60%. Our requirements were best fulfilled by the criterion based on Schoenfeld residuals. Our mortality data analysis also supported the findings in simulation studies. With the criterion based on Schoenfeld residuals, we recommend a powerful and flexible tool for genetic epidemiological studies to refine genetic association studies by judging the contribution of genetic variants to survival phenotype.

  4. Evolutions in food marketing, quantifying the impact, and policy implications.

    Science.gov (United States)

    Cairns, Georgina

    2013-03-01

    A case study on interactive digital marketing examined the adequacy of extant policy controls and their underpinning paradigms to constrain the effects of this rapidly emerging practice. Findings were interactive digital marketing is expanding the strategies available to promote products, brands and consumer behaviours. It facilitates relational marketing; the collection of personal data for marketing; integration of the marketing mix, and provides a platform for consumers to engage in the co-creation of marketing communications. The paradigmatic logic of current policies to constrain youth-oriented food marketing does not address the interactive nature of digital marketing. The evidence base on the effects of HFSS marketing and policy interventions is based on conceptualizations of marketing as a force promoting transactions rather than interactions. Digital technologies are generating rich consumer data. Interactive digital technologies increase the complexity of the task of quantifying the impact of marketing. The rapidity of its uptake also increases urgency of need to identify appropriate effects measures. Independent analysis of commercial consumer data (appropriately transformed to protect commercial confidentiality and personal privacy) would provide evidence sources for policy on the impacts of commercial food and beverage marketing and policy controls. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Quantifying the pharmacology of antimalarial drug combination therapy

    Science.gov (United States)

    Hastings, Ian M.; Hodel, Eva Maria; Kay, Katherine

    2016-01-01

    Most current antimalarial drugs are combinations of an artemisinin plus a ‘partner’ drug from another class, and are known as artemisinin-based combination therapies (ACTs). They are the frontline drugs in treating human malaria infections. They also have a public-health role as an essential component of recent, comprehensive scale-ups of malaria interventions and containment efforts conceived as part of longer term malaria elimination efforts. Recent reports that resistance has arisen to artemisinins has caused considerable concern. We investigate the likely impact of artemisinin resistance by quantifying the contribution artemisinins make to the overall therapeutic capacity of ACTs. We achieve this using a simple, easily understood, algebraic approach and by more sophisticated pharmacokinetic/pharmacodynamic analyses of drug action; the two approaches gave consistent results. Surprisingly, the artemisinin component typically makes a negligible contribution (≪0.0001%) to the therapeutic capacity of the most widely used ACTs and only starts to make a significant contribution to therapeutic outcome once resistance has started to evolve to the partner drugs. The main threat to antimalarial drug effectiveness and control comes from resistance evolving to the partner drugs. We therefore argue that public health policies be re-focussed to maximise the likely long-term effectiveness of the partner drugs. PMID:27604175

  6. Can global positioning systems quantify participation in cerebral palsy?

    Science.gov (United States)

    Ben-Pazi, Hilla; Barzilay, Yair; Shoval, Noam

    2014-06-01

    This study examined whether motor-related participation could be assessed by global positioning systems in individuals with cerebral palsy. Global positioning systems monitoring devices were given to 2 adolescent girls (14-year-old with diplegic cerebral palsy and her 15-year-old healthy sister). Outcome measures were traveling distances, time spent outdoors, and Children's Assessment of Participation and Enjoyment questionnaires. Global positioning systems documented that the girl with cerebral palsy did not visit nearby friends, spent less time outdoors and traveled shorter distances than her sister (P = .02). Participation questionnaire corroborated that the girl with cerebral palsy performed most activities at home alone. Lower outdoor activity of the girl with cerebral palsy measured by a global positioning system was 29% to 53% of that of her sibling similar to participation questionnaires (44%). Global positioning devices objectively documented low outdoor activity in an adolescent with cerebral palsy compared to her sibling reflecting participation reported by validated questionnaires. Global positioning systems can potentially quantify certain aspects of participation.

  7. Quantifying the performance of individual players in a team activity.

    Directory of Open Access Journals (Sweden)

    Jordi Duch

    Full Text Available BACKGROUND: Teamwork is a fundamental aspect of many human activities, from business to art and from sports to science. Recent research suggest that team work is of crucial importance to cutting-edge scientific research, but little is known about how teamwork leads to greater creativity. Indeed, for many team activities, it is not even clear how to assign credit to individual team members. Remarkably, at least in the context of sports, there is usually a broad consensus on who are the top performers and on what qualifies as an outstanding performance. METHODOLOGY/PRINCIPAL FINDINGS: In order to determine how individual features can be quantified, and as a test bed for other team-based human activities, we analyze the performance of players in the European Cup 2008 soccer tournament. We develop a network approach that provides a powerful quantification of the contributions of individual players and of overall team performance. CONCLUSIONS/SIGNIFICANCE: We hypothesize that generalizations of our approach could be useful in other contexts where quantification of the contributions of individual team members is important.

  8. Quantifying uncertainty in LCA-modelling of waste management systems.

    Science.gov (United States)

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H

    2012-12-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.

  9. Workshop report on quantifying environmental damage from energy activities

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P D; Rowe, M D; Morris, S C; Hamilton, L D

    1977-09-11

    Data and methods for quantifying environmental damage from energy activities were evaluated. Specifically, discussions were designed to identify the types and amounts of pollutants emitted by energy technologies that may affect the environment adversely, methods of estimating spatial and temporal changes in air and water quality resulting from these emissions, spatial and temporal distributions of ecosystems at risk, dose-response functions for pollutants and ecosystems at risk, and environmental and economic variables to be used to measure damage. Emphasis was on available data and on several methods for quantitative estimation of effects of energy on the environment. Damage functions that could be used to quantitate effects of ozone and sulfur oxide on agricultural crops and trees, effects of altered stream depth and velocity patterns on river fish species, and sensitivities of lake chemistry and biology to acid rainfall are listed. Also described are methods for estimating effects of carbon dioxide, sulfur dioxide, ozone, and several other atmospheric pollutants on selected terrestrial communities by using computer modeling techniques. With these techniques, quantitative estimates of the effects of energy on the environment could be developed within one to two years. Brief discussions about effects of nutrient and trace metal discharges on terrestrial ecosystems and about impacts of petroleum hydrocarbon, heat, biocides, and entrainment on aquatic ecosystems are also included.

  10. Quantifying ground impact fatality rate for small unmanned aircraft

    DEFF Research Database (Denmark)

    La Cour-Harbo, Anders

    2017-01-01

    One of the major challenges of conducting operation of unmanned aircraft, especially operations beyond visual line-of-sight (BVLOS), is to make a realistic and sufficiently detailed risk assessment. An important part of such an assessment is to identify the risk of fatalities, preferably in a qua......One of the major challenges of conducting operation of unmanned aircraft, especially operations beyond visual line-of-sight (BVLOS), is to make a realistic and sufficiently detailed risk assessment. An important part of such an assessment is to identify the risk of fatalities, preferably...... in a quantitative way since this allows for comparison with manned aviation to determine whether an equivalent level of safety is achievable. This work presents a method for quantifying the probability of fatalities resulting from an uncontrolled descent of an unmanned aircraft conducting a BVLOS flight. The method...... is based on a standard stochastic model, and employs a parameterized high fidelity ground impact distribution model that accounts for both aircraft specifications, parameter uncertainties, and wind. The method also samples the flight path to create an almost continuous quantification of the risk...

  11. Quantifying Snow Volume Uncertainty from Repeat Terrestrial Laser Scanning Observations

    Science.gov (United States)

    Gadomski, P. J.; Hartzell, P. J.; Finnegan, D. C.; Glennie, C. L.; Deems, J. S.

    2014-12-01

    Terrestrial laser scanning (TLS) systems are capable of providing rapid, high density, 3D topographic measurements of snow surfaces from increasing standoff distances. By differencing snow surface with snow free measurements within a common scene, snow depths and volumes can be estimated. These data can support operational water management decision-making when combined with measured or modeled snow densities to estimate basin water content, evaluate in-situ data, or drive operational hydrologic models. In addition, change maps from differential TLS scans can also be used to support avalanche control operations to quantify loading patterns for both pre-control planning and post-control assessment. However, while methods for computing volume from TLS point cloud data are well documented, a rigorous quantification of the volumetric uncertainty has yet to be presented. Using repeat TLS data collected at the Arapahoe Basin Ski Area in Summit County, Colorado, we demonstrate the propagation of TLS point measurement and cloud registration uncertainties into 3D covariance matrices at the point level. The point covariances are then propagated through a volume computation to arrive at a single volume uncertainty value. Results from two volume computation methods are compared and the influence of data voids produced by occlusions examined.

  12. Quantifying the direct use value of Condor seamount

    Science.gov (United States)

    Ressurreição, Adriana; Giacomello, Eva

    2013-12-01

    Seamounts often satisfy numerous uses and interests. Multiple uses can generate multiple benefits but also conflicts and impacts, calling, therefore, for integrated and sustainable management. To assist in developing comprehensive management strategies, policymakers recognise the need to include measures of socioeconomic analysis alongside ecological data so that practical compromises can be made. This study assessed the direct output impact (DOI) of the relevant marine activities operating at Condor seamount (Azores, central northeast Atlantic) as proxies of the direct use values provided by the resource system. Results demonstrated that Condor seamount supported a wide range of uses yielding distinct economic outputs. Demersal fisheries, scientific research and shark diving were the top-three activities generating the highest revenues, while tuna fisheries, whale watching and scuba-diving had marginal economic significance. Results also indicated that the economic importance of non-extractive uses of Condor is considerable, highlighting the importance of these uses as alternative income-generating opportunities for local communities. It is hoped that quantifying the direct use values provided by Condor seamount will contribute to the decision making process towards its long-term conservation and sustainable use.

  13. Identifying and Quantifying Recurrent Novae Masquerading as Classical Novae

    CERN Document Server

    Pagnotta, Ashley

    2014-01-01

    Recurrent novae (RNe) are cataclysmic variables with two or more nova eruptions within a century. Classical novae (CNe) are similar systems with only one such eruption. Many of the so-called 'CNe' are actually RNe for which only one eruption has been discovered. Since RNe are candidate Type Ia supernova progenitors, it is important to know whether there are enough in our galaxy to provide the supernova rate, and therefore to know how many RNe are masquerading as CNe. To quantify this, we collected all available information on the light curves and spectra of a Galactic, time-limited sample of 237 CNe and the 10 known RNe, as well as exhaustive discovery efficiency records. We recognize RNe as having (a) outburst amplitude smaller than 14.5 - 4.5 * log(t_3), (b) orbital period >0.6 days, (c) infrared colors of J-H > 0.7 mag and H-K > 0.1 mag, (d) FWHM of H-alpha > 2000 km/s, (e) high excitation lines, such as Fe X or He II near peak, (f) eruption light curves with a plateau, and (g) white dwarf mass greater tha...

  14. Technique to accurately quantify collagen content in hyperconfluent cell culture.

    Science.gov (United States)

    See, Eugene Yong-Shun; Toh, Siew Lok; Goh, James Cho Hong

    2008-12-01

    Tissue engineering aims to regenerate tissues that can successfully take over the functions of the native tissue when it is damaged or diseased. In most tissues, collagen makes up the bulk component of the extracellular matrix, thus, there is great emphasis on its accurate quantification in tissue engineering. It has already been reported that pepsin digestion is able to solubilize the collagen deposited within the cell layer for accurate quantification of collagen content in cultures, but this method has drawbacks when cultured cells are hyperconfluent. In this condition, Pepsin digestion will result in fragments of the cell layers that cannot be completely resolved. These fragments of the undigested cell sheet are visible to the naked eye, which can bias the final results. To the best of our knowledge, there has been no reported method to accurately quantify the collagen content in hyperconfluent cell sheet. Therefore, this study aims to illustrate that sonication is able to aid pepsin digestion of hyperconfluent cell layers of fibroblasts and bone marrow mesenchymal stem cells, to solubilize all the collagen for accurate quantification purposes.

  15. Quantifying uncertainty in the phylogenetics of Australian numeral systems.

    Science.gov (United States)

    Zhou, Kevin; Bowern, Claire

    2015-09-22

    Researchers have long been interested in the evolution of culture and the ways in which change in cultural systems can be reconstructed and tracked. Within the realm of language, these questions are increasingly investigated with Bayesian phylogenetic methods. However, such work in cultural phylogenetics could be improved by more explicit quantification of reconstruction and transition probabilities. We apply such methods to numerals in the languages of Australia. As a large phylogeny with almost universal 'low-limit' systems, Australian languages are ideal for investigating numeral change over time. We reconstruct the most likely extent of the system at the root and use that information to explore the ways numerals evolve. We show that these systems do not increment serially, but most commonly vary their upper limits between 3 and 5. While there is evidence for rapid system elaboration beyond the lower limits, languages lose numerals as well as gain them. We investigate the ways larger numerals build on smaller bases, and show that there is a general tendency to both gain and replace 4 by combining 2 + 2 (rather than inventing a new unanalysable word 'four'). We develop a series of methods for quantifying and visualizing the results.

  16. Quantifying the benefits of Mediterranean diet in terms of survival.

    Science.gov (United States)

    Bellavia, Andrea; Tektonidis, Thanasis G; Orsini, Nicola; Wolk, Alicja; Larsson, Susanna C

    2016-05-01

    Beneficial effects of Mediterranean diet (MD) have been consistently documented. However, to fully understand the public health implications of MD adherence, an informative step is to quantify these effects in terms of survival time differences. The aim of this study was to evaluate the impact of MD on survival, presenting results in terms of differences in median age at death. We used data from 71,333 participants from a large population-based cohort of Swedish men and women, followed-up between January 1, 1998, and December 31, 2012. A total score of MD, ranging from 0 to 8, was calculated by including information on vegetables and fruits consumption, legumes and nuts, non-refined/high fiber grains, fermented dairy products, fish, red meat, use of olive oil/rapeseed oil, and moderate alcohol intake. Multivariable-adjusted differences in median age at death were estimated with Laplace regression and presented as a function of the MD score. During 15 years of follow-up we documented 14,697 deaths. We observed a linear dose-response association between the MD score and median age at death, with higher score associated with longer survival. The difference in median age at death between participants with the extreme scores (0 vs 8) of MD was up to 2 years (23 months, 95 % CI: 16-29). In this study we documented that adherence to MD may accrue benefits up to 2 years of longer survival.

  17. Challenges in quantifying biosphere-atmosphere exchange of nitrogen species.

    Science.gov (United States)

    Sutton, M A; Nemitz, E; Erisman, J W; Beier, C; Bahl, K Butterbach; Cellier, P; de Vries, W; Cotrufo, F; Skiba, U; Di Marco, C; Jones, S; Laville, P; Soussana, J F; Loubet, B; Twigg, M; Famulari, D; Whitehead, J; Gallagher, M W; Neftel, A; Flechard, C R; Herrmann, B; Calanca, P L; Schjoerring, J K; Daemmgen, U; Horvath, L; Tang, Y S; Emmett, B A; Tietema, A; Peñuelas, J; Kesik, M; Brueggemann, N; Pilegaard, K; Vesala, T; Campbell, C L; Olesen, J E; Dragosits, U; Theobald, M R; Levy, P; Mobbs, D C; Milne, R; Viovy, N; Vuichard, N; Smith, J U; Smith, P; Bergamaschi, P; Fowler, D; Reis, S

    2007-11-01

    Recent research in nitrogen exchange with the atmosphere has separated research communities according to N form. The integrated perspective needed to quantify the net effect of N on greenhouse-gas balance is being addressed by the NitroEurope Integrated Project (NEU). Recent advances have depended on improved methodologies, while ongoing challenges include gas-aerosol interactions, organic nitrogen and N(2) fluxes. The NEU strategy applies a 3-tier Flux Network together with a Manipulation Network of global-change experiments, linked by common protocols to facilitate model application. Substantial progress has been made in modelling N fluxes, especially for N(2)O, NO and bi-directional NH(3) exchange. Landscape analysis represents an emerging challenge to address the spatial interactions between farms, fields, ecosystems, catchments and air dispersion/deposition. European up-scaling of N fluxes is highly uncertain and a key priority is for better data on agricultural practices. Finally, attention is needed to develop N flux verification procedures to assess compliance with international protocols.

  18. Quantifying the abnormal hemodynamics of sickle cell anemia

    Science.gov (United States)

    Lei, Huan; Karniadakis, George

    2012-02-01

    Sickle red blood cells (SS-RBC) exhibit heterogeneous morphologies and abnormal hemodynamics in deoxygenated states. A multi-scale model for SS-RBC is developed based on the Dissipative Particle Dynamics (DPD) method. Different cell morphologies (sickle, granular, elongated shapes) typically observed in deoxygenated states are constructed and quantified by the Asphericity and Elliptical shape factors. The hemodynamics of SS-RBC suspensions is studied in both shear and pipe flow systems. The flow resistance obtained from both systems exhibits a larger value than the healthy blood flow due to the abnormal cell properties. Moreover, SS-RBCs exhibit abnormal adhesive interactions with both the vessel endothelium cells and the leukocytes. The effect of the abnormal adhesive interactions on the hemodynamics of sickle blood is investigated using the current model. It is found that both the SS-RBC - endothelium and the SS-RBC - leukocytes interactions, can potentially trigger the vicious ``sickling and entrapment'' cycles, resulting in vaso-occlusion phenomena widely observed in micro-circulation experiments.

  19. Quantifying stretching and rearrangement in epithelial sheet migration

    CERN Document Server

    Lee, Rachel M; Nordstrom, Kerstin N; Ouellette, Nicholas T; Losert, Wolfgang; 10.1088/1367-2630/15/2/025036

    2013-01-01

    Although understanding the collective migration of cells, such as that seen in epithelial sheets, is essential for understanding diseases such as metastatic cancer, this motion is not yet as well characterized as individual cell migration. Here we adapt quantitative metrics used to characterize the flow and deformation of soft matter to contrast different types of motion within a migrating sheet of cells. Using a Finite-Time Lyapunov Exponent (FTLE) analysis, we find that - in spite of large fluctuations - the flow field of an epithelial cell sheet is not chaotic. Stretching of a sheet of cells (i.e., positive FTLE) is localized at the leading edge of migration. By decomposing the motion of the cells into affine and non-affine components using the metric D$^{2}_{min}$, we quantify local plastic rearrangements and describe the motion of a group of cells in a novel way. We find an increase in plastic rearrangements with increasing cell densities, whereas inanimate systems tend to exhibit less non-affine rearran...

  20. Quantifying leaf venation patterns: two-dimensional maps.

    Science.gov (United States)

    Rolland-Lagan, Anne-Gaëlle; Amin, Mira; Pakulska, Malgosia

    2009-01-01

    The leaf vasculature plays crucial roles in transport and mechanical support. Understanding how vein patterns develop and what underlies pattern variation between species has many implications from both physiological and evolutionary perspectives. We developed a method for extracting spatial vein pattern data from leaf images, such as vein densities and also the sizes and shapes of the vein reticulations. We used this method to quantify leaf venation patterns of the first rosette leaf of Arabidopsis thaliana throughout a series of developmental stages. In particular, we characterized the size and shape of vein network areoles (loops), which enlarge and are split by new veins as a leaf develops. Pattern parameters varied in time and space. In particular, we observed a distal to proximal gradient in loop shape (length/width ratio) which varied over time, and a margin-to-center gradient in loop sizes. Quantitative analyses of vein patterns at the tissue level provide a two-way link between theoretical models of patterning and molecular experimental work to further explore patterning mechanisms during development. Such analyses could also be used to investigate the effect of environmental factors on vein patterns, or to compare venation patterns from different species for evolutionary studies. The method also provides a framework for gathering and overlaying two-dimensional maps of point, line and surface morphological data.