WorldWideScience

Sample records for sophisticated analysis including

  1. Sophisticated Players and Sophisticated Agents

    NARCIS (Netherlands)

    Rustichini, A.

    1998-01-01

    A sophisticated player is an individual who takes the action of the opponents, in a strategic situation, as determined by decision of rational opponents, and acts accordingly. A sophisticated agent is rational in the choice of his action, but ignores the fact that he is part of a strategic

  2. Low Level RF Including a Sophisticated Phase Control System for CTF3

    CERN Document Server

    Mourier, J; Nonglaton, J M; Syratchev, I V; Tanner, L

    2004-01-01

    CTF3 (CLIC Test Facility 3), currently under construction at CERN, is a test facility designed to demonstrate the key feasibility issues of the CLIC (Compact LInear Collider) two-beam scheme. When completed, this facility will consist of a 150 MeV linac followed by two rings for bunch-interleaving, and a test stand where 30 GHz power will be generated. In this paper, the work that has been carried out on the linac's low power RF system is described. This includes, in particular, a sophisticated phase control system for the RF pulse compressor to produce a flat-top rectangular pulse over 1.4 µs.

  3. Library of sophisticated functions for analysis of nuclear spectra

    Science.gov (United States)

    Morháč, Miroslav; Matoušek, Vladislav

    2009-10-01

    In the paper we present compact library for analysis of nuclear spectra. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. The functions can process one- and two-dimensional spectra. The software described in the paper comprises a number of conventional as well as newly developed methods needed to analyze experimental data. Program summaryProgram title: SpecAnalysLib 1.1 Catalogue identifier: AEDZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 42 154 No. of bytes in distributed program, including test data, etc.: 2 379 437 Distribution format: tar.gz Programming language: C++ Computer: Pentium 3 PC 2.4 GHz or higher, Borland C++ Builder v. 6. A precompiled Windows version is included in the distribution package Operating system: Windows 32 bit versions RAM: 10 MB Word size: 32 bits Classification: 17.6 Nature of problem: The demand for advanced highly effective experimental data analysis functions is enormous. The library package represents one approach to give the physicists the possibility to use the advanced routines simply by calling them from their own programs. SpecAnalysLib is a collection of functions for analysis of one- and two-parameter γ-ray spectra, but they can be used for other types of data as well. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. Solution method: The algorithms of background estimation are based on Sensitive Non-linear Iterative Peak (SNIP) clipping algorithm. The smoothing algorithms are based on the convolution of the original data with several types of filters and algorithms based on discrete

  4. Sophistication and Performance of Italian Agri‐food Exports

    Directory of Open Access Journals (Sweden)

    Anna Carbone

    2012-06-01

    Full Text Available Nonprice competition is increasingly important in world food markets. Recently, the expression ‘export sophistication’ has been introduced in the economic literature to refer to a wide set of attributes that increase product value. An index has been proposed to measure sophistication in an indirect way through the per capita GDP of exporting countries (Lall et al., 2006; Haussmann et al., 2007.The paper applies the sophistication measure to the Italian food export sector, moving from an analysis of trends and performance of Italian food exports. An original way to disentangle different components in the temporal variation of the sophistication index is also proposed.Results show that the sophistication index offers original insights on recent trends in world food exports and with respect to Italian core food exports.

  5. Systematization and sophistication of a comprehensive sensitivity analysis program. Phase 2

    International Nuclear Information System (INIS)

    Oyamada, Kiyoshi; Ikeda, Takao

    2004-02-01

    This study developed minute estimation by adopting comprehensive sensitivity analytical program for reliability of TRU waste repository concepts in a crystalline rock condition. We examined each components and groundwater scenario of geological repository and prepared systematic bases to examine the reliability from the point of comprehensiveness. Models and data are sophisticated to examine the reliability. Based on an existing TRU waste repository concepts, effects of parameters to nuclide migration were quantitatively classified. Those parameters, that will be decided quantitatively, are such as site character of natural barrier and design specification of engineered barriers. Considering the feasibility of those figures of specifications, reliability is re-examined on combinations of those parameters within a practical range. Future issues are; Comprehensive representation of hybrid geosphere model including the fractured medium and permeable matrix medium. Sophistication of tools to develop the reliable combinations of parameters. It is significant to continue this study because the disposal concepts and specification of TRU nuclides containing waste on various sites shall be determined rationally and safely through these studies. (author)

  6. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  7. The Impact of Financial Sophistication on Adjustable Rate Mortgage Ownership

    Science.gov (United States)

    Smith, Hyrum; Finke, Michael S.; Huston, Sandra J.

    2011-01-01

    The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…

  8. Pension fund sophistication and investment policy

    NARCIS (Netherlands)

    de Dreu, J.|info:eu-repo/dai/nl/364537906; Bikker, J.A.|info:eu-repo/dai/nl/06912261X

    This paper assesses the sophistication of pension funds’ investment policies using data on 748 Dutch pension funds during the 1999–2006 period. We develop three indicators of sophistication: gross rounding of investment choices, investments in alternative sophisticated asset classes and ‘home bias’.

  9. In Praise of the Sophists.

    Science.gov (United States)

    Gibson, Walker

    1993-01-01

    Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)

  10. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  11. Financial Literacy and Financial Sophistication in the Older Population

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S.; Curto, Vilsa

    2017-01-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications. PMID:28553191

  12. Financial Literacy and Financial Sophistication in the Older Population.

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S; Curto, Vilsa

    2014-10-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications.

  13. Moral foundations and political attitudes: The moderating role of political sophistication.

    Science.gov (United States)

    Milesi, Patrizia

    2016-08-01

    Political attitudes can be associated with moral concerns. This research investigated whether people's level of political sophistication moderates this association. Based on the Moral Foundations Theory, this article examined whether political sophistication moderates the extent to which reliance on moral foundations, as categories of moral concerns, predicts judgements about policy positions. With this aim, two studies examined four policy positions shown by previous research to be best predicted by the endorsement of Sanctity, that is, the category of moral concerns focused on the preservation of physical and spiritual purity. The results showed that reliance on Sanctity predicted political sophisticates' judgements, as opposed to those of unsophisticates, on policy positions dealing with equal rights for same-sex and unmarried couples and with euthanasia. Political sophistication also interacted with Fairness endorsement, which includes moral concerns for equal treatment of everybody and reciprocity, in predicting judgements about equal rights for unmarried couples, and interacted with reliance on Authority, which includes moral concerns for obedience and respect for traditional authorities, in predicting opposition to stem cell research. Those findings suggest that, at least for these particular issues, endorsement of moral foundations can be associated with political attitudes more strongly among sophisticates than unsophisticates. © 2015 International Union of Psychological Science.

  14. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances.......We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ...

  15. Lexical Sophistication as a Multidimensional Phenomenon: Relations to Second Language Lexical Proficiency, Development, and Writing Quality

    Science.gov (United States)

    Kim, Minkyung; Crossley, Scott A.; Kyle, Kristopher

    2018-01-01

    This study conceptualizes lexical sophistication as a multidimensional phenomenon by reducing numerous lexical features of lexical sophistication into 12 aggregated components (i.e., dimensions) via a principal component analysis approach. These components were then used to predict second language (L2) writing proficiency levels, holistic lexical…

  16. The First Sophists and the Uses of History.

    Science.gov (United States)

    Jarratt, Susan C.

    1987-01-01

    Reviews the history of intellectual views on the Greek sophists in three phases: (1) their disparagement by Plato and Aristotle as the morally disgraceful "other"; (2) nineteenth century British positivists' reappraisal of these relativists as ethically and scientifically superior; and (3) twentieth century versions of the sophists as…

  17. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  18. The Relationship between Logistics Sophistication and Drivers of the Outsourcing of Logistics Activities

    Directory of Open Access Journals (Sweden)

    Peter Wanke

    2008-10-01

    Full Text Available A strong link has been established between operational excellence and the degree of sophistication of logistics organization, a function of factors such as performance monitoring, investment in Information Technology [IT] and the formalization of logistics organization, as proposed in the Bowersox, Daugherty, Dröge, Germain and Rogers (1992 Leading Edge model. At the same time, shippers have been increasingly outsourcing their logistics activities to third party providers. This paper, based on a survey with large Brazilian shippers, addresses a gap in the literature by investigating the relationship between dimensions of logistics organization sophistication and drivers of logistics outsourcing. To this end, the dimensions behind the logistics sophistication construct were first investigated. Results from factor analysis led to the identification of six dimensions of logistics sophistication. By means of multivariate logistical regression analyses it was possible to relate some of these dimensions, such as the formalization of the logistics organization, to certain drivers of the outsourcing of logistics activities of Brazilian shippers, such as cost savings. These results indicate the possibility of segmenting shippers according to characteristics of their logistics organization, which may be particularly useful to logistics service providers.

  19. Lexical Complexity Development from Dynamic Systems Theory Perspective: Lexical Density, Diversity, and Sophistication

    Directory of Open Access Journals (Sweden)

    Reza Kalantari

    2017-10-01

    Full Text Available This longitudinal case study explored Iranian EFL learners’ lexical complexity (LC through the lenses of Dynamic Systems Theory (DST. Fifty independent essays written by five intermediate to advanced female EFL learners in a TOEFL iBT preparation course over six months constituted the corpus of this study. Three Coh-Metrix indices (Graesser, McNamara, Louwerse, & Cai, 2004; McNamara & Graesser, 2012, three Lexical Complexity Analyzer indices (Lu, 2010, 2012; Lu & Ai, 2011, and four Vocabprofile indices (Cobb, 2000 were selected to measure different dimensions of LC. Results of repeated measures analysis of variance (RM ANOVA indicated an improvement with regard to only lexical sophistication. Positive and significant relationships were found between time and mean values in Academic Word List and Beyond-2000 as indicators of lexical sophistication. The remaining seven indices of LC, falling short of significance, tended to flatten over the course of this writing program. Correlation analyses among LC indices indicated that lexical density enjoyed positive correlations with lexical sophistication. However, lexical diversity revealed no significant correlations with both lexical density and lexical sophistication. This study suggests that DST perspective specifies a viable foundation for analyzing lexical complexity

  20. Does Investors' Sophistication Affect Persistence and Pricing of Discretionary Accruals?

    OpenAIRE

    Lanfeng Kao

    2007-01-01

    This paper examines whether the sophistication of market investors influences management's strategy on discretionary accounting choice, and thus changes the persistence of discretionary accruals. The results show that the persistence of discretionary accruals for firms face with naive investors is lower than that for firms face with sophisticated investors. The results also demonstrate that sophisticated investors indeed incorporate the implications of current earnings components into future ...

  1. The conceptualization and measurement of cognitive health sophistication.

    Science.gov (United States)

    Bodie, Graham D; Collins, William B; Jensen, Jakob D; Davis, Lashara A; Guntzviller, Lisa M; King, Andy J

    2013-01-01

    This article develops a conceptualization and measure of cognitive health sophistication--the complexity of an individual's conceptual knowledge about health. Study 1 provides initial validity evidence for the measure--the Healthy-Unhealthy Other Instrument--by showing its association with other cognitive health constructs indicative of higher health sophistication. Study 2 presents data from a sample of low-income adults to provide evidence that the measure does not depend heavily on health-related vocabulary or ethnicity. Results from both studies suggest that the Healthy-Unhealthy Other Instrument can be used to capture variability in the sophistication or complexity of an individual's health-related schematic structures on the basis of responses to two simple open-ended questions. Methodological advantages of the Healthy-Unhealthy Other Instrument and suggestions for future research are highlighted in the discussion.

  2. Obfuscation, Learning, and the Evolution of Investor Sophistication

    OpenAIRE

    Bruce Ian Carlin; Gustavo Manso

    2011-01-01

    Investor sophistication has lagged behind the growing complexity of retail financial markets. To explore this, we develop a dynamic model to study the interaction between obfuscation and investor sophistication in mutual fund markets. Taking into account different learning mechanisms within the investor population, we characterize the optimal timing of obfuscation for financial institutions who offer retail products. We show that educational initiatives that are directed to facilitate learnin...

  3. Probabilistic Sophistication, Second Order Stochastic Dominance, and Uncertainty Aversion

    OpenAIRE

    Simone Cerreia-Vioglio; Fabio Maccheroni; Massimo Marinacci; Luigi Montrucchio

    2010-01-01

    We study the interplay of probabilistic sophistication, second order stochastic dominance, and uncertainty aversion, three fundamental notions in choice under uncertainty. In particular, our main result, Theorem 2, characterizes uncertainty averse preferences that satisfy second order stochastic dominance, as well as uncertainty averse preferences that are probabilistically sophisticated.

  4. Gamma spectrum analysis including NAA with SAMPO for Windows

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Nikkinen, M.T.; Routti, J.T.

    1995-01-01

    SAMPO for Windows is a high performance gamma spectrum analysis program. All the measurement, analysis and NAA phases can be done either under full interactive use control or user defined tasks can be used for automated measurement and analysis sequences including control of MCAs and sample changers. High resolution gamma-ray spectroscopy together with the possibility to resolve complex multiplets with high accuracy makes SAMPO very suitable for INAA. On the other hand, the possibility to automate analysis sequences allows it use effectively also in all routine NAA measurements. NAA in SAMPO is accomplished using comparative methods. Spectra of standards, flux monitors, controls and actual samples are analyzed normally to obtain the peak areas which are optionally corrected for decay. In the comparison the flux monitor results are used to correct for variations in the effective neutron flux. An optional irradiation position correction can also be applied. The controls are used to alarm for possible deviations in the results. The sophisticated spectrum analysis methods used together with the comparative NAA and monitors give accurate results limited by the systematic effects only. The Windows environment provides ease of use and further processing power is available through the interface to expert system identification of nuclides. (author) 19 refs.; 1 tab

  5. EU-Korea FTA and Its Impact on V4 Economies. A Comparative Analysis of Trade Sophistication and Intra-Industry Trade

    Directory of Open Access Journals (Sweden)

    Michalski Bartosz

    2018-03-01

    Full Text Available This paper investigates selected short- and mid-term effects in trade in goods between the Visegrad countries (V4: the Czech Republic, Hungary, Poland and the Slovak Republic and the Republic of Korea under the framework of the Free Trade Agreement between the European Union and the Republic of Korea. This Agreement is described in the “Trade for All” (2015: 9 strategy as the most ambitious trade deal ever implemented by the EU. The primary purpose of our analysis is to identify, compare, and evaluate the evolution of the technological sophistication of bilateral exports and imports. Another dimension of the paper concentrates on the developments within intra-industry trade. Moreover, these objectives are approached taking into account the context of the South Korean direct investment inflow to the V4. The evaluation of technological sophistication is based on UNCTAD’s methodology, while the intensity of intra-industry trade is measured by the GL-index and identification of its subcategories (horizontal and vertical trade. The analysis covers the timespan 2001–2015. The novelty of the paper lies in the fact that the study of South Korean-V4 trade relations has not so far been carried out from this perspective. Thus this paper investigates interesting phenomena identified in the trade between the Republic of Korea (ROK and V4 economies. The main findings imply an impact of South Korean direct investments on trade. This is represented by the trade deficit of the V4 with ROK and the structure of bilateral trade in terms of its technological sophistication. South Korean investments might also have had positive consequences for the evolution of IIT, particularly in the machinery sector. The political interpretation indicates that they may strengthen common threats associated with the middle-income trap, particularly the technological gap and the emphasis placed on lower costs of production.

  6. The role of sophisticated accounting system in strategy management

    OpenAIRE

    Naranjo Gil, David

    2004-01-01

    Organizations are designing more sophisticated accounting information systems to meet the strategic goals and enhance their performance. This study examines the effect of accounting information system design on the performance of organizations pursuing different strategic priorities. The alignment between sophisticated accounting information systems and organizational strategy is analyzed. The enabling effect of the accounting information system on performance is also examined. Relationships ...

  7. Cognitive Load and Strategic Sophistication

    OpenAIRE

    Allred, Sarah; Duffy, Sean; Smith, John

    2013-01-01

    We study the relationship between the cognitive load manipulation and strategic sophistication. The cognitive load manipulation is designed to reduce the subject's cognitive resources that are available for deliberation on a choice. In our experiment, subjects are placed under a large cognitive load (given a difficult number to remember) or a low cognitive load (given a number which is not difficult to remember). Subsequently, the subjects play a one-shot game then they are asked to recall...

  8. Aristotle and Social-Epistemic Rhetoric: The Systematizing of the Sophistic Legacy.

    Science.gov (United States)

    Allen, James E.

    While Aristotle's philosophical views are more foundational than those of many of the Older Sophists, Aristotle's rhetorical theories inherit and incorporate many of the central tenets ascribed to Sophistic rhetoric, albeit in a more systematic fashion, as represented in the "Rhetoric." However, Aristotle was more than just a rhetorical…

  9. Putin’s Russia: Russian Mentality and Sophisticated Imperialism in Military Policies

    OpenAIRE

    Szénási, Lieutenant-Colonel Endre

    2016-01-01

    According to my experiences, the Western world hopelessly fails to understand Russian mentality, or misinterprets it. During my analysis of the Russian way of thinking I devoted special attention to the examination of military mentality. I have connected the issue of the Russian way of thinking to the contemporary imperial policies of Putin’s Russia.  I have also attempted to prove the level of sophistication of both. I hope that a better understanding of both the Russian mentality and imperi...

  10. The predictors of economic sophistication: media, interpersonal communication and negative economic experiences

    NARCIS (Netherlands)

    Kalogeropoulos, A.; Albæk, E.; de Vreese, C.H.; van Dalen, A.

    2015-01-01

    In analogy to political sophistication, it is imperative that citizens have a certain level of economic sophistication, especially in times of heated debates about the economy. This study examines the impact of different influences (media, interpersonal communication and personal experiences) on

  11. PAUL AND SOPHISTIC RHETORIC: A PERSPECTIVE ON HIS ...

    African Journals Online (AJOL)

    use of modern rhetorical theories but analyses the letter in terms of the clas- ..... If a critical reader would have had the traditional anti-sophistic arsenal ..... pressions and that 'rhetoric' is mainly a matter of communicating these thoughts.

  12. Isocratean Discourse Theory and Neo-Sophistic Pedagogy: Implications for the Composition Classroom.

    Science.gov (United States)

    Blair, Kristine L.

    With the recent interest in the fifth century B.C. theories of Protagoras and Gorgias come assumptions about the philosophical affinity of the Greek educator Isocrates to this pair of older sophists. Isocratean education in discourse, with its emphasis on collaborative political discourse, falls within recent definitions of a sophist curriculum.…

  13. SMEs and new ventures need business model sophistication

    DEFF Research Database (Denmark)

    Kesting, Peter; Günzel-Jensen, Franziska

    2015-01-01

    , and Spreadshirt, this article develops a framework that introduces five business model sophistication strategies: (1) uncover additional functions of your product, (2) identify strategic benefits for third parties, (3) take advantage of economies of scope, (4) utilize cross-selling opportunities, and (5) involve...

  14. Sophisticated Approval Voting, Ignorance Priors, and Plurality Heuristics: A Behavioral Social Choice Analysis in a Thurstonian Framework

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R.; Tsetlin, Ilia

    2007-01-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two…

  15. Cognitive ability rivals the effect of political sophistication on ideological voting

    DEFF Research Database (Denmark)

    Hebbelstrup Rye Rasmussen, Stig

    2016-01-01

    This article examines the impact of cognitive ability on ideological voting. We find, using a US sample and a Danish sample, that the effect of cognitive ability rivals the effect of the traditionally strongest predicter of ideological voting political sophistication. Furthermore, the results...... are consistent with the effect of cognitive ability being partly mediated by political sophistication. Much of the effect of cognitive ability remains however and is not explained by differences in education or Openness to experience either. The implications of these results for democratic theory are discussed....

  16. Purification through Emotions: The Role of Shame in Plato's "Sophist" 230B4-E5

    Science.gov (United States)

    Candiotto, Laura

    2018-01-01

    This article proposes an analysis of Plato's "Sophist" (230b4--e5) that underlines the bond between the logical and the emotional components of the Socratic "elenchus", with the aim of depicting the social valence of this philosophical practice. The use of emotions characterizing the 'elenctic' method described by Plato is…

  17. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Directory of Open Access Journals (Sweden)

    Daniel Müllensiefen

    Full Text Available Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636. Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  18. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Science.gov (United States)

    Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren

    2014-01-01

    Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  19. The New Toxicology of Sophisticated Materials: Nanotoxicology and Beyond

    Science.gov (United States)

    Maynard, Andrew D.; Warheit, David B.; Philbert, Martin A.

    2011-01-01

    It has long been recognized that the physical form of materials can mediate their toxicity—the health impacts of asbestiform materials, industrial aerosols, and ambient particulate matter are prime examples. Yet over the past 20 years, toxicology research has suggested complex and previously unrecognized associations between material physicochemistry at the nanoscale and biological interactions. With the rapid rise of the field of nanotechnology and the design and production of increasingly complex nanoscale materials, it has become ever more important to understand how the physical form and chemical composition of these materials interact synergistically to determine toxicity. As a result, a new field of research has emerged—nanotoxicology. Research within this field is highlighting the importance of material physicochemical properties in how dose is understood, how materials are characterized in a manner that enables quantitative data interpretation and comparison, and how materials move within, interact with, and are transformed by biological systems. Yet many of the substances that are the focus of current nanotoxicology studies are relatively simple materials that are at the vanguard of a new era of complex materials. Over the next 50 years, there will be a need to understand the toxicology of increasingly sophisticated materials that exhibit novel, dynamic and multifaceted functionality. If the toxicology community is to meet the challenge of ensuring the safe use of this new generation of substances, it will need to move beyond “nano” toxicology and toward a new toxicology of sophisticated materials. Here, we present a brief overview of the current state of the science on the toxicology of nanoscale materials and focus on three emerging toxicology-based challenges presented by sophisticated materials that will become increasingly important over the next 50 years: identifying relevant materials for study, physicochemical characterization, and

  20. Procles the Carthaginian: A North African Sophist in Pausanias’ Periegesis

    Directory of Open Access Journals (Sweden)

    Juan Pablo Sánchez Hernández

    2010-11-01

    Full Text Available Procles, cited by Pausanias (in the imperfect tense about a display in Rome and for an opinion about Pyrrhus of Epirus, probably was not a historian of Hellenistic date, but a contemporary sophist whom Pausanias encountered in person in Rome.

  1. Does underground storage still require sophisticated studies?

    International Nuclear Information System (INIS)

    Marsily, G. de

    1997-01-01

    Most countries agree to the necessity of burying high or medium-level wastes in geological layers situated at a few hundred meters below the ground level. The advantages and disadvantages of different types of rock such as salt, clay, granite and volcanic material are examined. Sophisticated studies are lead to determine the best geological confinement but questions arise about the time for which safety must be ensured. France has chosen 3 possible sites. These sites are geologically described in the article. The final place will be proposed after a testing phase of about 5 years in an underground facility. (A.C.)

  2. The sophisticated control of the tram bogie on track

    Directory of Open Access Journals (Sweden)

    Radovan DOLECEK

    2015-09-01

    Full Text Available The paper deals with the problems of routing control algorithms of new conception of tram vehicle bogie. The main goal of these research activities is wear reduction of rail wheels and tracks, wear reduction of traction energy losses and increasing of running comfort. The testing experimental tram vehicle with special bogie construction powered by traction battery is utilized for these purposes. This vehicle has a rotary bogie with independent rotating wheels driven by permanent magnets synchronous motors and a solid axle. The wheel forces in bogie are measured by large amounts of the various sensors placed on the testing experimental tram vehicle. Nowadays the designed control algorithms are implemented to the vehicle superset control system. The traction requirements and track characteristics have an effect to these control algorithms. This control including sophisticated routing brings other improvements which is verified and corrected according to individual traction and driving characteristics, and opens new possibilities.

  3. Finding the Fabulous Few: Why Your Program Needs Sophisticated Research.

    Science.gov (United States)

    Pfizenmaier, Emily

    1981-01-01

    Fund raising, it is argued, needs sophisticated prospect research. Professional prospect researchers play an important role in helping to identify prospective donors and also in helping to stimulate interest in gift giving. A sample of an individual work-up on a donor and a bibliography are provided. (MLW)

  4. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species.

    Directory of Open Access Journals (Sweden)

    Marie Devaine

    2017-11-01

    Full Text Available Theory of Mind (ToM, i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded. However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity or social group size (a proxy for social network complexity are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees engage in simple dyadic games against artificial ToM players (via a familiar human caregiver. Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size. Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

  5. Few remarks on chiral theories with sophisticated topology

    International Nuclear Information System (INIS)

    Golo, V.L.; Perelomov, A.M.

    1978-01-01

    Two classes of the two-dimensional Euclidean chiral field theoreties are singled out: 1) the field phi(x) takes the values in the compact Hermitiam symmetric space 2) the field phi(x) takes the values in an orbit of the adjoint representation of the comcompact Lie group. The theories have sophisticated topological and rich analytical structures. They are considered with the help of topological invariants (topological charges). Explicit formulae for the topological charges are indicated, and the lower bound extimate for the action is given

  6. STOCK EXCHANGE LISTING INDUCES SOPHISTICATION OF CAPITAL BUDGETING

    Directory of Open Access Journals (Sweden)

    Wesley Mendes-da-Silva

    2014-08-01

    Full Text Available This article compares capital budgeting techniques employed in listed and unlisted companies in Brazil. We surveyed the Chief Financial Officers (CFOs of 398 listed companies and 300 large unlisted companies, and based on 91 respondents, the results suggest that the CFOs of listed companies tend to use less simplistic methods more often, for example: NPV and CAPM, and that CFOs of unlisted companies are less likely to estimate the cost of equity, despite being large companies. These findings indicate that stock exchange listing may require greater sophistication of the capital budgeting process.

  7. Differential ethnic associations between maternal flexibility and play sophistication in toddlers born very low birth weight

    Science.gov (United States)

    Erickson, Sarah J.; Montague, Erica Q.; Maclean, Peggy C.; Bancroft, Mary E.; Lowe, Jean R.

    2013-01-01

    Children born very low birth weight (development of self-regulation and effective functional skills, and play serves as an important avenue of early intervention. The current study investigated associations between maternal flexibility and toddler play sophistication in Caucasian, Spanish speaking Hispanic, English speaking Hispanic, and Native American toddlers (18-22 months adjusted age) in a cross-sectional cohort of 73 toddlers born VLBW and their mothers. We found that the association between maternal flexibility and toddler play sophistication differed by ethnicity (F(3,65) = 3.34, p = .02). In particular, Spanish speaking Hispanic dyads evidenced a significant positive association between maternal flexibility and play sophistication of medium effect size. Results for Native Americans were parallel to those of Spanish speaking Hispanic dyads: the relationship between flexibility and play sophistication was positive and of small-medium effect size. Findings indicate that for Caucasians and English speaking Hispanics, flexibility evidenced a non-significant (negative and small effect size) association with toddler play sophistication. Significant follow-up contrasts revealed that the associations for Caucasian and English speaking Hispanic dyads were significantly different from those of the other two ethnic groups. Results remained unchanged after adjusting for the amount of maternal language, an index of maternal engagement and stimulation; and after adjusting for birth weight, gestational age, gender, test age, cognitive ability, as well maternal age, education, and income. Our results provide preliminary evidence that ethnicity and acculturation may mediate the association between maternal interactive behavior such as flexibility and toddler developmental outcomes, as indexed by play sophistication. Addressing these association differences is particularly important in children born VLBW because interventions targeting parent interaction strategies such as

  8. Does a more sophisticated storm erosion model improve probabilistic erosion estimates?

    NARCIS (Netherlands)

    Ranasinghe, R.W.M.R.J.B.; Callaghan, D.; Roelvink, D.

    2013-01-01

    The dependency between the accuracy/uncertainty of storm erosion exceedance estimates obtained via a probabilistic model and the level of sophistication of the structural function (storm erosion model) embedded in the probabilistic model is assessed via the application of Callaghan et al.'s (2008)

  9. Sophisticating a naive Liapunov function

    International Nuclear Information System (INIS)

    Smith, D.; Lewins, J.D.

    1985-01-01

    The art of the direct method of Liapunov to determine system stability is to construct a suitable Liapunov or V function where V is to be positive definite (PD), to shrink to a center, which may be conveniently chosen as the origin, and where V is the negative definite (ND). One aid to the art is to solve an approximation to the system equations in order to provide a candidate V function. It can happen, however, that the V function is not strictly ND but vanishes at a finite number of isolated points. Naively, one anticipates that stability has been demonstrated since the trajectory of the system at such points is only momentarily tangential and immediately enters a region of inward directed trajectories. To demonstrate stability rigorously requires the construction of a sophisticated Liapunov function from what can be called the naive original choice. In this paper, the authors demonstrate the method of perturbing the naive function in the context of the well-known second-order oscillator and then apply the method to a more complicated problem based on a prompt jump model for a nuclear fission reactor

  10. Strategic sophistication of individuals and teams. Experimental evidence

    Science.gov (United States)

    Sutter, Matthias; Czermak, Simon; Feri, Francesco

    2013-01-01

    Many important decisions require strategic sophistication. We examine experimentally whether teams act more strategically than individuals. We let individuals and teams make choices in simple games, and also elicit first- and second-order beliefs. We find that teams play the Nash equilibrium strategy significantly more often, and their choices are more often a best response to stated first order beliefs. Distributional preferences make equilibrium play less likely. Using a mixture model, the estimated probability to play strategically is 62% for teams, but only 40% for individuals. A model of noisy introspection reveals that teams differ from individuals in higher order beliefs. PMID:24926100

  11. Development Strategies for Tourism Destinations: Tourism Sophistication vs. Resource Investments

    OpenAIRE

    Rainer Andergassen; Guido Candela

    2010-01-01

    This paper investigates the effectiveness of development strategies for tourism destinations. We argue that resource investments unambiguously increase tourism revenues and that increasing the degree of tourism sophistication, that is increasing the variety of tourism related goods and services, increases tourism activity and decreases the perceived quality of the destination's resource endowment, leading to an ambiguous effect on tourism revenues. We disentangle these two effects and charact...

  12. Do organizations adopt sophisticated capital budgeting practices to deal with uncertainty in the investment decision? : A research note

    NARCIS (Netherlands)

    Verbeeten, Frank H M

    This study examines the impact of uncertainty on the sophistication of capital budgeting practices. While the theoretical applications of sophisticated capital budgeting practices (defined as the use of real option reasoning and/or game theory decision rules) have been well documented, empirical

  13. Sophistication of burnup analysis system for fast reactor

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hirai, Yasushi; Hyoudou, Hideaki; Tatsumi, Masahiro

    2010-02-01

    Improvement on prediction accuracy for neutronics property of fast reactor cores is one of the most important study domains in terms of both achievement of high economical plant efficiency based on reasonably advanced designs and increased reliability and safety margins. In former study, considerable improvement on prediction accuracy in neutronics design has been achieved in the development of the unified constants library as a fruit of a series of critical experiments such as JUPITER in application of the reactor constant adjustments. For design of fast reactor cores, however, improvement of not only static properties but also burnup properties is very important. For such purpose, it is necessary to improve the prediction accuracy on burnup properties using actual burnup data of 'JOYO' and 'MONJU', experimental and prototype fast reactors. Recently, study on effective burnup method for minor actinides becomes important theme. However, there is a problem that analysis work tends to become inefficient for lack of functionality suitable for analysis of composition change due to burnup since the conventional analysis system is targeted to critical assembly systems. Therefore development of burnup analysis system for fast reactors with modularity and flexibility is being done that would contribute to actual core design work and improvement of prediction accuracy. In the previous research, we have developed a prototype system which has functions of performing core and burnup calculations using given constant files (PDS files) and information based on simple and easy user input data. It has also functions of fuel shuffling which is indispensable for production systems. In the present study, we implemented functions for cell calculations and burnup calculations. With this, whole steps in analysis can be carried out with only this system. In addition, we modified the specification of user input to improve the convenience of this system. Since implementations being done so

  14. "SOCRATICS" AS ADDRESSES OF ISOCRATES’ EPIDEICTIC SPEECHES (Against the Sophists, Encomium of Helen, Busiris

    Directory of Open Access Journals (Sweden)

    Anna Usacheva

    2012-06-01

    Full Text Available This article analyses the three epideictic orations of Isocrates which are in themselves a precious testimony of the quality of intellectual life at the close of the fourth century before Christ. To this period belong also the Socratics who are generally seen as an important link between Socrates and Plato. The author of this article proposes a more productive approach to the study of Antisthenes, Euclid of Megara and other so-called Socratics, revealing them not as independent thinkers but rather as adherents of the sophistic school and also as teachers, thereby, including them among those who took part in the educative activity of their time

  15. Impact of sophisticated fog spray models on accident analyses

    International Nuclear Information System (INIS)

    Roblyer, S.P.; Owzarski, P.C.

    1978-01-01

    The N-Reactor confinement system release dose to the public in a postulated accident is reduced by washing the confinement atmosphere with fog sprays. This allows a low pressure release of confinement atmosphere containing fission products through filters and out an elevated stack. The current accident analysis required revision of the CORRAL code and other codes such as CONTEMPT to properly model the N Reactor confinement into a system of multiple fog-sprayed compartments. In revising these codes, more sophisticated models for the fog sprays and iodine plateout were incorporated to remove some of the conservatism of steam condensing rate, fission product washout and iodine plateout than used in previous studies. The CORRAL code, which was used to describe the transport and deposition of airborne fission products in LWR containment systems for the Rasmussen Study, was revised to describe fog spray removal of molecular iodine (I 2 ) and particulates in multiple compartments for sprays having individual characteristics of on-off times, flow rates, fall heights, and drop sizes in changing containment atmospheres. During postulated accidents, the code determined the fission product removal rates internally rather than from input decontamination factors. A discussion is given of how the calculated plateout and washout rates vary with time throughout the analysis. The results of the accident analyses indicated that more credit could be given to fission product washout and plateout. An important finding was that the release of fission products to the atmosphere and adsorption of fission products on the filters were significantly lower than previous studies had indicated

  16. The Value of Multivariate Model Sophistication: An Application to pricing Dow Jones Industrial Average options

    DEFF Research Database (Denmark)

    Rombouts, Jeroen V.K.; Stentoft, Lars; Violante, Francesco

    innovation for a Laplace innovation assumption improves the pricing in a smaller way. Apart from investigating directly the value of model sophistication in terms of dollar losses, we also use the model condence set approach to statistically infer the set of models that delivers the best pricing performance.......We assess the predictive accuracy of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set 248 multivariate models that differer...

  17. Sophistic Ethics in the Technical Writing Classroom: Teaching "Nomos," Deliberation, and Action.

    Science.gov (United States)

    Scott, J. Blake

    1995-01-01

    Claims that teaching ethics is particularly important to technical writing. Outlines a classical, sophistic approach to ethics based on the theories and pedagogies of Protagoras, Gorgias, and Isocrates, which emphasizes the Greek concept of "nomos," internal and external deliberation, and responsible action. Discusses problems and…

  18. Sophisticated Fowl: The Complex Behaviour and Cognitive Skills of Chickens and Red Junglefowl

    Directory of Open Access Journals (Sweden)

    Laura Garnham

    2018-01-01

    Full Text Available The world’s most numerous bird, the domestic chicken, and their wild ancestor, the red junglefowl, have long been used as model species for animal behaviour research. Recently, this research has advanced our understanding of the social behaviour, personality, and cognition of fowl, and demonstrated their sophisticated behaviour and cognitive skills. Here, we overview some of this research, starting with describing research investigating the well-developed senses of fowl, before presenting how socially and cognitively complex they can be. The realisation that domestic chickens, our most abundant production animal, are behaviourally and cognitively sophisticated should encourage an increase in general appraise and fascination towards them. In turn, this should inspire increased use of them as both research and hobby animals, as well as improvements in their unfortunately often poor welfare.

  19. Assessing Epistemic Sophistication by Considering Domain-Specific Absolute and Multiplicistic Beliefs Separately

    Science.gov (United States)

    Peter, Johannes; Rosman, Tom; Mayer, Anne-Kathrin; Leichner, Nikolas; Krampen, Günter

    2016-01-01

    Background: Particularly in higher education, not only a view of science as a means of finding absolute truths (absolutism), but also a view of science as generally tentative (multiplicism) can be unsophisticated and obstructive for learning. Most quantitative epistemic belief inventories neglect this and understand epistemic sophistication as…

  20. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers

    Science.gov (United States)

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences. PMID:28450829

  1. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers.

    Science.gov (United States)

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences.

  2. Multi-disciplinary communication networks for skin risk assessment in nursing homes with high IT sophistication.

    Science.gov (United States)

    Alexander, Gregory L; Pasupathy, Kalyan S; Steege, Linsey M; Strecker, E Bradley; Carley, Kathleen M

    2014-08-01

    The role of nursing home (NH) information technology (IT) in quality improvement has not been clearly established, and its impacts on communication between care givers and patient outcomes in these settings deserve further attention. In this research, we describe a mixed method approach to explore communication strategies used by healthcare providers for resident skin risk in NH with high IT sophistication (ITS). Sample included NH participating in the statewide survey of ITS. We incorporated rigorous observation of 8- and 12-h shifts, and focus groups to identify how NH IT and a range of synchronous and asynchronous tools are used. Social network analysis tools and qualitative analysis were used to analyze data and identify relationships between ITS dimensions and communication interactions between care providers. Two of the nine ITS dimensions (resident care-technological and administrative activities-technological) and total ITS were significantly negatively correlated with number of unique interactions. As more processes in resident care and administrative activities are supported by technology, the lower the number of observed unique interactions. Additionally, four thematic areas emerged from staff focus groups that demonstrate how important IT is to resident care in these facilities including providing resident-centered care, teamwork and collaboration, maintaining safety and quality, and using standardized information resources. Our findings in this study confirm prior research that as technology support (resident care and administrative activities) and overall ITS increases, observed interactions between staff members decrease. Conversations during staff interviews focused on how technology facilitated resident centered care through enhanced information sharing, greater virtual collaboration between team members, and improved care delivery. These results provide evidence for improving the design and implementation of IT in long term care systems to support

  3. Reacting to Neighborhood Cues?: Political Sophistication Moderates the Effect of Exposure to Immigrants

    DEFF Research Database (Denmark)

    Danckert, Bolette; Dinesen, Peter Thisted; Sønderskov, Kim Mannemar

    2017-01-01

    is founded on politically sophisticated individuals having a greater comprehension of news and other mass-mediated sources, which makes them less likely to rely on neighborhood cues as sources of information relevant for political attitudes. Based on a unique panel data set with fine-grained information...

  4. Sophistication of burnup analysis system for fast reactor (2)

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hirai, Yasushi; Tatsumi, Masahiro

    2010-10-01

    Improvement on prediction accuracy for neutronics characteristics of fast reactor cores is one of the most important study domains in terms of both achievement of high economical plant efficiency based on reasonably advanced designs and increased reliability and safety margins. In former study, considerable improvement on prediction accuracy in neutronics design has been achieved in the development of the unified cross-section set as a fruit of a series of critical experiments such as JUPITER in application of the reactor constant adjustments. For design of fast reactor cores improvement of not only static characteristics but also burnup characteristics is very important. For such purpose, it is necessary to improve the prediction accuracy on burnup characteristics using actual burnup data of 'JOYO' and 'MONJU', experimental and prototype fast reactors. Recently, study on effective burnup method for minor actinides becomes important theme. However, there is a problem that analysis work tends to become inefficient for lack of functionality suitable for analysis of composition change due to burnup since the conventional analysis system is targeted to critical assembly systems. Therefore development of burnup analysis system for fast reactors with modularity and flexibility is being done that would contribute to actual core design work and improvement of prediction accuracy. In the previous study, we have developed a prototype system which has functions of performing core and burnup calculations using given constant files (PDS files) and information based on simple and easy user input data. It has also functions of fuel shuffling which is indispensable for power reactor analysis systems. In the present study, by extending the prototype system, features for handling of control rods and energy collapse of group constants have been designed and implemented. Computational results from the present analysis system are stored into restart files which can be accessible by

  5. Financial Sophistication and the Distribution of the Welfare Cost of Inflation

    OpenAIRE

    Paola Boel; Gabriele Camera

    2009-01-01

    The welfare cost of anticipated inflation is quantified in a calibrated model of the U.S. economy that exhibits tractable equilibrium dispersion in wealth and earnings. Inflation does not generate large losses in societal welfare, yet its impact varies noticeably across segments of society depending also on the financial sophistication of the economy. If money is the only asset, then inflation hurts mostly the wealthier and more productive agents, while those poorer and less productive may ev...

  6. The relation between maturity and sophistication shall be properly dealt with in nuclear power development

    International Nuclear Information System (INIS)

    Li Yongjiang

    2009-01-01

    The paper analyses the advantages and disadvantages of the second generation improved technologies and third generation technologies mainly developed in China in terms of safety and economy. The paper also discusses the maturity of the second generation improved technologies and the sophistication of the third generation technologies respectively. Meanwhile, the paper proposes that the advantage and disadvantage of second generation improved technologies and third generation technologies should be carefully taken into consideration and the relationship between the maturity and sophistication should be properly dealt with in the current stage. A two-step strategy shall be taken as a solution to solve the problem of insufficient capacity of nuclear power, trace and develop the third generation technologies, so as to ensure the sound and fast development of nuclear power. (authors)

  7. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    Science.gov (United States)

    de Sá, Fábio P; Zina, Juliana; Haddad, Célio F B

    2016-01-01

    Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids), we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  8. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    Directory of Open Access Journals (Sweden)

    Fábio P de Sá

    Full Text Available Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids, we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  9. Close to the Clothes : Materiality and Sophisticated Archaism in Alexander van Slobbe’s Design Practices

    NARCIS (Netherlands)

    Baronian, M.-A.

    This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

  10. Close to the Clothes: Materiality and Sophisticated Archaism in Alexander van Slobbe’s Design Practices

    NARCIS (Netherlands)

    Baronian, M.-A.

    This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

  11. A Snapshot of Serial Rape: An Investigation of Criminal Sophistication and Use of Force on Victim Injury and Severity of the Assault.

    Science.gov (United States)

    de Heer, Brooke

    2016-02-01

    Prior research on rapes reported to law enforcement has identified criminal sophistication and the use of force against the victim as possible unique identifiers to serial rape versus one-time rape. This study sought to contribute to the current literature on reported serial rape by investigating how the level of criminal sophistication of the rapist and use of force used were associated with two important outcomes of rape: victim injury and overall severity of the assault. In addition, it was evaluated whether rapist and victim ethnicity affected these relationships. A nation-wide sample of serial rape cases reported to law enforcement collected by the Federal Bureau of Investigation (FBI) was analyzed (108 rapists, 543 victims). Results indicated that serial rapists typically used a limited amount of force against the victim and displayed a high degree of criminal sophistication. In addition, the more criminally sophisticated the perpetrator was, the more sexual acts he performed on his victim. Finally, rapes between a White rapist and White victim were found to exhibit higher levels of criminal sophistication and were more severe in terms of number and types of sexual acts committed. These findings provide a more in-depth understanding of serial rape that can inform both academics and practitioners in the field about contributors to victim injury and severity of the assault. © The Author(s) 2014.

  12. Sophisticated Calculation of the 1oo4-architecture for Safety-related Systems Conforming to IEC61508

    International Nuclear Information System (INIS)

    Hayek, A; Al Bokhaiti, M; Schwarz, M H; Boercsoek, J

    2012-01-01

    With the publication and enforcement of the standard IEC 61508 of safety related systems, recent system architectures have been presented and evaluated. Among a number of techniques and measures to the evaluation of safety integrity level (SIL) for safety-related systems, several measures such as reliability block diagrams and Markov models are used to analyze the probability of failure on demand (PFD) and mean time to failure (MTTF) which conform to IEC 61508. The current paper deals with the quantitative analysis of the novel 1oo4-architecture (one out of four) presented in recent work. Therefore sophisticated calculations for the required parameters are introduced. The provided 1oo4-architecture represents an advanced safety architecture based on on-chip redundancy, which is 3-failure safe. This means that at least one of the four channels have to work correctly in order to trigger the safety function.

  13. Sophisticated approval voting, ignorance priors, and plurality heuristics: a behavioral social choice analysis in a Thurstonian framework.

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R; Tsetlin, Ilia

    2007-10-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.

  14. Interaction Analysis: Theory, Research and Application.

    Science.gov (United States)

    Amidon, Edmund J., Ed.; Hough, John J., Ed.

    This volume of selected readings developed for students and practitioners at various levels of sophistication is intended to be representative of work done to date on interaction analysis. The contents include journal articles, papers read at professional meetings, abstracts of doctoral dissertations, and selections from larger monographs, plus 12…

  15. Analysis of Qualitative Interviews about the Impact of Information Technology on Pressure Ulcer Prevention Programs: Implications for the Wound Ostomy Continence Nurse

    Science.gov (United States)

    Shepherd, Marilyn Murphy; Wipke-Tevis, Deidre D.; Alexander, Gregory L.

    2015-01-01

    Purpose The purpose of this study was to compare pressure ulcer prevention programs in 2 long term care facilities (LTC) with diverse Information Technology Sophistication (ITS), one with high sophistication and one with low sophistication, and to identify implications for the Wound Ostomy Continence Nurse (WOC Nurse) Design Secondary analysis of narrative data obtained from a mixed methods study. Subjects and Setting The study setting was 2 LTC facilities in the Midwestern United States. The sample comprised 39 staff from 2 facilities, including 26 from a high ITS facility and 13 from the low ITS facility. Respondents included Certified Nurse Assistants,, Certified Medical Technicians, Restorative Medical Technicians, Social Workers, Registered Nurses, Licensed Practical Nurses, Information Technology staff, Administrators, and Directors. Methods This study is a secondary analysis of interviews regarding communication and education strategies in two longterm care agencies. This analysis focused on focus group interviews, which included both direct and non-direct care providers. Results Eight themes (codes) were identified in the analysis. Three themes are presented individually with exemplars of communication and education strategies. The analysis revealed specific differences between the high ITS and low ITS facility in regards to education and communication involving pressure ulcer prevention. These differences have direct implications for WOC nurses consulting in the LTC setting. Conclusions Findings from this study suggest that effective strategies for staff education and communication regarding PU prevention differ based on the level of ITS within a given facility. Specific strategies for education and communication are suggested for agencies with high ITS and agencies with low ITS sophistication. PMID:25945822

  16. Sophisticated Search Capabilities in the ADS Abstract Service

    Science.gov (United States)

    Eichhorn, G.; Accomazzi, A.; Grant, C. S.; Henneken, E.; Kurtz, M. J.; Murray, S. S.

    2003-12-01

    The ADS provides access to over 940,000 references from astronomy and planetary sciences publications and 1.5 million records from physics publications. It is funded by NASA and provides free access to these references, as well as to 2.4 million scanned pages from the astronomical literature. These include most of the major astronomy and several planetary sciences journals, as well as many historical observatory publications. The references now include the abstracts from all volumes of the Journal of Geophysical Research (JGR) since the beginning of 2002. We get these abstracts on a regular basis. The Kluwer journal Solar Physics has been scanned back to volume 1 and is available through the ADS. We have extracted the reference lists from this and many other journals and included them in the reference and citation database of the ADS. We have recently scanning Earth, Moon and Planets, another Kluwer journal, and will scan other Kluwer journals in the future as well. We plan on extracting references from these journals as well in the near future. The ADS has many sophisticated query features. These allow the user to formulate complex queries. Using results lists to get further information about the selected articles provide the means to quickly find important and relevant articles from the database. Three advanced feedback queries are available from the bottom of the ADS results list (in addition to regular feedback queries already available from the abstract page and from the bottom of the results list): 1. Get reference list for selected articles: This query returns all known references for the selected articles (or for all articles in the first list). The resulting list will be ranked according to how often each article is referred to and will show the most referenced articles in the field of study that created the first list. It presumably shows the most important articles in that field. 2. Get citation list for selected articles: This returns all known articles

  17. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task.

    Directory of Open Access Journals (Sweden)

    Thomas Akam

    2015-12-01

    Full Text Available The recently developed 'two-step' behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects' investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues.

  18. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task

    Science.gov (United States)

    Akam, Thomas; Costa, Rui; Dayan, Peter

    2015-01-01

    The recently developed ‘two-step’ behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects’ investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues. PMID:26657806

  19. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task.

    Science.gov (United States)

    Akam, Thomas; Costa, Rui; Dayan, Peter

    2015-12-01

    The recently developed 'two-step' behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects' investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues.

  20. Safe distance car-following model including backward-looking and its stability analysis

    Science.gov (United States)

    Yang, Da; Jin, Peter Jing; Pu, Yun; Ran, Bin

    2013-03-01

    The focus of this paper is the car-following behavior including backward-looking, simply called the bi-directional looking car-following behavior. This study is motivated by the potential changes of the physical properties of traffic flow caused by the fast developing intelligent transportation system (ITS), especially the new connected vehicle technology. Existing studies on this topic focused on general motors (GM) models and optimal velocity (OV) models. The safe distance car-following model, Gipps' model, which is more widely used in practice have not drawn too much attention in the bi-directional looking context. This paper explores the property of the bi-directional looking extension of Gipps' safe distance model. The stability condition of the proposed model is derived using the linear stability theory and is verified using numerical simulations. The impacts of the driver and vehicle characteristics appeared in the proposed model on the traffic flow stability are also investigated. It is found that taking into account the backward-looking effect in car-following has three types of effect on traffic flow: stabilizing, destabilizing and producing non-physical phenomenon. This conclusion is more sophisticated than the study results based on the OV bi-directional looking car-following models. Moreover, the drivers who have the smaller reaction time or the larger additional delay and think the other vehicles have larger maximum decelerations can stabilize traffic flow.

  1. 40 CFR 60.1125 - What must I include in my siting analysis?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false What must I include in my siting... § 60.1125 What must I include in my siting analysis? (a) Include an analysis of how your municipal...) Vegetation. (b) Include an analysis of alternatives for controlling air pollution that minimize potential...

  2. Yersinia virulence factors - a sophisticated arsenal for combating host defences [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Steve Atkinson

    2016-06-01

    Full Text Available The human pathogens Yersinia pseudotuberculosis and Yersinia enterocolitica cause enterocolitis, while Yersinia pestis is responsible for pneumonic, bubonic, and septicaemic plague. All three share an infection strategy that relies on a virulence factor arsenal to enable them to enter, adhere to, and colonise the host while evading host defences to avoid untimely clearance. Their arsenal includes a number of adhesins that allow the invading pathogens to establish a foothold in the host and to adhere to specific tissues later during infection. When the host innate immune system has been activated, all three pathogens produce a structure analogous to a hypodermic needle. In conjunction with the translocon, which forms a pore in the host membrane, the channel that is formed enables the transfer of six ‘effector’ proteins into the host cell cytoplasm. These proteins mimic host cell proteins but are more efficient than their native counterparts at modifying the host cell cytoskeleton, triggering the host cell suicide response. Such a sophisticated arsenal ensures that yersiniae maintain the upper hand despite the best efforts of the host to counteract the infecting pathogen.

  3. A sophisticated simulation for the fracture behavior of concrete material using XFEM

    Science.gov (United States)

    Zhai, Changhai; Wang, Xiaomin; Kong, Jingchang; Li, Shuang; Xie, Lili

    2017-10-01

    The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.

  4. Electrochemical Sensors for Clinic Analysis

    Directory of Open Access Journals (Sweden)

    Guang Li

    2008-03-01

    Full Text Available Demanded by modern medical diagnosis, advances in microfabrication technology have led to the development of fast, sensitive and selective electrochemical sensors for clinic analysis. This review addresses the principles behind electrochemical sensor design and fabrication, and introduces recent progress in the application of electrochemical sensors to analysis of clinical chemicals such as blood gases, electrolytes, metabolites, DNA and antibodies, including basic and applied research. Miniaturized commercial electrochemical biosensors will form the basis of inexpensive and easy to use devices for acquiring chemical information to bring sophisticated analytical capabilities to the non-specialist and general public alike in the future.

  5. Instructions included? Make safety training part of medical device procurement process.

    Science.gov (United States)

    Keller, James P

    2010-04-01

    Before hospitals embrace new technologies, it's important that medical personnel agree on how best to use them. Likewise, hospitals must provide the support to operate these sophisticated devices safely. With this in mind, it's wise for hospitals to include medical device training in the procurement process. Moreover, purchasing professionals can play a key role in helping to increase the amount of user training for medical devices and systems. What steps should you take to help ensure that new medical devices are implemented safely? Here are some tips.

  6. Roman sophisticated surface modification methods to manufacture silver counterfeited coins

    Science.gov (United States)

    Ingo, G. M.; Riccucci, C.; Faraldi, F.; Pascucci, M.; Messina, E.; Fierro, G.; Di Carlo, G.

    2017-11-01

    By means of the combined use of X-ray photoelectron spectroscopy (XPS), optical microscopy (OM) and scanning electron microscopy (SEM) coupled with energy dispersive X-ray spectroscopy (EDS) the surface and subsurface chemical and metallurgical features of silver counterfeited Roman Republican coins are investigated to decipher some aspects of the manufacturing methods and to evaluate the technological ability of the Roman metallurgists to produce thin silver coatings. The results demonstrate that over 2000 ago important advances in the technology of thin layer deposition on metal substrates were attained by Romans. The ancient metallurgists produced counterfeited coins by combining sophisticated micro-plating methods and tailored surface chemical modification based on the mercury-silvering process. The results reveal that Romans were able systematically to chemically and metallurgically manipulate alloys at a micro scale to produce adherent precious metal layers with a uniform thickness up to few micrometers. The results converge to reveal that the production of forgeries was aimed firstly to save expensive metals as much as possible allowing profitable large-scale production at a lower cost. The driving forces could have been a lack of precious metals, an unexpected need to circulate coins for trade and/or a combinations of social, political and economic factors that requested a change in money supply. Finally, some information on corrosion products have been achieved useful to select materials and methods for the conservation of these important witnesses of technology and economy.

  7. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  8. Advanced calculus a transition to analysis

    CERN Document Server

    Dence, Thomas P

    2010-01-01

    Designed for a one-semester advanced calculus course, Advanced Calculus explores the theory of calculus and highlights the connections between calculus and real analysis -- providing a mathematically sophisticated introduction to functional analytical concepts. The text is interesting to read and includes many illustrative worked-out examples and instructive exercises, and precise historical notes to aid in further exploration of calculus. Ancillary list: * Companion website, Ebook- http://www.elsevierdirect.com/product.jsp?isbn=9780123749550 * Student Solutions Manual- To come * Instructor

  9. The Analysis of Sophisticated Direction of Arrival Estimation Methods in Passive Coherent Locators

    National Research Council Canada - National Science Library

    Ozcetin, Ahmet

    2002-01-01

    ...). The goal is to compare the ACMA to the MUSIC, and CBF algorithms for application to PCL. The results and analysis presented here support the use of constant modulus information, where available, as an important addition to DOA estimation...

  10. Sophisticated Online Learning Scheme for Green Resource Allocation in 5G Heterogeneous Cloud Radio Access Networks

    KAUST Repository

    Alqerm, Ismail

    2018-01-23

    5G is the upcoming evolution for the current cellular networks that aims at satisfying the future demand for data services. Heterogeneous cloud radio access networks (H-CRANs) are envisioned as a new trend of 5G that exploits the advantages of heterogeneous and cloud radio access networks to enhance spectral and energy efficiency. Remote radio heads (RRHs) are small cells utilized to provide high data rates for users with high quality of service (QoS) requirements, while high power macro base station (BS) is deployed for coverage maintenance and low QoS users service. Inter-tier interference between macro BSs and RRHs and energy efficiency are critical challenges that accompany resource allocation in H-CRANs. Therefore, we propose an efficient resource allocation scheme using online learning, which mitigates interference and maximizes energy efficiency while maintaining QoS requirements for all users. The resource allocation includes resource blocks (RBs) and power. The proposed scheme is implemented using two approaches: centralized, where the resource allocation is processed at a controller integrated with the baseband processing unit and decentralized, where macro BSs cooperate to achieve optimal resource allocation strategy. To foster the performance of such sophisticated scheme with a model free learning, we consider users\\' priority in RB allocation and compact state representation learning methodology to improve the speed of convergence and account for the curse of dimensionality during the learning process. The proposed scheme including both approaches is implemented using software defined radios testbed. The obtained results and simulation results confirm that the proposed resource allocation solution in H-CRANs increases the energy efficiency significantly and maintains users\\' QoS.

  11. A Case Study on E - Banking Security – When Security Becomes Too Sophisticated for the User to Access Their Information

    OpenAIRE

    Aaron M. French

    2012-01-01

    While eBanking security continues to increase in sophistication to protect against threats, the usability of the eBanking decreases resulting in poor security behaviors by the users. The current research evaluates se curity risks and measures taken for eBanking solutions. A case study is presented describing how increased complexity decreases vulnerabilities online but increases vulnerabilities from internal threats and eBanking users

  12. Nurturing Opportunity Identification for Business Sophistication in a Cross-disciplinary Study Environment

    Directory of Open Access Journals (Sweden)

    Karine Oganisjana

    2012-12-01

    Full Text Available Opportunity identification is the key element of the entrepreneurial process; therefore the issue of developing this skill in students is a crucial task in contemporary European education which has recognized entrepreneurship as one of the lifelong learning key competences. The earlier opportunity identification becomes a habitual way of thinking and behavior across a broad range of contexts, the more likely that entrepreneurial disposition will steadily reside in students. In order to nurture opportunity identification in students for making them able to organize sophisticated businesses in the future, certain demands ought to be put forward as well to the teacher – the person who is to promote these qualities in their students. The paper reflects some findings of a research conducted within the frameworks of a workplace learning project for the teachers of one of Riga secondary schools (Latvia. The main goal of the project was to teach the teachers to identify hidden inner links between apparently unrelated things, phenomena and events within 10th grade study curriculum and connect them together and create new opportunities. The creation and solution of cross-disciplinary tasks were the means for achieving this goal.

  13. Sophisticated Epistemologies of Physics versus High-Stakes Tests: How Do Elite High School Students Respond to Competing Influences about How to Learn Physics?

    Science.gov (United States)

    Yerdelen-Damar, Sevda; Elby, Andrew

    2016-01-01

    This study investigates how elite Turkish high school physics students claim to approach learning physics when they are simultaneously (i) engaged in a curriculum that led to significant gains in their epistemological sophistication and (ii) subject to a high-stakes college entrance exam. Students reported taking surface (rote) approaches to…

  14. Comparative Analysis of Investment Decision Models

    Directory of Open Access Journals (Sweden)

    Ieva Kekytė

    2017-06-01

    Full Text Available Rapid development of financial markets resulted new challenges for both investors and investment issues. This increased demand for innovative, modern investment and portfolio management decisions adequate for market conditions. Financial market receives special attention, creating new models, includes financial risk management and investment decision support systems.Researchers recognize the need to deal with financial problems using models consistent with the reality and based on sophisticated quantitative analysis technique. Thus, role mathematical modeling in finance becomes important. This article deals with various investments decision-making models, which include forecasting, optimization, stochatic processes, artificial intelligence, etc., and become useful tools for investment decisions.

  15. Real analysis foundations and functions of one variable

    CERN Document Server

    Laczkovich, Miklós

    2015-01-01

    Based on courses given at Eötvös Loránd University (Hungary) over the past 30 years, this introductory textbook develops the central concepts of the analysis of functions of one variable - systematically, with many examples and illustrations, and in a manner that builds upon, and sharpens, the students' mathematical intuition. The modular organization of the book makes it adaptable for either semester or year-long introductory courses, while the wealth of material allows for it to be used at various levels of student sophistication in all programs where analysis is a part of the curriculum, including teachers' education. In the spirit of learning-by-doing, Real Analysis includes more than 500 engaging exercises for the student keen on mastering the basics of analysis. There are frequent hints and occasional complete solutions provided for the more challenging exercises making it an ideal choice for independent study. The book includes a solid grounding in the basics of logic and proofs, sets, and real numb...

  16. Atmosphere-soil-vegetation model including CO2 exchange processes: SOLVEG2

    International Nuclear Information System (INIS)

    Nagai, Haruyasu

    2004-11-01

    A new atmosphere-soil-vegetation model named SOLVEG2 (SOLVEG version 2) was developed to study the heat, water, and CO 2 exchanges between the atmosphere and land-surface. The model consists of one-dimensional multilayer sub-models for the atmosphere, soil, and vegetation. It also includes sophisticated processes for solar and long-wave radiation transmission in vegetation canopy and CO 2 exchanges among the atmosphere, soil, and vegetation. Although the model usually simulates only vertical variation of variables in the surface-layer atmosphere, soil, and vegetation canopy by using meteorological data as top boundary conditions, it can be used by coupling with a three-dimensional atmosphere model. In this paper, details of SOLVEG2, which includes the function of coupling with atmosphere model MM5, are described. (author)

  17. Reactive polymer coatings: A robust platform towards sophisticated surface engineering for biotechnology

    Science.gov (United States)

    Chen, Hsien-Yeh

    Functionalized poly(p-xylylenes) or so-called reactive polymers can be synthesized via chemical vapor deposition (CVD) polymerization. The resulting ultra-thin coatings are pinhole-free and can be conformally deposited to a wide range of substrates and materials. More importantly, the equipped functional groups can served as anchoring sites for tailoring the surface properties, making these reactive coatings a robust platform that can deal with sophisticated challenges faced in biointerfaces. In this work presented herein, surface coatings presenting various functional groups were prepared by CVD process. Such surfaces include aldehyde-functionalized coating to precisely immobilize saccharide molecules onto well-defined areas and alkyne-functionalized coating to click azide-modified molecules via Huisgen 1,3-dipolar cycloaddition reaction. Moreover, CVD copolymerization has been conducted to prepare multifunctional coatings and their specific functions were demonstrated by the immobilization of biotin and NHS-ester molecules. By using a photodefinable coating, polyethylene oxides were immobilized onto a wide range of substrates through photo-immobilization. Spatially controlled protein resistant properties were characterized by selective adsorption of fibrinogen and bovine serum albumin as model systems. Alternatively, surface initiator coatings were used for polymer graftings of polyethylene glycol) methyl ether methacrylate, and the resultant protein- and cell- resistant properties were characterized by adsorption of kinesin motor proteins, fibrinogen, and murine fibroblasts (NIH3T3). Accessibility of reactive coatings within confined microgeometries was systematically studied, and the preparation of homogeneous polymer thin films within the inner surface of microchannels was demonstrated. Moreover, these advanced coatings were applied to develop a dry adhesion process for microfluidic devices. This process provides (i) excellent bonding strength, (ii) extended

  18. A consistent response spectrum analysis including the resonance range

    International Nuclear Information System (INIS)

    Schmitz, D.; Simmchen, A.

    1983-01-01

    The report provides a complete consistent Response Spectrum Analysis for any component. The effect of supports with different excitation is taken into consideration, at is the description of the resonance ranges. It includes information explaining how the contributions of the eigenforms with higher eigenfrequencies are to be considered. Stocking of floor response spectra is also possible using the method described here. However, modified floor response spectra must now be calculated for each building mode. Once these have been prepared, the calculation of the dynamic component values is practically no more complicated than with the conventional, non-consistent methods. The consistent Response Spectrum Analysis can supply smaller and larger values than the conventional theory, a fact which can be demonstrated using simple examples. The report contains a consistent Response Spectrum Analysis (RSA), which, as far as we know, has been formulated in this way for the first time. A consistent RSA is so important because today this method is preferentially applied as an important tool for the earthquake proof of components in nuclear power plants. (orig./HP)

  19. xSyn: A Software Tool for Identifying Sophisticated 3-Way Interactions From Cancer Expression Data

    Directory of Open Access Journals (Sweden)

    Baishali Bandyopadhyay

    2017-08-01

    Full Text Available Background: Constructing gene co-expression networks from cancer expression data is important for investigating the genetic mechanisms underlying cancer. However, correlation coefficients or linear regression models are not able to model sophisticated relationships among gene expression profiles. Here, we address the 3-way interaction that 2 genes’ expression levels are clustered in different space locations under the control of a third gene’s expression levels. Results: We present xSyn, a software tool for identifying such 3-way interactions from cancer gene expression data based on an optimization procedure involving the usage of UPGMA (Unweighted Pair Group Method with Arithmetic Mean and synergy. The effectiveness is demonstrated by application to 2 real gene expression data sets. Conclusions: xSyn is a useful tool for decoding the complex relationships among gene expression profiles. xSyn is available at http://www.bdxconsult.com/xSyn.html .

  20. When not to copy: female fruit flies use sophisticated public information to avoid mated males

    Science.gov (United States)

    Loyau, Adeline; Blanchet, Simon; van Laere, Pauline; Clobert, Jean; Danchin, Etienne

    2012-10-01

    Semen limitation (lack of semen to fertilize all of a female's eggs) imposes high fitness costs to female partners. Females should therefore avoid mating with semen-limited males. This can be achieved by using public information extracted from watching individual males' previous copulating activities. This adaptive preference should be flexible given that semen limitation is temporary. We first demonstrate that the number of offspring produced by males Drosophila melanogaster gradually decreases over successive copulations. We then show that females avoid mating with males they just watched copulating and that visual public cues are sufficient to elicit this response. Finally, after males were given the time to replenish their sperm reserves, females did not avoid the males they previously saw copulating anymore. These results suggest that female fruit flies may have evolved sophisticated behavioural processes of resistance to semen-limited males, and demonstrate unsuspected adaptive context-dependent mate choice in an invertebrate.

  1. Dynamic malware analysis using IntroVirt: a modified hypervisor-based system

    Science.gov (United States)

    White, Joshua S.; Pape, Stephen R.; Meily, Adam T.; Gloo, Richard M.

    2013-05-01

    In this paper, we present a system for Dynamic Malware Analysis which incorporates the use of IntroVirt™. IntroVirt is an introspective hypervisor architecture and infrastructure that supports advanced analysis techniques for stealth-malwareanalysis. This system allows for complete guest monitoring and interaction, including the manipulation and blocking of system calls. IntroVirt is capable of bypassing virtual machine detection capabilities of even the most sophisticated malware, by spoofing returns to system call responses. Additional fuzzing capabilities can be employed to detect both malware vulnerabilities and polymorphism.

  2. Estudo teórico das transições eletrônicas usando métodos simples e sofisticados Theoretical study of electronic transitions using simple and sophisticated methods

    Directory of Open Access Journals (Sweden)

    Nelson H. Morgon

    2013-01-01

    Full Text Available In this paper, the use of both simple and sophisticated models in the study of electronic transitions was explored for a set of molecular systems: C2H4, C4H4, C4H6, C6H6, C6H8, "C8", C60, and [H2NCHCH(CHCHkCHNH2]+, where k = 0 to 4. The simple model of the free particle (1D, 2D, and 3D boxes, rings or spherical surfaces, considering the boundary conditions, was found to yield similar results to the sophisticated theoretical methods such as EOM-CCSD/6-311++G** or TD(NStates=5,Root=1-M06-2X/6-311++G**.

  3. Education Program on Fossil Resources Including Coal

    Science.gov (United States)

    Usami, Masahiro

    Fossil fuels including coal play a key role as crucial energies in contributing to economic development in Asia. On the other hand, its limited quantity and the environmental problems causing from its usage have become a serious global issue and a countermeasure to solve such problems is very much demanded. Along with the pursuit of sustainable development, environmentally-friendly use of highly efficient fossil resources should be therefore, accompanied. Kyushu-university‧s sophisticated research through long years of accumulated experience on the fossil resources and environmental sectors together with the advanced large-scale commercial and empirical equipments will enable us to foster cooperative research and provide internship program for the future researchers. Then, this program is executed as a consignment business from the Ministry of Economy, Trade and Industry from 2007 fiscal year to 2009 fiscal year. The lecture that uses the textbooks developed by this program is scheduled to be started a course in fiscal year 2010.

  4. A review of the reliability analysis of LPRS including the components repairs

    International Nuclear Information System (INIS)

    Oliveira, L.F.S. de; Fleming, P.V.; Frutuoso e Melo, P.F.F.; Tayt-Sohn, L.C.

    1983-01-01

    The reliability analysis of low pressure recirculation system in its long-term recicurlation phase before 24hs is presented. The possibility of repairing the components out of the containment is included. A general revision of analysis of the short-term recirculation phase is done. (author) [pt

  5. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    Science.gov (United States)

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  6. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    Directory of Open Access Journals (Sweden)

    Hsieh Fushing

    2011-03-01

    Full Text Available We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  7. CITATION ANALYSIS OF URBAN PLANNING SCHOLARS IN THE U.S.

    Directory of Open Access Journals (Sweden)

    Sanchez Thomas W

    2015-03-01

    Full Text Available This article provides a complete citation analysis for the field of urban planning in the U.S. Urban planning is multi-disciplinary with a rich tradition of debate about the knowledge domain of both research and practice. Urban planning includes consideration of social, economic, technological, environmental, and political systems that are highly sophisticated, which therefore has an extensive body of scholarship. The article argues that Google Scholar is an appropriate source of citation data for urban planning and includes a brief example of one urban planning scholar to demonstrate GS citation patterns. This is followed by the results of a descriptive analysis showing general patterns of citation activity for urban planning schools. A greater depth of analysis is required to better understand the dynamics of these scholarly activities.

  8. Seismic analysis and design of NPP structures

    International Nuclear Information System (INIS)

    de Carvalho Santos, S.H.; da Silva, R.E.

    1989-01-01

    Numerical methods for static and dynamic analysis of structures, as well as for the design of individual structural elements under the applied loads are under continuous development, being very sophisticated methods nowadays available for the engineering practice. Nevertheless, this sophistication will be useless if some important aspects necessary to assure full compatability between analysis and design are disregarded. Some of these aspects are discussed herein. This paper presents an integrated approach for the seismic analysis and design of NPP structures: the development of models for the seismic analysis, the distribution of the global seismic forces among the seismic-resistant elements and the criteria for the design of the individual elements for combined static and dynamic forces are the main topics to be discussed herein. The proposed methodology is illustrated. Some examples taken from the project practice are presented for illustration the exposed concepts

  9. Lectures on harmonic analysis

    CERN Document Server

    Wolff, Thomas H; Shubin, Carol

    2003-01-01

    This book demonstrates how harmonic analysis can provide penetrating insights into deep aspects of modern analysis. It is both an introduction to the subject as a whole and an overview of those branches of harmonic analysis that are relevant to the Kakeya conjecture. The usual background material is covered in the first few chapters: the Fourier transform, convolution, the inversion theorem, the uncertainty principle and the method of stationary phase. However, the choice of topics is highly selective, with emphasis on those frequently used in research inspired by the problems discussed in the later chapters. These include questions related to the restriction conjecture and the Kakeya conjecture, distance sets, and Fourier transforms of singular measures. These problems are diverse, but often interconnected; they all combine sophisticated Fourier analysis with intriguing links to other areas of mathematics and they continue to stimulate first-rate work. The book focuses on laying out a solid foundation for fu...

  10. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    Science.gov (United States)

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  11. How novice, skilled and advanced clinical researchers include variables in a case report form for clinical research: a qualitative study.

    Science.gov (United States)

    Chu, Hongling; Zeng, Lin; Fetters, Micheal D; Li, Nan; Tao, Liyuan; Shi, Yanyan; Zhang, Hua; Wang, Xiaoxiao; Li, Fengwei; Zhao, Yiming

    2017-09-18

    Despite varying degrees in research training, most academic clinicians are expected to conduct clinical research. The objective of this research was to understand how clinical researchers of different skill levels include variables in a case report form for their clinical research. The setting for this research was a major academic institution in Beijing, China. The target population was clinical researchers with three levels of experience, namely, limited clinical research experience, clinicians with rich clinical research experience and clinical research experts. Using a qualitative approach, we conducted 13 individual interviews (face to face) and one group interview (n=4) with clinical researchers from June to September 2016. Based on maximum variation sampling to identify researchers with three levels of research experience: eight clinicians with limited clinical research experience, five clinicians with rich clinical research experience and four clinical research experts. These 17 researchers had diverse hospital-based medical specialties and or specialisation in clinical research. Our analysis yields a typology of three processes developing a case report form that varies according to research experience level. Novice clinician researchers often have an incomplete protocol or none at all, and conduct data collection and publication based on a general framework. Experienced clinician researchers include variables in the case report form based on previous experience with attention to including domains or items at risk for omission and by eliminating unnecessary variables. Expert researchers consider comprehensively in advance data collection and implementation needs and plan accordingly. These results illustrate increasing levels of sophistication in research planning that increase sophistication in selection for variables in the case report form. These findings suggest that novice and intermediate-level researchers could benefit by emulating the comprehensive

  12. The Impact of Services on Economic Complexity: Service Sophistication as Route for Economic Growth.

    Science.gov (United States)

    Stojkoski, Viktor; Utkovski, Zoran; Kocarev, Ljupco

    2016-01-01

    Economic complexity reflects the amount of knowledge that is embedded in the productive structure of an economy. By combining tools from network science and econometrics, a robust and stable relationship between a country's productive structure and its economic growth has been established. Here we report that not only goods but also services are important for predicting the rate at which countries will grow. By adopting a terminology which classifies manufactured goods and delivered services as products, we investigate the influence of services on the country's productive structure. In particular, we provide evidence that complexity indices for services are in general higher than those for goods, which is reflected in a general tendency to rank countries with developed service sector higher than countries with economy centred on manufacturing of goods. By focusing on country dynamics based on experimental data, we investigate the impact of services on the economic complexity of countries measured in the product space (consisting of both goods and services). Importantly, we show that diversification of service exports and its sophistication can provide an additional route for economic growth in both developing and developed countries.

  13. The State of Nursing Home Information Technology Sophistication in Rural and Nonrural US Markets.

    Science.gov (United States)

    Alexander, Gregory L; Madsen, Richard W; Miller, Erin L; Wakefield, Douglas S; Wise, Keely K; Alexander, Rachel L

    2017-06-01

    To test for significant differences in information technology sophistication (ITS) in US nursing homes (NH) based on location. We administered a primary survey January 2014 to July 2015 to NH in each US state. The survey was cross-sectional and examined 3 dimensions (IT capabilities, extent of IT use, degree of IT integration) among 3 domains (resident care, clinical support, administrative activities) of ITS. ITS was broken down by NH location. Mean responses were compared across 4 NH categories (Metropolitan, Micropolitan, Small Town, and Rural) for all 9 ITS dimensions and domains. Least square means and Tukey's method were used for multiple comparisons. Methods yielded 815/1,799 surveys (45% response rate). In every health care domain (resident care, clinical support, and administrative activities) statistical differences in facility ITS occurred in larger (metropolitan or micropolitan) and smaller (small town or rural) populated areas. This study represents the most current national assessment of NH IT since 2004. Historically, NH IT has been used solely for administrative activities and much less for resident care and clinical support. However, results are encouraging as ITS in other domains appears to be greater than previously imagined. © 2016 National Rural Health Association.

  14. SAMPO 90 high resolution interactive gamma-spectrum analysis including automation with macros

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Nikkinen, M.T.; Routti, J.T.

    1992-01-01

    SAMPO 90 is high performance gamma-spectrum analysis program for personal computers. It uses color graphics to display calibrations, spectra, fitting results as multiplet components, and analysis results. All the analysis phases can be done either under full interactive user control or macros and programmable function keys can be used for completely automated measurement and analysis sequences including the control of MACs and sample changers. Accurate peak area determination of even the most complex multiplets, of up to 32 components, is accomplished using linear and mixed mode fitting. Nuclide identification is done using associated lines techniques allowing interference correction for fully overlapping peaks. Peaked Background Subtraction can be performed and Minimum Detectable Activities calculated. The analysis reports and program parameters are fully customizable. (author) 13 refs.; 1 fig

  15. Dose profile analysis of small fields in intensity modulated radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Medel B, E. [IMSS, Centro Medico Nacional Manuel Avila Camacho, Calle 2 Nte. 2004, Barrio de San Francisco, 72090 Puebla, Pue. (Mexico); Tejeda M, G.; Romero S, K., E-mail: romsakaren@gmail.com [Benemerita Universidad Autonoma de Puebla, Facultad de Ciencias Fisico Matematicas, Av. San Claudio y 18 Sur, Ciudad Universitaria, 72570 Puebla, Pue.(Mexico)

    2015-10-15

    Full text: Small field dosimetry is getting a very important worldwide task nowadays. The use of fields of few centimeters is more common with the introduction of sophisticated techniques of radiation therapy, as Intensity Modulated Radiotherapy (IMRT). In our country the implementation of such techniques is just getting started and whit it the need of baseline data acquisition. The dosimetry under small field conditions represents a challenge for the physicists community. In this work, a dose profile analysis was done, using various types of dosimeters for further comparisons. This analysis includes the study of quality parameters as flatness, symmetry, penumbra, and other in-axis measurements. (Author)

  16. Two-dimensional analysis of motion artifacts, including flow effects

    International Nuclear Information System (INIS)

    Litt, A.M.; Brody, A.S.; Spangler, R.A.; Scott, P.D.

    1990-01-01

    The effects of motion on magnetic resonance images have been theoretically analyzed for the case of a point-like object in simple harmonic motion and for other one-dimensional trajectories. The authors of this paper extend this analysis to a generalized two-dimensional magnetization with an arbitrary motion trajectory. The authors provide specific solutions for the clinically relevant cases of the cross-sections of cylindrical objects in the body, such as the aorta, which has a roughly one-dimensional, simple harmonic motion during respiration. By extending the solution to include inhomogeneous magnetizations, the authors present a model which allows the effects of motion artifacts and flow artifacts to be analyzed simultaneously

  17. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis.

    Science.gov (United States)

    Delorme, Arnaud; Makeig, Scott

    2004-03-15

    We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.

  18. SAMPO 90 - High resolution interactive gamma spectrum analysis including automation with macros

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Nikkinen, M.T.; Routti, J.T.

    1991-01-01

    SAMPO 90 is a high performance gamma spectrum analysis program for personal computers. It uses high resolution color graphics to display calibrations, spectra, fitting results as multiplet components, and analysis results. All the analysis phases can be done either under full interactive user control or by using macros for automated measurement and analysis sequences including the control of MCAs and sample changers. Semi-automated calibrations for peak shapes (Gaussian with exponential tails), detector efficiency, and energy are available with a possibility for user intervention through interactive graphics. Accurate peak area determination of even the most complex multiplets, of up to 32 components, is accomplished using linear, non-linear and mixed mode fitting, where the component energies and areas can be either frozen or allowed to float in arbitrary combinations. Nuclide identification is done using associated lines techniques which allow interference correction for fully overlapping peaks. Peaked Background Subtraction can be performed and Minimum Detectable Activities calculated. Attenuation corrections can be taken into account in detector efficiency calculation. The most common PC-based MCA spectrum formats (Canberra S100, Ortec ACE, Nucleus PCA, ND AccuSpec) are supported as well as ASCII spectrum files. A gamma-line library is included together with an editor for user configurable libraries. The analysis reports and program parameters are fully customizable. Function key macros can be used to automate the most common analysis procedures. Small batch type modules are additionally available for routine work. SAMPO 90 is a result of over twenty man years of programming and contains 25,000 lines of Fortran, 10,000 lines of C, and 12,000 lines of assembler

  19. Multi trace element analysis of dry biological materials by neutron activation analysis including a chemical group separation

    International Nuclear Information System (INIS)

    Weers, C.A.

    1980-07-01

    Multi-element analysis of dry biological material by neutron activation analysis has to include radiochemical separation. The evaporation process is described in terms of the half-volume. The pretreatment of the samples and the development of the destruction-evaporation apparatus are described. The successive adsorption steps with active charcoal, Al 2 O 3 and coprecipitation with Fe(OH) 3 are described. Results obtained for standard reference materials are summarized. (G.T.H.)

  20. Combining Conversation Analysis and Nexus Analysis to explore hospital practices

    DEFF Research Database (Denmark)

    Paasch, Bettina Sletten

    , ethnographic observations, interviews, photos and documents were obtained. Inspired by the analytical manoeuvre of zooming in and zooming out proposed by Nicolini (Nicolini, 2009; Nicolini, 2013) the present study uses Conversations Analysis (Sacks, Schegloff, & Jefferson, 1974) and Embodied Interaction...... of interaction. In the conducted interviews nurses report mobile work phones to disturb interactions with patients when they ring, however, analysing the recorded interactions with tools from Conversations Analysis and Embodied Interaction Analysis displays how nurses demonstrate sophisticated awareness...... interrelationships influencing it. The present study thus showcases how Conversation Analysis and Nexus Analysis can be combined to achieve a multi-layered perspective on interactions between nurses, patients and mobile work phones....

  1. Aerodynamic analysis of the Darrieus rotor including secondary effects

    Science.gov (United States)

    Paraschivoiu, I.; Delclaux, F.; Fraunie, P.; Beguier, C.

    1983-10-01

    An aerodynamic analysis is made of two variants of the two-actuator-disk theory for modeling the Darrieus wind turbine. The double-multiple-streamtube model with constant and variable interference factors, including secondary effects, is examined for a Darrieus rotor. The influence of the secondary effects, namely, the blade geometry and profile type, the rotating tower, and the presence of struts and aerodynamic spoilers, is relatively significant, especially at high tip-speed ratios. Variation of the induced velocity as a function of the azimuthal angle allows a more accurate calculation of the aerodynamic loads on the downwind zone of the rotor with respect to the assumed constant interference factors. The theoretical results were compared with available experimental data for the Magdalen Islands wind turbine and Sandia-type machines (straight-line/circular-arc shape).

  2. A sophisticated cad tool for the creation of complex models for electromagnetic interaction analysis

    Science.gov (United States)

    Dion, Marc; Kashyap, Satish; Louie, Aloisius

    1991-06-01

    This report describes the essential features of the MS-DOS version of DIDEC-DREO, an interactive program for creating wire grid, surface patch, and cell models of complex structures for electromagnetic interaction analysis. It uses the device-independent graphics library DIGRAF and the graphics kernel system HALO, and can be executed on systems with various graphics devices. Complicated structures can be created by direct alphanumeric keyboard entry, digitization of blueprints, conversion form existing geometric structure files, and merging of simple geometric shapes. A completed DIDEC geometric file may then be converted to the format required for input to a variety of time domain and frequency domain electromagnetic interaction codes. This report gives a detailed description of the program DIDEC-DREO, its installation, and its theoretical background. Each available interactive command is described. The associated program HEDRON which generates simple geometric shapes, and other programs that extract the current amplitude data from electromagnetic interaction code outputs, are also discussed.

  3. Dynamic Analysis of Wind Turbines Including Soil-Structure Interaction

    DEFF Research Database (Denmark)

    Harte, M.; Basu, B.; Nielsen, Søren R.K.

    2012-01-01

    This paper investigates the along-wind forced vibration response of an onshore wind turbine. The study includes the dynamic interaction effects between the foundation and the underlying soil, as softer soils can influence the dynamic response of wind turbines. A Multi-Degree-of-Freedom (MDOF......) horizontal axes onshore wind turbine model is developed for dynamic analysis using an Euler–Lagrangian approach. The model is comprised of a rotor blade system, a nacelle and a flexible tower connected to a foundation system using a substructuring approach. The rotor blade system consists of three rotating...... for displacement of the turbine system are obtained and the modal frequencies of the combined turbine-foundation system are estimated. Simulations are presented for the MDOF turbine structure subjected to wind loading for different soil stiffness conditions. Steady state and turbulent wind loading, developed using...

  4. Core analysis: new features and applications

    International Nuclear Information System (INIS)

    Edenius, M.; Kurcyusz, E.; Molina, D.; Wiksell, G.

    1995-01-01

    Today, core analysis may be performed with sophisticated software capable of both steady state and transient analysis using a common methodology for BWRs and PWRs. General trends in core analysis software development are: improved accuracy, automated engineering functions; three-dimensional transient capability; graphical user interfaces. As a demonstration of such software, new features of Studsvik-CMS (Core management system) and examples of applications are discussed in this article. 2 figs., 8 refs

  5. Data Analysis Strategies in Medical Imaging.

    Science.gov (United States)

    Parmar, Chintan; Barry, Joseph D; Hosny, Ahmed; Quackenbush, John; Aerts, Hugo Jwl

    2018-03-26

    Radiographic imaging continues to be one of the most effective and clinically useful tools within oncology. Sophistication of artificial intelligence (AI) has allowed for detailed quantification of radiographic characteristics of tissues using predefined engineered algorithms or deep learning methods. Precedents in radiology as well as a wealth of research studies hint at the clinical relevance of these characteristics. However, there are critical challenges associated with the analysis of medical imaging data. While some of these challenges are specific to the imaging field, many others like reproducibility and batch effects are generic and have already been addressed in other quantitative fields such as genomics. Here, we identify these pitfalls and provide recommendations for analysis strategies of medical imaging data including data normalization, development of robust models, and rigorous statistical analyses. Adhering to these recommendations will not only improve analysis quality, but will also enhance precision medicine by allowing better integration of imaging data with other biomedical data sources. Copyright ©2018, American Association for Cancer Research.

  6. Global Analysis of Solar Neutrino Oscillations Including SNO CC Measurement

    CERN Document Server

    Bahcall, J N; Peña-Garay, C; Bahcall, John N; Peña-Garay, Carlos

    2001-01-01

    For active and sterile neutrinos, we present the globally allowed solutions for two neutrino oscillations. We include the SNO CC measurement and all other relevant solar neutrino and reactor data. Five active neutrino oscillation solutions (LMA, LOW, SMA, VAC, and Just So2) are currently allowed at 3 sigma; three sterile neutrino solutions (Just So2, SMA, and VAC) are allowed at 3 sigma. The goodness of fit is satisfactory for all eight solutions. We also investigate the robustness of the allowed solutions by carrying out global analyses with and without: 1) imposing solar model constraints on the 8B neutrino flux, 2) including the Super-Kamiokande spectral energy distribution and day-night data, 3) using an enhanced CC cross section for deuterium (due to radiative corrections), and 4) a optimistic, hypothetical reduction by a factor of three of the error of the SNO CC rate. For every analysis strategy used in this paper, the most favored solutions all involve large mixing angles: LMA, LOW, or VAC. The favore...

  7. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    Science.gov (United States)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc

  8. A study on the influence of eWOM using content analysis: how do comments on value for money, product sophistication and experiential feeling affect our choices?

    Science.gov (United States)

    Cho, Vincent; Chan, Alpha

    2017-07-01

    The influence of electronic word of mouth (eWOM) has been heavily investigated in relation to online ratings. However, only a few studies examined the content of eWOM. From the perspective of the consideration sets model, consumers formulate an awareness set, a consideration set and a choice set before making a purchase. We argue that the formulation of these sets is influenced by eWOM based on its volume, valance and content relating to product attributes such as value for money, product sophistication and experiential feeling. In this study, the content of posts relating to Shure professional earphones in the online forum Mingo (www.mingo-hmw.com/forum) was captured and annotated. During the data collection period, Mingo was the sole online forum relating to professional earphones. Without much interference from other online forums, the circumstances of this study closely approximate a laboratory setting. In addition, we collected the actual sales, marketing costs, fault rates and number of retail stores selling the Shure professional earphones for 126 weeks. Our findings show that the weekly volume of posts, their relative number of positive (negative) comments, especially regarding value for money and sound quality, and those posts from the earlier week impinged strongly on weekly sales of Shure products. From the regression models, the explained variance in sales jumps from 0.236 to 0.732 due to the influence of eWOM.

  9. Mesh Processing in Medical-Image Analysis-a Tutorial

    DEFF Research Database (Denmark)

    Levine, Joshua A.; Paulsen, Rasmus Reinhold; Zhang, Yongjie

    2012-01-01

    Medical-image analysis requires an understanding of sophisticated scanning modalities, constructing geometric models, building meshes to represent domains, and downstream biological applications. These four steps form an image-to-mesh pipeline. For research in this field to progress, the imaging...

  10. Experimental design and quantitative analysis of microbial community multiomics.

    Science.gov (United States)

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  11. Gap analysis: rethinking the conceptual foundations

    OpenAIRE

    Langford, Gary O.; Franck, Raymond; Huynh, Tom; Lewis, Ira A.

    2007-01-01

    Acquisition research (Graduate School of Business & Public Policy) Gap Analysis is widely regarded as a useful tool to facilitate commercial and defense system acquisitions. This paper is a rethinking of the theoretical foundations and systematics of Gap Analysis with practical extensions to illustrate its utility and limitations. It also provides a new perspective on those theoretical foundations from the perspectives of systems and value engineering. The growing sophistication and comple...

  12. Invitation to classical analysis

    CERN Document Server

    Duren, Peter

    2012-01-01

    This book gives a rigorous treatment of selected topics in classical analysis, with many applications and examples. The exposition is at the undergraduate level, building on basic principles of advanced calculus without appeal to more sophisticated techniques of complex analysis and Lebesgue integration. Among the topics covered are Fourier series and integrals, approximation theory, Stirling's formula, the gamma function, Bernoulli numbers and polynomials, the Riemann zeta function, Tauberian theorems, elliptic integrals, ramifications of the Cantor set, and a theoretical discussion of differ

  13. Thermodynamic analysis of a Stirling engine including regenerator dead volume

    Energy Technology Data Exchange (ETDEWEB)

    Puech, Pascal; Tishkova, Victoria [Universite de Toulouse, UPS, CNRS, CEMES, 29 rue Jeanne Marvig, F-31055 Toulouse (France)

    2011-02-15

    This paper provides a theoretical investigation on the thermodynamic analysis of a Stirling engine with linear and sinusoidal variations of the volume. The regenerator in a Stirling engine is an internal heat exchanger allowing to reach high efficiency. We used an isothermal model to analyse the net work and the heat stored in the regenerator during a complete cycle. We show that the engine efficiency with perfect regeneration doesn't depend on the regenerator dead volume but this dead volume strongly amplifies the imperfect regeneration effect. An analytical expression to estimate the improvement due to the regenerator has been proposed including the combined effects of dead volume and imperfect regeneration. This could be used at the very preliminary stage of the engine design process. (author)

  14. A structural design and analysis of a piping system including seismic load

    International Nuclear Information System (INIS)

    Hsieh, B.J.; Kot, C.A.

    1991-01-01

    The structural design/analysis of a piping system at a nuclear fuel facility is used to investigate some aspects of current design procedures. Specifically the effect of using various stress measures including ASME Boiler ampersand Pressure Vessel (B ampersand PV) Code formulas is evaluated. It is found that large differences in local maximum stress values may be calculated depending on the stress criterion used. However, when the global stress maximum for the entire system are compared the differences are much smaller, being nevertheless, for some load combinations, of the order of 50 percent. The effect of using an Equivalent Static Method (ESM) analysis is also evaluated by comparing its results with those obtained from a Response Spectrum Method (RSM) analysis with the modal responses combined by using the absolute summation (ABS), by using the square root of the squares (SRSS), and by using the 10 percent method (10PC). It is shown that for a spectrum amplification factor (equivalent static coefficient greater than unity) of at least 1.32 must be used in the current application of the ESM analysis in order to obtain results which are conservative in all aspects relative to an RSM analysis based on ABS. However, it appears that an adequate design would be obtained from the ESM approach even without the use of a spectrum amplification factor. 7 refs., 3 figs., 3 tabs

  15. Sophisticated visualization algorithms for analysis of multidimensional experimental nuclear spectra

    International Nuclear Information System (INIS)

    Morhac, M.; Kliman, J.; Matousek, V.; Turzo, I.

    2004-01-01

    This paper describes graphical models of visualization of 2-, 3-, 4-dimensional scalar data used in nuclear data acquisition, processing and visualization system developed at the Institute of Physics, Slovak Academy of Sciences. It focuses on presentation of nuclear spectra (histograms). However it can be successfully applied for visualization of arrays of other data types. In the paper we present conventional as well as new developed surface and volume rendering visualization techniques used (Authors)

  16. Rapid, low-cost, image analysis through video processing

    International Nuclear Information System (INIS)

    Levinson, R.A.; Marrs, R.W.; Grantham, D.G.

    1976-01-01

    Remote Sensing now provides the data necessary to solve many resource problems. However, many of the complex image processing and analysis functions used in analysis of remotely-sensed data are accomplished using sophisticated image analysis equipment. High cost of this equipment places many of these techniques beyond the means of most users. A new, more economical, video system capable of performing complex image analysis has now been developed. This report describes the functions, components, and operation of that system. Processing capability of the new video image analysis system includes many of the tasks previously accomplished with optical projectors and digital computers. Video capabilities include: color separation, color addition/subtraction, contrast stretch, dark level adjustment, density analysis, edge enhancement, scale matching, image mixing (addition and subtraction), image ratioing, and construction of false-color composite images. Rapid input of non-digital image data, instantaneous processing and display, relatively low initial cost, and low operating cost gives the video system a competitive advantage over digital equipment. Complex pre-processing, pattern recognition, and statistical analyses must still be handled through digital computer systems. The video system at the University of Wyoming has undergone extensive testing, comparison to other systems, and has been used successfully in practical applications ranging from analysis of x-rays and thin sections to production of color composite ratios of multispectral imagery. Potential applications are discussed including uranium exploration, petroleum exploration, tectonic studies, geologic mapping, hydrology sedimentology and petrography, anthropology, and studies on vegetation and wildlife habitat

  17. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  18. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  19. Exploring the predictive power of interaction terms in a sophisticated risk equalization model using regression trees.

    Science.gov (United States)

    van Veen, S H C M; van Kleef, R C; van de Ven, W P M M; van Vliet, R C J A

    2018-02-01

    This study explores the predictive power of interaction terms between the risk adjusters in the Dutch risk equalization (RE) model of 2014. Due to the sophistication of this RE-model and the complexity of the associations in the dataset (N = ~16.7 million), there are theoretically more than a million interaction terms. We used regression tree modelling, which has been applied rarely within the field of RE, to identify interaction terms that statistically significantly explain variation in observed expenses that is not already explained by the risk adjusters in this RE-model. The interaction terms identified were used as additional risk adjusters in the RE-model. We found evidence that interaction terms can improve the prediction of expenses overall and for specific groups in the population. However, the prediction of expenses for some other selective groups may deteriorate. Thus, interactions can reduce financial incentives for risk selection for some groups but may increase them for others. Furthermore, because regression trees are not robust, additional criteria are needed to decide which interaction terms should be used in practice. These criteria could be the right incentive structure for risk selection and efficiency or the opinion of medical experts. Copyright © 2017 John Wiley & Sons, Ltd.

  20. CATDAT - A program for parametric and nonparametric categorical data analysis user's manual, Version 1.0

    International Nuclear Information System (INIS)

    Peterson, James R.; Haas, Timothy C.; Lee, Danny C.

    2000-01-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network

  1. Analysis of high-throughput plant image data with the information system IAP

    Directory of Open Access Journals (Sweden)

    Klukas Christian

    2012-06-01

    Full Text Available This work presents a sophisticated information system, the Integrated Analysis Platform (IAP, an approach supporting large-scale image analysis for different species and imaging systems. In its current form, IAP supports the investigation of Maize, Barley and Arabidopsis plants based on images obtained in different spectra.

  2. HEART: an automated beat-to-beat cardiovascular analysis package using Matlab.

    Science.gov (United States)

    Schroeder, M J Mark J; Perreault, Bill; Ewert, D L Daniel L; Koenig, S C Steven C

    2004-07-01

    A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that have been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses such as lumped arterial model parameter estimation and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. An attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process, thus establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab platform provides users with the flexibility to adapt or create study specific analysis files according to their specific needs. Copyright 2003 Elsevier Ltd.

  3. Reliability analysis using network simulation

    International Nuclear Information System (INIS)

    Engi, D.

    1985-01-01

    The models that can be used to provide estimates of the reliability of nuclear power systems operate at many different levels of sophistication. The least-sophisticated models treat failure processes that entail only time-independent phenomena (such as demand failure). More advanced models treat processes that also include time-dependent phenomena such as run failure and possibly repair. However, many of these dynamic models are deficient in some respects because they either disregard the time-dependent phenomena that cannot be expressed in closed-form analytic terms or because they treat these phenomena in quasi-static terms. The next level of modeling requires a dynamic approach that incorporates not only procedures for treating all significant time-dependent phenomena but also procedures for treating these phenomena when they are conditionally linked or characterized by arbitrarily selected probability distributions. The level of sophistication that is required is provided by a dynamic, Monte Carlo modeling approach. A computer code that uses a dynamic, Monte Carlo modeling approach is Q-GERT (Graphical Evaluation and Review Technique - with Queueing), and the present study had demonstrated the feasibility of using Q-GERT for modeling time-dependent, unconditionally and conditionally linked phenomena that are characterized by arbitrarily selected probability distributions

  4. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  5. Detection of feigned mental disorders on the personality assessment inventory: a discriminant analysis.

    Science.gov (United States)

    Rogers, R; Sewell, K W; Morey, L C; Ustad, K L

    1996-12-01

    Psychological assessment with multiscale inventories is largely dependent on the honesty and forthrightness of those persons evaluated. We investigated the effectiveness of the Personality Assessment Inventory (PAI) in detecting participants feigning three specific disorders: schizophrenia, major depression, and generalized anxiety disorder. With a simulation design, we tested the PAI validity scales on 166 naive (undergraduates with minimal preparation) and 80 sophisticated (doctoral psychology students with 1 week preparation) participants. We compared their results to persons with the designated disorders: schizophrenia (n = 45), major depression (n = 136), and generalized anxiety disorder (n = 40). Although moderately effective with naive simulators, the validity scales evidenced only modest positive predictive power with their sophisticated counterparts. Therefore, we performed a two-stage discriminant analysis that yielded a moderately high hit rate (> 80%) that was maintained in the cross-validation sample, irrespective of the feigned disorder or the sophistication of the simulators.

  6. Necessary steps in factor analysis : Enhancing validation studies of educational instruments. The PHEEM applied to clerks as an example

    NARCIS (Netherlands)

    Schonrock-Adema, Johanna; Heijne-Penninga, Marjolein; van Hell, Elisabeth A.; Cohen-Schotanus, Janke

    2009-01-01

    Background: The validation of educational instruments, in particular the employment of factor analysis, can be improved in many instances. Aims: To demonstrate the superiority of a sophisticated method of factor analysis, implying an integration of recommendations described in the factor analysis

  7. Slideline verification for multilayer pressure vessel and piping analysis including tangential motion

    International Nuclear Information System (INIS)

    Van Gulick, L.A.

    1984-01-01

    Nonlinear finite element method (FEM) computer codes with slideline algorithm implementations should be useful for the analysis of prestressed multilayer pressure vessels and piping. This paper presents closed form solutions including the effects of tangential motion useful for verifying slideline implementations for this purpose. The solutions describe stresses and displacements of a long internally pressurized elastic-plastic cylinder initially separated from an elastic outer cylinder by a uniform gap. Comparison of closed form and FEM results evaluates the usefulness of the closed form solution and the validity of the sideline implementation used

  8. Enterprise Architecture-Based Risk and Security Modelling and Analysis

    NARCIS (Netherlands)

    Jonkers, Henk; Quartel, Dick; Kordy, Barbara; Ekstedt, Mathias; Seong Kim, Deng

    2016-01-01

    The growing complexity of organizations and the increasing number of sophisticated cyber attacks asks for a systematic and integral approach to Enterprise Risk and Security Management (ERSM). As enterprise architecture offers the necessary integral perspective, including the business and IT aspects

  9. Earthquake analysis of structures including structure-soil interaction by a substructure method

    International Nuclear Information System (INIS)

    Chopra, A.K.; Guttierrez, J.A.

    1977-01-01

    A general substructure method for analysis of response of nuclear power plant structures to earthquake ground motion, including the effects of structure-soil interaction, is summarized. The method is applicable to complex structures idealized as finite element systems and the soil region treated as either a continuum, for example as a viscoelastic halfspace, or idealized as a finite element system. The halfspace idealization permits reliable analysis for sites where essentially similar soils extend to large depths and there is no rigid boundary such as soil-rock interface. For sites where layers of soft soil are underlain by rock at shallow depth, finite element idealization of the soil region is appropriate; in this case, the direct and substructure methods would lead to equivalent results but the latter provides the better alternative. Treating the free field motion directly as the earthquake input in the substructure eliminates the deconvolution calculations and the related assumption-regarding type and direction of earthquake waves-required in the direct method. (Auth.)

  10. Analysis of general and specific combining abilities of popcorn populations, including selfed parents

    Directory of Open Access Journals (Sweden)

    José Marcelo Soriano Viana

    2003-12-01

    Full Text Available Estimation of general and specific combining ability effects in a diallel analysis of cross-pollinating populations, including the selfed parents, is presented in this work. The restrictions considered satisfy the parametric values of the GCA and SCA effects. The method is extended to self-pollinating populations (suitable for other species, without the selfed parents. The analysis of changes in population means due to inbreeding (sensitivity to inbreeding also permits to assess the predominant direction of dominance deviations and the relative genetic variability in each parent population. The methodology was used to select popcorn populations for intra- and inter-population breeding programs and for hybrid production, developed at the Federal University of Viçosa, MG, Brazil. Two yellow pearl grain popcorn populations were selected.

  11. DaqProVis, a toolkit for acquisition, interactive analysis, processing and visualization of multidimensional data

    Energy Technology Data Exchange (ETDEWEB)

    Morhac, M. [Institute of Physics, Slovak Academy of Sciences, Dubravska cesta 9, 845 11 Bratislava (Slovakia)]. E-mail: fyzimiro@savba.sk; Matousek, V. [Institute of Physics, Slovak Academy of Sciences, Dubravska cesta 9, 845 11 Bratislava (Slovakia); Turzo, I. [Institute of Physics, Slovak Academy of Sciences, Dubravska cesta 9, 845 11 Bratislava (Slovakia); Kliman, J. [Institute of Physics, Slovak Academy of Sciences, Dubravska cesta 9, 845 11 Bratislava (Slovakia)

    2006-04-01

    Multidimensional data acquisition, processing and visualization system to analyze experimental data in nuclear physics is described. It includes a large number of sophisticated algorithms of the multidimensional spectra processing, including background elimination, deconvolution, peak searching and fitting.

  12. Learning Strategic Sophistication

    NARCIS (Netherlands)

    Blume, A.; DeJong, D.V.; Maier, M.

    2005-01-01

    We experimentally investigate coordination games in which cognition plays an important role, i.e. where outcomes are affected by the agents level of understanding of the game and the beliefs they form about each others understanding.We ask whether and when repeated exposure permits agents to learn

  13. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  14. Software project profitability analysis using temporal probabilistic reasoning; an empirical study with the CASSE framework

    CSIR Research Space (South Africa)

    Balikuddembe, JK

    2009-04-01

    Full Text Available Undertaking adequate risk management by understanding project requirements and ensuring that viable estimates are made on software projects require extensive application and sophisticated techniques of analysis and interpretation. Informative...

  15. Seismic analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Halbritter, A.L.

    1984-01-01

    Nuclear Power Plants require exceptional safety guarantees which are reflected in a rigorous control of the employed materials, advanced construction technology, sophisticated methods of analysis and consideration of non conventional load cases such as the earthquake loading. In this paper, the current procedures used in the seismic analysis of Nuclear Power Plants are presented. The seismic analysis of the structures has two objectives: the determination of forces in the structure in order to design it against earthquakes and the generation of floor response spectra to be used in the design of mechanical and electrical components and piping systems. (Author) [pt

  16. GCtool for fuel cell systems design and analysis : user documentation.

    Energy Technology Data Exchange (ETDEWEB)

    Ahluwalia, R.K.; Geyer, H.K.

    1999-01-15

    GCtool is a comprehensive system design and analysis tool for fuel cell and other power systems. A user can analyze any configuration of component modules and flows under steady-state or dynamic conditions. Component models can be arbitrarily complex in modeling sophistication and new models can be added easily by the user. GCtool also treats arbitrary system constraints over part or all of the system, including the specification of nonlinear objective functions to be minimized subject to nonlinear, equality or inequality constraints. This document describes the essential features of the interpreted language and the window-based GCtool environment. The system components incorporated into GCtool include a gas flow mixer, splitier, heater, compressor, gas turbine, heat exchanger, pump, pipe, diffuser, nozzle, steam drum, feed water heater, combustor, chemical reactor, condenser, fuel cells (proton exchange membrane, solid oxide, phosphoric acid, and molten carbonate), shaft, generator, motor, and methanol steam reformer. Several examples of system analysis at various levels of complexity are presented. Also given are instructions for generating two- and three-dimensional plots of data and the details of interfacing new models to GCtool.

  17. Universal spectrum data analysis program for microsoft windows

    International Nuclear Information System (INIS)

    Hao, F.; Cai, Z.; Wang, H.

    1993-01-01

    We have developed a universal spectrum analysis and characterization program the Microsoft Windows environment. This sophisticated and easy to use software package can be employed in many areas for spectra data analysis, parametrization and line profile recognition. Spectra can for example be smoothed, calibrated and transformed from the laboratory frame to the projectile frame and background can be subtracted by using cubic spline functions or exponential functions. Up to 10 peaks and 40 different parameters can be fitted simultaneously either automatically by least squares routines or manually by system interactive devices. Line profiles include triangular, Gaussian, Lorentzian, Fano, Shore, post collisonal interactions functions etc., and also can be easily expanded to virtually any nonlinear fitting function. In addition, Fast Fourier Transform (FFT) routines allow users to convolute, deconvolute or Fourier analyze complex spectral patterns. Specifically this program has been applied for high resolution electron- and photon emission spectra following electron or ion collision with gaseous targets. Some examples for data evaluation will be presented

  18. Analysis of transfer reactions: determination of spectroscopic factors

    Energy Technology Data Exchange (ETDEWEB)

    Keeley, N. [CEA Saclay, Dept. d' Astrophysique, de Physique des Particules de Physique Nucleaire et de l' Instrumentation Associee (DSM/DAPNIA/SPhN), 91- Gif sur Yvette (France); The Andrzej So an Institute for Nuclear Studies, Dept. of Nuclear Reactions, Warsaw (Poland)

    2007-07-01

    An overview of the most popular models used for the analysis of direct reaction data is given, concentrating on practical aspects. The 4 following models (in order of increasing sophistication): the distorted wave born approximation (DWBA), the adiabatic model, the coupled channels born approximation, and the coupled reaction channels are briefly described. As a concrete example, the C{sup 12}(d,p)C{sup 13} reaction at an incident deuteron energy of 30 MeV is analysed with progressively more physically sophisticated models. The effect of the choice of the reaction model on the spectroscopic information extracted from the data is investigated and other sources of uncertainty in the derived spectroscopic factors are discussed. We have showed that the choice of the reaction model can significantly influence the nuclear structure information, particularly the spectroscopic factors or amplitudes but occasionally also the spin-parity, that we wish to extract from direct reaction data. We have also demonstrated that the DWBA can fail to give a satisfactory description of transfer data but when the tenets of the theory are fulfilled DWBA can work very well and will yield the same results as most sophisticated models. The use of global rather than fitted optical potentials can also lead to important differences in the extracted spectroscopic factors.

  19. Overview and application of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) toolbox

    Science.gov (United States)

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  20. Constructing a sophistication index as a method of market ...

    African Journals Online (AJOL)

    segmentation method offers researchers and marketing practitioners a ..... Pallant (2010) recommends a minimum value of 0.6 for a good analysis. .... a means of profiling segments, stock farmers are not classified as unsophisticated,.

  1. Analysis of LOFT pressurizer spray and surge nozzles to include a 4500F step transient

    International Nuclear Information System (INIS)

    Nitzel, M.E.

    1978-01-01

    This report presents the analysis of the LOFT pressurizer spray and surge nozzles to include a 450 0 F step thermal transient. Previous analysis performed under subcontract by Basic Technology Incorporated was utilized where applicable. The SAASIII finite element computer program was used to determine stress distributions in the nozzles due to the step transient. Computer results were then incorporated in the necessary additional calculations to ascertain that stress limitations were not exceeded. The results of the analysis indicate that both the spray and surge nozzles will be within stress allowables prescribed by subsubarticle NB-3220 of the 1974 edition of the ASME Boiler and Pressure Vessel Code when subjected to currently known design, normal operating, upset, emergency, and faulted condition loads

  2. CATDAT : A Program for Parametric and Nonparametric Categorical Data Analysis : User's Manual Version 1.0, 1998-1999 Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, James T.

    1999-12-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network.

  3. Decision analysis for dynamic accounting of nuclear material

    International Nuclear Information System (INIS)

    Shipley, J.P.

    1978-01-01

    Effective materials accounting for special nuclear material in modern fuel cycle facilities will depend heavily on sophisticated data analysis techniques. Decision analysis, which combines elements of estimation theory, decision theory, and systems analysis, is a framework well suited to the development and application of these techniques. Augmented by pattern-recognition tools such as the alarm-sequence chart, decision analysis can be used to reduce errors caused by subjective data evaluation and to condense large collections of data to a smaller set of more descriptive statistics. Application to data from a model plutonium nitrate-to-oxide conversion process illustrates the concepts

  4. Introduction to Economic Analysis

    OpenAIRE

    R. Preston McAfee

    2005-01-01

    This book presents introductory economics ("principles") material using standard mathematical tools, including calculus. It is designed for a relatively sophisticated undergraduate who has not taken a basic university course in economics. It also contains the standard intermediate microeconomics material and some material that ought to be standard but is not. The book can easily serve as an intermediate microeconomics text. The focus of this book is on the conceptual tools and not on fluff. M...

  5. Linearized potential flow analysis of a 40 chamber, oscillating water column wave energy device

    DEFF Research Database (Denmark)

    Bingham, Harry B.; Read, Robert

    . The calculations are compared to model-scale measurements in a slack-moored condition, and generally good agreement is found. Work is in progress to move the solution to the time-domain and include a more sophisticated PTO model which includes nonlinear and air compressability effects in the turbine....

  6. High Performance Liquid Chromatography of Some Analgesic Compounds: An Instrumental Analysis Experiment.

    Science.gov (United States)

    Haddad, Paul; And Others

    1983-01-01

    Background information, procedures, and results are provided for an experiment demonstrating techniques of solvent selection, gradient elution, pH control, and ion-pairing in the analysis of an analgesic mixture using reversed-phase liquid chromatography on an octadecylsilane column. Although developed using sophisticated/expensive equipment, less…

  7. Systematic review and meta-analysis of studies evaluating diagnostic test accuracy: A practical review for clinical researchers-Part II. general guidance and tips

    International Nuclear Information System (INIS)

    Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi; Park, Seong Ho; Lee, June Young

    2015-01-01

    Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies

  8. Optoelectronic Devices Advanced Simulation and Analysis

    CERN Document Server

    Piprek, Joachim

    2005-01-01

    Optoelectronic devices transform electrical signals into optical signals and vice versa by utilizing the sophisticated interaction of electrons and light within micro- and nano-scale semiconductor structures. Advanced software tools for design and analysis of such devices have been developed in recent years. However, the large variety of materials, devices, physical mechanisms, and modeling approaches often makes it difficult to select appropriate theoretical models or software packages. This book presents a review of devices and advanced simulation approaches written by leading researchers and software developers. It is intended for scientists and device engineers in optoelectronics, who are interested in using advanced software tools. Each chapter includes the theoretical background as well as practical simulation results that help to better understand internal device physics. The software packages used in the book are available to the public, on a commercial or noncommercial basis, so that the interested r...

  9. Handbook of soil analysis. Mineralogical, organic and inorganic methods

    Energy Technology Data Exchange (ETDEWEB)

    Pansu, M. [Centre IRD, 34 - Montpellier (France); Gautheyrou, J.

    2006-07-01

    This handbook is a reference guide for selecting and carrying out numerous methods of soil analysis. It is written in accordance with analytical standards and quality control approaches.It covers a large body of technical information including protocols, tables, formulae, spectrum models, chromatograms and additional analytical diagrams. The approaches are diverse, from the simplest tests to the most sophisticated determination methods in the physical chemistry of mineralogical and organic structures, available and total elements, soil exchange complex, pesticides and contaminants, trace elements and isotopes.As a basic reference, it will be particularly useful to scientists, engineers, technicians, professors and students, in the areas of soil science, agronomy, earth and environmental sciences as well as in related fields such as analytical chemistry, geology, hydrology, ecology, climatology, civil engineering and industrial activities associated with soil. (orig.)

  10. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    Eldridge, C.; Gagne, D.; Wilson, B.; Murray, J.; Gazze, C.; Feldman, Y.; Rorif, F.

    2015-01-01

    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  11. Diversity of Pseudomonas Genomes, Including Populus-Associated Isolates, as Revealed by Comparative Genome Analysis.

    Science.gov (United States)

    Jun, Se-Ran; Wassenaar, Trudy M; Nookaew, Intawat; Hauser, Loren; Wanchai, Visanu; Land, Miriam; Timm, Collin M; Lu, Tse-Yuan S; Schadt, Christopher W; Doktycz, Mitchel J; Pelletier, Dale A; Ussery, David W

    2016-01-01

    The Pseudomonas genus contains a metabolically versatile group of organisms that are known to occupy numerous ecological niches, including the rhizosphere and endosphere of many plants. Their diversity influences the phylogenetic diversity and heterogeneity of these communities. On the basis of average amino acid identity, comparative genome analysis of >1,000 Pseudomonas genomes, including 21 Pseudomonas strains isolated from the roots of native Populus deltoides (eastern cottonwood) trees resulted in consistent and robust genomic clusters with phylogenetic homogeneity. All Pseudomonas aeruginosa genomes clustered together, and these were clearly distinct from other Pseudomonas species groups on the basis of pangenome and core genome analyses. In contrast, the genomes of Pseudomonas fluorescens were organized into 20 distinct genomic clusters, representing enormous diversity and heterogeneity. Most of our 21 Populus-associated isolates formed three distinct subgroups within the major P. fluorescens group, supported by pathway profile analysis, while two isolates were more closely related to Pseudomonas chlororaphis and Pseudomonas putida. Genes specific to Populus-associated subgroups were identified. Genes specific to subgroup 1 include several sensory systems that act in two-component signal transduction, a TonB-dependent receptor, and a phosphorelay sensor. Genes specific to subgroup 2 contain hypothetical genes, and genes specific to subgroup 3 were annotated with hydrolase activity. This study justifies the need to sequence multiple isolates, especially from P. fluorescens, which displays the most genetic variation, in order to study functional capabilities from a pangenomic perspective. This information will prove useful when choosing Pseudomonas strains for use to promote growth and increase disease resistance in plants. Copyright © 2015 Jun et al.

  12. Application of atomic absorption in molecular analysis (spectrophotometry)

    International Nuclear Information System (INIS)

    Baliza, S.V.; Soledade, L.E.B.

    1981-01-01

    The apparatus of atomic absorption has been considered by all the experts in chemical analysis as one of the most important equipments in actual utilization in such field. Among its several applications one should emphasize direct and indirect metals analyses using flame, graphite furnace, cold vapor generator,... Besides such known applications, the authors have developed at the R and D Center of CSN a patent pendent method for the utilization of such equipment for molecular analysis, in substitution of a sophisticated and specific apparatus. (Author) [pt

  13. Market segmentation for multiple option healthcare delivery systems--an application of cluster analysis.

    Science.gov (United States)

    Jarboe, G R; Gates, R H; McDaniel, C D

    1990-01-01

    Healthcare providers of multiple option plans may be confronted with special market segmentation problems. This study demonstrates how cluster analysis may be used for discovering distinct patterns of preference for multiple option plans. The availability of metric, as opposed to categorical or ordinal, data provides the ability to use sophisticated analysis techniques which may be superior to frequency distributions and cross-tabulations in revealing preference patterns.

  14. Analysis of a proposed crucial test of quantum mechanics

    International Nuclear Information System (INIS)

    Collett, M.J.; Loudon, R.

    1987-01-01

    An experiment based on an extension of the Einstein-Podolsky-Rosen argument has been proposed by Popper as a crucial test of the Copenhagen interpretation of quantum mechanics. Here the authors show, by a slightly more complete version of Popper's analysis, although still at a relatively primitive level of sophistication, that the proposed experiment does not in fact provide such a test. (author)

  15. Analysis of natural circulation BWR dynamics with stochastic and deterministic methods

    International Nuclear Information System (INIS)

    VanderHagen, T.H.; Van Dam, H.; Hoogenboom, J.E.; Kleiss, E.B.J.; Nissen, W.H.M.; Oosterkamp, W.J.

    1986-01-01

    Reactor kinetic, thermal hydraulic and total plant stability of a natural convection cooled BWR was studied using noise analysis and by evaluation of process responses to control rod steps and to steamflow control valve steps. An estimate of the fuel thermal time constant and an impression of the recirculation flow response to power variations was obtained. A sophisticated noise analysis method resulted in more insight into the fluctuations of the coolant velocity

  16. Utility of lab-on-a-chip technology for high-throughput nucleic acid and protein analysis

    DEFF Research Database (Denmark)

    Hawtin, Paul; Hardern, Ian; Wittig, Rainer

    2005-01-01

    On-chip electrophoresis can provide size separations of nucleic acids and proteins similar to more traditional slab gel electrophoresis. Lab-on-a-chip (LoaC) systems utilize on-chip electrophoresis in conjunction with sizing calibration, sensitive detection schemes, and sophisticated data analysi...

  17. Line outage contingency analysis including the system islanding ...

    African Journals Online (AJOL)

    The optimally ordered sparse [Bʹ], [Bʺ] matrices for the integrated system are used for load flow analysis to determine modified values of voltage phase angles [d] and bus voltages [V] to determine the over loading effect on the remaining lines due to outage of a selected line outage contingency. In case of over loading in ...

  18. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  19. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  20. Space Weather opportunities from the Swarm mission including near real time applications

    DEFF Research Database (Denmark)

    Stolle, Claudia; Floberghagen, Rune; Luehr, Hermann

    2013-01-01

    Sophisticated space weather monitoring aims at nowcasting and predicting solar-terrestrial interactions because their effects on the ionosphere and upper atmosphere may seriously impact advanced technology. Operating alert infrastructures rely heavily on ground-based measurements and satellite...... these products in timely manner will add significant value in monitoring present space weather and helping to predict the evolution of several magnetic and ionospheric events. Swarm will be a demonstrator mission for the valuable application of LEO satellite observations for space weather monitoring tools....

  1. Cask crush pad analysis using detailed and simplified analysis methods

    International Nuclear Information System (INIS)

    Uldrich, E.D.; Hawkes, B.D.

    1997-01-01

    A crush pad has been designed and analyzed to absorb the kinetic energy of a hypothetically dropped spent nuclear fuel shipping cask into a 44-ft. deep cask unloading pool at the Fluorinel and Storage Facility (FAST). This facility, located at the Idaho Chemical Processing Plant (ICPP) at the Idaho national Engineering and Environmental Laboratory (INEEL), is a US Department of Energy site. The basis for this study is an analysis by Uldrich and Hawkes. The purpose of this analysis was to evaluate various hypothetical cask drop orientations to ensure that the crush pad design was adequate and the cask deceleration at impact was less than 100 g. It is demonstrated herein that a large spent fuel shipping cask, when dropped onto a foam crush pad, can be analyzed by either hand methods or by sophisticated dynamic finite element analysis using computer codes such as ABAQUS. Results from the two methods are compared to evaluate accuracy of the simplified hand analysis approach

  2. Isogeometric analysis of free-form Timoshenko curved beams including the nonlinear effects of large deformations

    Science.gov (United States)

    Hosseini, Seyed Farhad; Hashemian, Ali; Moetakef-Imani, Behnam; Hadidimoud, Saied

    2018-03-01

    In the present paper, the isogeometric analysis (IGA) of free-form planar curved beams is formulated based on the nonlinear Timoshenko beam theory to investigate the large deformation of beams with variable curvature. Based on the isoparametric concept, the shape functions of the field variables (displacement and rotation) in a finite element analysis are considered to be the same as the non-uniform rational basis spline (NURBS) basis functions defining the geometry. The validity of the presented formulation is tested in five case studies covering a wide range of engineering curved structures including from straight and constant curvature to variable curvature beams. The nonlinear deformation results obtained by the presented method are compared to well-established benchmark examples and also compared to the results of linear and nonlinear finite element analyses. As the nonlinear load-deflection behavior of Timoshenko beams is the main topic of this article, the results strongly show the applicability of the IGA method to the large deformation analysis of free-form curved beams. Finally, it is interesting to notice that, until very recently, the large deformations analysis of free-form Timoshenko curved beams has not been considered in IGA by researchers.

  3. Nuclear Reactor Engineering Analysis Laboratory

    International Nuclear Information System (INIS)

    Carlos Chavez-Mercado; Jaime B. Morales-Sandoval; Benjamin E. Zayas-Perez

    1998-01-01

    The Nuclear Reactor Engineering Analysis Laboratory (NREAL) is a sophisticated computer system with state-of-the-art analytical tools and technology for analysis of light water reactors. Multiple application software tools can be activated to carry out different analyses and studies such as nuclear fuel reload evaluation, safety operation margin measurement, transient and severe accident analysis, nuclear reactor instability, operator training, normal and emergency procedures optimization, and human factors engineering studies. An advanced graphic interface, driven through touch-sensitive screens, provides the means to interact with specialized software and nuclear codes. The interface allows the visualization and control of all observable variables in a nuclear power plant (NPP), as well as a selected set of nonobservable or not directly controllable variables from conventional control panels

  4. Applications of neutron activation analysis in determination of natural and man-made radionuclides, including PA-231

    Science.gov (United States)

    Byrne, A. R.; Benedik, L.

    1999-01-01

    Neutron activation analysis (NAA), being essentially an isotopic and not an elemental method of analysis, is capable of determining a number of important radionuclides of radioecological interest by transformation into another, more easily quantifiable radionuclide. The nuclear characteristics which favour this technique may be summarized in an advantage factor relative to radiometric analysis of the original radioanalyte. Well known or hardly known examples include235U,238U,232Th,230Th,129I,99Tc,237Np and231Pa; a number of these are discussed and illustrated in analysis of real samples of environmental and biological origin. In particular, determination of231Pa by RNAA was performed using both postirradiation and preseparation methods. Application of INAA to enable the use of238U and232Th as endogenous (internal) radiotracers in alpha spectrometric analyses of uranium and thorium radioisotopes in radioecological studies is described, also allowing independent data sets to be obtained for quality control.

  5. Static, Lightweight Includes Resolution for PHP

    NARCIS (Netherlands)

    M.A. Hills (Mark); P. Klint (Paul); J.J. Vinju (Jurgen)

    2014-01-01

    htmlabstractDynamic languages include a number of features that are challenging to model properly in static analysis tools. In PHP, one of these features is the include expression, where an arbitrary expression provides the path of the file to include at runtime. In this paper we present two

  6. A multimethod analysis of shared decision-making in hospice interdisciplinary team meetings including family caregivers.

    Science.gov (United States)

    Washington, Karla T; Oliver, Debra Parker; Gage, L Ashley; Albright, David L; Demiris, George

    2016-03-01

    Much of the existing research on shared decision-making in hospice and palliative care focuses on the provider-patient dyad; little is known about shared decision-making that is inclusive of family members of patients with advanced disease. We sought to describe shared decision-making as it occurred in hospice interdisciplinary team meetings that included family caregivers as participants using video-conferencing technology. We conducted a multimethod study in which we used content and thematic analysis techniques to analyze video-recordings of hospice interdisciplinary team meetings (n = 100), individual interviews of family caregivers (n = 73) and hospice staff members (n = 78), and research field notes. Participants in the original studies from which data for this analysis were drawn were hospice family caregivers and staff members employed by one of five different community-based hospice agencies located in the Midwestern United States. Shared decision-making occurred infrequently in hospice interdisciplinary team meetings that included family caregivers. Barriers to shared decision-making included time constraints, communication skill deficits, unaddressed emotional needs, staff absences, and unclear role expectations. The hospice philosophy of care, current trends in healthcare delivery, the interdisciplinary nature of hospice teams, and the designation of a team leader/facilitator supported shared decision-making. The involvement of family caregivers in hospice interdisciplinary team meetings using video-conferencing technology creates a useful platform for shared decision-making; however, steps must be taken to transform family caregivers from meeting attendees to shared decision-makers. © The Author(s) 2015.

  7. Modern devices the simple physics of sophisticated technology

    CERN Document Server

    Joseph, Charles L

    2016-01-01

    This book discusses the principles of physics through applications of state-of-the-art technologies and advanced instruments. The authors use diagrams, sketches, and graphs coupled with equations and mathematical analysis to enhance the reader's understanding of modern devices. Readers will learn to identify common underlying physical principles that govern several types of devices, while gaining an understanding of the performance trade-off imposed by the physical limitations of various processing methods. The topics discussed in the book assume readers have taken an introductory physics course, college algebra, and have a basic understanding of calculus. * Describes the basic physics behind a large number of devices encountered in everyday life, from the air conditioner to Blu-ray discs * Covers state-of-the-art devices such as spectrographs, photoelectric image sensors, spacecraft systems, astronomical and planetary observatories, biomedical imaging instruments, particle accelerators, and jet engines * Inc...

  8. Information flows at OS level unmask sophisticated Android malware

    OpenAIRE

    Viet Triem Tong , Valérie; Trulla , Aurélien; Leslous , Mourad; Lalande , Jean-François

    2017-01-01

    International audience; The detection of new Android malware is far from being a relaxing job. Indeed, each day new Android malware appear in the market and it remains difficult to quickly identify them. Unfortunately users still pay the lack of real efficient tools able to detect zero day malware that have no known signature. The difficulty is that most of the existing approaches rely on static analysis coupled with the ability of malware to hide their malicious code. Thus, we believe that i...

  9. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  10. Factor Analysis of Drawings: Application to college student models of the greenhouse effect

    Science.gov (United States)

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-09-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance, suggesting that 4 archetype models of the greenhouse effect dominate thinking within this population. Factor scores, indicating the extent to which each student's drawing aligned with representative models, were compared to performance on conceptual understanding and attitudes measures, demographics, and non-cognitive features of drawings. Student drawings were also compared to drawings made by scientists to ascertain the extent to which models reflect more sophisticated and accurate models. Results indicate that student and scientist drawings share some similarities, most notably the presence of some features of the most sophisticated non-scientific model held among the study population. Prior knowledge, prior attitudes, gender, and non-cognitive components are also predictive of an individual student's model. This work presents a new technique for analyzing drawings, with general implications for the use of drawings in investigating student conceptions.

  11. Cardiovascular imaging environment: will the future be cloud-based?

    Science.gov (United States)

    Kawel-Boehm, Nadine; Bluemke, David A

    2017-07-01

    In cardiovascular CT and MR imaging large datasets have to be stored, post-processed, analyzed and distributed. Beside basic assessment of volume and function in cardiac magnetic resonance imaging e.g., more sophisticated quantitative analysis is requested requiring specific software. Several institutions cannot afford various types of software and provide expertise to perform sophisticated analysis. Areas covered: Various cloud services exist related to data storage and analysis specifically for cardiovascular CT and MR imaging. Instead of on-site data storage, cloud providers offer flexible storage services on a pay-per-use basis. To avoid purchase and maintenance of specialized software for cardiovascular image analysis, e.g. to assess myocardial iron overload, MR 4D flow and fractional flow reserve, evaluation can be performed with cloud based software by the consumer or complete analysis is performed by the cloud provider. However, challenges to widespread implementation of cloud services include regulatory issues regarding patient privacy and data security. Expert commentary: If patient privacy and data security is guaranteed cloud imaging is a valuable option to cope with storage of large image datasets and offer sophisticated cardiovascular image analysis for institutions of all sizes.

  12. Measurements and analysis of online social networks

    OpenAIRE

    González Sánchez, Roberto

    2014-01-01

    Mención Internacional Online Social Networks (OSNs) have become the most used Internet applications attracting hundreds of millions active users every day. The large amount of valuable information in OSNs (not even before available) has attracted the research community to design sophisticated techniques to collect, process, interpret and apply these data into a large range of disciplines including Sociology, Marketing, Computer Science, etc. This thesis presents a series of ...

  13. Interface and thin film analysis: Comparison of methods, trends

    International Nuclear Information System (INIS)

    Werner, H.W.; Torrisi, A.

    1990-01-01

    Thin film properties are governed by a number of parameters such as: Surface and interface chemical composition, microstructure and the distribution of defects, dopants and impurities. For the determination of most of these aspects sophisticated analytical methods are needed. An overview of these analytical methods is given including: - Features and modes of analytical methods; - Main characteristics, advantages and disadvantages of the established methods [e.g. ESCA (Electron Spectroscopy for Chemical Analysis), AES (Auger Electron Spectroscopy), SIMS (Secondary Ion Mass Spectrometry), RBS (Rutherford Backscattering Spectrometry), SEM (Scanning Electron Microscopy), TEM (Transmission Electron Microscopy), illustrated with typical examples]; - Presentation of relatively new methods such as XRM (X-ray Microscopy) and SCAM (Scanning Acoustic Microscopy). Some features of ESCA (chemical information, insulator analysis, non-destructive depth profiling) have been selected for a more detailed presentation, viz. to illustrate the application of ESCA to practical problems. Trends in instrumental development and analytical applications of the techniques are discussed; the need for a multi-technique approach to solve complex analytical problems is emphasized. (orig.)

  14. Analysis of advanced European nuclear fuel cycle scenarios including transmutation and economical estimates

    International Nuclear Information System (INIS)

    Merino Rodriguez, I.; Alvarez-Velarde, F.; Martin-Fuertes, F.

    2013-01-01

    Four European fuel cycle scenarios involving transmutation options have been addressed from a point of view of resources utilization and economics. Scenarios include the current fleet using Light Water Reactor (LWR) technology and open fuel cycle (as a reference scenario), a full replacement of the initial fleet with Fast Reactors (FR) burning U-Pu MOX fuel and two fuel cycles with Minor Actinide (MA) transmutation in a fraction of the FR fleet or in dedicated Accelerator Driven Systems (ADS).Results reveal that all scenarios are feasible according to nuclear resources demand. Regarding the economic analysis, the estimations show an increase of LCOE - averaged over the whole period - with respect to the reference scenario of 20% for Pu management scenario and around 35% for both transmutation scenarios respectively.

  15. Analysis of advanced European nuclear fuel cycle scenarios including transmutation and economical estimates

    Energy Technology Data Exchange (ETDEWEB)

    Merino Rodriguez, I.; Alvarez-Velarde, F.; Martin-Fuertes, F.

    2013-07-01

    Four European fuel cycle scenarios involving transmutation options have been addressed from a point of view of resources utilization and economics. Scenarios include the current fleet using Light Water Reactor (LWR) technology and open fuel cycle (as a reference scenario), a full replacement of the initial fleet with Fast Reactors (FR) burning U-Pu MOX fuel and two fuel cycles with Minor Actinide (MA) transmutation in a fraction of the FR fleet or in dedicated Accelerator Driven Systems (ADS).Results reveal that all scenarios are feasible according to nuclear resources demand. Regarding the economic analysis, the estimations show an increase of LCOE - averaged over the whole period - with respect to the reference scenario of 20% for Pu management scenario and around 35% for both transmutation scenarios respectively.

  16. Analysis of plutonium gamma-ray spectra by small portable computers

    International Nuclear Information System (INIS)

    Ruhter, W.; Gunnink, R.; Camp, D.; DeCarolis, M.

    1985-01-01

    A sophisticated program for isotopic analysis of plutonium gamma-ray spectra using small computers has been developed. It is implemented on a DEC LSI-11/2 configured in a portable unit without a mass storage device for use by IAEA inspectors in the field. Only the positions of the 148-keV 241 Pu and 208-keV 237 U peaks are needed as input. Analysis is completed in 90 seconds by fitting isotopic component response functions to peak multiplets. 9 refs., 2 figs., 1 tab

  17. Development of calculation method for one-dimensional kinetic analysis in fission reactors, including feedback effects

    International Nuclear Information System (INIS)

    Paixao, S.B.; Marzo, M.A.S.; Alvim, A.C.M.

    1986-01-01

    The calculation method used in WIGLE code is studied. Because of the non availability of such a praiseworthy solution, expounding the method minutely has been tried. This developed method has been applied for the solution of the one-dimensional, two-group, diffusion equations in slab, axial analysis, including non-boiling heat transfer, accountig for feedback. A steady-state program (CITER-1D), written in FORTRAN 4, has been implemented, providing excellent results, ratifying the developed work quality. (Author) [pt

  18. Elementary real and complex analysis

    CERN Document Server

    Shilov, Georgi E

    1996-01-01

    In this book the renowned Russian mathematician Georgi E. Shilov brings his unique perspective to real and complex analysis, an area of perennial interest in mathematics. Although there are many books available on the topic, the present work is specially designed for undergraduates in mathematics, science and engineering. A high level of mathematical sophistication is not required.The book begins with a systematic study of real numbers, understood to be a set of objects satisfying certain definite axioms. The concepts of a mathematical structure and an isomorphism are introduced in Chapter 2,

  19. Practical Analysis of the Dynamic Characteristics of JavaScript

    OpenAIRE

    Wei, Shiyi

    2015-01-01

    JavaScript is a dynamic object-oriented programming language, which is designed with flexible programming mechanisms. JavaScript is widely used in developing sophisticated software systems, especially web applications. Despite of its popularity, there is a lack of software tools that support JavaScript for software engineering clients. Dataflow analysis approximates software behavior by analyzing the program code; it is the foundation for many software tools. However, several unique features...

  20. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    Science.gov (United States)

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  1. A Review of Pathway-Based Analysis Tools That Visualize Genetic Variants

    Directory of Open Access Journals (Sweden)

    Elisa Cirillo

    2017-11-01

    Full Text Available Pathway analysis is a powerful method for data analysis in genomics, most often applied to gene expression analysis. It is also promising for single-nucleotide polymorphism (SNP data analysis, such as genome-wide association study data, because it allows the interpretation of variants with respect to the biological processes in which the affected genes and proteins are involved. Such analyses support an interactive evaluation of the possible effects of variations on function, regulation or interaction of gene products. Current pathway analysis software often does not support data visualization of variants in pathways as an alternate method to interpret genetic association results, and specific statistical methods for pathway analysis of SNP data are not combined with these visualization features. In this review, we first describe the visualization options of the tools that were identified by a literature review, in order to provide insight for improvements in this developing field. Tool evaluation was performed using a computational epistatic dataset of gene–gene interactions for obesity risk. Next, we report the necessity to include in these tools statistical methods designed for the pathway-based analysis with SNP data, expressly aiming to define features for more comprehensive pathway-based analysis tools. We conclude by recognizing that pathway analysis of genetic variations data requires a sophisticated combination of the most useful and informative visual aspects of the various tools evaluated.

  2. Interior point algorithms theory and analysis

    CERN Document Server

    Ye, Yinyu

    2011-01-01

    The first comprehensive review of the theory and practice of one of today's most powerful optimization techniques. The explosive growth of research into and development of interior point algorithms over the past two decades has significantly improved the complexity of linear programming and yielded some of today's most sophisticated computing techniques. This book offers a comprehensive and thorough treatment of the theory, analysis, and implementation of this powerful computational tool. Interior Point Algorithms provides detailed coverage of all basic and advanced aspects of the subject.

  3. Validity of segmental bioelectrical impedance analysis for estimating fat-free mass in children including overweight individuals.

    Science.gov (United States)

    Ohta, Megumi; Midorikawa, Taishi; Hikihara, Yuki; Masuo, Yoshihisa; Sakamoto, Shizuo; Torii, Suguru; Kawakami, Yasuo; Fukunaga, Tetsuo; Kanehisa, Hiroaki

    2017-02-01

    This study examined the validity of segmental bioelectrical impedance (BI) analysis for predicting the fat-free masses (FFMs) of whole-body and body segments in children including overweight individuals. The FFM and impedance (Z) values of arms, trunk, legs, and whole body were determined using a dual-energy X-ray absorptiometry and segmental BI analyses, respectively, in 149 boys and girls aged 6 to 12 years, who were divided into model-development (n = 74), cross-validation (n = 35), and overweight (n = 40) groups. Simple regression analysis was applied to (length) 2 /Z (BI index) for each of the whole-body and 3 segments to develop the prediction equations of the measured FFM of the related body part. In the model-development group, the BI index of each of the 3 segments and whole body was significantly correlated to the measured FFM (R 2 = 0.867-0.932, standard error of estimation = 0.18-1.44 kg (5.9%-8.7%)). There was no significant difference between the measured and predicted FFM values without systematic error. The application of each equation derived in the model-development group to the cross-validation and overweight groups did not produce significant differences between the measured and predicted FFM values and systematic errors, with an exception that the arm FFM in the overweight group was overestimated. Segmental bioelectrical impedance analysis is useful for predicting the FFM of each of whole-body and body segments in children including overweight individuals, although the application for estimating arm FFM in overweight individuals requires a certain modification.

  4. Human factors design of nuclear power plant control rooms including computer-based operator aids

    International Nuclear Information System (INIS)

    Bastl, W.; Felkel, L.; Becker, G.; Bohr, E.

    1983-01-01

    The scientific handling of human factors problems in control rooms began around 1970 on the basis of safety considerations. Some recent research work deals with the development of computerized systems like plant balance calculation, safety parameter display, alarm reduction and disturbance analysis. For disturbance analysis purposes it is necessary to homogenize the information presented to the operator according to the actual plant situation in order to supply the operator with the information he most urgently needs at the time. Different approaches for solving this problem are discussed, and an overview is given on what is being done. Other research projects concentrate on the detailed analysis of operators' diagnosis strategies in unexpected situations, in order to obtain a better understanding of their mental processes and the influences upon them when such situations occur. This project involves the use of a simulator and sophisticated recording and analysis methods. Control rooms are currently designed with the aid of mock-ups. They enable operators to contribute their experience to the optimization of the arrangement of displays and controls. Modern control rooms are characterized by increasing use of process computers and CRT (Cathode Ray Tube) displays. A general concept for the integration of the new computerized system and the conventional control panels is needed. The technical changes modify operators' tasks, and future ergonomic work in nuclear plants will need to consider the re-allocation of function between man and machine, the incorporation of task changes in training programmes, and the optimal design of information presentation using CRTs. Aspects of developments in control room design are detailed, typical research results are dealt with, and a brief forecast of the ergonomic contribution to be made in the Federal Republic of Germany is given

  5. Precise analysis of the metal package photomultiplier single photoelectron spectra

    International Nuclear Information System (INIS)

    Chirikov-Zorin, I.E.; Fedorko, I.; Sykora, I.; Tokar, S.; Menzione, A.

    2000-01-01

    A deconvolution method based on a sophisticated photomultiplier response function was used to analyse the compact metal package photomultiplier spectra taken in single photoelectron mode. The spectra taken by Hamamtsu R5600 and R5900 photomultipliers have been analysed. The detailed analysis shows that the method appropriately describes the process of charge multiplication in these photomultipliers in a wide range of working regimes and the deconvoluted parameters are established with about 1% accuracy. The method can be used for a detailed analysis of photomultiplier noise and for calibration purposes

  6. Building Models in the Classroom: Taking Advantage of Sophisticated Geomorphic Numerical Tools Using a Simple Graphical User Interface

    Science.gov (United States)

    Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.

    2014-12-01

    Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

  7. Analysis Tools for the Ion Cyclotron Emission Diagnostic on DIII-D

    Science.gov (United States)

    Del Castillo, C. A.; Thome, K. E.; Pinsker, R. I.; Meneghini, O.; Pace, D. C.

    2017-10-01

    Ion cyclotron emission (ICE) waves are excited by suprathermal particles such as neutral beam particles and fusion products. An ICE diagnostic is in consideration for use at ITER, where it could provide important passive measurement of fast ions location and losses, which are otherwise difficult to determine. Simple ICE data analysis codes had previously been developed, but more sophisticated codes are required to facilitate data analysis. Several terabytes of ICE data were collected on DIII-D during the 2015-2017 campaign. The ICE diagnostic consists of antenna straps and dedicated magnetic probes that are both digitized at 200 MHz. A suite of Python spectral analysis tools within the OMFIT framework is under development to perform the memory-intensive analysis of this data. A fast and optimized analysis allows ready access to data visualizations as spectrograms and as plots of both frequency and time cuts of the data. A database of processed ICE data is being constructed to understand the relationship between the frequency and intensity of ICE and a variety of experimental parameters including neutral beam power and geometry, local and global plasma parameters, magnetic fields, and many others. Work supported in part by US DoE under the Science Undergraduate Laboratory Internship (SULI) program and under DE-FC02-04ER54698.

  8. An artificial intelligence approach towards disturbance analysis

    International Nuclear Information System (INIS)

    Fiedler, U.; Lindner, A.; Baldeweg, F.; Klebau, J.

    1986-01-01

    Scale and degree of sophistication of technological plants, e.g. nuclear power plants, have been essentially increased during the last decades. Conventional disturbance analysis systems have proved to work successfully in well-known situations. But in cases of emergencies, the operator needs more advanced assistance in realizing diagnosis and therapy control. The significance of introducing artificial intelligence (AI) methods in nuclear power technology is emphasized. Main features of the on-line disturbance analysis system SAAP-2 are reported about. It is being developed for application to nuclear power plants. Problems related to man-machine communication will be gone into more detail, because their solution will influence end-user acceptance considerably. (author)

  9. Advanced Analysis Methods in High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  10. Event based neutron activation spectroscopy and analysis algorithm using MLE and metaheuristics

    Science.gov (United States)

    Wallace, Barton

    2014-03-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods [1] given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis [2] was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes [3] involved was used to create a statistical model. Maximum likelihood estimation was combined with metaheuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research.

  11. Basic real analysis

    CERN Document Server

    Sohrab, Houshang H

    2014-01-01

    This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....

  12. Earthquake analysis of structures including structure-soil interaction by a substructure method

    International Nuclear Information System (INIS)

    Chopra, A.K.; Guttierrez, J.A.

    1977-01-01

    A general substructure method for analysis of response of nuclear power plant structures to earthquake ground motion, including the effects of structure-soil interaction, is summarized. The method is applicable to complex structures idealized as finite element systems and the soil region treated as either a continuum, for example as a viscoelastic halfspace, or idealized as a finite element system. The halfspace idealization permits reliable analysis for sites where essentially similar soils extend to large depths and there is no rigid boundary such as soil-rock interface. For sites where layers of soft soil are underlain by rock at shallow depth, finite element idealization of the soil region is appropriate; in this case, the direct and substructure methods would lead to equivalent results but the latter provides the better alternative. Treating the free field motion directly as the earthquake input in the substructure method eliminates the deconvolution calculations and the related assumption -regarding type and direction of earthquake waves- required in the direct method. The substructure method is computationally efficient because the two substructures-the structure and the soil region- are analyzed separately; and, more important, it permits taking advantage of the important feature that response to earthquake ground motion is essentially contained in the lower few natural modes of vibration of the structure on fixed base. For sites where essentially similar soils extend to large depths and there is no obvious rigid boundary such as a soil-rock interface, numerical results for earthquake response of a nuclear reactor structure are presented to demonstrate that the commonly used finite element method may lead to unacceptable errors; but the substructure method leads to reliable results

  13. Static aeroelastic analysis including geometric nonlinearities based on reduced order model

    Directory of Open Access Journals (Sweden)

    Changchuan Xie

    2017-04-01

    Full Text Available This paper describes a method proposed for modeling large deflection of aircraft in nonlinear aeroelastic analysis by developing reduced order model (ROM. The method is applied for solving the static aeroelastic and static aeroelastic trim problems of flexible aircraft containing geometric nonlinearities; meanwhile, the non-planar effects of aerodynamics and follower force effect have been considered. ROMs are computational inexpensive mathematical representations compared to traditional nonlinear finite element method (FEM especially in aeroelastic solutions. The approach for structure modeling presented here is on the basis of combined modal/finite element (MFE method that characterizes the stiffness nonlinearities and we apply that structure modeling method as ROM to aeroelastic analysis. Moreover, the non-planar aerodynamic force is computed by the non-planar vortex lattice method (VLM. Structure and aerodynamics can be coupled with the surface spline method. The results show that both of the static aeroelastic analysis and trim analysis of aircraft based on structure ROM can achieve a good agreement compared to analysis based on the FEM and experimental result.

  14. N-opcode Analysis for Android Malware Classification and Categorization

    OpenAIRE

    Kang, BooJoong; Yerima, Suleiman Y.; McLaughlin, Kieran; Sezer, Sakir

    2016-01-01

    Malware detection is a growing problem particularly on the Android mobile platform due to its increasing popularity and accessibility to numerous third party app markets. This has also been made worse by the increasingly sophisticated detection avoidance techniques employed by emerging malware families. This calls for more effective techniques for detection and classification of Android malware. Hence, in this paper we present an n-opcode analysis based approach that utilizes machine learning...

  15. Statistical mechanical analysis of LMFBR fuel cladding tubes

    International Nuclear Information System (INIS)

    Poncelet, J.-P.; Pay, A.

    1977-01-01

    The most important design requirement on fuel pin cladding for LMFBR's is its mechanical integrity. Disruptive factors include internal pressure from mixed oxide fuel fission gas release, thermal stresses and high temperature creep, neutron-induced differential void-swelling as a source of stress in the cladding and irradiation creep of stainless steel material, corrosion by fission products. Under irradiation these load-restraining mechanisms are accentuated by stainless steel embrittlement and strength alterations. To account for the numerous uncertainties involved in the analysis by theoretical models and computer codes statistical tools are unavoidably requested, i.e. Monte Carlo simulation methods. Thanks to these techniques, uncertainties in nominal characteristics, material properties and environmental conditions can be linked up in a correct way and used for a more accurate conceptual design. First, a thermal creep damage index is set up through a sufficiently sophisticated clad physical analysis including arbitrary time dependence of power and neutron flux as well as effects of sodium temperature, burnup and steel mechanical behavior. Although this strain limit approach implies a more general but time consuming model., on the counterpart the net output is improved and e.g. clad temperature, stress and strain maxima may be easily assessed. A full spectrum of variables are statistically treated to account for their probability distributions. Creep damage probability may be obtained and can contribute to a quantitative fuel probability estimation

  16. Analysis of Smart Composite Structures Including Debonding

    Science.gov (United States)

    Chattopadhyay, Aditi; Seeley, Charles E.

    1997-01-01

    Smart composite structures with distributed sensors and actuators have the capability to actively respond to a changing environment while offering significant weight savings and additional passive controllability through ply tailoring. Piezoelectric sensing and actuation of composite laminates is the most promising concept due to the static and dynamic control capabilities. Essential to the implementation of these smart composites are the development of accurate and efficient modeling techniques and experimental validation. This research addresses each of these important topics. A refined higher order theory is developed to model composite structures with surface bonded or embedded piezoelectric transducers. These transducers are used as both sensors and actuators for closed loop control. The theory accurately captures the transverse shear deformation through the thickness of the smart composite laminate while satisfying stress free boundary conditions on the free surfaces. The theory is extended to include the effect of debonding at the actuator-laminate interface. The developed analytical model is implemented using the finite element method utilizing an induced strain approach for computational efficiency. This allows general laminate geometries and boundary conditions to be analyzed. The state space control equations are developed to allow flexibility in the design of the control system. Circuit concepts are also discussed. Static and dynamic results of smart composite structures, obtained using the higher order theory, are correlated with available analytical data. Comparisons, including debonded laminates, are also made with a general purpose finite element code and available experimental data. Overall, very good agreement is observed. Convergence of the finite element implementation of the higher order theory is shown with exact solutions. Additional results demonstrate the utility of the developed theory to study piezoelectric actuation of composite

  17. SINEX: SCALE shielding analysis GUI for X-Windows

    International Nuclear Information System (INIS)

    Browman, S.M.; Barnett, D.L.

    1997-12-01

    SINEX (SCALE Interface Environment for X-windows) is an X-Windows graphical user interface (GUI), that is being developed for performing SCALE radiation shielding analyses. SINEX enables the user to generate input for the SAS4/MORSE and QADS/QAD-CGGP shielding analysis sequences in SCALE. The code features will facilitate the use of both analytical sequences with a minimum of additional user input. Included in SINEX is the capability to check the geometry model by generating two-dimensional (2-D) color plots of the geometry model using a new version of the SCALE module, PICTURE. The most sophisticated feature, however, is the 2-D visualization display that provides a graphical representation on screen as the user builds a geometry model. This capability to interactively build a model will significantly increase user productivity and reduce user errors. SINEX will perform extensive error checking and will allow users to execute SCALE directly from the GUI. The interface will also provide direct on-line access to the SCALE manual

  18. Performing data analysis using IBM SPSS

    CERN Document Server

    Meyers, Lawrence S; Guarino, A J

    2013-01-01

    This book is designed to be a user's guide for students and other interested readers to perform statistical data analysis with IBM SPSS, which is a major statistical software package used extensively in academic, government, and business settings. This book addresses the needs, level of sophistication, and interest in introductory statistical methodology on the part of undergraduate and graduate students in social and behavioral science, business, health-related, and education programs.  Each chapter covers a particular statistical procedure and has the following format: an example pr

  19. Sophistication of computational science and fundamental physics simulations

    International Nuclear Information System (INIS)

    Ishiguro, Seiji; Ito, Atsushi; Usami, Shunsuke; Ohtani, Hiroaki; Sakagami, Hitoshi; Toida, Mieko; Hasegawa, Hiroki; Horiuchi, Ritoku; Miura, Hideaki

    2016-01-01

    Numerical experimental reactor research project is composed of the following studies: (1) nuclear fusion simulation research with a focus on specific physical phenomena of specific equipment, (2) research on advanced simulation method to increase predictability or expand its application range based on simulation, (3) visualization as the foundation of simulation research, (4) research for advanced computational science such as parallel computing technology, and (5) research aiming at elucidation of fundamental physical phenomena not limited to specific devices. Specifically, a wide range of researches with medium- to long-term perspectives are being developed: (1) virtual reality visualization, (2) upgrading of computational science such as multilayer simulation method, (3) kinetic behavior of plasma blob, (4) extended MHD theory and simulation, (5) basic plasma process such as particle acceleration due to interaction of wave and particle, and (6) research related to laser plasma fusion. This paper reviews the following items: (1) simultaneous visualization in virtual reality space, (2) multilayer simulation of collisionless magnetic reconnection, (3) simulation of microscopic dynamics of plasma coherent structure, (4) Hall MHD simulation of LHD, (5) numerical analysis for extension of MHD equilibrium and stability theory, (6) extended MHD simulation of 2D RT instability, (7) simulation of laser plasma, (8) simulation of shock wave and particle acceleration, and (9) study on simulation of homogeneous isotropic MHD turbulent flow. (A.O.)

  20. Topic Modeling in Sentiment Analysis: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Toqir Ahmad Rana

    2016-06-01

    Full Text Available With the expansion and acceptance of Word Wide Web, sentiment analysis has become progressively popular research area in information retrieval and web data analysis. Due to the huge amount of user-generated contents over blogs, forums, social media, etc., sentiment analysis has attracted researchers both in academia and industry, since it deals with the extraction of opinions and sentiments. In this paper, we have presented a review of topic modeling, especially LDA-based techniques, in sentiment analysis. We have presented a detailed analysis of diverse approaches and techniques, and compared the accuracy of different systems among them. The results of different approaches have been summarized, analyzed and presented in a sophisticated fashion. This is the really effort to explore different topic modeling techniques in the capacity of sentiment analysis and imparting a comprehensive comparison among them.

  1. Event based neutron activation spectroscopy and analysis algorithm using MLE and meta-heuristics

    International Nuclear Information System (INIS)

    Wallace, B.

    2014-01-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes involved was used to create a statistical model. Maximum likelihood estimation was combined with meta-heuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research. (author)

  2. An artificial intelligence approach towards disturbance analysis in nuclear power plants

    International Nuclear Information System (INIS)

    Lindner, A.; Klebau, J.; Fielder, U.; Baldeweg, F.

    1987-01-01

    The scale and degree of sophistication of technological plants, e.g. nuclear power plants, have been essentially increased during the last decades. Conventional disturbance analysis systems have proved to work successfully in wellknown situations. But in cases of emergencies, the operator staff needs a more advanced assistance in realizing diagnosis and therapy control. The significance of introducing artificial intelligence methods in nuclear power technology is emphasized. Main features of the on-line disturbance analysis system SAAP-2 are reported about. It is being developed for application in nuclear power plants. 9 refs. (author)

  3. Flat-plate solar array project. Volume 8: Project analysis and integration

    Science.gov (United States)

    Mcguire, P.; Henry, P.

    1986-01-01

    Project Analysis and Integration (PA&I) performed planning and integration activities to support management of the various Flat-Plate Solar Array (FSA) Project R&D activities. Technical and economic goals were established by PA&I for each R&D task within the project to coordinate the thrust toward the National Photovoltaic Program goals. A sophisticated computer modeling capability was developed to assess technical progress toward meeting the economic goals. These models included a manufacturing facility simulation, a photovoltaic power station simulation and a decision aid model incorporating uncertainty. This family of analysis tools was used to track the progress of the technology and to explore the effects of alternative technical paths. Numerous studies conducted by PA&I signaled the achievement of milestones or were the foundation of major FSA project and national program decisions. The most important PA&I activities during the project history are summarized. The PA&I planning function is discussed and how it relates to project direction and important analytical models developed by PA&I for its analytical and assessment activities are reviewed.

  4. Line outage contingency analysis including the system islanding scenario

    Energy Technology Data Exchange (ETDEWEB)

    Hazarika, D.; Bhuyan, S. [Assam Engineering College, Jalukbari, Guwahati 781013 (India); Chowdhury, S.P. [Jadavpur University, Jadavpur, Kolkata 700 032 (India)

    2006-05-15

    The paper describes an algorithm for determining the line outage contingency of a line taking into account of line over load effect in remaining lines and subsequent tripping of over loaded line(s) leading to possible system split or islanding of a power system. The optimally ordered sparse [B'], [B'] matrices for the integrated system are used for load flow analysis to determine modified values of voltage phase angles [{delta}] and bus voltages [V] to determine the over loading effect on the remaining lines due to outage of a selected line outage contingency. In case of over loading in remaining line(s), the over loaded lines are removed from the system and a topology processor is used to find the islands. A fast decoupled load flow (FDLF) analysis is carried out for finding out the system variables for the islanded (or single island) system by incorporating appropriate modification in the [B'] and [B'] matrices of the integrated system. Line outage indices based on line overload, loss of load, loss of generation and static voltage stability are computed to indicate severity of a line outage of a selected line. (author)

  5. Design and performance analysis of solid-propellant rocket motors using a simplified computer program

    Science.gov (United States)

    Sforzini, R. H.

    1972-01-01

    An analysis and a computer program are presented which represent a compromise between the more sophisticated programs using precise burning geometric relations and the textbook type of solutions. The program requires approximately 900 computer cards including a set of 20 input data cards required for a typical problem. The computer operating time for a single configuration is approximately 1 minute and 30 seconds on the IBM 360 computer. About l minute and l5 seconds of the time is compilation time so that additional configurations input at the same time require approximately 15 seconds each. The program uses approximately 11,000 words on the IBM 360. The program is written in FORTRAN 4 and is readily adaptable for use on a number of different computers: IBM 7044, IBM 7094, and Univac 1108.

  6. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  7. A systematic review of breath analysis and detection of volatile organic compounds in COPD

    DEFF Research Database (Denmark)

    Christiansen, Anders; Davidsen, Jesper Rømhild; Titlestad, Ingrid

    2016-01-01

    research area is breath analysis, with several published attempts to find exhaled compounds as diagnostic markers. The field is broad and no review of published COPD breath analysis studies exists yet. We have conducted a systematic review examining the state of art and identified 12 suitable papers, which...... in breath sampling technologies, the selection of appropriate control groups, and a lack of sophisticated (and standardized) statistical data analysis methods. No cross-hospital/study comparisons have been published yet. We conclude that future efforts should (also) concentrate on making breath data...... analysis more comparable through standardization of sampling, data processing, and reporting....

  8. Background field removal technique using regularization enabled sophisticated harmonic artifact reduction for phase data with varying kernel sizes.

    Science.gov (United States)

    Kan, Hirohito; Kasai, Harumasa; Arai, Nobuyuki; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2016-09-01

    An effective background field removal technique is desired for more accurate quantitative susceptibility mapping (QSM) prior to dipole inversion. The aim of this study was to evaluate the accuracy of regularization enabled sophisticated harmonic artifact reduction for phase data with varying spherical kernel sizes (REV-SHARP) method using a three-dimensional head phantom and human brain data. The proposed REV-SHARP method used the spherical mean value operation and Tikhonov regularization in the deconvolution process, with varying 2-14mm kernel sizes. The kernel sizes were gradually reduced, similar to the SHARP with varying spherical kernel (VSHARP) method. We determined the relative errors and relationships between the true local field and estimated local field in REV-SHARP, VSHARP, projection onto dipole fields (PDF), and regularization enabled SHARP (RESHARP). Human experiment was also conducted using REV-SHARP, VSHARP, PDF, and RESHARP. The relative errors in the numerical phantom study were 0.386, 0.448, 0.838, and 0.452 for REV-SHARP, VSHARP, PDF, and RESHARP. REV-SHARP result exhibited the highest correlation between the true local field and estimated local field. The linear regression slopes were 1.005, 1.124, 0.988, and 0.536 for REV-SHARP, VSHARP, PDF, and RESHARP in regions of interest on the three-dimensional head phantom. In human experiments, no obvious errors due to artifacts were present in REV-SHARP. The proposed REV-SHARP is a new method combined with variable spherical kernel size and Tikhonov regularization. This technique might make it possible to be more accurate backgroud field removal and help to achive better accuracy of QSM. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Gravity Probe B data analysis: II. Science data and their handling prior to the final analysis

    International Nuclear Information System (INIS)

    Silbergleit, A S; Conklin, J W; Heifetz, M I; Holmes, T; Li, J; Mandel, I; Solomonik, V G; Stahl, K; P W Worden Jr; Everitt, C W F; Adams, M; Berberian, J E; Bencze, W; Clarke, B; Al-Jadaan, A; Keiser, G M; Kozaczuk, J A; Al-Meshari, M; Muhlfelder, B; Salomon, M

    2015-01-01

    The results of the Gravity Probe B relativity science mission published in Everitt et al (2011 Phys. Rev. Lett. 106 221101) required a rather sophisticated analysis of experimental data due to several unexpected complications discovered on-orbit. We give a detailed description of the Gravity Probe B data reduction. In the first paper (Silbergleit et al Class. Quantum Grav. 22 224018) we derived the measurement models, i.e., mathematical expressions for all the signals to analyze. In the third paper (Conklin et al Class. Quantum Grav. 22 224020) we explain the estimation algorithms and their program implementation, and discuss the experiment results obtained through data reduction. This paper deals with the science data preparation for the main analysis yielding the relativistic drift estimates. (paper)

  10. Impact analysis and testing of tritiated heavy water transportation packages including hydrodynamic effects

    International Nuclear Information System (INIS)

    Sauve, R.G.; Tulk, J.D.; Gavin, M.E.

    1989-01-01

    Ontario Hydro has recently designed a new Type B(M) Tritiated Heavy Water Transportation Package (THWTP) for the road transportation of tritiated heavy water from its operating nuclear stations to the Tritium Removal Facility in Ontario. These packages must demonstrate the ability to withstand severe shock and impact scenarios such as those prescribed by IAEA standards. The package, shown in figure 1, comprises an inner container filled with tritiated heavy water, and a 19 lb/ft 3 polyurethane foam-filled overpack. The overpack is of sandwich construction with 304L stainless steel liners and 10.5 inch thick nominal foam walls. The outer shell is 0.75 inch thick and the inner shell is 0.25 inch thick. The primary containment boundary consists of the overpack inner liner, the containment lid and outer containment seals in the lid region. The total weight of the container including the 12,000 lb. payload is 36,700 lb. The objective of the present study is to evaluate the hydrodynamic effect of the tritiated heavy water payload on the structural integrity of the THWTP during a flat end drop from a height of 9 m. The study consisted of three phases: (i) developing an analytical model to simulate the hydrodynamic effects of the heavy water payload during impact; (ii) performing an impact analysis for a 9 m flat end drop of the THWTP including fluid structure interaction; (iii) verification of the analytical models by experiment

  11. IEDA [Intelligent Eddy Current Data Analysis] helps make sense of eddy current data [steam generators

    International Nuclear Information System (INIS)

    Clark, R.

    1989-01-01

    The increasing sophistication of eddy current signal interpretation in steam generator tubing has improved capabilities, but has also made the process of analysis more complex and time consuming. Westinghouse has developed an intelligent computerised tool - the IEDA (Intelligent Eddy Current Data Analysis) system, to lighten the load on analysts. Since 1985, 44 plants have been inspected with IEDA, representing over 400,000 tubes. The system has provided a repeatability and a consistency not achieved by human operators. (U.K.)

  12. Laser induced breakdown spectroscopy of the uranium including calcium. Time resolved measurement spectroscopic analysis (Contract research)

    International Nuclear Information System (INIS)

    Akaoka, Katsuaki; Maruyama, Youichiro; Oba, Masaki; Miyabe, Masabumi; Otobe, Haruyoshi; Wakaida, Ikuo

    2010-05-01

    For the remote analysis of low DF TRU (Decontamination Factor Transuranic) fuel, Laser Breakdown Spectroscopy (LIBS) was applied to uranium oxide including a small amount of calcium oxide. The characteristics, such as spectrum intensity and plasma excitation temperature, were measured using time-resolved spectroscopy. As a result, in order to obtain the stable intensity of calcium spectrum for the uranium spectrum, it was found out that the optimum observation delay time of spectrum is 4 microseconds or more after laser irradiation. (author)

  13. Yucca Mountain transportation routes: Preliminary characterization and risk analysis

    International Nuclear Information System (INIS)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R.

    1991-01-01

    In this study, rail and highway routes which may be used for shipments of high-level nuclear waste to a proposed repository at Yucca Mountain, Nevada are characterized. This characterization facilitates three types of impact analysis: comparative study, limited worst-case assessment, and more sophisticated probabilistic risk assessment techniques. Data for relative and absolute impact measures are provided to support comparisons of routes based on selected characteristics. A worst-case scenario assessment is included to determine potentially critical and most likely places for accidents or incidents to occur. The assessment facilitated by the data in this study is limited because impact measures are restricted to the identification of potential areas or persons affected. No attempt is made to quantify the magnitude of these impacts. Most likely locations for accidents to occur are determined relative to other locations within the scope of this study. Independent factors and historical trends used to identify these likely locations are only proxies for accident probability

  14. COMPARATIVE ANALYSIS OF THE RECENT EVOLUTIONS OF ROMANIAN AND EUROPEAN UNION'S COMPETITIVENESS

    Directory of Open Access Journals (Sweden)

    Felea Adrian Ioan

    2011-07-01

    Full Text Available The main subject of this paper refers to an analysis of the recent trends and evolution of Romanian competitiveness compared to the European Union competitiveness and it is structured in four main parts. The first section of the paper regards an introduction of the competitiveness evolution process, recalling the three actual evaluation models of the competitiveness level. In the second part of the paper there can be found the competitiveness indexes practiced and published by the World Economic Forum, indicators that are structured on three main levels as following: the Global Competitiveness Index and its aggregate indicators that are developed on three categories of factors that are essential for the competitiveness process (Basic requirements, Efficienty Enhancers, Innovation and sophistication factors and the indexes associated to the twelve pillars of competitiveness: Institutions, Infrastructure, Macroeconomic stability, Health and primary education, Higer education and training, Goods market efficiency, Labor market efficiency, Financial market sophistication Technological readiness, Market size, Business sophistication, Innovation. Based on the values obtained after consulting the World Economic Forum Reports and regarding the competitiveness from a global perspective, the third part of the paper presents a comparative analyisis of the evolution of the Romanian competitiveness process and the one of the EU25. In the last part of the paper there can be found the conclusions of this analysis, with respect to the values found This paper is part of the doctoral thesis entitled "Increased Competitiveness in the Romanian economy, in the context of Sustainable Development, coordinated by Professor Michael Berinde University of Oradea, Faculty of Economics. Doctoral research is supported by Human Resources Development Operational Programme 2007-2013, Contract POSDRU/CPP107/DMI1.5/S/80272 , "Doctoral programs to train researchers performing

  15. Human Gait Feature Extraction Including a Kinematic Analysis toward Robotic Power Assistance

    Directory of Open Access Journals (Sweden)

    Mario I. Chacon-Murguia

    2012-09-01

    Full Text Available The present work proposes a method for human gait and kinematic analysis. Gait analysis consists of the determination of hip, knee and ankle positions through video analysis. Gait kinematic for the thigh and knee is then generated from this data. Evaluations of the gait analysis method indicate an acceptable performance of 86.66% for hip and knee position estimation, and comparable findings with other reported works for gait kinematic. A coordinate systems assignment is performed according to the DH algorithm and a direct kinematic model of the legs is obtained. The legs' angles obtained from the video analysis are applied to the kinematic model in order to revise the application of this model to robotic legs in a power assisted system.

  16. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  17. A meta-analysis including dose-response relationship between night shift work and the risk of colorectal cancer.

    Science.gov (United States)

    Wang, Xiao; Ji, Alin; Zhu, Yi; Liang, Zhen; Wu, Jian; Li, Shiqi; Meng, Shuai; Zheng, Xiangyi; Xie, Liping

    2015-09-22

    A meta-analysis was conducted to quantitatively evaluate the correlation between night shift work and the risk of colorectal cancer. We searched for publications up to March 2015 using PubMed, Web of Science, Cochrane Library, EMBASE and the Chinese National Knowledge Infrastructure databases, and the references of the retrieved articles and relevant reviews were also checked. OR and 95% CI were used to assess the degree of the correlation between night shift work and risk of colorectal cancer via fixed- or random-effect models. A dose-response meta-analysis was performed as well. The pooled OR estimates of the included studies illustrated that night shift work was correlated with an increased risk of colorectal cancer (OR = 1.318, 95% CI 1.121-1.551). No evidence of publication bias was detected. In the dose-response analysis, the rate of colorectal cancer increased by 11% for every 5 years increased in night shift work (OR = 1.11, 95% CI 1.03-1.20). In conclusion, this meta-analysis indicated that night shift work was associated with an increased risk of colorectal cancer. Further researches should be conducted to confirm our findings and clarify the potential biological mechanisms.

  18. Solving real-life problems: future mobile technology sophistication

    International Nuclear Information System (INIS)

    Shafiq, F.; Ahsan, K.; Nadeem, A.

    2016-01-01

    Almost all the human being real life concerned domains are taking advantage of latest technologies for enhancing their process, procedures and operations. This integration of technological innovations provides ease of access, flexibility, transparency, reliability and speed for the concerned process and procedures. Rapid growth of ICT (Information and Communication Technology) and MT (Mobile Technology) provides opportunity to redesign and re-engineered the human routines life activities process and procedures. Technology integration and adoption in routine life activities may serves compensatory mechanism to assist the population in different manner such as monitoring older adults and children at homes, provides security assistance, monitoring and recording patients vital signs automatically, controlling and monitoring equipments and devices, providing assistance in shopping, banking and education as well. Disasters happened suddenly, destroy everything indiscriminately. Adoption and integration of latest technologies including ICT and MT can enhance the current disaster management process, procedures and operations. This research study focuses the impacts of latest and emerging technology trends in routine life activities and surrounds their potential strength to improve and enhance disaster management activities. MT is providing a promising platform for facilitating people to enhance their routine life activities. This research argue that integration and adoption of mobile computing in disaster management domain can enhance disaster management activities with promising minimizing error, quick information assembling, quick response based on technology manipulation and prioritizing action. (author)

  19. Solving Real-Life Problems: Future Mobile Technology Sophistication

    Directory of Open Access Journals (Sweden)

    FARHAN SHAFIQ

    2016-07-01

    Full Text Available Almost all the human being real life concerned domains are taking advantage of latest technologies for enhancing their process, procedures and operations. This integration of technological innovations provides ease of access, flexibility, transparency, reliability and speed for the concerned process and procedures. Rapid growth of ICT (Information and Communication Technology and MT (Mobile Technology provides opportunity to redesign and reengineered the human routines? life activities process and procedures. Technology integration and adoption in routine life activities may serves compensatory mechanism to assist the population in different manner such as monitoring older adults and children at homes, provides security assistance, monitoring and recording patients vital signs automatically, controlling and monitoring equipments and devices, providing assistance in shopping, banking and education as well. Disasters happened suddenly, destroy everything indiscriminately. Adoption and integration of latest technologies including ICT and MT can enhance the current disaster management process, procedures and operations. This research study focuses the impacts of latest and emerging technology trends in routine life activities and surrounds their potential strength to improve and enhance disaster management activities. MT is providing a promising platform for facilitating people to enhance their routine life activities. This research argue that integration and adoption of mobile computing in disaster management domain can enhance disaster management activities with promising minimizing error, quick information assembling, quick response based on technology manipulation and prioritizing action.

  20. Data-acquisition systems for the present and the future

    International Nuclear Information System (INIS)

    Drobnis, D.D.

    1982-09-01

    Basic components of today's acquisition systems are surveyed. These include front-end tools such as microprocessors, programmable controllers, and CAMAC interfaces. Some key concepts in large central real-time systems are examined: Hardware and Software architecture, and data base structure. Some trends in present data acquisition system design are analyzed, including increasing distribution of system functions and expansion to hierarchical multi-processor netowrks. With the evolution of microprocessors, front-end intelligence is growing into front-end computing power. Real-time host systems are becoming increasingly sophisticated human interface and data base management tools, with increasingly complex operating systems, and increasing amounts of memory, mass storage, and computing power. And the ultimate analysis of plasma data is becoming increasingly sophisticated

  1. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  2. Potential language and attentional networks revealed through factor analysis of rCBF data measured with SPECT

    DEFF Research Database (Denmark)

    McLaughlin, T; Steinberg, B; Christensen, B

    1992-01-01

    's area (left hemisphere), when subjects listened to narrative speech, compared to white noise (baseline). No significant rCBF differences were detected with this test during dichotic stimulation vs. white noise. A more sophisticated statistical method (factor analysis) disclosed patterns of functionally...... brain networks involved in (I) auditory/linguistic, (II) attentional, and (III) visual imaging activity....

  3. From Aplysia to the constitution: evolution of concepts in behavior analysis / Da Aplysia à constituição: evolução de conceitos na análise do comportamento

    Directory of Open Access Journals (Sweden)

    João Claudio Todorov

    2004-01-01

    Full Text Available This work presents the evolution of concepts in behavior analysis, especially in the writings of Skinner, from the definition of operant behavior and reinforcement contingency to processes of selection by consequences, following the development of a conceptual language that covers both the behavior of molluscs and the metacontingencies included in the constitution of a country. Skinner's first papers used the terminology developed by Pavlov in his studies of conditioned reflexes to deal with behavior in general. From Skinner's doctoral dissertation to the papers published in the 80's there was an evolution, followed by the field of behavior analysis, which today uses the same, sophisticated language to deal with complex issues like cultural practices and the survival of cultures.

  4. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    International Nuclear Information System (INIS)

    Elkhoraibi, T.; Hashemi, A.; Ostadan, F.

    2014-01-01

    Soil-structure interaction (SSI) is a major step for seismic design of massive and stiff structures typical of the nuclear facilities and civil infrastructures such as tunnels, underground stations, dams and lock head structures. Currently most SSI analyses are performed deterministically, incorporating limited range of variation in soil and structural properties and without consideration of the ground motion incoherency effects. This often leads to overestimation of the seismic response particularly the In-Structure-Response Spectra (ISRS) with significant impositions of design and equipment qualification costs, especially in the case of high-frequency sensitive equipment at stiff soil or rock sites. The reluctance to incorporate a more comprehensive probabilistic approach is mainly due to the fact that the computational cost of performing probabilistic SSI analysis even without incoherency function considerations has been prohibitive. As such, bounding deterministic approaches have been preferred by the industry and accepted by the regulatory agencies. However, given the recently available and growing computing capabilities, the need for a probabilistic-based approach to the SSI analysis is becoming clear with the advances in performance-based engineering and the utilization of fragility analysis in the decision making process whether by the owners or the regulatory agencies. This paper demonstrates the use of both probabilistic and deterministic SSI analysis techniques to identify important engineering demand parameters in the structure. A typical nuclear industry structure is used as an example for this study. The system is analyzed for two different site conditions: rock and deep soil. Both deterministic and probabilistic SSI analysis approaches are performed, using the program SASSI, with and without ground motion incoherency considerations. In both approaches, the analysis begins at the hard rock level using the low frequency and high frequency hard rock

  5. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    Energy Technology Data Exchange (ETDEWEB)

    Elkhoraibi, T., E-mail: telkhora@bechtel.com; Hashemi, A.; Ostadan, F.

    2014-04-01

    Soil-structure interaction (SSI) is a major step for seismic design of massive and stiff structures typical of the nuclear facilities and civil infrastructures such as tunnels, underground stations, dams and lock head structures. Currently most SSI analyses are performed deterministically, incorporating limited range of variation in soil and structural properties and without consideration of the ground motion incoherency effects. This often leads to overestimation of the seismic response particularly the In-Structure-Response Spectra (ISRS) with significant impositions of design and equipment qualification costs, especially in the case of high-frequency sensitive equipment at stiff soil or rock sites. The reluctance to incorporate a more comprehensive probabilistic approach is mainly due to the fact that the computational cost of performing probabilistic SSI analysis even without incoherency function considerations has been prohibitive. As such, bounding deterministic approaches have been preferred by the industry and accepted by the regulatory agencies. However, given the recently available and growing computing capabilities, the need for a probabilistic-based approach to the SSI analysis is becoming clear with the advances in performance-based engineering and the utilization of fragility analysis in the decision making process whether by the owners or the regulatory agencies. This paper demonstrates the use of both probabilistic and deterministic SSI analysis techniques to identify important engineering demand parameters in the structure. A typical nuclear industry structure is used as an example for this study. The system is analyzed for two different site conditions: rock and deep soil. Both deterministic and probabilistic SSI analysis approaches are performed, using the program SASSI, with and without ground motion incoherency considerations. In both approaches, the analysis begins at the hard rock level using the low frequency and high frequency hard rock

  6. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  7. Cash in the Czech Republic: Trend Analysis 2003–2015

    OpenAIRE

    Zbyněk Revenda

    2017-01-01

    Electronization of banking services is a  strong reason for relative growth of cashless payments. The importance of cash, i.e., banknotes and coins, for realizing transactions should decrease. An analysis for the Czech Republic in the period 2003–2015 confirms this. Demand of nonbank entities for cash is associated mainly with liquidity, banks’ credibility and technological sophistication. Illegal transactions also form part of the demand. Zero return on cash counteracts demand, but it has li...

  8. Neutron activation analysis of Lerna ceramics (Greece) at Early Bronze Age: local production and trade exchanges

    International Nuclear Information System (INIS)

    Attas, M.

    1980-01-01

    Neutron activation analysis is a powerful tool for determining the provenance of ancient ceramics. A sophisticated analytical system for gamma-ray spectrometry, designed specifically for the chemical analysis of ceramics by thermal neutron activation, was used to determine the concentrations of twenty elements in samples of ancient pottery. The measurements were made relative to the standard pottery of Perlman and Asaro. The purpose of the work was to study the production of fine pottery at the settlement of Lerna, in the Argolid of Greece, during the Early Bronze Age (third millennium BC). About half of the 50 samples analysed formed the major compositional group, which was attributed to Lerna. It included, besides the majority of the samples from the second phase of the Early Bronze Age (Lerna III), several samples from the third phase (Lerna IV); that is, from levels immediately succeeding the great destruction which marks the end of the Lerna III settlement. A small number of objects forms a second group of local origin and includes 4 of the 5 clay sealings sampled. Among the archaeologically unusual objects, several could be attributed to Lerna, while others were characterized as imports [fr

  9. A Sophisticated Architecture Is Indeed Necessary for the Implementation of Health in All Policies but not Enough Comment on "Understanding the Role of Public Administration in Implementing Action on the Social Determinants of Health and Health Inequities".

    Science.gov (United States)

    Breton, Eric

    2016-02-29

    In this commentary, I argue that beyond a sophisticated supportive architecture to facilitate implementation of actions on the social determinants of health (SDOH) and health inequities, the Health in All Policies (HiAP) project faces two main barriers: lack of awareness within policy networks on the social determinants of population health, and a tendency of health actors to neglect investing in other sectors' complex problems. © 2016 by Kerman University of Medical Sciences.

  10. Analysis of the Capital Budgeting Practices: Serbian Case

    Directory of Open Access Journals (Sweden)

    Lidija Barjaktarovic

    2016-09-01

    Full Text Available This paper addresses two major research questions: which techniques firms in Serbia use for project evaluation, and how the Serbian companies calculate their cost of capital. The authors have created the questionnaire and the sample consisting of 65 companies that responded to the enquiry (out of 392 during the year 2015. The results showed that payback criterion is the most dominant capital budgeting technique used by firms in Serbia. Further, the results revealed that large firms as well as multinational firms are more inclined to use discounted cash flow capital budgeting techniques and other sophisticated techniques. Finally, the authors have concluded that the CAPM is not the dominant method for calculating cost of capital by the companies from the sample analyzing the overall sample. Consequently, skilled human capital, the adequate knowledge and developed procedures may contribute to accepting capital budgeting techniques such as discounted cash flow analysis and other sophisticated techniques by larger portion of the companies in Serbia. Finally, existence of perfect financial market is a necessary precondition for implementing all these contemporary financial concepts and its development has to be posed as one of the priorities in the years to come.

  11. Continuous nowhere differentiable functions the monsters of analysis

    CERN Document Server

    Jarnicki, Marek

    2015-01-01

    This book covers the construction, analysis, and theory of continuous nowhere differentiable functions, comprehensively and accessibly. After illuminating the significance of the subject through an overview of its history, the reader is introduced to the sophisticated toolkit of ideas and tricks used to study the explicit continuous nowhere differentiable functions of Weierstrass, Takagi–van der Waerden, Bolzano, and others. Modern tools of functional analysis, measure theory, and Fourier analysis are applied to examine the generic nature of continuous nowhere differentiable functions, as well as linear structures within the (nonlinear) space of continuous nowhere differentiable functions. To round out the presentation, advanced techniques from several areas of mathematics are brought together to give a state-of-the-art analysis of Riemann’s continuous, and purportedly nowhere differentiable, function. For the reader’s benefit, claims requiring elaboration, and open problems, are clearly indicated. An a...

  12. Considerations of fluid-structure interaction effects in the design of high-level waste storage tanks

    International Nuclear Information System (INIS)

    Stuart, R.J.; Shipley, L.E.; Ghose, A.; Hiremath, M.S.

    1994-01-01

    For the seismic evaluation and design of the large number of underground high-level waste storage tanks (HLWST) at DOE sites, an important consideration is the adequate estimation of the fluid-structure interaction effects on the design forces. The DOE Tanks Seismic Experts Panel (TSEP) has developed seismic design and evaluation guidelines which include simplified methods for estimating hydrodynamic effects on tanks. For the practical analysis and design of HLWSTs, however, more sophisticated methods are often needed. The research presented in this paper demonstrates the effectiveness and reliability of finite element method based techniques, developed and utilized by ARES, to evaluate the fluid-structure interaction effects on underground HLWSTs. Analysis results for simple cylindrical tank configurations are first compared with previously published data, to benchmark the techniques. Next, for an actual HLWST configuration, correlations are established between these techniques and the TSEP guidelines, for the design parameters affected by fluid-structure interaction. Finally, practical design situations which may require a level of analysis sophistication that goes beyond the simplified TSEP guidelines are presented. This level of sophistication is frequently required when attempting to validate or upgrade the design qualifications of existing tanks

  13. Information management for global environmental change, including the Carbon Dioxide Information Analysis Center

    Energy Technology Data Exchange (ETDEWEB)

    Stoss, F.W. [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center

    1994-06-01

    The issue of global change is international in scope. A body of international organizations oversees the worldwide coordination of research and policy initiatives. In the US the National Science and Technology Council (NSTC) was established in November of 1993 to provide coordination of science, space, and technology policies throughout the federal government. NSTC is organized into nine proposed committees. The Committee on Environmental and Natural Resources (CERN) oversees the US Department of Energy`s Global Change Research Program (USGCRP). As part of the USGCRP, the US Department of Energy`s Global Change Research Program aims to improve the understanding of Earth systems and to strengthen the scientific basis for the evaluation of policy and government action in response to potential global environmental changes. This paper examines the information and data management roles of several international and national programs, including Oak Ridge National Laboratory`s (ORNL`s) global change information programs. An emphasis will be placed on the Carbon Dioxide Information Analysis Center (CDIAC), which also serves as the World Data Center-A for Atmospheric Trace Gases.

  14. Efficient Time-Domain Ray-Tracing Technique for the Analysis of Ultra-Wideband Indoor Environments including Lossy Materials and Multiple Effects

    Directory of Open Access Journals (Sweden)

    F. Saez de Adana

    2009-01-01

    Full Text Available This paper presents an efficient application of the Time-Domain Uniform Theory of Diffraction (TD-UTD for the analysis of Ultra-Wideband (UWB mobile communications for indoor environments. The classical TD-UTD formulation is modified to include the contribution of lossy materials and multiple-ray interactions with the environment. The electromagnetic analysis is combined with a ray-tracing acceleration technique to treat realistic and complex environments. The validity of this method is tested with measurements performed inside the Polytechnic building of the University of Alcala and shows good performance of the model for the analysis of UWB propagation.

  15. Design by analysis of composite pressure equipment

    International Nuclear Information System (INIS)

    Durand, S.; Mallard, H.

    2004-01-01

    Design by analysis has been particularly pointed out by the european pressure equipment directive. Advanced mechanical analysis like finite element method are used instead of classical design by formulas or graphs. Structural behaviour can be understood by the designer. Design by analysis of metallic pressure equipments is widely used. Material behaviour or limits analysis is based on sophisticated approach (elasto-plastic analysis,..). Design by analysis of composite pressure equipments is not systematically used for industrial products. The difficulty comes from the number of information to handle. The laws of mechanics are the same for composite materials than for steel. The authors want to show that in design by analysis, the composite material approach is only more complete than the metallic approach. Mechanics is more general but not more complicated. A multi-material approach is a natural evolution of design by analysis of composite equipments. The presentation is illustrated by several industrial cases - composite vessel: analogy with metallic calculations; - composite pipes and fittings; - welding and bounding of thermoplastic equipments. (authors)

  16. Implementing the Bayesian paradigm in risk analysis

    International Nuclear Information System (INIS)

    Aven, T.; Kvaloey, J.T.

    2002-01-01

    The Bayesian paradigm comprises a unified and consistent framework for analyzing and expressing risk. Yet, we see rather few examples of applications where the full Bayesian setting has been adopted with specifications of priors of unknown parameters. In this paper, we discuss some of the practical challenges of implementing Bayesian thinking and methods in risk analysis, emphasizing the introduction of probability models and parameters and associated uncertainty assessments. We conclude that there is a need for a pragmatic view in order to 'successfully' apply the Bayesian approach, such that we can do the assignments of some of the probabilities without adopting the somewhat sophisticated procedure of specifying prior distributions of parameters. A simple risk analysis example is presented to illustrate ideas

  17. Plato's patricide in the sophist

    Directory of Open Access Journals (Sweden)

    Deretić Irina J.

    2012-01-01

    Full Text Available In this paper, the author attempts to elucidate validity of Plato's criticism of Parmenides' simplified monistic ontology, as well as his concept of non-being. In contrast to Parmenides, Plato introduces a more complex ontology of the megista gene and redefines Parmenides' concept of non-being as something absolutely different from being. According to Plato, not all things are in the same sense, i. e. they have the different ontological status. Additionally, he redefines Parmenides' concept of absolute non-being as 'difference' or 'otherness.' .

  18. Sophisticated fuel handling system evolved

    International Nuclear Information System (INIS)

    Ross, D.A.

    1988-01-01

    The control systems at Sellafield fuel handling plant are described. The requirements called for built-in diagnostic features as well as the ability to handle a large sequencing application. Speed was also important; responses better than 50ms were required. The control systems are used to automate operations within each of the three main process caves - two Magnox fuel decanners and an advanced gas-cooled reactor fuel dismantler. The fuel route within the fuel handling plant is illustrated and described. ASPIC (Automated Sequence Package for Industrial Control) which was developed as a controller for the plant processes is described. (U.K.)

  19. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  20. A Web Services Data Analysis Grid

    Energy Technology Data Exchange (ETDEWEB)

    William A Watson III; Ian Bird; Jie Chen; Bryan Hess; Andy Kowalski; Ying Chen

    2002-07-01

    The trend in large-scale scientific data analysis is to exploit compute, storage and other resources located at multiple sites, and to make those resources accessible to the scientist as if they were a single, coherent system. Web technologies driven by the huge and rapidly growing electronic commerce industry provide valuable components to speed the deployment of such sophisticated systems. Jefferson Lab, where several hundred terabytes of experimental data are acquired each year, is in the process of developing a web-based distributed system for data analysis and management. The essential aspects of this system are a distributed data grid (site independent access to experiment, simulation and model data) and a distributed batch system, augmented with various supervisory and management capabilities, and integrated using Java and XML-based web services.

  1. A Web Services Data Analysis Grid

    International Nuclear Information System (INIS)

    William A Watson III; Ian Bird; Jie Chen; Bryan Hess; Andy Kowalski; Ying Chen

    2002-01-01

    The trend in large-scale scientific data analysis is to exploit compute, storage and other resources located at multiple sites, and to make those resources accessible to the scientist as if they were a single, coherent system. Web technologies driven by the huge and rapidly growing electronic commerce industry provide valuable components to speed the deployment of such sophisticated systems. Jefferson Lab, where several hundred terabytes of experimental data are acquired each year, is in the process of developing a web-based distributed system for data analysis and management. The essential aspects of this system are a distributed data grid (site independent access to experiment, simulation and model data) and a distributed batch system, augmented with various supervisory and management capabilities, and integrated using Java and XML-based web services

  2. Clinicopathological and Prognostic Significance of Cancer Antigen 15-3 and Carcinoembryonic Antigen in Breast Cancer: A Meta-Analysis including 12,993 Patients

    Directory of Open Access Journals (Sweden)

    Xuan Li

    2018-01-01

    Full Text Available Purpose. The prognostic role of serum cancer antigen 15-3 (CA15-3 and carcinoembryonic antigen (CEA in breast cancer remains controversial. In this study, we conducted a meta-analysis to investigate the prognostic value of these two markers in breast cancer patients. Methods. After electronic databases were searched, 36 studies (31 including information regarding CA15-3 and 23 including information regarding CEA with 12,993 subjects were included. Based on the data directly or indirectly from the available studies, the hazard ratios (HRs and odds ratios (ORs and their 95% confidence intervals (CIs were pooled according to higher or lower marker levels. Results. Elevated CA15-3 or CEA was statistically significant with poorer DFS and OS in breast cancer (multivariate analysis of OS: HR = 2.03, 95% CI 1.76–2.33 for CA15-3; HR = 1.79, 95% CI 1.46–2.20 for CEA; multivariate analysis of DFS: HR = 1.56, 95% CI 1.06–1.55 for CA15-3; HR = 1.77, 95% CI 1.53–2.04 for CEA. Subgroup analysis showed that CA15-3 or CEA had significant predictive values in primary or metastasis types and different cut-offs and included sample sizes and even the study publication year. Furthermore, elevated CA15-3 was associated with advanced histological grade and younger age, while elevated CEA was related to the non-triple-negative tumor type and older age. These two elevated markers were all associated with a higher tumor burden. Conclusions. This meta-analysis showed that elevated serum CA15-3 or CEA was associated with poor DFS and OS in patients with breast cancer, and they should be tested anytime if possible.

  3. Technical support document: Energy conservation standards for consumer products: Dishwashers, clothes washers, and clothes dryers including: Environmental impacts; regulatory impact analysis

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The Energy Policy and Conservation Act as amended (P.L. 94-163), establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. This Technical Support Document presents the methodology, data and results from the analysis of the energy and economic impacts of standards on dishwashers, clothes washers, and clothes dryers. The economic impact analysis is performed in five major areas: An Engineering Analysis, which establishes technical feasibility and product attributes including costs of design options to improve appliance efficiency. A Consumer Analysis at two levels: national aggregate impacts, and impacts on individuals. The national aggregate impacts include forecasts of appliance sales, efficiencies, energy use, and consumer expenditures. The individual impacts are analyzed by Life-Cycle Cost (LCC), Payback Periods, and Cost of Conserved Energy (CCE), which evaluate the savings in operating expenses relative to increases in purchase price; A Manufacturer Analysis, which provides an estimate of manufacturers' response to the proposed standards. Their response is quantified by changes in several measures of financial performance for a firm. An Industry Impact Analysis shows financial and competitive impacts on the appliance industry. A Utility Analysis that measures the impacts of the altered energy-consumption patterns on electric utilities. A Environmental Effects analysis, which estimates changes in emissions of carbon dioxide, sulfur oxides, and nitrogen oxides, due to reduced energy consumption in the home and at the power plant. A Regulatory Impact Analysis collects the results of all the analyses into the net benefits and costs from a national perspective. 47 figs., 171 tabs. (JF)

  4. Usability Evaluation of the Spatial OLAP Visualization and Analysis Tool (SOVAT).

    Science.gov (United States)

    Scotch, Matthew; Parmanto, Bambang; Monaco, Valerie

    2007-02-01

    Increasingly sophisticated technologies, such as On-Line Analytical Processing (OLAP) and Geospatial Information Systems (GIS), are being leveraged for conducting community health assessments (CHA). Little is known about the usability of OLAP and GIS interfaces with respect to CHA. We conducted an iterative usability evaluation of the Spatial OLAP Visualization and Analysis Tool (SOVAT), a software application that combines OLAP and GIS. A total of nine graduate students and six community health researchers were asked to think-aloud while completing five CHA questions using SOVAT. The sessions were analyzed after every three participants and changes to the interface were made based on the findings. Measures included elapsed time, answers provided, erroneous actions, and satisfaction. Traditional OLAP interface features were poorly understood by participants and combined OLAP-GIS features needed to be better emphasized. The results suggest that the changes made to the SOVAT interface resulted in increases in both usability and user satisfaction.

  5. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  6. MetaComp: comprehensive analysis software for comparative meta-omics including comparative metagenomics.

    Science.gov (United States)

    Zhai, Peng; Yang, Longshu; Guo, Xiao; Wang, Zhe; Guo, Jiangtao; Wang, Xiaoqi; Zhu, Huaiqiu

    2017-10-02

    During the past decade, the development of high throughput nucleic sequencing and mass spectrometry analysis techniques have enabled the characterization of microbial communities through metagenomics, metatranscriptomics, metaproteomics and metabolomics data. To reveal the diversity of microbial communities and interactions between living conditions and microbes, it is necessary to introduce comparative analysis based upon integration of all four types of data mentioned above. Comparative meta-omics, especially comparative metageomics, has been established as a routine process to highlight the significant differences in taxon composition and functional gene abundance among microbiota samples. Meanwhile, biologists are increasingly concerning about the correlations between meta-omics features and environmental factors, which may further decipher the adaptation strategy of a microbial community. We developed a graphical comprehensive analysis software named MetaComp comprising a series of statistical analysis approaches with visualized results for metagenomics and other meta-omics data comparison. This software is capable to read files generated by a variety of upstream programs. After data loading, analyses such as multivariate statistics, hypothesis testing of two-sample, multi-sample as well as two-group sample and a novel function-regression analysis of environmental factors are offered. Here, regression analysis regards meta-omic features as independent variable and environmental factors as dependent variables. Moreover, MetaComp is capable to automatically choose an appropriate two-group sample test based upon the traits of input abundance profiles. We further evaluate the performance of its choice, and exhibit applications for metagenomics, metaproteomics and metabolomics samples. MetaComp, an integrative software capable for applying to all meta-omics data, originally distills the influence of living environment on microbial community by regression analysis

  7. Cluster analysis of accelerated molecular dynamics simulations: A case study of the decahedron to icosahedron transition in Pt nanoparticles

    Science.gov (United States)

    Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F.; Perez, Danny

    2017-10-01

    Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.

  8. Fatigue evaluation including environmental effects for primary circuit components in nuclear power plants

    International Nuclear Information System (INIS)

    Seichter, Johannes; Reese, Sven H.; Klucke, Dietmar

    2013-01-01

    The influence of LWR coolant environment to the lifetime of materials in Nuclear Power Plants is in discussion internationally. Environmental phenomena were investigated in laboratory tests and published in recent years. The discussion is mainly focused both on the transition from laboratory to real plant components and on numerical calculation procedures. Since publishing of the NUREG/CR-6909 report in 2007, formulae for calculating the Fen factors have been modified several times. Various calculation procedures like the so called 'Strain-integrated Method' and 'Simplified Approach' have been published while each approach yields to different results. The recent revision of the calculation procedure, proposed by ANL in 2012, is presented and discussed with regard to possible variations in the results depending on the assumptions made. In German KTA Rules the effect of environmentally assisted fatigue (EAF) is taken into account by means of so called attention thresholds. If the threshold value is exceeded, further measures like NDT, in-service inspections including fracture mechanical evaluations or detailed assessment procedures have to be performed. One way to handle those measures is to apply sophisticated procedures and to show that the calculated CUF is below the defined attention thresholds. On the basis of a practical example, methods and approaches will be discussed and recommendations in terms of avoiding over-conservatism and misinterpretation will be presented.

  9. Fatigue evaluation including environmental effects for primary circuit components in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Seichter, Johannes [Siempelkamp Pruef- und Gutachter-Gesellschaft mbH, Dresden (Germany); Reese, Sven H.; Klucke, Dietmar [Component Technology Global Unit Generation, E.ON Kernkraft GmbH, Hannover (Germany)

    2013-05-15

    The influence of LWR coolant environment to the lifetime of materials in Nuclear Power Plants is in discussion internationally. Environmental phenomena were investigated in laboratory tests and published in recent years. The discussion is mainly focused both on the transition from laboratory to real plant components and on numerical calculation procedures. Since publishing of the NUREG/CR-6909 report in 2007, formulae for calculating the Fen factors have been modified several times. Various calculation procedures like the so called 'Strain-integrated Method' and 'Simplified Approach' have been published while each approach yields to different results. The recent revision of the calculation procedure, proposed by ANL in 2012, is presented and discussed with regard to possible variations in the results depending on the assumptions made. In German KTA Rules the effect of environmentally assisted fatigue (EAF) is taken into account by means of so called attention thresholds. If the threshold value is exceeded, further measures like NDT, in-service inspections including fracture mechanical evaluations or detailed assessment procedures have to be performed. One way to handle those measures is to apply sophisticated procedures and to show that the calculated CUF is below the defined attention thresholds. On the basis of a practical example, methods and approaches will be discussed and recommendations in terms of avoiding over-conservatism and misinterpretation will be presented.

  10. Including uncertainty in hazard analysis through fuzzy measures

    International Nuclear Information System (INIS)

    Bott, T.F.; Eisenhawer, S.W.

    1997-12-01

    This paper presents a method for capturing the uncertainty expressed by an Hazard Analysis (HA) expert team when estimating the frequencies and consequences of accident sequences and provides a sound mathematical framework for propagating this uncertainty to the risk estimates for these accident sequences. The uncertainty is readily expressed as distributions that can visually aid the analyst in determining the extent and source of risk uncertainty in HA accident sequences. The results also can be expressed as single statistics of the distribution in a manner analogous to expressing a probabilistic distribution as a point-value statistic such as a mean or median. The study discussed here used data collected during the elicitation portion of an HA on a high-level waste transfer process to demonstrate the techniques for capturing uncertainty. These data came from observations of the uncertainty that HA team members expressed in assigning frequencies and consequences to accident sequences during an actual HA. This uncertainty was captured and manipulated using ideas from possibility theory. The result of this study is a practical method for displaying and assessing the uncertainty in the HA team estimates of the frequency and consequences for accident sequences. This uncertainty provides potentially valuable information about accident sequences that typically is lost in the HA process

  11. Statistical Analysis and Comparison of Harmonics Measured in Offshore Wind Farms

    DEFF Research Database (Denmark)

    Kocewiak, Lukasz Hubert; Hjerrild, Jesper; Bak, Claus Leth

    2011-01-01

    The paper shows statistical analysis of harmonic components measured in different offshore wind farms. Harmonic analysis is a complex task and requires many aspects, such as measurements, data processing, modeling, validation, to be taken into consideration. The paper describes measurement process...... and shows sophisticated analysis on representative harmonic measurements from Avedøre Holme, Gunfleet Sands and Burbo Bank wind farms. The nature of generation and behavior of harmonic components in offshore wind farms clearly presented and explained based on probabilistic approach. Some issues regarding...... commonly applied standards are also put forward in the discussion. Based on measurements and data analysis it is shown that a general overview about wind farm harmonic behaviour cannot be fully observed only based on single-value measurements as suggested in the standards but using more descriptive...

  12. Operations and Modeling Analysis

    Science.gov (United States)

    Ebeling, Charles

    2005-01-01

    The Reliability and Maintainability Analysis Tool (RMAT) provides NASA the capability to estimate reliability and maintainability (R&M) parameters and operational support requirements for proposed space vehicles based upon relationships established from both aircraft and Shuttle R&M data. RMAT has matured both in its underlying database and in its level of sophistication in extrapolating this historical data to satisfy proposed mission requirements, maintenance concepts and policies, and type of vehicle (i.e. ranging from aircraft like to shuttle like). However, a companion analyses tool, the Logistics Cost Model (LCM) has not reached the same level of maturity as RMAT due, in large part, to nonexistent or outdated cost estimating relationships and underlying cost databases, and it's almost exclusive dependence on Shuttle operations and logistics cost input parameters. As a result, the full capability of the RMAT/LCM suite of analysis tools to take a conceptual vehicle and derive its operations and support requirements along with the resulting operating and support costs has not been realized.

  13. Nontargeted nuclear magnetic resonance (NMR) analysis to detect hazardous substances including methanol in unrecorded alcohol from Novosibirsk, Russia

    OpenAIRE

    Hausler, Thomas; Okaru,  Alex O.; Neufeld, Maria; Rehm, Jürgen; Kuballa, Thomas; Luy, Burkhard; Lachenmeier, Dirk W.

    2016-01-01

    Nuclear magnetic resonance (NMR) spectroscopy was applied to the analysis of alcoholic products in the context of health and safety control. A total of 86 samples of unrecorded alcohol were collected in Novosibirsk and nearby cities in Russia. Sampling was based on interviews with alcohol dependent patients, and unrecorded alcohol thus defined included illegally or informally produced alcoholic products (e.g., counterfeit or home-made alcoholic beverages) or surrogate alcohol in the form of c...

  14. The Case for Including Adverse Childhood Experiences in Child Maltreatment Education: A Path Analysis.

    Science.gov (United States)

    Bachmann, Michael; Bachmann, Brittany A

    2018-03-16

    The lifelong, negative consequences of exposure to adverse childhood experiences (ACEs) for individuals and their families are well established. To demonstrate the importance of including ACE information in child maltreatment education curricula using path analysis. Survey data examined the impact of child maltreatment education programs and knowledge about ACEs on medical practitioners' reporting habits and ability to detect maltreatment. A path diagram distinguished between the direct impact of education programs on outcome measures and the indirect effect that is mediated through knowledge of ACEs. Medical practitioners' ability to detect child maltreatment and their number of referrals to Child Protective Services (CPS). The optimized path diagram (χ 2 SB(3) = 3.9, p = 0.27; RMSEA-SB = 0.017; R 2 = 0.21, where SB is Satorra-Bentler coefficient and RMSEA is root-mean-square error of approximation) revealed the mediating variable "knowledge about ACEs" as the strongest structural effect (SB-β = 0.34) on the number of CPS referrals. It was almost twice as high as the second strongest effect of formal education programs (SB-β = 0.19). For workplace training programs, the total effect when including knowledge of ACEs was almost double as strong as the direct effect alone. Even when previous child maltreatment education was controlled for, practitioners familiar with the consequences of ACEs were significantly more likely to recognize and to report abuse to CPS. This study documented the importance of specialized training programs on ACEs, and the essential role ACE knowledge plays in the effectiveness of provider education programs.

  15. Nonlinear dynamic analysis of framed structures including soil-structure interaction effects

    International Nuclear Information System (INIS)

    Mahmood, M.N.; Ahmed, S.Y.

    2008-01-01

    The role of oil-structure interaction on seismic behavior of reinforced concrete structures is investigated in this paper. A finite element approach has been adopted to model the interaction system that consists of the reinforced concrete plane frame, soil deposit and interface which represents the frictional between foundation of the structure and subsoil. The analysis is based on the elasto-plastic behavior of the frame members (beams and columns) that is defined by the ultimate axial force-bending moment interaction curve, while the cap model is adopted to govern the elasto-plastic behavior of the soil material. Mohr-Coulomb failure law is used to determine the initiation of slippage at the interface, while the separation is assumed to determine the initiation of slippage at the interface, while the separation is assumed to occur when the stresses at the interface becomes tension stresses. New-Mark's Predictor-Corrector algorithm is adopted for nonlinear dynamic analysis. The main aim of present work is to evaluate the sensitivity of structures to different behavior of the soil and interface layer when subjected to an earthquake excitation. Predicted results of the dynamic analysis of the interaction system indicate that the soil-structure interaction problem can have beneficial effects on the structural behavior when different soil models (elastic and elasto-plastic) and interface conditions (perfect bond and permitted slip)are considered. (author)

  16. Cost-Utility Analysis of Extending Public Health Insurance Coverage to Include Diabetic Retinopathy Screening by Optometrists.

    Science.gov (United States)

    van Katwyk, Sasha; Jin, Ya-Ping; Trope, Graham E; Buys, Yvonne; Masucci, Lisa; Wedge, Richard; Flanagan, John; Brent, Michael H; El-Defrawy, Sherif; Tu, Hong Anh; Thavorn, Kednapa

    2017-09-01

    Diabetic retinopathy (DR) is one of the leading causes of vision loss and blindness in Canada. Eye examinations play an important role in early detection. However, DR screening by optometrists is not always universally covered by public or private health insurance plans. This study assessed whether expanding public health coverage to include diabetic eye examinations for retinopathy by optometrists is cost-effective from the perspective of the health care system. We conducted a cost-utility analysis of extended coverage for diabetic eye examinations in Prince Edward Island to include examinations by optometrists, not currently publicly covered. We used a Markov chain to simulate disease burden based on eye examination rates and DR progression over a 30-year time horizon. Results were presented as an incremental cost per quality-adjusted life year (QALY) gained. A series of one-way and probabilistic sensitivity analyses were performed. Extending public health coverage to eye examinations by optometrists was associated with higher costs ($9,908,543.32) and improved QALYs (156,862.44), over 30 years, resulting in an incremental cost-effectiveness ratio of $1668.43/QALY gained. Sensitivity analysis showed that the most influential determinants of the results were the cost of optometric screening and selected utility scores. At the commonly used threshold of $50,000/QALY, the probability that the new policy was cost-effective was 99.99%. Extending public health coverage to eye examinations by optometrists is cost-effective based on a commonly used threshold of $50,000/QALY. Findings from this study can inform the decision to expand public-insured optometric services for patients with diabetes. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  17. Use of portable X-ray fluorescence instrument for bulk alloy analysis on low corroded indoor bronzes

    International Nuclear Information System (INIS)

    Šatović, D.; Desnica, V.; Fazinić, S.

    2013-01-01

    One of the most often used non-destructive methods for elemental analysis when performing field measurements on bronze sculptures is X-ray fluorescence (XRF) analysis based on portable instrumentation. However, when performing routine in-situ XRF analysis on corroded objects obtained results are sometimes considerably influenced by the corrosion surface products. In this work the suitability of portable XRF for bulk analysis of low corroded bronzes, which were initially precisely characterized using sophisticated and reliable laboratory methods, was investigated and some improvements in measuring technique and data processing were given. Artificially corroded bronze samples were analyzed by a portable XRF instrument using the same methodology and procedures as when performing in-situ analysis on real objects. The samples were first investigated using sophisticated complementary laboratory techniques: Scanning Electron Microscopy, Proton-Induced X-ray Emission Spectroscopy and Rutherford Backscattering Spectrometry, in order to gain precise information on the formation of the corrosion product layers and in-depth elemental profile of corrosion layers for different aging parameters. It has been shown that for corrosion layers of up to ca. 25 μm a portable XRF can yield very accurate quantification results. - Highlights: • XRF quantification is very accurate for bronze corrosion layers of up to ca. 25 μm. • Corrosion layer formation on bronze described in two phases. • Corrosion layers precisely characterized using PIXE, RBS and SEM. • Corrosion approximated as CuO for layer thickness calculations via X-ray attenuations • Increasingly lighter corrosion matrix may cause SnLα radiation intensity inversion

  18. Use of portable X-ray fluorescence instrument for bulk alloy analysis on low corroded indoor bronzes

    Energy Technology Data Exchange (ETDEWEB)

    Šatović, D., E-mail: dsatovic@alu.hr [Department of Conservation and Restoration, Academy of Fine Arts, Ilica 85, 10000 Zagreb (Croatia); Desnica, V. [Department of Conservation and Restoration, Academy of Fine Arts, Ilica 85, 10000 Zagreb (Croatia); Fazinić, S. [Laboratory for Ion Beam Interactions, Ruđer Bošković Institute, Bijenička 54, 10000 Zagreb (Croatia)

    2013-11-01

    One of the most often used non-destructive methods for elemental analysis when performing field measurements on bronze sculptures is X-ray fluorescence (XRF) analysis based on portable instrumentation. However, when performing routine in-situ XRF analysis on corroded objects obtained results are sometimes considerably influenced by the corrosion surface products. In this work the suitability of portable XRF for bulk analysis of low corroded bronzes, which were initially precisely characterized using sophisticated and reliable laboratory methods, was investigated and some improvements in measuring technique and data processing were given. Artificially corroded bronze samples were analyzed by a portable XRF instrument using the same methodology and procedures as when performing in-situ analysis on real objects. The samples were first investigated using sophisticated complementary laboratory techniques: Scanning Electron Microscopy, Proton-Induced X-ray Emission Spectroscopy and Rutherford Backscattering Spectrometry, in order to gain precise information on the formation of the corrosion product layers and in-depth elemental profile of corrosion layers for different aging parameters. It has been shown that for corrosion layers of up to ca. 25 μm a portable XRF can yield very accurate quantification results. - Highlights: • XRF quantification is very accurate for bronze corrosion layers of up to ca. 25 μm. • Corrosion layer formation on bronze described in two phases. • Corrosion layers precisely characterized using PIXE, RBS and SEM. • Corrosion approximated as CuO for layer thickness calculations via X-ray attenuations • Increasingly lighter corrosion matrix may cause SnLα radiation intensity inversion.

  19. Should Cost-Effectiveness Analysis Include the Cost of Consumption Activities? AN Empirical Investigation.

    Science.gov (United States)

    Adarkwah, Charles Christian; Sadoghi, Amirhossein; Gandjour, Afschin

    2016-02-01

    There has been a debate on whether cost-effectiveness analysis should consider the cost of consumption and leisure time activities when using the quality-adjusted life year as a measure of health outcome under a societal perspective. The purpose of this study was to investigate whether the effects of ill health on consumptive activities are spontaneously considered in a health state valuation exercise and how much this matters. The survey enrolled patients with inflammatory bowel disease in Germany (n = 104). Patients were randomized to explicit and no explicit instruction for the consideration of consumption and leisure effects in a time trade-off (TTO) exercise. Explicit instruction to consider non-health-related utility in TTO exercises did not influence TTO scores. However, spontaneous consideration of non-health-related utility in patients without explicit instruction (60% of respondents) led to significantly lower TTO scores. Results suggest an inclusion of consumption costs in the numerator of the cost-effectiveness ratio, at least for those respondents who spontaneously consider non-health-related utility from treatment. Results also suggest that exercises eliciting health valuations from the general public may include a description of the impact of disease on consumptive activities. Copyright © 2015 John Wiley & Sons, Ltd.

  20. International Conference on Recent Advances in Mathematical Biology, Analysis and Applications

    CERN Document Server

    Saleem, M; Srivastava, H; Khan, Mumtaz; Merajuddin, M

    2016-01-01

    The book contains recent developments and contemporary research in mathematical analysis and in its application to problems arising from the biological and physical sciences. The book is of interest to readers who wish to learn of new research in such topics as linear and nonlinear analysis, mathematical biology and ecology, dynamical systems, graph theory, variational analysis and inequalities, functional analysis, differential and difference equations, partial differential equations, approximation theory, and chaos. All papers were prepared by participants at the International Conference on Recent Advances in Mathematical Biology, Analysis and Applications (ICMBAA-2015) held during 4–6 June 2015 in Aligarh, India. A focal theme of the conference was the application of mathematics to the biological sciences and on current research in areas of theoretical mathematical analysis that can be used as sophisticated tools for the study of scientific problems. The conference provided researchers, academicians and ...

  1. Review of neutron activation analysis in the standardization and study of reference materials, including its application to radionuclide reference materials

    International Nuclear Information System (INIS)

    Byrne, A.R.

    1993-01-01

    Neutron activation analysis (NAA) plays a very important role in the certification of reference materials (RMs) and their characterization, including homogeneity testing. The features of the method are briefly reviewed, particularly aspects relating to its completely independent nuclear basis, its virtual freedom from blank problems, and its capacity for self-verification. This last aspect, arising from the essentially isotopic character of NAA, can be exploited by using different nuclear reactions and induced nuclides, and the possibility of employing two modes, one instrumental (nondestructive), the other radiochemical (destructive). This enables the derivation of essentially independent analytical information and the unique capacity of NAA for selfvalidation. The application of NAA to quantify natural or man-made radionuclides such as uranium, thorium, 237 Np, 129 I and 230 Th is discussed, including its advantages over conventional radiometric methods and its usefulness in providing independent data for nuclides where other confirmatory analyses are impossible, or are only recently becoming available through newer 'atom counting' techniques. Certain additional, prospective uses of NAA in the study of RMs and potential RMs are mentioned, including transmutation reactions, creation of endogenously radiolabelled matrices for production and study of RMs (such as dissolution and leaching tests, use as incorporated radiotracers for chemical recovery correction), and the possibility of molecular activation analysis for specification. (orig.)

  2. SoS Notebook: An Interactive Multi-Language Data Analysis Environment.

    Science.gov (United States)

    Peng, Bo; Wang, Gao; Ma, Jun; Leong, Man Chong; Wakefield, Chris; Melott, James; Chiu, Yulun; Du, Di; Weinstein, John N

    2018-05-22

    Complex bioinformatic data analysis workflows involving multiple scripts in different languages can be difficult to consolidate, share, and reproduce. An environment that streamlines the entire processes of data collection, analysis, visualization and reporting of such multi-language analyses is currently lacking. We developed Script of Scripts (SoS) Notebook, a web-based notebook environment that allows the use of multiple scripting language in a single notebook, with data flowing freely within and across languages. SoS Notebook enables researchers to perform sophisticated bioinformatic analysis using the most suitable tools for different parts of the workflow, without the limitations of a particular language or complications of cross-language communications. SoS Notebook is hosted at http://vatlab.github.io/SoS/ and is distributed under a BSD license. bpeng@mdanderson.org.

  3. [Materiality Analysis of Health Plans Based on Stakeholder Engagement and the Issues Included at ISO 26000:2010].

    Science.gov (United States)

    Moyano Santiago, Miguel Angel; Rivera Lirio, Juana María

    2017-01-18

    Health plans of the Spanish autonomous communities can incorporate sustainable development criteria in its development. There have been no analysis or proposals about development and indicators. The goal is to add a contribution to help build better health plans aimed at sustainable development and help to manage economic, social and environmental impacts of health systems criteria. We used a variation of the RAND/UCLA or modified Delphi technique method. The process consisted of a bibliographical and context matters and issues related to health and social responsibility analysis based on ISO 26000: 2010. A survey by deliberately to a selection of 70 expert members of the identified stakeholders was carried out and a discussion group was held to determine the consensus on the issues addressed in the survey sample. The research was conducted in 2015. From the literature review 33 health issues included in ISO 26000:2010 were obtained. 7 survey proved relevant high consensus, 8 relevance and average consensus and 18 with less relevance and high level of dissent. The expert group excluded 4 of the 18 subjects with less consensus. 29 issues included 33 at work, divided into 7 subjects contained in the guide ISO 26000 of social responsibility, were relevant stakeholders regarding possible inclusion in health plans. Considering the direct relationship published by ISO (International Organization for Standardization) among the issues ISO 26000 and the economic, social and environmental indicators in GRI (Global Reporting Initiative) in its G4 version, a panel with monitoring indicators related to relevant issues were elaborated.

  4. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  5. High resolution transmission electron microscopy and microdiffraction for radiation damage analysis

    International Nuclear Information System (INIS)

    Sinclair, R.

    1982-01-01

    High resolution TEM techniques have developed to quite a sophisticated level over the past few years. In addition TEM instruments with a scanning capability have become available commercially which permit in particular the formation of a small electron probe at the specimen. Thus direct resolution and microdiffraction investigations of thin specimens are now possible, neither of which have been employed to any great extent in the analysis of radiation damage. Some recent advances which are thought to be relevant to this specific area of research are highlighted

  6. Correction to: A sophisticated, differentiated Golgi in the ancestor of eukaryotes.

    Science.gov (United States)

    Barlow, Lael D; Nývltová, Eva; Aguilar, Maria; Tachezy, Jan; Dacks, Joel B

    2018-03-28

    Upon publication of the original article, Barlow et al. [1], the authors noticed that Fig. 4b contained an inaccuracy when additional data is taken into account. We inferred a loss of GRASP in the common ancestor of cryptophytes and archaeplastids, based on the absence of identified homologues in the data from taxa that we analyzed, which include Cyanidioschyzon merolae as the single representative of red algae.

  7. AAMQS: A non-linear QCD analysis of new HERA data at small-x including heavy quarks

    International Nuclear Information System (INIS)

    Albacete, Javier L.; Armesto, Nestor; Salgado, Carlos A.; Milhano, Jose Guilherme; Quiroga Arias, Paloma

    2011-01-01

    We present a global analysis of available data on inclusive structure functions and reduced cross sections measured in electron-proton scattering at small values of Bjorken-x, x<0.01, including the latest data from HERA on reduced cross sections. Our approach relies on the dipole formulation of DIS together with the use of the non-linear running coupling Balitsky-Kovchegov equation for the description of the small-x dynamics. We improve our previous studies by including the heavy quark (charm and beauty) contribution to the reduced cross sections, and also by considering a variable flavor scheme for the running of the coupling. We obtain a good description of the data, with the fit parameters remaining stable with respect to our previous analyses where only light quarks were considered. The inclusion of the heavy quark contributions resulted in a good description of available experimental data for the charm component of the structure function and reduced cross section provided the initial transverse distribution of heavy quarks was allowed to differ from (more specifically, to have a smaller radius than) that of the light flavors. (orig.)

  8. AAMQS: A non-linear QCD analysis of new HERA data at small-x including heavy quarks

    Energy Technology Data Exchange (ETDEWEB)

    Albacete, Javier L. [CEA/Saclay, URA 2306, Unite de Recherche Associee au CNRS, Institut de Physique Theorique, Gif-sur-Yvette cedex (France); Armesto, Nestor; Salgado, Carlos A. [Universidade de Santiago de Compostela, Departamento de Fisica de Particulas and IGFAE, Santiago de Compostela (Spain); Milhano, Jose Guilherme [Instituto Superior Tecnico (IST), Universidade Tecnica de Lisboa, CENTRA, Lisboa (Portugal); Theory Unit, CERN, Physics Department, Geneve 23 (Switzerland); Quiroga Arias, Paloma [UPMC Univ. Paris 6 and CNRS UMR7589, LPTHE, Paris (France)

    2011-07-15

    We present a global analysis of available data on inclusive structure functions and reduced cross sections measured in electron-proton scattering at small values of Bjorken-x, x<0.01, including the latest data from HERA on reduced cross sections. Our approach relies on the dipole formulation of DIS together with the use of the non-linear running coupling Balitsky-Kovchegov equation for the description of the small-x dynamics. We improve our previous studies by including the heavy quark (charm and beauty) contribution to the reduced cross sections, and also by considering a variable flavor scheme for the running of the coupling. We obtain a good description of the data, with the fit parameters remaining stable with respect to our previous analyses where only light quarks were considered. The inclusion of the heavy quark contributions resulted in a good description of available experimental data for the charm component of the structure function and reduced cross section provided the initial transverse distribution of heavy quarks was allowed to differ from (more specifically, to have a smaller radius than) that of the light flavors. (orig.)

  9. Analysis of a suppressive subtractive hybridization library of Alternaria alternata resistant to 2-propenyl isothiocyanate

    Directory of Open Access Journals (Sweden)

    Heriberto García-Coronado

    2015-07-01

    Conclusions: The fungal response showed that natural compounds could induce tolerance/resistance mechanisms in organisms in the same manner as synthetic chemical products. The response of A. alternata to the toxicity of 2-pITC is a sophisticated phenomenon including the induction of signaling cascades targeting a broad set of cellular processes. Whole-transcriptome approaches are needed to elucidate completely the fungal response to 2-pITC.

  10. MiToBo - A Toolbox for Image Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Birgit Möller

    2016-04-01

    Full Text Available MiToBo is a toolbox and Java library for solving basic as well as advanced image processing and analysis tasks. It features a rich collection of fundamental, intermediate and high-level image processing operators and algorithms as well as a couple of sophisticated tools for specific biological and biomedical applications. These tools include operators for elucidating cellular morphology and locomotion as well as operators for the characterization of certain intracellular particles and structures. MiToBo builds upon and integrates into the widely-used image analysis software packages ImageJ and Fiji [11, 10], and all of its operators can easily be run in ImageJ and Fiji via a generic operator runner plugin. Alternatively MiToBo operators can directly be run from command line, and using its functionality as a library for developing own applications is also supported. Thanks to the Alida library [8] forming the base of MiToBo all operators share unified APIs fostering reusability, and graphical as well as command line user interfaces for operators are automatically generated. MiToBo is available from its website http://www.informatik.uni-halle.de/mitobo, on Github, via an Apache Archiva Maven repository server, and it can easily be activated in Fiji via its own update site.

  11. The Case for Including Adverse Childhood Experiences in Child Maltreatment Education: A Path Analysis

    Science.gov (United States)

    Bachmann, Michael; Bachmann, Brittany A

    2018-01-01

    Context The lifelong, negative consequences of exposure to adverse childhood experiences (ACEs) for individuals and their families are well established. Objective To demonstrate the importance of including ACE information in child maltreatment education curricula using path analysis. Design Survey data examined the impact of child maltreatment education programs and knowledge about ACEs on medical practitioners’ reporting habits and ability to detect maltreatment. A path diagram distinguished between the direct impact of education programs on outcome measures and the indirect effect that is mediated through knowledge of ACEs. Main Outcome Measures Medical practitioners’ ability to detect child maltreatment and their number of referrals to Child Protective Services (CPS). Results The optimized path diagram (χ2SB(3) = 3.9, p = 0.27; RMSEA-SB = 0.017; R2 = 0.21, where SB is Satorra-Bentler coefficient and RMSEA is root-mean-square error of approximation) revealed the mediating variable “knowledge about ACEs” as the strongest structural effect (SB-β = 0.34) on the number of CPS referrals. It was almost twice as high as the second strongest effect of formal education programs (SB-β = 0.19). For workplace training programs, the total effect when including knowledge of ACEs was almost double as strong as the direct effect alone. Even when previous child maltreatment education was controlled for, practitioners familiar with the consequences of ACEs were significantly more likely to recognize and to report abuse to CPS. Conclusion This study documented the importance of specialized training programs on ACEs, and the essential role ACE knowledge plays in the effectiveness of provider education programs. PMID:29616910

  12. Including health economic analysis in pilot studies: lessons learned from a cost-utility analysis within the PROSPECTIV pilot study

    Directory of Open Access Journals (Sweden)

    Richéal M. Burns

    2017-07-01

    Full Text Available PurposeTo assess feasibility and health economic benefits and costs as part of a pilot study for a nurse-led, psychoeducational intervention (NPLI for prostate cancer in order to understand the potential for cost effectiveness as well as contribute to the design of a larger scale trial.MethodsMen with stable prostate cancer post-treatment were recruited from two cancer centres in the UK. Eighty-three men were randomised to the NLPI plus usual care or usual care alone (UCA (42 NLPI and 41 UCA; the NLPI plus usual care was delivered in the primary-care setting (the intervention and included an initial face-to-face consultation with a trained nurse, with follow-up tailored to individual needs. The study afforded the opportunity to undertake a short-term within pilot analysis. The primary outcome measure for the economic evaluation was quality of life, as measured by the EuroQol five dimensions questionnaire (EQ-5D (EQ-5D-5L instrument. Costs (£2014 assessed included health-service resource use, out-of-pocket expenses and losses from inability to undertake usual activities.ResultsTotal and incremental costs varied across the different scenarios assessed, with mean cost differences ranging from £173 to £346; incremental effect, as measured by the change in utility scores over the duration of follow-up, exhibited wide confidence intervals highlighting inconclusive effectiveness (95% CI: -0.0226; 0.0438. The cost per patient of delivery of the intervention would be reduced if rolled out to a larger patient cohort.ConclusionsThe NLPI is potentially cost saving depending on the scale of delivery; however, the results presented are not considered generalisable.

  13. “Man is the measure of all things”: A critical analysis of the sophist's ...

    African Journals Online (AJOL)

    With every passing year, our experiences of the human nature have continued to teach us more about the very nature of man. Consequently, there has been the need to unlearn much of what has turned out to be prejudices and errors in our conception of man. This notwithstanding, the question “What is Man?

  14. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  15. Low-energy ion-beam deposition apparatus equipped with surface analysis system

    International Nuclear Information System (INIS)

    Ohno, Hideki; Aoki, Yasushi; Nagai, Siro.

    1994-10-01

    A sophisticated apparatus for low energy ion beam deposition (IBD) was installed at Takasaki Radiation Chemistry Research Establishment of JAERI in March 1991. The apparatus is composed of an IBD system and a real time/in-situ surface analysis system for diagnosing deposited thin films. The IBD system provides various kinds of low energy ion down to 10 eV with current density of 10 μA/cm 2 and irradiation area of 15x15 mm 2 . The surface analysis system consists of RHEED, AES, ISS and SIMS. This report describes the characteristics and the operation procedure of the apparatus together with some experimental results on depositing thin carbon films. (author)

  16. Quantitative analysis of raw materials mining of Sverdlovsk region in Russia

    Science.gov (United States)

    Tarasyev, Alexander M.; Vasilev, Julian; Turygina, Victoria F.

    2016-06-01

    The purpose of this article is to show the application of some qualitative methods for the analysis of a dataset for raw materials. The main approaches used are related to the correlation analysis and forecasting with trend lines. It is proved that the future mining of particular ores can be predicted on the basis of mathematical modeling. It is also shown that there exists a strong correlation between the mining of some specific raw materials. Some of the revealed correlations have meaningful explanations, and for others one should look for sophisticated interpretations. The applied approach can be used for forecasting of raw materials exploitation in various regions of Russia and in other countries.

  17. Prosthetic hand sensor placement: Analysis of touch perception during the grasp

    Directory of Open Access Journals (Sweden)

    Mirković Bojana

    2014-01-01

    Full Text Available Humans rely on their hands to perform everyday tasks. The hand is used as a tool, but also as the interface to “sense” the world. Current prosthetic hands are based on sophisticated multi-fingered structures, and include many sensors which counterpart natural proprioceptors and exteroceptors. The sensory information is used for control, but not sent to the user of the hand (amputee. Grasping without sensing is not good enough. This research is part of the development of the sensing interface for amputees, specifically addressing the analysis of human perception while grasping. The goal is to determine the small number of preferred positions of sensors on the prosthetic hand. This task has previously been approached by trying to replicate a natural sensory system characteristic for healthy humans, resulting in a multitude of redundant sensors and basic inability to make the patient aware of the sensor readings on the subconscious level. We based our artificial perception system on the reported sensations of humans when grasping various objects without seeing the objects (obstructed visual feedback. Subjects, with no known sensory deficits, were asked to report on the touch sensation while grasping. The analysis included objects of various sizes, weights, textures and temperatures. Based on this data we formed a map of the preferred positions for the sensors that is appropriate for five finger human-like robotic hand. The final map was intentionally minimized in size (number of sensors.

  18. Introduction to statistical data analysis for the life sciences

    CERN Document Server

    Ekstrom, Claus Thorn

    2014-01-01

    This text provides a computational toolbox that enables students to analyze real datasets and gain the confidence and skills to undertake more sophisticated analyses. Although accessible with any statistical software, the text encourages a reliance on R. For those new to R, an introduction to the software is available in an appendix. The book also includes end-of-chapter exercises as well as an entire chapter of case exercises that help students apply their knowledge to larger datasets and learn more about approaches specific to the life sciences.

  19. Methodological challenges when doing research that includes ethnic minorities

    DEFF Research Database (Denmark)

    Morville, Anne-Le; Erlandsson, Lena-Karin

    2016-01-01

    minorities are included. Method: A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O’Malley’s framework for scoping reviews, applying content analysis. Results: The results showed methodological...

  20. MOS modeling hierarchy including radiation effects

    International Nuclear Information System (INIS)

    Alexander, D.R.; Turfler, R.M.

    1975-01-01

    A hierarchy of modeling procedures has been developed for MOS transistors, circuit blocks, and integrated circuits which include the effects of total dose radiation and photocurrent response. The models were developed for use with the SCEPTRE circuit analysis program, but the techniques are suitable for other modern computer aided analysis programs. The modeling hierarchy permits the designer or analyst to select the level of modeling complexity consistent with circuit size, parametric information, and accuracy requirements. Improvements have been made in the implementation of important second order effects in the transistor MOS model, in the definition of MOS building block models, and in the development of composite terminal models for MOS integrated circuits

  1. Quantum algorithms for topological and geometric analysis of data

    Science.gov (United States)

    Lloyd, Seth; Garnerone, Silvano; Zanardi, Paolo

    2016-01-01

    Extracting useful information from large data sets can be a daunting task. Topological methods for analysing data sets provide a powerful technique for extracting such information. Persistent homology is a sophisticated tool for identifying topological features and for determining how such features persist as the data is viewed at different scales. Here we present quantum machine learning algorithms for calculating Betti numbers—the numbers of connected components, holes and voids—in persistent homology, and for finding eigenvectors and eigenvalues of the combinatorial Laplacian. The algorithms provide an exponential speed-up over the best currently known classical algorithms for topological data analysis. PMID:26806491

  2. Automated Dynamic Analysis of Ransomware: Benefits, Limitations and use for Detection

    OpenAIRE

    Sgandurra, Daniele; Muñoz-González, Luis; Mohsen, Rabih; Lupu, Emil C.

    2016-01-01

    Recent statistics show that in 2015 more than 140 millions new malware samples have been found. Among these, a large portion is due to ransomware, the class of malware whose specific goal is to render the victim's system unusable, in particular by encrypting important files, and then ask the user to pay a ransom to revert the damage. Several ransomware include sophisticated packing techniques, and are hence difficult to statically analyse. We present EldeRan, a machine learning approach for d...

  3. Upgraded safety analysis document including operations policies, operational safety limits and policy changes. Revision 2

    International Nuclear Information System (INIS)

    Batchelor, K.

    1996-03-01

    The National Synchrotron Light Source Safety Analysis Reports (1), (2), (3), BNL reports number-sign 51584, number-sign 52205 and number-sign 52205 (addendum) describe the basic Environmental Safety and Health issues associated with the department's operations. They include the operating envelope for the Storage Rings and also the rest of the facility. These documents contain the operational limits as perceived prior or during construction of the facility, much of which still are appropriate for current operations. However, as the machine has matured, the experimental program has grown in size, requiring more supervision in that area. Also, machine studies have either verified or modified knowledge of beam loss modes and/or radiation loss patterns around the facility. This document is written to allow for these changes in procedure or standards resulting from their current mode of operation and shall be used in conjunction with the above reports. These changes have been reviewed by NSLS and BNL ES and H committee and approved by BNL management

  4. Comprehensive Feature-Based Landscape Analysis of Continuous and Constrained Optimization Problems Using the R-Package flacco

    OpenAIRE

    Kerschke, Pascal

    2017-01-01

    Choosing the best-performing optimizer(s) out of a portfolio of optimization algorithms is usually a difficult and complex task. It gets even worse, if the underlying functions are unknown, i.e., so-called Black-Box problems, and function evaluations are considered to be expensive. In the case of continuous single-objective optimization problems, Exploratory Landscape Analysis (ELA) - a sophisticated and effective approach for characterizing the landscapes of such problems by means of numeric...

  5. Applying AI tools to operational space environmental analysis

    Science.gov (United States)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  6. Comparison of different application systems and CT- assisted treatment planning procedures in primary endometrium cancer: Is it technically possible to include the whole uterus volume in the volume treated by brachytherapy

    International Nuclear Information System (INIS)

    Mock, U.; Knocke, Th.; Fellner, C.; Poetter, R.

    1996-01-01

    Purpose: Brachytherapy is regarded as the definitive component of treatment for inoperable patients with endometrium cancer. In published series the whole uterus has been claimed to represent the target volume independently of the individual tumor spread. The purpose of this work is to compare different planning and application procedures and to analyze the target volumes (whole uterus), treatment volumes and their respective relation for the given various conditions. Material and Methods: In ten patients with primary endometrium cancer the correlation between target- and treatment volume was analysed based on standard one-channel applicators or individual Heyman applicators. A comparative analysis of target volumes resulting from two different planning procedures of Heyman applications was performed. CT was carried out after insertion of the Heyman ovoids. Target volume was estimated by measuring the uterus size at different cross sections of the CT images. Dose calculation was performed with (PLATO-system) or without (NPS-system) transferring these data directly to the planning system. We report on the differences in treatment volumes resulting from the two application and planning systems. Results: The mean value of the uterus volume was 180 ccm (range 57 ccm to 316 ccm). Four out of 10 patients had an asymmetric uterus configuration with a side-difference (in longitudinal or transversal direction) of more than 1 cm. On average 70% (range 48-95%) of the uterus volume was included by the treatment volume when Heymann applicators were used compared to 45 % (range 25-89%) when standard one channel applicators were used. This represents an improvement of 25% (range from 11%-35%). By utilizing the more sophisticated way of treatment planning a more adequate coverage of the uterus volume was achieved in five out of ten patients. The treated volume increased on the average by 20 % (range 11 %-32%). In three cases changes in the irradiation volume were less than 5%. In

  7. Coupled dynamic analysis of subsea pipe laying operations

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Danilo Machado Lawinscky da; Jacob, Breno Pinheiro [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Civil. Lab. of Computational Methods and Offshore Systems

    2009-12-19

    It is recognized that deep water offshore oil exploitation activities requires the use of sophisticated computational tools to predict the behavior of floating offshore systems under the action of environmental loads. These computational tools should be able to perform coupled dynamic analyses, considering the non-linear interaction of the hydrodynamic behavior of the platform with the structural/hydrodynamic behavior of the mooring lines and risers, represented by Finite Element models. The use of such a sophisticated computational tool becomes mandatory not only for the design of production platforms, but also for the simulation of offshore installation operations. For instance, in the installation of submarine pipelines, the wall thickness design may not be governed by the pressure containment requirements of the pipeline during the operation, but by the installation process, specifically the combined action of bending, tension and hydrostatic pressure acting on the pipeline, that is also submitted to the motions of the lay barge. Therefore, the objective of this work is to present the results of numerical simulations of S-lay installation procedures using a computational tool that performs dynamic analysis coupling the structural behavior of the pipe with the hydrodynamic behavior of the vessel motions under environmental conditions. This tool rigorously considers the contact between the pipeline and its supports (lay barge, stinger, seabed). The results are compared to traditional pipe laying simulations based on RAO motions. (author)

  8. INCLUDING RISK IN ECONOMIC FEASIBILITY ANALYSIS:A STOCHASTIC SIMULATION MODEL FOR BLUEBERRY INVESTMENT DECISIONS IN CHILE

    Directory of Open Access Journals (Sweden)

    GERMÁN LOBOS

    2015-12-01

    Full Text Available ABSTRACT The traditional method of net present value (NPV to analyze the economic profitability of an investment (based on a deterministic approach does not adequately represent the implicit risk associated with different but correlated input variables. Using a stochastic simulation approach for evaluating the profitability of blueberry (Vaccinium corymbosum L. production in Chile, the objective of this study is to illustrate the complexity of including risk in economic feasibility analysis when the project is subject to several but correlated risks. The results of the simulation analysis suggest that the non-inclusion of the intratemporal correlation between input variables underestimate the risk associated with investment decisions. The methodological contribution of this study illustrates the complexity of the interrelationships between uncertain variables and their impact on the convenience of carrying out this type of business in Chile. The steps for the analysis of economic viability were: First, adjusted probability distributions for stochastic input variables (SIV were simulated and validated. Second, the random values of SIV were used to calculate random values of variables such as production, revenues, costs, depreciation, taxes and net cash flows. Third, the complete stochastic model was simulated with 10,000 iterations using random values for SIV. This result gave information to estimate the probability distributions of the stochastic output variables (SOV such as the net present value, internal rate of return, value at risk, average cost of production, contribution margin and return on capital. Fourth, the complete stochastic model simulation results were used to analyze alternative scenarios and provide the results to decision makers in the form of probabilities, probability distributions, and for the SOV probabilistic forecasts. The main conclusion shown that this project is a profitable alternative investment in fruit trees in

  9. Nonlinear dynamic analysis of hydrodynamically-coupled stainless steel structures

    International Nuclear Information System (INIS)

    Zhao, Y.

    1996-01-01

    Spent nuclear fuel is usually stored temporarily on the site of nuclear power plants. The spent fuel storage racks are nuclear-safety-related stainless steel structures required to be analyzed for seismic loads. When the storage pool is subjected to three-dimensional (3-D) floor seismic excitations, rack modules, stored fuel bundles, adjacent racks and pool walls, and surrounding water are hydrodynamically coupled. Hydrodynamic coupling (HC) significantly affects the dynamic responses of the racks that are free-standing and submerged in water within the pool. A nonlinear time-history dynamic analysis is usually needed to describe the motion behavior of the racks that are both geometrically nonlinear and material nonlinear in nature. The nonlinearities include the friction resistance between the rack supporting legs and the pool floor, and various potential impacts of fuel-rack, rack-rack, and rack-pool wall. The HC induced should be included in the nonlinear dynamic analysis using the added-hydrodynamic-mass concept based on potential theory per the US Nuclear Regulatory Commission (USNRC) acceptance criteria. To this end, a finite element analysis constitutes a feasible and effective tool. However, most people perform somewhat simplified 1-D, or 2-D, or 3-D single rack and 2-D multiple rack analyses. These analyses are incomplete because a 3-D single rack model behaves quite differently from a 2-D mode. Furthermore, a 3-D whole pool multi-rack model behaves differently than a 3-D single rack model, especially when the strong HC effects are unsymmetrical. In this paper 3-D nonlinear dynamic time-history analyses were performed in a more quantitative manner using sophisticated finite element models developed for a single rack as well as all twelve racks in the whole-pool. Typical response results due to different HC effects are determined and discussed

  10. SNP array analysis reveals novel genomic abnormalities including copy neutral loss of heterozygosity in anaplastic oligodendrogliomas.

    Directory of Open Access Journals (Sweden)

    Ahmed Idbaih

    Full Text Available Anaplastic oligodendrogliomas (AOD are rare glial tumors in adults with relative homogeneous clinical, radiological and histological features at the time of diagnosis but dramatically various clinical courses. Studies have identified several molecular abnormalities with clinical or biological relevance to AOD (e.g. t(1;19(q10;p10, IDH1, IDH2, CIC and FUBP1 mutations.To better characterize the clinical and biological behavior of this tumor type, the creation of a national multicentric network, named "Prise en charge des OLigodendrogliomes Anaplasiques (POLA," has been supported by the Institut National du Cancer (InCA. Newly diagnosed and centrally validated AOD patients and their related biological material (tumor and blood samples were prospectively included in the POLA clinical database and tissue bank, respectively.At the molecular level, we have conducted a high-resolution single nucleotide polymorphism array analysis, which included 83 patients. Despite a careful central pathological review, AOD have been found to exhibit heterogeneous genomic features. A total of 82% of the tumors exhibited a 1p/19q-co-deletion, while 18% harbor a distinct chromosome pattern. Novel focal abnormalities, including homozygously deleted, amplified and disrupted regions, have been identified. Recurring copy neutral losses of heterozygosity (CNLOH inducing the modulation of gene expression have also been discovered. CNLOH in the CDKN2A locus was associated with protein silencing in 1/3 of the cases. In addition, FUBP1 homozygous deletion was detected in one case suggesting a putative tumor suppressor role of FUBP1 in AOD.Our study showed that the genomic and pathological analyses of AOD are synergistic in detecting relevant clinical and biological subgroups of AOD.

  11. An Analysis of Earth Science Data Analytics Use Cases

    Science.gov (United States)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  12. Kaplan turbine tip vortex cavitation - analysis and prevention

    Science.gov (United States)

    Motycak, L.; Skotak, A.; Kupcik, R.

    2012-11-01

    The work is focused on one type of Kaplan turbine runner cavitation - a tip vortex cavitation. For detailed description of the tip vortex, the CFD analysis is used. On the basis of this analysis it is possible to estimate the intensity of cavitating vortex core, danger of possible blade surface and runner chamber cavitation pitting. In the paper, the ways how to avoid the pitting effect of the tip vortex are described. In order to prevent the blade surface against pitting, the following possibilities as the change of geometry of the runner blade, dimension of tip clearance and finally the installation of the anti-cavitation lips are discussed. The knowledge of the shape and intensity of the tip vortex helps to design the anti-cavitation lips more sophistically. After all, the results of the model tests of the Kaplan runner with or without anti-cavitation lips and the results of the CFD analysis are compared.

  13. Use of computational fluid dynamics codes for safety analysis of nuclear reactor systems, including containment. Summary report of a technical meeting

    International Nuclear Information System (INIS)

    2003-11-01

    Safety analysis is an important tool for justifying the safety of nuclear power plants. Typically, this type of analysis is performed by means of system computer codes with one dimensional approximation for modelling real plant systems. However, in the nuclear area there are issues for which traditional treatment using one dimensional system codes is considered inadequate for modelling local flow and heat transfer phenomena. There is therefore increasing interest in the application of three dimensional computational fluid dynamics (CFD) codes as a supplement to or in combination with system codes. There are a number of both commercial (general purpose) CFD codes as well as special codes for nuclear safety applications available. With further progress in safety analysis techniques, the increasing use of CFD codes for nuclear applications is expected. At present, the main objective with respect to CFD codes is generally to improve confidence in the available analysis tools and to achieve a more reliable approach to safety relevant issues. An exchange of views and experience can facilitate and speed up progress in the implementation of this objective. Both the International Atomic Energy Agency (IAEA) and the Nuclear Energy Agency of the Organisation for Economic Co-operation and Development (OECD/NEA) believed that it would be advantageous to provide a forum for such an exchange. Therefore, within the framework of the Working Group on the Analysis and Management of Accidents of the NEA's Committee on the Safety of Nuclear Installations, the IAEA and the NEA agreed to jointly organize the Technical Meeting on the Use of Computational Fluid Dynamics Codes for Safety Analysis of Reactor Systems, including Containment. The meeting was held in Pisa, Italy, from 11 to 14 November 2002. The publication constitutes the report of the Technical Meeting. It includes short summaries of the presentations that were made and of the discussions as well as conclusions and

  14. IWGFR benchmark test on signal processing for boiling noise detection, stage 2: Analysis of data from BOR-60

    International Nuclear Information System (INIS)

    Rowley, R.; Waites, C.; Macleod, I.D.

    1989-01-01

    Data from boiling experiments in the BOR 60 reactor in USSR has been supplied by IAEA to enable analysis techniques to be compared. The signals have been analysed at RNL using two basic techniques, High Frequency RMS analysis and Pulse Counting analysis and two more sophisticated methods, Pattern Recognition and Pulse Timing Analysis. All methods indicated boiling successfully, pulse counting proved more sensitive than RMS for the detection of the onset of boiling. Pattern Recognition shows promise of a very reliable detector provided the background can be defined. Data from an Ionisation chamber was also supplied and there was good correlation between the neutronic and acoustic signals. (author). 25 figs, 4 tabs

  15. Including climate change in energy investment decisions

    International Nuclear Information System (INIS)

    Ybema, J.R.; Boonekamp, P.G.M.; Smit, J.T.J.

    1995-08-01

    To properly take climate change into account in the analysis of energy investment decisions, it is required to apply decision analysis methods that are capable of considering the specific characteristics of climate change (large uncertainties, long term horizon). Such decision analysis methods do exist. They can explicitly include evolving uncertainties, multi-stage decisions, cumulative effects and risk averse attitudes. Various methods are considered in this report and two of these methods have been selected: hedging calculations and sensitivity analysis. These methods are applied to illustrative examples, and its limitations are discussed. The examples are (1a) space heating and hot water for new houses from a private investor perspective and (1b) as example (1a) but from a government perspective, (2) electricity production with an integrated coal gasification combined cycle (ICGCC) with or without CO 2 removal, and (3) national energy strategy to hedge for climate change. 9 figs., 21 tabs., 42 refs., 1 appendix

  16. HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Bauerdick, Lothar; et al.

    2018-04-09

    At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and interpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. As part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.

  17. Structural analysis of jewelry from the Moche tomb of the `lady of Cao' by X-ray digital radiography

    Science.gov (United States)

    Azeredo, S. R.; Cesareo, R.; Franco, R.; Fernandez, A.; Bustamante, A.; Lopes, R. T.

    2018-04-01

    Nose ornaments from the tomb of the `Lady of Cao', a mummified woman representative of the Moche culture and dated to the third-or-fourth century AD, were analyzed by X-ray digital radiography. These spectacular gold and silver jewels are some of the most sophisticated metalworking ever produced in ancient America. The Mochecivilization flourished along the north coast of present-day Peru, between the Andes and the Pacific Ocean, approximately between 100 and 600 AD. The Moche were very sophisticated artisans and metal smiths, being considered the finest producers of jewels and artifacts of the region. A portable X-ray digital radiography (XDR) system consisting of a flat panel detector with high resolution image and a mini X-ray tube was used for the structural analysis of the Moche jewels aiming at inferring different joining methods of the silver-gold sheets. The radiographic analysis showed some differences in the joint of the silver-and-gold sheets. Presence of filler material and adhesive for joining the silver-and-gold sheets was visible as well as silver-gold junctions without filler material (or with a material invisible in radiography). Furthermore, the technique demonstrated the advantage of using a portable XDR micro system when the sample cannot be brought to the laboratory.

  18. Ca analysis: An Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis☆

    Science.gov (United States)

    Greensmith, David J.

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. PMID:24125908

  19. IGSA: Individual Gene Sets Analysis, including Enrichment and Clustering.

    Science.gov (United States)

    Wu, Lingxiang; Chen, Xiujie; Zhang, Denan; Zhang, Wubing; Liu, Lei; Ma, Hongzhe; Yang, Jingbo; Xie, Hongbo; Liu, Bo; Jin, Qing

    2016-01-01

    Analysis of gene sets has been widely applied in various high-throughput biological studies. One weakness in the traditional methods is that they neglect the heterogeneity of genes expressions in samples which may lead to the omission of some specific and important gene sets. It is also difficult for them to reflect the severities of disease and provide expression profiles of gene sets for individuals. We developed an application software called IGSA that leverages a powerful analytical capacity in gene sets enrichment and samples clustering. IGSA calculates gene sets expression scores for each sample and takes an accumulating clustering strategy to let the samples gather into the set according to the progress of disease from mild to severe. We focus on gastric, pancreatic and ovarian cancer data sets for the performance of IGSA. We also compared the results of IGSA in KEGG pathways enrichment with David, GSEA, SPIA, ssGSEA and analyzed the results of IGSA clustering and different similarity measurement methods. Notably, IGSA is proved to be more sensitive and specific in finding significant pathways, and can indicate related changes in pathways with the severity of disease. In addition, IGSA provides with significant gene sets profile for each sample.

  20. Applying phylogenetic analysis to viral livestock diseases: moving beyond molecular typing.

    Science.gov (United States)

    Olvera, Alex; Busquets, Núria; Cortey, Marti; de Deus, Nilsa; Ganges, Llilianne; Núñez, José Ignacio; Peralta, Bibiana; Toskano, Jennifer; Dolz, Roser

    2010-05-01

    Changes in livestock production systems in recent years have altered the presentation of many diseases resulting in the need for more sophisticated control measures. At the same time, new molecular assays have been developed to support the diagnosis of animal viral disease. Nucleotide sequences generated by these diagnostic techniques can be used in phylogenetic analysis to infer phenotypes by sequence homology and to perform molecular epidemiology studies. In this review, some key elements of phylogenetic analysis are highlighted, such as the selection of the appropriate neutral phylogenetic marker, the proper phylogenetic method and different techniques to test the reliability of the resulting tree. Examples are given of current and future applications of phylogenetic reconstructions in viral livestock diseases. Copyright 2009 Elsevier Ltd. All rights reserved.

  1. ASAP: a web-based platform for the analysis and interactive visualization of single-cell RNA-seq data.

    Science.gov (United States)

    Gardeux, Vincent; David, Fabrice P A; Shajkofci, Adrian; Schwalie, Petra C; Deplancke, Bart

    2017-10-01

    Single-cell RNA-sequencing (scRNA-seq) allows whole transcriptome profiling of thousands of individual cells, enabling the molecular exploration of tissues at the cellular level. Such analytical capacity is of great interest to many research groups in the world, yet these groups often lack the expertise to handle complex scRNA-seq datasets. We developed a fully integrated, web-based platform aimed at the complete analysis of scRNA-seq data post genome alignment: from the parsing, filtering and normalization of the input count data files, to the visual representation of the data, identification of cell clusters, differentially expressed genes (including cluster-specific marker genes), and functional gene set enrichment. This Automated Single-cell Analysis Pipeline (ASAP) combines a wide range of commonly used algorithms with sophisticated visualization tools. Compared with existing scRNA-seq analysis platforms, researchers (including those lacking computational expertise) are able to interact with the data in a straightforward fashion and in real time. Furthermore, given the overlap between scRNA-seq and bulk RNA-seq analysis workflows, ASAP should conceptually be broadly applicable to any RNA-seq dataset. As a validation, we demonstrate how we can use ASAP to simply reproduce the results from a single-cell study of 91 mouse cells involving five distinct cell types. The tool is freely available at asap.epfl.ch and R/Python scripts are available at github.com/DeplanckeLab/ASAP. bart.deplancke@epfl.ch. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  2. Purchasing portfolio usage and purchasing sophistication

    NARCIS (Netherlands)

    Gelderman, C.J.; Weele, van A.J.

    2005-01-01

    Purchasing portfolio models have caused considerable controversy in literature. Many advantages and disadvantages have been put forward, revealing a strong disagreement on the merits of portfolio models. This study addresses the question whether or not the use of purchasing portfolio models should

  3. Sophisticated digestive systems in early arthropods.

    Science.gov (United States)

    Vannier, Jean; Liu, Jianni; Lerosey-Aubril, Rudy; Vinther, Jakob; Daley, Allison C

    2014-05-02

    Understanding the way in which animals diversified and radiated during their early evolutionary history remains one of the most captivating of scientific challenges. Integral to this is the 'Cambrian explosion', which records the rapid emergence of most animal phyla, and for which the triggering and accelerating factors, whether environmental or biological, are still unclear. Here we describe exceptionally well-preserved complex digestive organs in early arthropods from the early Cambrian of China and Greenland with functional similarities to certain modern crustaceans and trace these structures through the early evolutionary lineage of fossil arthropods. These digestive structures are assumed to have allowed for more efficient digestion and metabolism, promoting carnivory and macrophagy in early arthropods via predation or scavenging. This key innovation may have been of critical importance in the radiation and ecological success of Arthropoda, which has been the most diverse and abundant invertebrate phylum since the Cambrian.

  4. Butler's sophisticated constructivism: A critical assessment

    NARCIS (Netherlands)

    Vasterling, V.L.M.

    1999-01-01

    This paper aims to investigate whether and in what respects the conceptions of the body and of agency that Judith Butler develops in Bodies That Matter are useful contributions to feminist theory. The discussion focuses on the clarification and critical assessment of the arguments Butler presents to

  5. UPDG: Utilities package for data analysis of Pooled DNA GWAS

    Directory of Open Access Journals (Sweden)

    Ho Daniel WH

    2012-01-01

    Full Text Available Abstract Background Despite being a well-established strategy for cost reduction in disease gene mapping, pooled DNA association study is much less popular than the individual DNA approach. This situation is especially true for pooled DNA genomewide association study (GWAS, for which very few computer resources have been developed for its data analysis. This motivates the development of UPDG (Utilities package for data analysis of Pooled DNA GWAS. Results UPDG represents a generalized framework for data analysis of pooled DNA GWAS with the integration of Unix/Linux shell operations, Perl programs and R scripts. With the input of raw intensity data from GWAS, UPDG performs the following tasks in a stepwise manner: raw data manipulation, correction for allelic preferential amplification, normalization, nested analysis of variance for genetic association testing, and summarization of analysis results. Detailed instructions, procedures and commands are provided in the comprehensive user manual describing the whole process from preliminary preparation of software installation to final outcome acquisition. An example dataset (input files and sample output files is also included in the package so that users can easily familiarize themselves with the data file formats, working procedures and expected output. Therefore, UPDG is especially useful for users with some computer knowledge, but without a sophisticated programming background. Conclusions UPDG provides a free, simple and platform-independent one-stop service to scientists working on pooled DNA GWAS data analysis, but with less advanced programming knowledge. It is our vision and mission to reduce the hindrance for performing data analysis of pooled DNA GWAS through our contribution of UPDG. More importantly, we hope to promote the popularity of pooled DNA GWAS, which is a very useful research strategy.

  6. Do sophisticated epistemic beliefs predict meaningful learning? Findings from a structural equation model of undergraduate biology learning

    Science.gov (United States)

    Lee, Silvia Wen-Yu; Liang, Jyh-Chong; Tsai, Chin-Chung

    2016-10-01

    This study investigated the relationships among college students' epistemic beliefs in biology (EBB), conceptions of learning biology (COLB), and strategies of learning biology (SLB). EBB includes four dimensions, namely 'multiple-source,' 'uncertainty,' 'development,' and 'justification.' COLB is further divided into 'constructivist' and 'reproductive' conceptions, while SLB represents deep strategies and surface learning strategies. Questionnaire responses were gathered from 303 college students. The results of the confirmatory factor analysis and structural equation modelling showed acceptable model fits. Mediation testing further revealed two paths with complete mediation. In sum, students' epistemic beliefs of 'uncertainty' and 'justification' in biology were statistically significant in explaining the constructivist and reproductive COLB, respectively; and 'uncertainty' was statistically significant in explaining the deep SLB as well. The results of mediation testing further revealed that 'uncertainty' predicted surface strategies through the mediation of 'reproductive' conceptions; and the relationship between 'justification' and deep strategies was mediated by 'constructivist' COLB. This study provides evidence for the essential roles some epistemic beliefs play in predicting students' learning.

  7. Utility of a Systematic Approach to Teaching Photographic Nasal Analysis to Otolaryngology Residents.

    Science.gov (United States)

    Robitschek, Jon; Dresner, Harley; Hilger, Peter

    2017-12-01

    Photographic nasal analysis constitutes a critical step along the path toward accurate diagnosis and precise surgical planning in rhinoplasty. The learned process by which one assesses photographs, analyzes relevant anatomical landmarks, and generates a global view of the nasal aesthetic is less widely described. To discern the common pitfalls in performing photographic nasal analysis and to quantify the utility of a systematic approach model in teaching photographic nasal analysis to otolaryngology residents. This prospective observational study included 20 participants from a university-based otolaryngology residency program. The control and intervention groups underwent baseline graded assessment of 3 patients. The intervention group received instruction on a systematic approach model for nasal analysis, and both groups underwent postintervention testing at 10 weeks. Data were collected from October 1, 2015, through June 1, 2016. A 10-minute, 11-slide presentation provided instruction on a systematic approach to nasal analysis to the intervention group. Graded photographic nasal analysis using a binary 18-point system. The 20 otolaryngology residents (15 men and 5 women; age range, 24-34 years) were adept at mentioning dorsal deviation and dorsal profile with focused descriptions of tip angle and contour. Areas commonly omitted by residents included verification of the Frankfort plane, position of the lower lateral crura, radix position, and ratio of the ala to tip lobule. The intervention group demonstrated immediate improvement after instruction on the teaching model, with the mean (SD) postintervention test score doubling compared with their baseline performance (7.5 [2.7] vs 10.3 [2.5]; P Otolaryngology residents demonstrated proficiency at incorporating nasal deviation, tip angle, and dorsal profile contour into their nasal analysis. They often omitted verification of the Frankfort plane, position of lower lateral crura, radix depth, and ala-to-tip lobule

  8. Screening of Available Tools for Dynamic Mooring Analysis of Wave Energy Converters

    Directory of Open Access Journals (Sweden)

    Jonas Bjerg Thomsen

    2017-06-01

    Full Text Available The focus on alternative energy sources has increased significantly throughout the last few decades, leading to a considerable development in the wave energy sector. In spite of this, the sector cannot yet be considered commercialized, and many challenges still exist, in which mooring of floating wave energy converters is included. Different methods for assessment and design of mooring systems have been described by now, covering simple quasi-static analysis and more advanced and sophisticated dynamic analysis. Design standards for mooring systems already exist, and new ones are being developed specifically forwave energy converter moorings, which results in other requirements to the chosen tools, since these often have been aimed at other offshore sectors. The present analysis assesses a number of relevant commercial software packages for full dynamic mooring analysis in order to highlight the advantages and drawbacks. The focus of the assessment is to ensure that the software packages are capable of fulfilling the requirements of modeling, as defined in design standards and thereby ensuring that the analysis can be used to get a certified mooring system. Based on the initial assessment, the two software packages DeepC and OrcaFlex are found to best suit the requirements. They are therefore used in a case study in order to evaluate motion and mooring load response, and the results are compared in order to provide guidelines for which software package to choose. In the present study, the OrcaFlex code was found to satisfy all requirements.

  9. Impactos da sofisticação logística de empresas industriais nas motivações para terceirização Impact of industrial companies' sophisticated logistics on outsourcing

    Directory of Open Access Journals (Sweden)

    Peter Wanke

    2004-12-01

    Full Text Available O objetivo desta pesquisa é avaliar o impacto das diferentes dimensões de sofisticação da organização logística das empresas brasileiras do setor industrial, nas motivações para a terceirização de atividades logísticas. Para isto, foram definidas e operacionalizadas, com base em revisão de literatura, variáveis relacionadas à sofisticação da organização logística e aos principais motivos relacionados à decisão de terceirização. Foram enviados questionários para 218 empresas do setor industrial, listadas no Ranking da Revista Exame. A partir dos 93 questionários recebidos, foi possível identificar dois grupos distintos de empresas e suas diferentes motivações para a terceirização: (1 empresas com maiores níveis de formalização organizacional e baixos níveis de adoção de tecnologias de informação; e (2 empresas com menores níveis de formalização organizacional e adoção intensiva de tecnologias de informação. Os resultados são discutidos sob o prisma de oportunidades para um posicionamento mais adequado dos prestadores de serviço logístico, na oferta de seus serviços a estes dois grupos de empresas.An evaluation was made to identify how the different degrees of sophistication in the logistical organization of Brazilian industrial companies affect their decision to outsource logistic services. To this end, based on a review of the literature, variables relating to the sophistication of logistic organization and the main reasons for deciding to outsource were defined. 218 questionnaires were mailed to industrial companies listed in Exame magazine. The 93 companies that filled out the questionnaire were divided into two groups based on their reasons for outsourcing logistic services: (1 companies with high levels of formal organization and low levels of IT use, and (2 companies with low levels of formal organization but high levels of IT use. The findings are discussed from the standpoint of

  10. Rapid analysis of perchlorate in drinking water at parts per billion levels using microchip electrophoresis.

    Science.gov (United States)

    Gertsch, Jana C; Noblitt, Scott D; Cropek, Donald M; Henry, Charles S

    2010-05-01

    A microchip capillary electrophoresis (MCE) system has been developed for the determination of perchlorate in drinking water. The United States Environmental Protection Agency (USEPA) recently proposed a health advisory limit for perchlorate in drinking water of 15 parts per billion (ppb), a level requiring large, sophisticated instrumentation, such as ion chromatography coupled with mass spectrometry (IC-MS), for detection. An inexpensive, portable system is desired for routine online monitoring applications of perchlorate in drinking water. Here, we present an MCE method using contact conductivity detection for perchlorate determination. The method has several advantages, including reduced analysis times relative to IC, inherent portability, high selectivity, and minimal sample pretreatment. Resolution of perchlorate from more abundant ions was achieved using zwitterionic, sulfobetaine surfactants, N-hexadecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (HDAPS) and N-tetradecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (TDAPS). The system performance and the optimization of the separation chemistry, including the use of these surfactants to resolve perchlorate from other anions, are discussed in this work. The system is capable of detection limits of 3.4 +/- 1.8 ppb (n = 6) in standards and 5.6 +/- 1.7 ppb (n = 6) in drinking water.

  11. Towards understanding of magnetization reversal in Nd-Fe-B nanocomposites: analysis by high-throughput micromagnetic simulations

    Science.gov (United States)

    Erokhin, Sergey; Berkov, Dmitry; Ito, Masaaki; Kato, Akira; Yano, Masao; Michels, Andreas

    2018-03-01

    We demonstrate how micromagnetic simulations can be employed in order to characterize and analyze the magnetic microstructure of nanocomposites. For the example of nanocrystalline Nd-Fe-B, which is a potential material for future permanent-magnet applications, we have compared three different models for the micromagnetic analysis of this material class: (i) a description of the nanocomposite microstructure in terms of Stoner-Wohlfarth particles with and without the magnetodipolar interaction; (ii) a model based on the core-shell representation of the nanograins; (iii) the latter model including a contribution of superparamagnetic clusters. The relevant parameter spaces have been systematically scanned with the aim to establish which micromagnetic approach can most adequately describe experimental data for this material. According to our results, only the last, most sophisticated model is able to provide an excellent agreement with the measured hysteresis loop. The presented methodology is generally applicable to multiphase magnetic nanocomposites and it highligths the complex interrelationship between the microstructure, magnetic interactions, and the macroscopic magnetic properties.

  12. Advanced Simulation and Optimization Tools for Dynamic Aperture of Non-scaling FFAGs and Accelerators including Modern User Interfaces

    International Nuclear Information System (INIS)

    Mills, F.; Makino, K.; Berz, M.; Johnstone, C.

    2010-01-01

    With the U.S. experimental effort in HEP largely located at laboratories supporting the operations of large, highly specialized accelerators, colliding beam facilities, and detector facilities, the understanding and prediction of high energy particle accelerators becomes critical to the success, overall, of the DOE HEP program. One area in which small businesses can contribute to the ongoing success of the U.S. program in HEP is through innovations in computer techniques and sophistication in the modeling of high-energy accelerators. Accelerator modeling at these facilities is performed by experts with the product generally highly specific and representative only of in-house accelerators or special-interest accelerator problems. Development of new types of accelerators like FFAGs with their wide choices of parameter modifications, complicated fields, and the simultaneous need to efficiently handle very large emittance beams requires the availability of new simulation environments to assure predictability in operation. In this, ease of use and interfaces are critical to realizing a successful model, or optimization of a new design or working parameters of machines. In Phase I, various core modules for the design and analysis of FFAGs were developed and Graphical User Interfaces (GUI) have been investigated instead of the more general yet less easily manageable console-type output COSY provides.

  13. Advanced Simulation and Optimization Tools for Dynamic Aperture of Non-scaling FFAGs and Accelerators including Modern User Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Mills, F.; Makino, Kyoko; Berz, Martin; Johnstone, C.

    2010-09-01

    With the U.S. experimental effort in HEP largely located at laboratories supporting the operations of large, highly specialized accelerators, colliding beam facilities, and detector facilities, the understanding and prediction of high energy particle accelerators becomes critical to the success, overall, of the DOE HEP program. One area in which small businesses can contribute to the ongoing success of the U.S. program in HEP is through innovations in computer techniques and sophistication in the modeling of high-energy accelerators. Accelerator modeling at these facilities is performed by experts with the product generally highly specific and representative only of in-house accelerators or special-interest accelerator problems. Development of new types of accelerators like FFAGs with their wide choices of parameter modifications, complicated fields, and the simultaneous need to efficiently handle very large emittance beams requires the availability of new simulation environments to assure predictability in operation. In this, ease of use and interfaces are critical to realizing a successful model, or optimization of a new design or working parameters of machines. In Phase I, various core modules for the design and analysis of FFAGs were developed and Graphical User Interfaces (GUI) have been investigated instead of the more general yet less easily manageable console-type output COSY provides.

  14. Application of a new model for groundwater age distributions: Modeling and isotopic analysis of artificial recharge in the Rialto-Colton basin, California

    Science.gov (United States)

    Ginn, T.R.; Woolfenden, L.

    2002-01-01

    A project for modeling and isotopic analysis of artificial recharge in the Rialto-Colton basin aquifer in California, is discussed. The Rialto-Colton aquifer has been divided into four primary and significant flowpaths following the general direction of groundwater flow from NW to SE. The introductory investigation include sophisticated chemical reaction modeling, with highly simplified flow path simulation. A comprehensive reactive transport model with the established set of geochemical reactions over the whole aquifer will also be developed for treating both reactions and transport realistically. This will be completed by making use of HBGC123D implemented with isotopic calculation step to compute Carbon-14 (C14) and stable Carbon-13 (C13) contents of the water. Computed carbon contents will also be calibrated with the measured carbon contents for assessment of the amount of imported recharge into the Linden pond.

  15. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  16. Method in analysis of CdZnTe γ spectrum with artificial neural network

    International Nuclear Information System (INIS)

    Ai Xianyun; Wei Yixiang; Xiao Wuyun

    2005-01-01

    The analysis of gamma-ray spectra to identify lines and their intensities usually requires expert knowledge and time consuming calculations with complex fitting functions. CdZnTe detector often exhibits asymmetric peak shape particularly at high energies making peak fitting methods and sophisticated isotope identification programs difficult to use. This paper investigates the use of the neural network to process gamma spectra measured with CdZnTe detector to verify nuclear materials. Results show that the neural network method gives advantages, in particular, when large low-energetic peak tailings are observed. (authors)

  17. Kaplan turbine tip vortex cavitation – analysis and prevention

    International Nuclear Information System (INIS)

    Motycak, L; Skotak, A; Kupcik, R

    2012-01-01

    The work is focused on one type of Kaplan turbine runner cavitation – a tip vortex cavitation. For detailed description of the tip vortex, the CFD analysis is used. On the basis of this analysis it is possible to estimate the intensity of cavitating vortex core, danger of possible blade surface and runner chamber cavitation pitting. In the paper, the ways how to avoid the pitting effect of the tip vortex are described. In order to prevent the blade surface against pitting, the following possibilities as the change of geometry of the runner blade, dimension of tip clearance and finally the installation of the anti-cavitation lips are discussed. The knowledge of the shape and intensity of the tip vortex helps to design the anti-cavitation lips more sophistically. After all, the results of the model tests of the Kaplan runner with or without anti-cavitation lips and the results of the CFD analysis are compared.

  18. Truck Drivers And Risk Of STDs Including HIV

    Directory of Open Access Journals (Sweden)

    Bansal R.K

    1995-01-01

    Full Text Available Research Question: Whether long distance truck drivers are at a higher risk of contracting and transmitting STDs including HIV? Objectives: i To study the degree of knowledge of HIV and AIDS among long- distance truck drivers. ii Assess their sexual behaviour including condom use. iii Explore their prevailing social influences and substance abuse patterns. iv Explore their treatment seeking bahaviour as regards STDs. v Deduce their risk of contracting and transmitting STDs including HIV. Study Design: Cross- sectional interview. Setting: Transport Nagar, Indore (M.P Participants: 210 senior drivers (First drivers and 210 junior drivers (Second drivers. Study Variables: Extra-Marital sexual intercourse, condom usage, past and present history of STDs, treatment and counseling, substance abuse, social â€" cultural milieu. Outcome Variables: Risk of contraction of STDs. Statistical Analysis: Univariate analysis. Results: 94% of the drivers were totally ignorant about AIDS. 82.9% and 43.8 % of the senior and junior drivers had a history of extra- marital sex and of these only 2 regularly used condoms. 13.8% and 3.3 % of the senior and junior drivers had a past or present history suggestive of STD infection. Alcohol and Opium were regularly used by them. Conclusion: The studied drivers are at a high risk of contracting and transmitting STDs including HIV.

  19. Evolutionary diversity of bile salts in reptiles and mammals, including analysis of ancient human and extinct giant ground sloth coprolites

    Science.gov (United States)

    2010-01-01

    Background Bile salts are the major end-metabolites of cholesterol and are also important in lipid and protein digestion and in influencing the intestinal microflora. We greatly extend prior surveys of bile salt diversity in both reptiles and mammals, including analysis of 8,000 year old human coprolites and coprolites from the extinct Shasta ground sloth (Nothrotherium shastense). Results While there is significant variation of bile salts across species, bile salt profiles are generally stable within families and often within orders of reptiles and mammals, and do not directly correlate with differences in diet. The variation of bile salts generally accords with current molecular phylogenies of reptiles and mammals, including more recent groupings of squamate reptiles. For mammals, the most unusual finding was that the Paenungulates (elephants, manatees, and the rock hyrax) have a very different bile salt profile from the Rufous sengi and South American aardvark, two other mammals classified with Paenungulates in the cohort Afrotheria in molecular phylogenies. Analyses of the approximately 8,000 year old human coprolites yielded a bile salt profile very similar to that found in modern human feces. Analysis of the Shasta ground sloth coprolites (approximately 12,000 years old) showed the predominant presence of glycine-conjugated bile acids, similar to analyses of bile and feces of living sloths, in addition to a complex mixture of plant sterols and stanols expected from an herbivorous diet. Conclusions The bile salt synthetic pathway has become longer and more complex throughout vertebrate evolution, with some bile salt modifications only found within single groups such as marsupials. Analysis of the evolution of bile salt structures in different species provides a potentially rich model system for the evolution of a complex biochemical pathway in vertebrates. Our results also demonstrate the stability of bile salts in coprolites preserved in arid climates

  20. Evolutionary diversity of bile salts in reptiles and mammals, including analysis of ancient human and extinct giant ground sloth coprolites

    Directory of Open Access Journals (Sweden)

    Hofmann Alan F

    2010-05-01

    Full Text Available Abstract Background Bile salts are the major end-metabolites of cholesterol and are also important in lipid and protein digestion and in influencing the intestinal microflora. We greatly extend prior surveys of bile salt diversity in both reptiles and mammals, including analysis of 8,000 year old human coprolites and coprolites from the extinct Shasta ground sloth (Nothrotherium shastense. Results While there is significant variation of bile salts across species, bile salt profiles are generally stable within families and often within orders of reptiles and mammals, and do not directly correlate with differences in diet. The variation of bile salts generally accords with current molecular phylogenies of reptiles and mammals, including more recent groupings of squamate reptiles. For mammals, the most unusual finding was that the Paenungulates (elephants, manatees, and the rock hyrax have a very different bile salt profile from the Rufous sengi and South American aardvark, two other mammals classified with Paenungulates in the cohort Afrotheria in molecular phylogenies. Analyses of the approximately 8,000 year old human coprolites yielded a bile salt profile very similar to that found in modern human feces. Analysis of the Shasta ground sloth coprolites (approximately 12,000 years old showed the predominant presence of glycine-conjugated bile acids, similar to analyses of bile and feces of living sloths, in addition to a complex mixture of plant sterols and stanols expected from an herbivorous diet. Conclusions The bile salt synthetic pathway has become longer and more complex throughout vertebrate evolution, with some bile salt modifications only found within single groups such as marsupials. Analysis of the evolution of bile salt structures in different species provides a potentially rich model system for the evolution of a complex biochemical pathway in vertebrates. Our results also demonstrate the stability of bile salts in coprolites

  1. Design and simulation of a fast Josephson junction on-chip gated clock for frequency and time analysis

    International Nuclear Information System (INIS)

    Ruby, R.C.

    1991-01-01

    This paper reports that as the sophistication and speed of digital communication systems increase, there is a corresponding demand for more sophisticated and faster measurement instruments. One such instrument new on the market is the HP 5371A Frequency and Time Interval Analyzer (FTIA). Such an instrument is analogous to a conventional oscilloscope. Whereas the oscilloscope measures waveform amplitudes as a function of time, the FTIA measures phase, frequency, or timing events as functions of time. These applications are useful in such diverse areas as spread-spectrum radar, chirp filter designs, disk-head evaluation, and timing jitter analysis. The on-chip clock designed for this application uses a single Josephson Junction as the clock and a resonator circuit to fix the frequency. A zero-crossing detector is used to start and stop the clock. A SFQ counter is used to count the pulses generated by the clock and a reset circuit is used to reset the clock. Extensive simulations and modeling have been done based on measured values obtained from our Nb/Al 2 O 3 /Al/Nb process

  2. Survival analysis and classification methods for forest fire size.

    Science.gov (United States)

    Tremblay, Pier-Olivier; Duchesne, Thierry; Cumming, Steven G

    2018-01-01

    Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at "being held" (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at "being held" exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances.

  3. Experience of the use of γ photon activation analysis for the determination of oxygen in sodium

    International Nuclear Information System (INIS)

    Hislop, J.S.; Wood, D.A.; Thompson, R.

    1981-01-01

    The use of γ photon activation analysis for determination of the oxygen content of sodium in an experimental rig used for evaluation of electrochemical oxygen meters is described. A sampling procedure has been developed, using a thin walled nickel tube to act both as the sample collector and irradiation container, which does not require the sophisticated sampling facilities necessary when using more conventional methods of analysis. Results have been obtained for oxygen content of sodium over the nominal temperature range 125-250 0 C and the resulting oxygen solubility relationship compared with literature values. Good agreement has been obtained with previous UK vacuum distillation data. (orig.)

  4. Analysis of advanced European nuclear fuel cycle scenarios including transmutation and economic estimates

    International Nuclear Information System (INIS)

    Rodríguez, Iván Merino; Álvarez-Velarde, Francisco; Martín-Fuertes, Francisco

    2014-01-01

    Highlights: • Four fuel cycle scenarios have been analyzed in resources and economic terms. • Scenarios involve Once-Through, Pu burning, and MA transmutation strategies. • No restrictions were found in terms of uranium and plutonium availability. • The best case cost and the impact of their uncertainties to the LCOE were analyzed. - Abstract: Four European fuel cycle scenarios involving transmutation options (in coherence with PATEROS and CP-ESFR EU projects) have been addressed from a point of view of resources utilization and economic estimates. Scenarios include: (i) the current fleet using Light Water Reactor (LWR) technology and open fuel cycle, (ii) full replacement of the initial fleet with Fast Reactors (FR) burning U–Pu MOX fuel, (iii) closed fuel cycle with Minor Actinide (MA) transmutation in a fraction of the FR fleet, and (iv) closed fuel cycle with MA transmutation in dedicated Accelerator Driven Systems (ADS). All scenarios consider an intermediate period of GEN-III+ LWR deployment and they extend for 200 years, looking for long term equilibrium mass flow achievement. The simulations were made using the TR E VOL code, capable to assess the management of the nuclear mass streams in the scenario as well as economics for the estimation of the levelized cost of electricity (LCOE) and other costs. Results reveal that all scenarios are feasible according to nuclear resources demand (natural and depleted U, and Pu). Additionally, we have found as expected that the FR scenario reduces considerably the Pu inventory in repositories compared to the reference scenario. The elimination of the LWR MA legacy requires a maximum of 55% fraction (i.e., a peak value of 44 FR units) of the FR fleet dedicated to transmutation (MA in MOX fuel, homogeneous transmutation) or an average of 28 units of ADS plants (i.e., a peak value of 51 ADS units). Regarding the economic analysis, the main usefulness of the provided economic results is for relative comparison of

  5. Explore Earth Science Datasets for STEM with the NASA GES DISC Online Visualization and Analysis Tool, Giovanni

    Science.gov (United States)

    Liu, Z.; Acker, J.; Kempler, S.

    2016-01-01

    The NASA Goddard Earth Sciences (GES) Data and Information Services Center(DISC) is one of twelve NASA Science Mission Directorate (SMD) Data Centers that provide Earth science data, information, and services to users around the world including research and application scientists, students, citizen scientists, etc. The GESDISC is the home (archive) of remote sensing datasets for NASA Precipitation and Hydrology, Atmospheric Composition and Dynamics, etc. To facilitate Earth science data access, the GES DISC has been developing user-friendly data services for users at different levels in different countries. Among them, the Geospatial Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni, http:giovanni.gsfc.nasa.gov) allows users to explore satellite-based datasets using sophisticated analyses and visualization without downloading data and software, which is particularly suitable for novices (such as students) to use NASA datasets in STEM (science, technology, engineering and mathematics) activities. In this presentation, we will briefly introduce Giovanni along with examples for STEM activities.

  6. Wave Data Analysis

    DEFF Research Database (Denmark)

    Alikhani, Amir; Frigaard, Peter; Burcharth, Hans F.

    1998-01-01

    The data collected over the course of the experiment must be analysed and converted into a form suitable for its intended use. Type of analyses range from simple to sophisticated. Depending on the particular experiment and the needs of the researcher. In this study three main part of irregular wa...

  7. PIVOT: platform for interactive analysis and visualization of transcriptomics data.

    Science.gov (United States)

    Zhu, Qin; Fisher, Stephen A; Dueck, Hannah; Middleton, Sarah; Khaladkar, Mugdha; Kim, Junhyong

    2018-01-05

    Many R packages have been developed for transcriptome analysis but their use often requires familiarity with R and integrating results of different packages requires scripts to wrangle the datatypes. Furthermore, exploratory data analyses often generate multiple derived datasets such as data subsets or data transformations, which can be difficult to track. Here we present PIVOT, an R-based platform that wraps open source transcriptome analysis packages with a uniform user interface and graphical data management that allows non-programmers to interactively explore transcriptomics data. PIVOT supports more than 40 popular open source packages for transcriptome analysis and provides an extensive set of tools for statistical data manipulations. A graph-based visual interface is used to represent the links between derived datasets, allowing easy tracking of data versions. PIVOT further supports automatic report generation, publication-quality plots, and program/data state saving, such that all analysis can be saved, shared and reproduced. PIVOT will allow researchers with broad background to easily access sophisticated transcriptome analysis tools and interactively explore transcriptome datasets.

  8. The heat transfer analysis of the first stage blade

    International Nuclear Information System (INIS)

    Hong, Yong Ju; Choi, Bum Seog; Park, Byung Gyu; Yoon, Eui Soo

    2001-01-01

    To get higher efficiency of gas turbine, the designer should have more higher Turbine Inlet Temperature(TIT). Today, modern gas turbine having sophisticated cooling scheme has TIT above 1,700 .deg. C. In the Korea, many gas turbine having TIT above 1,300 .deg. C was imported and being operated, but the gas with high TIT above 1,300 .deg. C in the turbine will give damage to liner of combustor, and blade of turbine and etc. So frequently maintenance for parts enduring high temperature was performed. In this study, the heat transfer analysis of cooling air in the internal cooling channel (network analysis) and temperature analysis of the blade (Finite Element Analysis) in the first stage rotor was conducted for development of the optimal cooling passage design procedure. The results of network analysis and FEM analysis of blade show that the high temperature spot are occurred at the leading edge, trailing edge near tip, and platform. So to get more reliable performance of gas turbine, the more efficient cooling method should be applied at the leading edge and tip section and the thermal barrier coating on the blade surface has important role in cooling blade

  9. Wind-induced response analysis of a wind turbine tower including the blade-tower coupling effect

    Institute of Scientific and Technical Information of China (English)

    Xiao-bo CHEN; Jing LI; Jian-yun CHEN

    2009-01-01

    To analyze wind-induced response characteristics of a wind turbine tower more accurately, the blade-tower coupling effect was investigated. The mean wind velocity of the rotating blades and tower was simulated according to wind shear effects,and the fluctuating wind velocity time series of the wind turbine were simulated by a harmony superposition method. A dynamic finite element method (FEM) was used to calculate the wind-induced response of the blades and tower. Wind-induced responses of the tower were calculated in two cases (one included the blade-tower coupling effect, and the other only added the mass of blades and the hub at the top of the tower), and then the maximal displacements at the top of the tower of the tow cases were compared with each other. As a result of the influence of the blade-tower coupling effect and the total base shear of the blades, the maximal displacement of the first case increased nearly by 300% compared to the second case. To obtain more precise analysis, the blade-tower coupling effect and the total base shear of the blades should be considered simultaneously in the design of wind turbine towers.

  10. BioVLAB-MMIA: a cloud environment for microRNA and mRNA integrated analysis (MMIA) on Amazon EC2.

    Science.gov (United States)

    Lee, Hyungro; Yang, Youngik; Chae, Heejoon; Nam, Seungyoon; Choi, Donghoon; Tangchaisin, Patanachai; Herath, Chathura; Marru, Suresh; Nephew, Kenneth P; Kim, Sun

    2012-09-01

    MicroRNAs, by regulating the expression of hundreds of target genes, play critical roles in developmental biology and the etiology of numerous diseases, including cancer. As a vast amount of microRNA expression profile data are now publicly available, the integration of microRNA expression data sets with gene expression profiles is a key research problem in life science research. However, the ability to conduct genome-wide microRNA-mRNA (gene) integration currently requires sophisticated, high-end informatics tools, significant expertise in bioinformatics and computer science to carry out the complex integration analysis. In addition, increased computing infrastructure capabilities are essential in order to accommodate large data sets. In this study, we have extended the BioVLAB cloud workbench to develop an environment for the integrated analysis of microRNA and mRNA expression data, named BioVLAB-MMIA. The workbench facilitates computations on the Amazon EC2 and S3 resources orchestrated by the XBaya Workflow Suite. The advantages of BioVLAB-MMIA over the web-based MMIA system include: 1) readily expanded as new computational tools become available; 2) easily modifiable by re-configuring graphic icons in the workflow; 3) on-demand cloud computing resources can be used on an "as needed" basis; 4) distributed orchestration supports complex and long running workflows asynchronously. We believe that BioVLAB-MMIA will be an easy-to-use computing environment for researchers who plan to perform genome-wide microRNA-mRNA (gene) integrated analysis tasks.

  11. Hair analysis for the detection of drug use-is there potential for evasion?

    Science.gov (United States)

    Marrinan, Shanna; Roman-Urrestarazu, Andres; Naughton, Declan; Levari, Emerlinda; Collins, John; Chilcott, Robert; Bersani, Giuseppe; Corazza, Ornella

    2017-05-01

    Hair analysis for illicit substances is widely used to detect chronic drug consumption or abstention from drugs. Testees are increasingly seeking ways to avoid detection by using a variety of untested adulterant products (e.g., shampoos, cleansers) widely sold online. This study aims to investigate adulteration of hair samples and to assess effectiveness of such methods. The literature on hair test evasion was searched for on PubMed or MEDLINE, Psycinfo, and Google Scholar. Given the sparse nature of peer-reviewed data on this subject, results were integrated with a qualitative assessment of online sources, including user-orientated information or commercial websites, drug fora and "chat rooms". Over four million web sources were identified in a Google search by using "beat hair drug test" and the first 86 were monitored on regular basis and considered for further analysis. Attempts to influence hair test results are widespread. Various "shampoos," and "cleansers" among other products, were found for sale, which claim to remove analytes. Often advertised with aggressive marketing strategies, which include discounts, testimonials, and unsupported claims of efficacy. However, these products may pose serious health hazards and are also potentially toxic. In addition, many anecdotal reports suggest that Novel Psychoactive Substances are also consumed as an evasion technique, as these are not easily detectable via standard drug test. Recent changes on Novel Psychoactive Substances legislations such as New Psychoactive Bill in the UK might further challenge the testing process. Further research is needed by way of chemical analysis and trial of the adulterant products sold online and their effects as well as the development of more sophisticated hair testing techniques. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Simple gambling or sophisticated gaming? : applying game analysis methods to modern video slot machine games

    OpenAIRE

    Leppäsalko, Tero

    2017-01-01

    Slot machine games have become the most popular form of gambling worldwide. In Finland, their pervasiveness in public spaces and popularity makes them one of the most common form of gaming. However, in game studies, gambling games are often regarded as borderline games due to the player’s lack of control. In this thesis I ask whether modern video slot machine games can be considered as games and if so, what similarities there are between them and contemporary video games. To find out if m...

  13. Scientific data analysis on data-parallel platforms.

    Energy Technology Data Exchange (ETDEWEB)

    Ulmer, Craig D.; Bayer, Gregory W.; Choe, Yung Ryn; Roe, Diana C.

    2010-09-01

    As scientific computing users migrate to petaflop platforms that promise to generate multi-terabyte datasets, there is a growing need in the community to be able to embed sophisticated analysis algorithms in the computing platforms' storage systems. Data Warehouse Appliances (DWAs) are attractive for this work, due to their ability to store and process massive datasets efficiently. While DWAs have been utilized effectively in data-mining and informatics applications, they remain largely unproven in scientific workloads. In this paper we present our experiences in adapting two mesh analysis algorithms to function on five different DWA architectures: two Netezza database appliances, an XtremeData dbX database, a LexisNexis DAS, and multiple Hadoop MapReduce clusters. The main contribution of this work is insight into the differences between these DWAs from a user's perspective. In addition, we present performance measurements for ten DWA systems to help understand the impact of different architectural trade-offs in these systems.

  14. Shunted Piezoelectric Vibration Damping Analysis Including Centrifugal Loading Effects

    Science.gov (United States)

    Min, James B.; Duffy, Kirsten P.; Provenza, Andrew J.

    2011-01-01

    Excessive vibration of turbomachinery blades causes high cycle fatigue problems which require damping treatments to mitigate vibration levels. One method is the use of piezoelectric materials as passive or active dampers. Based on the technical challenges and requirements learned from previous turbomachinery rotor blades research, an effort has been made to investigate the effectiveness of a shunted piezoelectric for the turbomachinery rotor blades vibration control, specifically for a condition with centrifugal rotation. While ample research has been performed on the use of a piezoelectric material with electric circuits to attempt to control the structural vibration damping, very little study has been done regarding rotational effects. The present study attempts to fill this void. Specifically, the objectives of this study are: (a) to create and analyze finite element models for harmonic forced response vibration analysis coupled with shunted piezoelectric circuits for engine blade operational conditions, (b) to validate the experimental test approaches with numerical results and vice versa, and (c) to establish a numerical modeling capability for vibration control using shunted piezoelectric circuits under rotation. Study has focused on a resonant damping control using shunted piezoelectric patches on plate specimens. Tests and analyses were performed for both non-spinning and spinning conditions. The finite element (FE) shunted piezoelectric circuit damping simulations were performed using the ANSYS Multiphysics code for the resistive and inductive circuit piezoelectric simulations of both conditions. The FE results showed a good correlation with experimental test results. Tests and analyses of shunted piezoelectric damping control, demonstrating with plate specimens, show a great potential to reduce blade vibrations under centrifugal loading.

  15. Identification and analysis of functional elements in 1% of the human genome by the ENCODE pilot project.

    Science.gov (United States)

    Birney, Ewan; Stamatoyannopoulos, John A; Dutta, Anindya; Guigó, Roderic; Gingeras, Thomas R; Margulies, Elliott H; Weng, Zhiping; Snyder, Michael; Dermitzakis, Emmanouil T; Thurman, Robert E; Kuehn, Michael S; Taylor, Christopher M; Neph, Shane; Koch, Christoph M; Asthana, Saurabh; Malhotra, Ankit; Adzhubei, Ivan; Greenbaum, Jason A; Andrews, Robert M; Flicek, Paul; Boyle, Patrick J; Cao, Hua; Carter, Nigel P; Clelland, Gayle K; Davis, Sean; Day, Nathan; Dhami, Pawandeep; Dillon, Shane C; Dorschner, Michael O; Fiegler, Heike; Giresi, Paul G; Goldy, Jeff; Hawrylycz, Michael; Haydock, Andrew; Humbert, Richard; James, Keith D; Johnson, Brett E; Johnson, Ericka M; Frum, Tristan T; Rosenzweig, Elizabeth R; Karnani, Neerja; Lee, Kirsten; Lefebvre, Gregory C; Navas, Patrick A; Neri, Fidencio; Parker, Stephen C J; Sabo, Peter J; Sandstrom, Richard; Shafer, Anthony; Vetrie, David; Weaver, Molly; Wilcox, Sarah; Yu, Man; Collins, Francis S; Dekker, Job; Lieb, Jason D; Tullius, Thomas D; Crawford, Gregory E; Sunyaev, Shamil; Noble, William S; Dunham, Ian; Denoeud, France; Reymond, Alexandre; Kapranov, Philipp; Rozowsky, Joel; Zheng, Deyou; Castelo, Robert; Frankish, Adam; Harrow, Jennifer; Ghosh, Srinka; Sandelin, Albin; Hofacker, Ivo L; Baertsch, Robert; Keefe, Damian; Dike, Sujit; Cheng, Jill; Hirsch, Heather A; Sekinger, Edward A; Lagarde, Julien; Abril, Josep F; Shahab, Atif; Flamm, Christoph; Fried, Claudia; Hackermüller, Jörg; Hertel, Jana; Lindemeyer, Manja; Missal, Kristin; Tanzer, Andrea; Washietl, Stefan; Korbel, Jan; Emanuelsson, Olof; Pedersen, Jakob S; Holroyd, Nancy; Taylor, Ruth; Swarbreck, David; Matthews, Nicholas; Dickson, Mark C; Thomas, Daryl J; Weirauch, Matthew T; Gilbert, James; Drenkow, Jorg; Bell, Ian; Zhao, XiaoDong; Srinivasan, K G; Sung, Wing-Kin; Ooi, Hong Sain; Chiu, Kuo Ping; Foissac, Sylvain; Alioto, Tyler; Brent, Michael; Pachter, Lior; Tress, Michael L; Valencia, Alfonso; Choo, Siew Woh; Choo, Chiou Yu; Ucla, Catherine; Manzano, Caroline; Wyss, Carine; Cheung, Evelyn; Clark, Taane G; Brown, James B; Ganesh, Madhavan; Patel, Sandeep; Tammana, Hari; Chrast, Jacqueline; Henrichsen, Charlotte N; Kai, Chikatoshi; Kawai, Jun; Nagalakshmi, Ugrappa; Wu, Jiaqian; Lian, Zheng; Lian, Jin; Newburger, Peter; Zhang, Xueqing; Bickel, Peter; Mattick, John S; Carninci, Piero; Hayashizaki, Yoshihide; Weissman, Sherman; Hubbard, Tim; Myers, Richard M; Rogers, Jane; Stadler, Peter F; Lowe, Todd M; Wei, Chia-Lin; Ruan, Yijun; Struhl, Kevin; Gerstein, Mark; Antonarakis, Stylianos E; Fu, Yutao; Green, Eric D; Karaöz, Ulaş; Siepel, Adam; Taylor, James; Liefer, Laura A; Wetterstrand, Kris A; Good, Peter J; Feingold, Elise A; Guyer, Mark S; Cooper, Gregory M; Asimenos, George; Dewey, Colin N; Hou, Minmei; Nikolaev, Sergey; Montoya-Burgos, Juan I; Löytynoja, Ari; Whelan, Simon; Pardi, Fabio; Massingham, Tim; Huang, Haiyan; Zhang, Nancy R; Holmes, Ian; Mullikin, James C; Ureta-Vidal, Abel; Paten, Benedict; Seringhaus, Michael; Church, Deanna; Rosenbloom, Kate; Kent, W James; Stone, Eric A; Batzoglou, Serafim; Goldman, Nick; Hardison, Ross C; Haussler, David; Miller, Webb; Sidow, Arend; Trinklein, Nathan D; Zhang, Zhengdong D; Barrera, Leah; Stuart, Rhona; King, David C; Ameur, Adam; Enroth, Stefan; Bieda, Mark C; Kim, Jonghwan; Bhinge, Akshay A; Jiang, Nan; Liu, Jun; Yao, Fei; Vega, Vinsensius B; Lee, Charlie W H; Ng, Patrick; Shahab, Atif; Yang, Annie; Moqtaderi, Zarmik; Zhu, Zhou; Xu, Xiaoqin; Squazzo, Sharon; Oberley, Matthew J; Inman, David; Singer, Michael A; Richmond, Todd A; Munn, Kyle J; Rada-Iglesias, Alvaro; Wallerman, Ola; Komorowski, Jan; Fowler, Joanna C; Couttet, Phillippe; Bruce, Alexander W; Dovey, Oliver M; Ellis, Peter D; Langford, Cordelia F; Nix, David A; Euskirchen, Ghia; Hartman, Stephen; Urban, Alexander E; Kraus, Peter; Van Calcar, Sara; Heintzman, Nate; Kim, Tae Hoon; Wang, Kun; Qu, Chunxu; Hon, Gary; Luna, Rosa; Glass, Christopher K; Rosenfeld, M Geoff; Aldred, Shelley Force; Cooper, Sara J; Halees, Anason; Lin, Jane M; Shulha, Hennady P; Zhang, Xiaoling; Xu, Mousheng; Haidar, Jaafar N S; Yu, Yong; Ruan, Yijun; Iyer, Vishwanath R; Green, Roland D; Wadelius, Claes; Farnham, Peggy J; Ren, Bing; Harte, Rachel A; Hinrichs, Angie S; Trumbower, Heather; Clawson, Hiram; Hillman-Jackson, Jennifer; Zweig, Ann S; Smith, Kayla; Thakkapallayil, Archana; Barber, Galt; Kuhn, Robert M; Karolchik, Donna; Armengol, Lluis; Bird, Christine P; de Bakker, Paul I W; Kern, Andrew D; Lopez-Bigas, Nuria; Martin, Joel D; Stranger, Barbara E; Woodroffe, Abigail; Davydov, Eugene; Dimas, Antigone; Eyras, Eduardo; Hallgrímsdóttir, Ingileif B; Huppert, Julian; Zody, Michael C; Abecasis, Gonçalo R; Estivill, Xavier; Bouffard, Gerard G; Guan, Xiaobin; Hansen, Nancy F; Idol, Jacquelyn R; Maduro, Valerie V B; Maskeri, Baishali; McDowell, Jennifer C; Park, Morgan; Thomas, Pamela J; Young, Alice C; Blakesley, Robert W; Muzny, Donna M; Sodergren, Erica; Wheeler, David A; Worley, Kim C; Jiang, Huaiyang; Weinstock, George M; Gibbs, Richard A; Graves, Tina; Fulton, Robert; Mardis, Elaine R; Wilson, Richard K; Clamp, Michele; Cuff, James; Gnerre, Sante; Jaffe, David B; Chang, Jean L; Lindblad-Toh, Kerstin; Lander, Eric S; Koriabine, Maxim; Nefedov, Mikhail; Osoegawa, Kazutoyo; Yoshinaga, Yuko; Zhu, Baoli; de Jong, Pieter J

    2007-06-14

    We report the generation and analysis of functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project. These data have been further integrated and augmented by a number of evolutionary and computational analyses. Together, our results advance the collective knowledge about human genome function in several major areas. First, our studies provide convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts, including non-protein-coding transcripts, and those that extensively overlap one another. Second, systematic examination of transcriptional regulation has yielded new understanding about transcription start sites, including their relationship to specific regulatory sequences and features of chromatin accessibility and histone modification. Third, a more sophisticated view of chromatin structure has emerged, including its inter-relationship with DNA replication and transcriptional regulation. Finally, integration of these new sources of information, in particular with respect to mammalian evolution based on inter- and intra-species sequence comparisons, has yielded new mechanistic and evolutionary insights concerning the functional landscape of the human genome. Together, these studies are defining a path for pursuit of a more comprehensive characterization of human genome function.

  16. Multielement ultratrace analysis in tungsten using secondary ion mass spectrometry

    International Nuclear Information System (INIS)

    Wilhartitz, P.; Virag, A.; Friedbacher, G.; Grasserbauer, M.

    1987-01-01

    The ever increasing demands on properties of materials create a trend also towards ultrapure products. Characterization of these materials is only possible with modern, highly sophisticated analytical techniques such as activation analysis and mass spectrometry, particularly SSMS, SIMS and GDMS. Analytical strategies were developed for the determination of about 40 elements in a tungsten matrix with high-performance SIMS. Difficulties like the elimination of interferences had to be overcome. Extrapolated detection limits were established in the range of pg/g (alkali metals, halides) to ng/g (e.g. Ta, Th). Depth profiling and ion imaging gave additional information about the lateral and the depth distribution of the elements. (orig.)

  17. Protecting Accelerator Control Systems in the Face of Sophisticated Cyber Attacks

    International Nuclear Information System (INIS)

    Hartman, Steven M.

    2012-01-01

    Cyber security for industrial control systems has received significant attention in the past two years. The news coverage of the Stuxnet attack, believed to be targeted at the control system for a uranium enrichment plant, brought the issue to the attention of news media and policy makers. This has led to increased scrutiny of control systems for critical infrastructure such as power generation and distribution, and industrial systems such as chemical plants and petroleum refineries. The past two years have also seen targeted network attacks aimed at corporate and government entities including US Department of Energy National Laboratories. Both of these developments have potential repercussions for the control systems of particle accelerators. The need to balance risks from potential attacks with the operational needs of an accelerator present a unique challenge for the system architecture and access model.

  18. Frontal sinus revision rate after nasal polyposis surgery including frontal recess clearance and middle turbinectomy: A long-term analysis.

    Science.gov (United States)

    Benkhatar, Hakim; Khettab, Idir; Sultanik, Philippe; Laccourreye, Ollivier; Bonfils, Pierre

    2018-08-01

    To determine the frontal sinus revision rate after nasal polyposis (NP) surgery including frontal recess clearance (FRC) and middle turbinectomy (MT), to search for predictive factors and to analyse surgical management. Longitudinal analysis of 153 patients who consecutively underwent bilateral sphenoethmoidectomy with FRC and MT for NP with a minimum follow-up of 7 years. Decision of revision surgery was made in case of medically refractory chronic frontal sinusitis or frontal mucocele. Univariate and multivariate analysis incorporating clinical and radiological variables were performed. The frontal sinus revision rate was 6.5% (10/153). The mean time between the initial procedure and revision surgery was 3 years, 10 months. Osteitis around the frontal sinus outflow tract (FSOT) was associated with a higher risk of frontal sinus revision surgery (p=0.01). Asthma and aspirin intolerance did not increase the risk, as well as frontal sinus ostium diameter or residual frontoethmoid cells. Among revised patients, 60% required multiple procedures and 70% required frontal sinus ostium enlargement. Our long-term study reports that NP surgery including FRC and MT is associated with a low frontal sinus revision rate (6.5%). Patients developing osteitis around the FSOT have a higher risk of frontal sinus revision surgery. As mucosal damage can lead to osteitis, FSOT mucosa should be preserved during initial NP surgery. However, as multiple procedures are common among NP patients requiring frontal sinus revision, frontal sinus ostium enlargement should be considered during first revision in the hope of reducing the need of further revisions. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Reliability evaluation of containments including soil-structure interaction

    International Nuclear Information System (INIS)

    Pires, J.; Hwang, H.; Reich, M.

    1985-12-01

    Soil-structure interaction effects on the reliability assessment of containment structures are examined. The probability-based method for reliability evaluation of nuclear structures developed at Brookhaven National Laboratory is extended to include soil-structure interaction effects. In this method, reliability of structures is expressed in terms of limit state probabilities. Furthermore, random vibration theory is utilized to calculate limit state probabilities under random seismic loads. Earthquake ground motion is modeled by a segment of a zero-mean, stationary, filtered Gaussian white noise random process, represented by its power spectrum. All possible seismic hazards at a site, represented by a hazard curve, are also included in the analysis. The soil-foundation system is represented by a rigid surface foundation on an elastic halfspace. Random and other uncertainties in the strength properties of the structure, in the stiffness and internal damping of the soil, are also included in the analysis. Finally, a realistic reinforced concrete containment is analyzed to demonstrate the application of the method. For this containment, the soil-structure interaction effects on; (1) limit state probabilities, (2) structural fragility curves, (3) floor response spectra with probabilistic content, and (4) correlation coefficients for total acceleration response at specified structural locations, are examined in detail. 25 refs., 21 figs., 12 tabs

  20. Multi trace element analysis of dry biological materials by neutron activation analysis including a chemical group separation

    International Nuclear Information System (INIS)

    Weers, C.A.

    1980-01-01

    The principles of activation analysis and the practical aspects of neutron activation analysis are outlined. The limits which are set to accuracy and precision are defined. The description of the evaporation process is summarised in terms of the half-volume. This quantity is then used to define the resolving power. The formulation is checked by radiotracer experiments. Dried animal blood is used as the testing material. The pretreatment of the samples and (the development of) the destruction-evaporation apparatus is described. Four successive devices were built and tested. The development of the successive adsorption steps with active charcoal, Al 2 O 3 and coprecipitation with Fe(OH) 3 is presented. Seven groups of about 25 elements in total can be determined this way. The results obtained for standard reference materials are summarized and compared with literature data. (Auth.)

  1. Hi-tech in space - Rosetta - a space sophisticate

    Science.gov (United States)

    2004-02-01

    The European Space Agency’s Rosetta mission will be getting under way in February 2004. The Rosetta spacecraft will be pairing up with Comet 67P/Churyumov-Gerasimenko and accompanying it on its journey, investigating the comet’s composition and the dynamic processes at work as it flies sunwards. The spacecraft will even deposit a lander on the comet. “This will be our first direct contact with the surface of a comet,” said Dr Manfred Warhaut, Operations Manager for the Rosetta mission at ESA's European Space Operations Centre (ESOC) in Darmstadt, Germany. The trip is certainly not short: Rosetta will need ten years just to reach the comet. This places extreme demands on its hardware; when the probe meets up with the comet, all instruments must be fully operational, especially since it will have been in “hibernation” for 2 and a half years of its journey. During this ‘big sleep’, all systems, scientific instruments included, are turned off. Only the on-board computer remains active. Twelve cubic metres of technical wizardry Rosetta’s hardware fits into a sort of aluminium box measuring just 12 cubic metres. The scientific payload is mounted in the upper part, while the subsystems - on-board computer, transmitter and propulsion system - are housed below. The lander is fixed to the opposite side of the probe from the steerable antenna. As the spacecraft orbits the comet, the scientific instruments will at all times be pointed towards its surface; the antenna and solar panels will point towards the Earth and Sun respectively. For trajectory and attitude control and for the major braking manœuvres, Rosetta is equipped with 24 thrusters each delivering 10 N. That corresponds to the force needed here on Earth to hold a bag containing 10 apples. Rosetta sets off with 1650 kg of propellant on board, accounting for more than half its mass at lift-off. Just 20% of total mass is available for scientific purposes. So when developing the research instruments

  2. A systematic review including meta-analysis of work environment and depressive symptoms.

    Science.gov (United States)

    Theorell, Töres; Hammarström, Anne; Aronsson, Gunnar; Träskman Bendz, Lil; Grape, Tom; Hogstedt, Christer; Marteinsdottir, Ina; Skoog, Ingmar; Hall, Charlotte

    2015-08-01

    Depressive symptoms are potential outcomes of poorly functioning work environments. Such symptoms are frequent and cause considerable suffering for the employees as well as financial loss for the employers. Accordingly good prospective studies of psychosocial working conditions and depressive symptoms are valuable. Scientific reviews of such studies have pointed at methodological difficulties but still established a few job risk factors. Those reviews were published some years ago. There is need for an updated systematic review using the GRADE system. In addition, gender related questions have been insufficiently reviewed. Inclusion criteria for the studies published 1990 to June 2013: 1. European and English speaking countries. 2. Quantified results describing the relationship between exposure (psychosocial or physical/chemical) and outcome (standardized questionnaire assessment of depressive symptoms or interview-based clinical depression). 3. Prospective or comparable case-control design with at least 100 participants. 4. Assessments of exposure (working conditions) and outcome at baseline and outcome (depressive symptoms) once again after follow-up 1-5 years later. 5. Adjustment for age and adjustment or stratification for gender. Studies filling inclusion criteria were subjected to assessment of 1.) relevance and 2.) quality using predefined criteria. Systematic review of the evidence was made using the GRADE system. When applicable, meta-analysis of the magnitude of associations was made. Consistency of findings was examined for a number of possible confounders and publication bias was discussed. Fifty-nine articles of high or medium high scientific quality were included. Moderately strong evidence (grade three out of four) was found for job strain (high psychological demands and low decision latitude), low decision latitude and bullying having significant impact on development of depressive symptoms. Limited evidence (grade two) was shown for psychological

  3. Probabilistic production simulation including CHP plants

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, H.V.; Palsson, H.; Ravn, H.F.

    1997-04-01

    A probabilistic production simulation method is presented for an energy system containing combined heat and power plants. The method permits incorporation of stochastic failures (forced outages) of the plants and is well suited for analysis of the dimensioning of the system, that is, for finding the appropriate types and capacities of production plants in relation to expansion planning. The method is in the tradition of similar approaches for the analysis of power systems, based on the load duration curve. The present method extends on this by considering a two-dimensional load duration curve where the two dimensions represent heat and power. The method permits the analysis of a combined heat and power system which includes all the basic relevant types of plants, viz., condensing plants, back pressure plants, extraction plants and heat plants. The focus of the method is on the situation where the heat side has priority. This implies that on the power side there may be imbalances between demand and production. The method permits quantification of the expected power overflow, the expected unserviced power demand, and the expected unserviced heat demand. It is shown that a discretization method as well as double Fourier series may be applied in algorithms based on the method. (au) 1 tab., 28 ills., 21 refs.

  4. Neutron lifetimes behavior analysis considering the two-region kinetic model in the IPEN/MB-01 reactor

    Energy Technology Data Exchange (ETDEWEB)

    Gonnelli, Eduardo; Diniz, Ricardo [Instituto de Pesquisas Energéticas e Nucleares - IPEN/CNEN-SP Travessa R-400, 05508-900, Cidade Universitária, São Paulo (Brazil)

    2014-11-11

    This is a complementary work about the behavior analysis of the neutron lifetimes that was developed in the IPEN/MB-01 nuclear reactor facility. The macroscopic neutron noise technique was experimentally employed using pulse mode detectors for two stages of control rods insertion, where a total of twenty levels of subcriticality have been carried out. It was also considered that the neutron reflector density was treated as an additional group of delayed neutrons, being a sophisticated approach in the two-region kinetic theoretical model.

  5. Neutron lifetimes behavior analysis considering the two-region kinetic model in the IPEN/MB-01 reactor

    International Nuclear Information System (INIS)

    Gonnelli, Eduardo; Diniz, Ricardo

    2014-01-01

    This is a complementary work about the behavior analysis of the neutron lifetimes that was developed in the IPEN/MB-01 nuclear reactor facility. The macroscopic neutron noise technique was experimentally employed using pulse mode detectors for two stages of control rods insertion, where a total of twenty levels of subcriticality have been carried out. It was also considered that the neutron reflector density was treated as an additional group of delayed neutrons, being a sophisticated approach in the two-region kinetic theoretical model

  6. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    Science.gov (United States)

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  7. Forensic document analysis using scanning microscopy

    Science.gov (United States)

    Shaffer, Douglas K.

    2009-05-01

    The authentication and identification of the source of a printed document(s) can be important in forensic investigations involving a wide range of fraudulent materials, including counterfeit currency, travel and identity documents, business and personal checks, money orders, prescription labels, travelers checks, medical records, financial documents and threatening correspondence. The physical and chemical characterization of document materials - including paper, writing inks and printed media - is becoming increasingly relevant for law enforcement agencies, with the availability of a wide variety of sophisticated commercial printers and copiers which are capable of producing fraudulent documents of extremely high print quality, rendering these difficult to distinguish from genuine documents. This paper describes various applications and analytical methodologies using scanning electron miscoscopy/energy dispersive (x-ray) spectroscopy (SEM/EDS) and related technologies for the characterization of fraudulent documents, and illustrates how their morphological and chemical profiles can be compared to (1) authenticate and (2) link forensic documents with a common source(s) in their production history.

  8. Application of model-based spectral analysis to wind-profiler radar observations

    Energy Technology Data Exchange (ETDEWEB)

    Boyer, E. [ENS, Cachan (France). LESiR; Petitdidier, M.; Corneil, W. [CETP, Velizy (France); Adnet, C. [THALES Air Dfense, Bagneux (France); Larzabal, P. [ENS, Cachan (France). LESiR; IUT, Cachan (France). CRIIP

    2001-08-01

    A classical way to reduce a radar's data is to compute the spectrum using FFT and then to identify the different peak contributions. But in case an overlapping between the different echoes (atmospheric echo, clutter, hydrometer echo..) exists, Fourier-like techniques provide poor frequency resolution and then sophisticated peak-identification may not be able to detect the different echoes. In order to improve the number of reduced data and their quality relative to Fourier spectrum analysis, three different methods are presented in this paper and applied to actual data. Their approach consists of predicting the main frequency-components, which avoids the development of very sophisticated peak-identification algorithms. The first method is based on cepstrum properties generally used to determine the shift between two close identical echoes. We will see in this paper that this method cannot provide a better estimate than Fourier-like techniques in an operational use. The second method consists of an autoregressive estimation of the spectrum. Since the tests were promising, this method was applied to reduce the radar data obtained during two thunderstorms. The autoregressive method, which is very simple to implement, improved the Doppler-frequency data reduction relative to the FFT spectrum analysis. The third method exploits a MUSIC algorithm, one of the numerous subspace-based methods, which is well adapted to estimate spectra composed of pure lines. A statistical study of performances of this method is presented, and points out the very good resolution of this estimator in comparison with Fourier-like techniques. Application to actual data confirms the good qualities of this estimator for reducing radar's data. (orig.)

  9. Engineering Analysis of Intermediate Loop and Process Heat Exchanger Requirements to Include Configuration Analysis and Materials Needs

    Energy Technology Data Exchange (ETDEWEB)

    T.M. Lillo; R.L. Williamson; T.R. Reed; C.B. Davis; D.M. Ginosar

    2005-09-01

    The need to locate advanced hydrogen production facilities a finite distance away from a nuclear power source necessitates the need for an intermediate heat transport loop (IHTL). This IHTL must not only efficiently transport energy over distances up to 500 meters but must also be capable of operating at high temperatures (>850oC) for many years. High temperature, long term operation raises concerns of material strength, creep resistance and general material stability (corrosion resistance). IHTL design is currently in the initial stages. Many questions remain to be answered before intelligent design can begin. The report begins to look at some of the issues surrounding the main components of an IHTL. Specifically, a stress analysis of a compact heat exchanger design under expected operating conditions is reported. Also the results of a thermal analysis performed on two ITHL pipe configurations for different heat transport fluids are presented. The configurations consist of separate hot supply and cold return legs as well as annular design in which the hot fluid is carried in an inner pipe and the cold return fluids travels in the opposite direction in the annular space around the hot pipe. The effects of insulation configurations on pipe configuration performance are also reported. Finally, a simple analysis of two different process heat exchanger designs, one a tube in shell type and the other a compact or microchannel reactor are evaluated in light of catalyst requirements. Important insights into the critical areas of research and development are gained from these analyses, guiding the direction of future areas of research.

  10. Market analysis, energy savings potential, and future development requirements for Radiance. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1993-10-01

    The Department of Energy (DOE) Office of Conservation and Renewable Energy (CE), Building Equipment Division has funded the development of a sophisticated computer rendering program called Radiance at Lawrence Berkeley Laboratories (LBL). The project review study included: (1) Surveys of the lighting profession to determine how designers would use an improved, user-friendly Radiance, (2) Elucidation of features, including how Radiance could be used to save energy, which could be incorporated into Radiance to facilitate its more widespread use, (3) Outline of a development plan and determination of what costs the DOE might incur if it were to proceed with the development of an improved version, and (4) Weighing the anticipated development costs against anticipated energy-saving benefits.

  11. Interactive analysis and evaluation of ERTS data for regional planning and urban development: A Los Angeles Basin case study

    Science.gov (United States)

    Raje, S.; Economy, R.; Willoughby, G.; Mcknight, J.

    1974-01-01

    The progression endemic to the ERTS Data Use Experiment SR 124 in data quality, analysis sophistication and applications responsiveness is reviewed. The roles of the variety of ERTS products, including the supporting underflight aircraft imagery at various scales, are discussed in the context of this investigation. The versatility of interpretation techniques and outputs developed and implemented via the General Electric Multispectral Information Extraction Systems is described and exemplified by both system-expository and applications-explanatory products. The wide-ranging and in-depth applications studied in the course of this experiment can be characterized as community-oriented and agency-directed. In the former, generic category, which is primarily data-contextual, problems analyzed dealt with agricultural systems, surface water bodies, snow cover, brush fire burns, forestry, grass growth, parks - golf courses - cemeteries, dust storms, grading sites, geological features and coastal water structure. The ERTS MSS band selectivity and measurements thresholds were of primary interest here. The agency-directed application areas have been user-evaluational in nature. Beginning with overall urbanized regional analysis of land cover density-development intensity, residential areas were analyzed for ascertaining if housing types could be aggregated with any degree of reliability.

  12. RVA. 3-D Visualization and Analysis Software to Support Management of Oil and Gas Resources

    Energy Technology Data Exchange (ETDEWEB)

    Keefer, Donald A. [Univ. of Illinois, Champaign, IL (United States); Shaffer, Eric G. [Univ. of Illinois, Champaign, IL (United States); Storsved, Brynne [Univ. of Illinois, Champaign, IL (United States); Vanmoer, Mark [Univ. of Illinois, Champaign, IL (United States); Angrave, Lawrence [Univ. of Illinois, Champaign, IL (United States); Damico, James R. [Univ. of Illinois, Champaign, IL (United States); Grigsby, Nathan [Univ. of Illinois, Champaign, IL (United States)

    2015-12-01

    A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including

  13. Geometric Scaling Analysis of Deep Inelastic Scattering Data Including Heavy Quarks

    International Nuclear Information System (INIS)

    Wu Qing-Dong; Zeng Ji; Hu Yuan-Yuan; Li Quan-Bo; Xiang Wen-Chang; Zhou Dai-Cui

    2016-01-01

    An analytic massive total cross section of photon-proton scattering is derived, which has geometric scaling. A geometric scaling is used to perform a global analysis of the deep inelastic scattering data on inclusive structure function F_2 measured in lepton–hadron scattering experiments at small values of Bjorken x. It is shown that the descriptions of the inclusive structure function F_2 and longitudinal structure function F_L are improved with the massive analytic structure function, which may imply the gluon saturation effect dominating the parton evolution process at HERA. The inclusion of the heavy quarks prevent the divergence of the lepton–hadron cross section, which plays a significant role in the description of the photoproduction region. (paper)

  14. Seismic security assessment of earth and rockfill dams located in epicentral regions

    Energy Technology Data Exchange (ETDEWEB)

    Oldecop, L.; Zabala, F.; Rodari, R. [San Juan National Univ., San Juan (Argentina). Instituto de Invest. Antisismicas

    2004-07-01

    The seismic safety of dams is of great interest to the midwest region of Argentina, the most seismically active area in the country. This paper examines factors controlling the design of dams subjected to earthquake action, criteria for safety verification and the analysis tools currently available. Data of dams, active faults and epicenters of historic earthquakes in the region were provided. Paleoseismicity research was suggested as an important area of research, potentially enhancing an understanding of a region's seismic activity. It was concluded that analysis tools currently used in engineering include simple models offering advantages in reliability and ease of result interpretation, but have shortcomings in their applicability. Care must be taken in the validation and interpretation of these models, particularly when the behaviour of a dam includes complex phenomena. More sophisticated analysis tools currently available are difficult to apply, largely due to the complexity of algorithms in the models. It was also concluded that in order to overcome difficulties in both simple and complex models, predictions should be contrasted with real behaviour data. Data from measurement of seismic behaviour is still relatively scarce, presenting an obstacle towards the further use of more sophisticated analysis tools, as they are not as yet tested against measurements and observations of real cases. 15 refs., 2 tabs., 11 figs.

  15. ABC-VED analysis of expendable medical stores at a tertiary care hospital.

    Science.gov (United States)

    Kumar, Sushil; Chakravarty, A

    2015-01-01

    The modern system of medicine has evolved into a complex, sophisticated and expensive treatment modality in terms of cost of medicines and consumables. In any hospital, approximately 33% of total annual budget is spent on buying materials and supplies including medicines. ABC (Always, Better Control)-VED (Vital, Essential, Desirable) analysis of medical stores of a large teaching, tertiary care hospital of the Armed Forces was carried out to identify the categories of drugs needing focused managerial control. Annual consumption and expenditure data of expendable medical stores for one year was extracted from the drug expense book, followed by classification on its annual usage value. Subsequently, the factor of criticality was applied to arrive at a decision matrix for understanding the need for selective managerial control. The study revealed that out of 1536 items considered for the study, 6.77% (104), 19.27% (296) and 73.95% (1136) items were found to be A, B and C category items respectively. VED analysis revealed that vital items (V) accounted for 13.14% (201), essential items (E) for 56.37% (866) and desirable accounted for 30.49% items (469). ABC-VED matrix analysis of the inventory reveals that only 322 (21%) items out of an inventory of 1536 drugs belonging to category I will require maximum attention. Scientific inventory management tools need to be applied routinely for efficient management of medical stores, as it contributes to judicious use of limited resources and resultant improvement in patient care.

  16. Scattering Analysis of a Compact Dipole Array with Series and Parallel Feed Network including Mutual Coupling Effect

    Directory of Open Access Journals (Sweden)

    H. L. Sneha

    2013-01-01

    Full Text Available The current focus in defense arena is towards the stealth technology with an emphasis to control the radar cross-section (RCS. The scattering from the antennas mounted over the platform is of prime importance especially for a low-observable aerospace vehicle. This paper presents the analysis of the scattering cross section of a uniformly spaced linear dipole array. Two types of feed networks, that is, series and parallel feed networks, are considered. The total RCS of phased array with either kind of feed network is obtained by following the signal as it enters through the aperture and travels through the feed network. The RCS estimation of array is done including the mutual coupling effect between the dipole elements in three configurations, that is, side-by-side, collinear, and parallel-in-echelon. The results presented can be useful while designing a phased array with optimum performance towards low observability.

  17. Neutron activation analysis is 60 years old: Is it time for retirement?

    International Nuclear Information System (INIS)

    de Goeij, J.J.M.

    1996-01-01

    In the past 60 yr, NAA (neutron activation analysis) has become an outstanding analytical technique for quite a few elements. In earlier days, radiochemical NAA (RNAA) with postirradiation radiochemical separations was the dominant version. In the last 25 yr, after introduction of high-resolution germanium semiconductor detectors in combination with sophisticated electronics and software, instrumental NAA (INAA) became the predominant version. The characteristic features of NAA are (a) multielement capability; (b) generally good selectivity; (c) low determination limits for quite a few elements; (d) no effect of the chemical state of the analyte; (e) relatively small matrix effects; (f) absence or minimization of the blank value; (g) simple calibration; (h) analysis of samples up to kilograms, physically independent method compared to other trace-element techniques; and (h) easily definable sources of systematic or random errors

  18. Analysis of Burst Observations by GLAST's LAT Detector

    International Nuclear Information System (INIS)

    Band, David L.; Digel, Seth W.

    2004-01-01

    Analyzing data from GLAST's Large Area Telescope (LAT) will require sophisticated techniques. The PSF and effective area are functions of both photon energy and the position in the field-of-view. During most of the mission the observatory will survey the sky continuously, and thus, the LAT will detect each count from a source at a different detector orientation; each count requires its own response function! The likelihood as a function of celestial position and photon energy will be the foundation of the standard analysis techniques. However, the 20 MeV-300 GeV emission at the time of the ∼ 100 keV burst emission (timescale of ∼ 10 s) can be isolated and analyzed because essentially no non-burst counts are expected within a PSF radius of the burst location during the burst. Both binned and unbinned (in energy) spectral fitting will be possible. Longer timescale afterglow emission will require the likelihood analysis that will be used for persistent sources

  19. Development of several data bases related to reactor safety research including probabilistic safety assessment and incident analysis at JAERI

    International Nuclear Information System (INIS)

    Kobayashi, Kensuke; Oikawa, Tetsukuni; Watanabe, Norio; Izumi, Fumio; Higuchi, Suminori

    1986-01-01

    Presented are several databases developed at JAERI for reactor safety research including probabilistic safety assessment and incident analysis. First described are the recent developments of the databases such as 1) the component failure rate database, 2) the OECD/NEA/IRS information retrieval system, 3) the nuclear power plant database and so on. Then several issues are discussed referring mostly to the operation of the database (data input and transcoding) and to the retrieval and utilization of the information. Finally, emphasis is given to the increasing role which artifitial intelligence techniques such as natural language treatment and expert systems may play in improving the future capabilities of the databases. (author)

  20. Preparation and certification of the Polish reference material Virginia Tobacco Leaves (CTA-VTL-2) for inorganic trace analysis including microanalysis

    International Nuclear Information System (INIS)

    Dybczynski, R.; Polkowska-Motrenko, H.; Samczynski, Z.; Szopa, Z.

    1997-01-01

    A new Polish certified reference material Virginia Tobacco Leaves (CTA-VTL-2) for inorganic trace analysis including microanalysis has been prepared. Certification of the candidate reference material was based on the world-wide interlaboratory comparison in which 60 laboratories from 18 countries, participated using various analytical methods and techniques. Data evaluation performed by means of the new multifunctional software package -SSQC. Recommended values were assigned for 33 and 'information' values for 10 elements, respectively. The validity of 'certified' values was confirmed for several elements using 'very accurate' methods developed in this Laboratory. (author)

  1. Nonlinear equilibrium in Tokamaks including convective terms and viscosity

    International Nuclear Information System (INIS)

    Martin, P.; Castro, E.; Puerta, J.

    2003-01-01

    MHD equilibrium in tokamaks becomes very complex, when the non-linear convective term and viscosity are included in the momentum equation. In order to simplify the analysis, each new term has been separated in type gradient terms and vorticity depending terms. For the special case in which the vorticity vanishes, an extended Grad-Shafranov type equation can be obtained. However now the magnetic surface is not isobars or current surfaces as in the usual Grad-Shafranov treatment. The non-linear convective terms introduces gradient of Bernoulli type kinetic terms . Montgomery and other authors have shown the importance of the viscosity terms in tokamaks [1,2], here the treatment is carried out for the equilibrium condition, including generalized tokamaks coordinates recently described [3], which simplify the equilibrium analysis. Calculation of the new isobar surfaces is difficult and some computation have been carried out elsewhere for some particular cases [3]. Here, our analysis is extended discussing how the toroidal current density, plasma pressure and toroidal field are modified across the midplane because of the new terms (convective and viscous). New calculations and computations are also presented. (Author)

  2. PyFolding: Open-Source Graphing, Simulation, and Analysis of the Biophysical Properties of Proteins.

    Science.gov (United States)

    Lowe, Alan R; Perez-Riba, Albert; Itzhaki, Laura S; Main, Ewan R G

    2018-02-06

    For many years, curve-fitting software has been heavily utilized to fit simple models to various types of biophysical data. Although such software packages are easy to use for simple functions, they are often expensive and present substantial impediments to applying more complex models or for the analysis of large data sets. One field that is reliant on such data analysis is the thermodynamics and kinetics of protein folding. Over the past decade, increasingly sophisticated analytical models have been generated, but without simple tools to enable routine analysis. Consequently, users have needed to generate their own tools or otherwise find willing collaborators. Here we present PyFolding, a free, open-source, and extensible Python framework for graphing, analysis, and simulation of the biophysical properties of proteins. To demonstrate the utility of PyFolding, we have used it to analyze and model experimental protein folding and thermodynamic data. Examples include: 1) multiphase kinetic folding fitted to linked equations, 2) global fitting of multiple data sets, and 3) analysis of repeat protein thermodynamics with Ising model variants. Moreover, we demonstrate how PyFolding is easily extensible to novel functionality beyond applications in protein folding via the addition of new models. Example scripts to perform these and other operations are supplied with the software, and we encourage users to contribute notebooks and models to create a community resource. Finally, we show that PyFolding can be used in conjunction with Jupyter notebooks as an easy way to share methods and analysis for publication and among research teams. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  3. Stress analysis of fuel claddings with axial fins including creep effects

    International Nuclear Information System (INIS)

    Krieg, R.

    1977-01-01

    For LMFBR fuel claddings with axial fins the stress and strain fields are calculated which may be caused by internal pressure, differential thermal expansion and irradiation induced differential swelling. To provide an appropriate description of the cladding material it is assumed that the total strain is the sum of a linear elastic and a creep term, where the latter one includes the thermal as well as the irradiation induced creep. First the linear elastic problem is treated by a semi-analytical method leading to a bipotential equation for Airys' stress function. Solving this equation analytically means that the field equations valid within the cladding are satisfied exactly. By applying a combined point matching- least square-method the boundary conditions could be satisfied approximately such that in most cases the remaining error is within the uncertainty range of the loading conditions. Then the nonlinear problem which includes creep is approximated by a sequence of linear elastic solutions with time as parameter. The accumulated creep strain is treated here as an imposed strain field. To study the influence of different effects such as fin shape, temperature region, irradiation induced creep and swelling or internal pressure, a total of eleven cases with various parameter variations are investigated. The results are presented graphically in the following forms: stress and strain distributions over the cladding cross section for end of life conditions and boundary stresses and strains versus time. (Auth.)

  4. Survival analysis and classification methods for forest fire size

    Science.gov (United States)

    2018-01-01

    Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at “being held” (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at “being held” exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances. PMID:29320497

  5. Including People with Dementia in Research: An Analysis of Australian Ethical and Legal Rules and Recommendations for Reform.

    Science.gov (United States)

    Ries, Nola M; Thompson, Katie A; Lowe, Michael

    2017-09-01

    Research is crucial to advancing knowledge about dementia, yet the burden of the disease currently outpaces research activity. Research often excludes people with dementia and other cognitive impairments because researchers and ethics committees are concerned about issues related to capacity, consent, and substitute decision-making. In Australia, participation in research by people with cognitive impairment is governed by a national ethics statement and a patchwork of state and territorial laws that have widely varying rules. We contend that this legislative variation precludes a consistent approach to research governance and participation and hinders research that seeks to include people with impaired capacity. In this paper, we present key ethical principles, provide a comprehensive review of applicable legal rules in Australian states and territories, and highlight significant differences and ambiguities. Our analysis includes recommendations for reform to improve clarity and consistency in the law and reduce barriers that may exclude persons with dementia from participating in ethically approved research. Our recommendations seek to advance the national decision-making principles recommended by the Australian Law Reform Commission, which emphasize the rights of all adults to make their own decisions and for those with impaired capacity to have access to appropriate supports to help them make decisions that affect their lives.

  6. A Study to Determine the Feasibility of Including the Direct Experiences of Microteaching and Team Teaching, and Interaction Analysis Training in the Pre-Service Training of Foreign Language Teachers.

    Science.gov (United States)

    Wolfe, David Edwin

    This study examines potentially significant factors in the training of foreign language teachers. Remarks on microteaching and interaction analysis precede a review and analysis of related literature. Included in this section are the Stanford University Summer Intern Program, Amidon's model of microteaching and interaction analysis, and…

  7. Space shuttle main engine vibration data base

    Science.gov (United States)

    Lewallen, Pat

    1986-01-01

    The Space Shuttle Main Engine Vibration Data Base is described. Included is a detailed description of the data base components, the data acquisition process, the more sophisticated software routines, and the future data acquisition methods. Several figures and plots are provided to illustrate the various output formats accessible to the user. The numerous vibration data recall and analysis capabilities available through automated data base techniques are revealed.

  8. 21st Century Environmental Challenges: The Need For a New Economics

    OpenAIRE

    Michie, Jonathan; Oughton, Christine

    2011-01-01

    Research on environmental economics and policy has been dominated by neoclassical theory. While there have been advances in this approach, including more sophisticated analysis of imperfect information and time, and the development of endogenous growth theory, neoclassical models contain a number of underlying characteristics that limit their relevance for modelling firm behaviour especially in relation to environmental issues, innovation and change. The limitations spring fundamentally from ...

  9. Vulnerability analysis of process plants subject to domino effects

    International Nuclear Information System (INIS)

    Khakzad, Nima; Reniers, Genserik; Abbassi, Rouzbeh; Khan, Faisal

    2016-01-01

    In the context of domino effects, vulnerability analysis of chemical and process plants aims to identify and protect installations which are relatively more susceptible to damage and thus contribute more to the initiation or propagation of domino effects. In the present study, we have developed a methodology based on graph theory for domino vulnerability analysis of hazardous installations within process plants, where owning to the large number of installations or complex interdependencies, the application of sophisticated reasoning approaches such as Bayesian network is limited. We have taken advantage of a hypothetical chemical storage plant to develop the methodology and validated the results using a dynamic Bayesian network approach. The efficacy and out-performance of the developed methodology have been demonstrated via a real-life complex case study. - Highlights: • Graph theory is a reliable tool for vulnerability analysis of chemical plants as to domino effects. • All-closeness centrality score can be used to identify most vulnerable installations. • As for complex chemical plants, the methodology outperforms Bayesian network.

  10. Structural Optimization based on the Concept of First Order Analysis

    International Nuclear Information System (INIS)

    Shinji, Nishiwaki; Hidekazu, Nishigaki; Yasuaki, Tsurumi; Yoshio, Kojima; Noboru, Kikuchi

    2002-01-01

    Computer Aided Engineering (CAE) has been successfully utilized in mechanical industries such as the automotive industry. It is, however, difficult for most mechanical design engineers to directly use CAE due to the sophisticated nature of the operations involved. In order to mitigate this problem, a new type of CAE, First Order Analysis (FOA) has been proposed. This paper presents the outcome of research concerning the development of a structural topology optimization methodology within FOA. This optimization method is constructed based on discrete and function-oriented elements such as beam and panel elements, and sequential convex programming. In addition, examples are provided to show the utility of the methodology presented here for mechanical design engineers

  11. Scenarios in society, society in scenarios: toward a social scientific analysis of storyline-driven environmental modeling

    International Nuclear Information System (INIS)

    Garb, Yaakov; Pulver, Simone; VanDeveer, Stacy D

    2008-01-01

    Scenario analysis, an approach to thinking about alternative futures based on storyline-driven modeling, has become increasingly common and important in attempts to understand and respond to the impacts of human activities on natural systems at a variety of scales. The construction of scenarios is a fundamentally social activity, yet social scientific perspectives have rarely been brought to bear on it. Indeed, there is a growing imbalance between the increasing technical sophistication of the modeling elements of scenarios and the continued simplicity of our understanding of the social origins, linkages, and implications of the narratives to which they are coupled. Drawing on conceptual and methodological tools from science and technology studies, sociology and political science, we offer an overview of what a social scientific analysis of scenarios might include. In particular, we explore both how scenarios intervene in social microscale and macroscale contexts and how aspects of such contexts are embedded in scenarios, often implicitly. Analyzing the social 'work' of scenarios (i) can enhance the understanding of scenario developers and modeling practitioners of the knowledge production processes in which they participate and (ii) can improve the utility of scenario products as decision-support tools to actual, rather than imagined, decision-makers.

  12. Statistical analysis of aging trend of mechanical properties in ethylene propylene rubber-insulated safety-related cables sampled from containments (Denryoko Chuo Kenkyusho Hokoku, December 2013 issue)

    International Nuclear Information System (INIS)

    Fuse, Norikazu; Kanegami, Masaki; Misaka, Hideki; Homma, Hiroya; Okamoto, Tatsuki

    2013-01-01

    As for polymeric insulations used in nuclear power plant safety cables, it is known that the present prediction model sometimes estimates the service life conservatively. In order to sophisticate the model to reflect to the aging in containments, disconnections between the prediction and realities are needed to be clarified. In the present paper, statistical analysis has been carried out on various aging status of insulations removed from domestic containments. Aging in operational environment is found to be slower than the one expected from acceleration aging test results. Temperature dependence of estimated lifetime on Arrhenius plot also suggests that elementary chemical reaction dominant under the two aging conditions is different, which results in apparent difference in activation energies and pre-exponential factors. Following two kinds of issues are found necessary to be clarified for the model sophistication; temperature change in predominant degradation chemical processes, and the effect of low oxygen concentration environment in boiling water reactor type containment. (author)

  13. Proteomic Analysis of the Human Olfactory Bulb.

    Science.gov (United States)

    Dammalli, Manjunath; Dey, Gourav; Madugundu, Anil K; Kumar, Manish; Rodrigues, Benvil; Gowda, Harsha; Siddaiah, Bychapur Gowrishankar; Mahadevan, Anita; Shankar, Susarla Krishna; Prasad, Thottethodi Subrahmanya Keshava

    2017-08-01

    The importance of olfaction to human health and disease is often underappreciated. Olfactory dysfunction has been reported in association with a host of common complex diseases, including neurological diseases such as Alzheimer's disease and Parkinson's disease. For health, olfaction or the sense of smell is also important for most mammals, for optimal engagement with their environment. Indeed, animals have developed sophisticated olfactory systems to detect and interpret the rich information presented to them to assist in day-to-day activities such as locating food sources, differentiating food from poisons, identifying mates, promoting reproduction, avoiding predators, and averting death. In this context, the olfactory bulb is a vital component of the olfactory system receiving sensory information from the axons of the olfactory receptor neurons located in the nasal cavity and the first place that processes the olfactory information. We report in this study original observations on the human olfactory bulb proteome in healthy subjects, using a high-resolution mass spectrometry-based proteomic approach. We identified 7750 nonredundant proteins from human olfactory bulbs. Bioinformatics analysis of these proteins showed their involvement in biological processes associated with signal transduction, metabolism, transport, and olfaction. These new observations provide a crucial baseline molecular profile of the human olfactory bulb proteome, and should assist the future discovery of biomarker proteins and novel diagnostics associated with diseases characterized by olfactory dysfunction.

  14. Analysis of advanced european nuclear fuel cycle scenarios including transmutation and economical estimates

    International Nuclear Information System (INIS)

    Merino Rodriguez, I.; Alvarez-Velarde, F.; Martin-Fuertes, F.

    2013-01-01

    In this work the transition from the existing Light Water Reactors (LWR) to the advanced reactors is analyzed, including Generation III+ reactors in a European framework. Four European fuel cycle scenarios involving transmutation options have been addressed. The first scenario (i.e., reference) is the current fleet using LWR technology and open fuel cycle. The second scenario assumes a full replacement of the initial fleet with Fast Reactors (FR) burning U-Pu MOX fuel. The third scenario is a modification of the second one introducing Minor Actinide (MA) transmutation in a fraction of the FR fleet. Finally, in the fourth scenario, the LWR fleet is replaced using FR with MOX fuel as well as Accelerator Driven Systems (ADS) for MA transmutation. All scenarios consider an intermediate period of GEN-III+ LWR deployment and they extend for a period of 200 years looking for equilibrium mass flows. The simulations were made using the TR-EVOL code, a tool for fuel cycle studies developed by CIEMAT. The results reveal that all scenarios are feasible according to nuclear resources demand (U and Pu). Concerning to no transmutation cases, the second scenario reduces considerably the Pu inventory in repositories compared to the reference scenario, although the MA inventory increases. The transmutation scenarios show that elimination of the LWR MA legacy requires on one hand a maximum of 33% fraction (i.e., a peak value of 26 FR units) of the FR fleet dedicated to transmutation (MA in MOX fuel, homogeneous transmutation). On the other hand a maximum number of ADS plants accounting for 5% of electricity generation are predicted in the fourth scenario (i.e., 35 ADS units). Regarding the economic analysis, the estimations show an increase of LCOE (Levelized cost of electricity) - averaged over the whole period - with respect to the reference scenario of 21% and 29% for FR and FR with transmutation scenarios respectively, and 34% for the fourth scenario. (authors)

  15. Environmental science applications with Rapid Integrated Mapping and analysis System (RIMS)

    Science.gov (United States)

    Shiklomanov, A.; Prusevich, A.; Gordov, E.; Okladnikov, I.; Titov, A.

    2016-11-01

    The Rapid Integrated Mapping and analysis System (RIMS) has been developed at the University of New Hampshire as an online instrument for multidisciplinary data visualization, analysis and manipulation with a focus on hydrological applications. Recently it was enriched with data and tools to allow more sophisticated analysis of interdisciplinary data. Three different examples of specific scientific applications with RIMS are demonstrated and discussed. Analysis of historical changes in major components of the Eurasian pan-Arctic water budget is based on historical discharge data, gridded observational meteorological fields, and remote sensing data for sea ice area. Express analysis of the extremely hot and dry summer of 2010 across European Russia is performed using a combination of near-real time and historical data to evaluate the intensity and spatial distribution of this event and its socioeconomic impacts. Integrative analysis of hydrological, water management, and population data for Central Asia over the last 30 years provides an assessment of regional water security due to changes in climate, water use and demography. The presented case studies demonstrate the capabilities of RIMS as a powerful instrument for hydrological and coupled human-natural systems research.

  16. Reduced dietary sodium intake increases heart rate. A meta-analysis of 63 randomized controlled trials including 72 study populations.

    Directory of Open Access Journals (Sweden)

    Niels eGraudal

    2016-03-01

    Full Text Available Reduced dietary sodium intake (sodium reduction increases heart rate in some studies of animals and humans. As heart rate is independently associated with the development of heart failure and increased risk of premature death a potential increase in heart rate could be a harmful side-effect of sodium reduction. The purpose of the present meta-analysis was to investigate the effect of sodium reduction on heart rate. Relevant studies were retrieved from an updated pool of 176 randomized controlled trials (RCTs published in the period 1973–2014. 63 of the RCTs including 72 study populations reported data on heart rate. In a meta-analysis of these data sodium reduction increased heart rate with 1.65 beats per minute [95% CI: 1.19, 2.11], p < 0.00001, corresponding to 2.4% of the baseline heart rate. This effect was independent of baseline blood pressure. In conclusion sodium reduction increases heart rate by as much (2.4% as it decreases blood pressure (2.5%. This side-effect, which may cause harmful health effects, contributes to the need for a revision of the present dietary guidelines.

  17. InterFace: A software package for face image warping, averaging, and principal components analysis.

    Science.gov (United States)

    Kramer, Robin S S; Jenkins, Rob; Burton, A Mike

    2017-12-01

    We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.

  18. Challenges and Strategies for Proteome Analysis of the Interaction of Human Pathogenic Fungi with Host Immune Cells.

    Science.gov (United States)

    Krüger, Thomas; Luo, Ting; Schmidt, Hella; Shopova, Iordana; Kniemeyer, Olaf

    2015-12-14

    Opportunistic human pathogenic fungi including the saprotrophic mold Aspergillus fumigatus and the human commensal Candida albicans can cause severe fungal infections in immunocompromised or critically ill patients. The first line of defense against opportunistic fungal pathogens is the innate immune system. Phagocytes such as macrophages, neutrophils and dendritic cells are an important pillar of the innate immune response and have evolved versatile defense strategies against microbial pathogens. On the other hand, human-pathogenic fungi have sophisticated virulence strategies to counteract the innate immune defense. In this context, proteomic approaches can provide deeper insights into the molecular mechanisms of the interaction of host immune cells with fungal pathogens. This is crucial for the identification of both diagnostic biomarkers for fungal infections and therapeutic targets. Studying host-fungal interactions at the protein level is a challenging endeavor, yet there are few studies that have been undertaken. This review draws attention to proteomic techniques and their application to fungal pathogens and to challenges, difficulties, and limitations that may arise in the course of simultaneous dual proteome analysis of host immune cells interacting with diverse morphotypes of fungal pathogens. On this basis, we discuss strategies to overcome these multifaceted experimental and analytical challenges including the viability of immune cells during co-cultivation, the increased and heterogeneous protein complexity of the host proteome dynamically interacting with the fungal proteome, and the demands on normalization strategies in terms of relative quantitative proteome analysis.

  19. 'Let's get wasted': A discourse analysis of teenagers' talk about binge drinking.

    Science.gov (United States)

    Chainey, Timothy A; Stephens, Christine

    2016-05-01

    Teenage binge drinking is a significant health issue. To explore teenagers talk about binge drinking, four peer-group interviews were conducted with 20 teenagers, aged 16-18 years, with experience of excessive alcohol use. A discourse analysis showed that a 'drinking is cool' discourse constructed 'getting wasted' as an integral part of social life, while a 'drinking as a social lubricant' discourse described the behavioural functions of alcohol use. Participants also actively resisted an 'alcohol is bad' discourse, which acknowledges the risks of alcohol use. The findings illustrate how teenagers use these resources in sophisticated ways to position the teen drinker positively and negatively. © The Author(s) 2014.

  20. Dispersed Fringe Sensing Analysis - DFSA

    Science.gov (United States)

    Sigrist, Norbert; Shi, Fang; Redding, David C.; Basinger, Scott A.; Ohara, Catherine M.; Seo, Byoung-Joon; Bikkannavar, Siddarayappa A.; Spechler, Joshua A.

    2012-01-01

    Dispersed Fringe Sensing (DFS) is a technique for measuring and phasing segmented telescope mirrors using a dispersed broadband light image. DFS is capable of breaking the monochromatic light ambiguity, measuring absolute piston errors between segments of large segmented primary mirrors to tens of nanometers accuracy over a range of 100 micrometers or more. The DFSA software tool analyzes DFS images to extract DFS encoded segment piston errors, which can be used to measure piston distances between primary mirror segments of ground and space telescopes. This information is necessary to control mirror segments to establish a smooth, continuous primary figure needed to achieve high optical quality. The DFSA tool is versatile, allowing precise piston measurements from a variety of different optical configurations. DFSA technology may be used for measuring wavefront pistons from sub-apertures defined by adjacent segments (such as Keck Telescope), or from separated sub-apertures used for testing large optical systems (such as sub-aperture wavefront testing for large primary mirrors using auto-collimating flats). An experimental demonstration of the coarse-phasing technology with verification of DFSA was performed at the Keck Telescope. DFSA includes image processing, wavelength and source spectral calibration, fringe extraction line determination, dispersed fringe analysis, and wavefront piston sign determination. The code is robust against internal optical system aberrations and against spectral variations of the source. In addition to the DFSA tool, the software package contains a simple but sophisticated MATLAB model to generate dispersed fringe images of optical system configurations in order to quickly estimate the coarse phasing performance given the optical and operational design requirements. Combining MATLAB (a high-level language and interactive environment developed by MathWorks), MACOS (JPL s software package for Modeling and Analysis for Controlled Optical

  1. Preparation and certification of the Polish reference material Virginia Tobacco Leaves (CTA-VTL-2) for inorganic trace analysis including microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Dybczynski, R.; Polkowska-Motrenko, H.; Samczynski, Z.; Szopa, Z.

    1997-12-31

    A new Polish certified reference material Virginia Tobacco Leaves (CTA-VTL-2) for inorganic trace analysis including microanalysis has been prepared. Certification of the candidate reference material was based on the world-wide interlaboratory comparison in which 60 laboratories from 18 countries, participated using various analytical methods and techniques. Data evaluation performed by means of the new multifunctional software package -SSQC. Recommended values were assigned for 33 and `information` values for 10 elements, respectively. The validity of `certified` values was confirmed for several elements using `very accurate` methods developed in this Laboratory. (author). 47 refs, 28 figs, 12 tabs.

  2. Energy principle with included boundary conditions

    International Nuclear Information System (INIS)

    Lehnert, B.

    1994-01-01

    Earlier comments by the author on the limitations of the classical form of the extended energy principle are supported by a complementary analysis on the potential energy change arising from free-boundary displacements of a magnetically confined plasma. In the final formulation of the extended principle, restricted displacements, satisfying pressure continuity by means of plasma volume currents in a thin boundary layer, are replaced by unrestricted (arbitrary) displacements which can give rise to induced surface currents. It is found that these currents contribute to the change in potential energy, and that their contribution is not taken into account by such a formulation. A general expression is further given for surface currents induced by arbitrary displacements. The expression is used to reformulate the energy principle for the class of displacements which satisfy all necessary boundary conditions, including that of the pressure balance. This makes a minimization procedure of the potential energy possible, for the class of all physically relevant test functions which include the constraints imposed by the boundary conditions. Such a procedure is also consistent with a corresponding variational calculus. (Author)

  3. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  4. A Large-Scale Analysis of Variance in Written Language.

    Science.gov (United States)

    Johns, Brendan T; Jamieson, Randall K

    2018-01-22

    The collection of very large text sources has revolutionized the study of natural language, leading to the development of several models of language learning and distributional semantics that extract sophisticated semantic representations of words based on the statistical redundancies contained within natural language (e.g., Griffiths, Steyvers, & Tenenbaum, ; Jones & Mewhort, ; Landauer & Dumais, ; Mikolov, Sutskever, Chen, Corrado, & Dean, ). The models treat knowledge as an interaction of processing mechanisms and the structure of language experience. But language experience is often treated agnostically. We report a distributional semantic analysis that shows written language in fiction books varies appreciably between books from the different genres, books from the same genre, and even books written by the same author. Given that current theories assume that word knowledge reflects an interaction between processing mechanisms and the language environment, the analysis shows the need for the field to engage in a more deliberate consideration and curation of the corpora used in computational studies of natural language processing. Copyright © 2018 Cognitive Science Society, Inc.

  5. High content analysis of phagocytic activity and cell morphology with PuntoMorph.

    Science.gov (United States)

    Al-Ali, Hassan; Gao, Han; Dalby-Hansen, Camilla; Peters, Vanessa Ann; Shi, Yan; Brambilla, Roberta

    2017-11-01

    Phagocytosis is essential for maintenance of normal homeostasis and healthy tissue. As such, it is a therapeutic target for a wide range of clinical applications. The development of phenotypic screens targeting phagocytosis has lagged behind, however, due to the difficulties associated with image-based quantification of phagocytic activity. We present a robust algorithm and cell-based assay system for high content analysis of phagocytic activity. The method utilizes fluorescently labeled beads as a phagocytic substrate with defined physical properties. The algorithm employs statistical modeling to determine the mean fluorescence of individual beads within each image, and uses the information to conduct an accurate count of phagocytosed beads. In addition, the algorithm conducts detailed and sophisticated analysis of cellular morphology, making it a standalone tool for high content screening. We tested our assay system using microglial cultures. Our results recapitulated previous findings on the effects of microglial stimulation on cell morphology and phagocytic activity. Moreover, our cell-level analysis revealed that the two phenotypes associated with microglial activation, specifically cell body hypertrophy and increased phagocytic activity, are not highly correlated. This novel finding suggests the two phenotypes may be under the control of distinct signaling pathways. We demonstrate that our assay system outperforms preexisting methods for quantifying phagocytic activity in multiple dimensions including speed, accuracy, and resolution. We provide a framework to facilitate the development of high content assays suitable for drug screening. For convenience, we implemented our algorithm in a standalone software package, PuntoMorph. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Application of model-based spectral analysis to wind-profiler radar observations

    Directory of Open Access Journals (Sweden)

    E. Boyer

    Full Text Available A classical way to reduce a radar’s data is to compute the spectrum using FFT and then to identify the different peak contributions. But in case an overlapping between the different echoes (atmospheric echo, clutter, hydrometeor echo. . . exists, Fourier-like techniques provide poor frequency resolution and then sophisticated peak-identification may not be able to detect the different echoes. In order to improve the number of reduced data and their quality relative to Fourier spectrum analysis, three different methods are presented in this paper and applied to actual data. Their approach consists of predicting the main frequency-components, which avoids the development of very sophisticated peak-identification algorithms. The first method is based on cepstrum properties generally used to determine the shift between two close identical echoes. We will see in this paper that this method cannot provide a better estimate than Fourier-like techniques in an operational use. The second method consists of an autoregressive estimation of the spectrum. Since the tests were promising, this method was applied to reduce the radar data obtained during two thunder-storms. The autoregressive method, which is very simple to implement, improved the Doppler-frequency data reduction relative to the FFT spectrum analysis. The third method exploits a MUSIC algorithm, one of the numerous subspace-based methods, which is well adapted to estimate spectra composed of pure lines. A statistical study of performances of this method is presented, and points out the very good resolution of this estimator in comparison with Fourier-like techniques. Application to actual data confirms the good qualities of this estimator for reducing radar’s data.

    Key words. Meteorology and atmospheric dynamics (tropical meteorology- Radio science (signal processing- General (techniques applicable in three or more fields

  7. StreptoBase: An Oral Streptococcus mitis Group Genomic Resource and Analysis Platform.

    Directory of Open Access Journals (Sweden)

    Wenning Zheng

    Full Text Available The oral streptococci are spherical Gram-positive bacteria categorized under the phylum Firmicutes which are among the most common causative agents of bacterial infective endocarditis (IE and are also important agents in septicaemia in neutropenic patients. The Streptococcus mitis group is comprised of 13 species including some of the most common human oral colonizers such as S. mitis, S. oralis, S. sanguinis and S. gordonii as well as species such as S. tigurinus, S. oligofermentans and S. australis that have only recently been classified and are poorly understood at present. We present StreptoBase, which provides a specialized free resource focusing on the genomic analyses of oral species from the mitis group. It currently hosts 104 S. mitis group genomes including 27 novel mitis group strains that we sequenced using the high throughput Illumina HiSeq technology platform, and provides a comprehensive set of genome sequences for analyses, particularly comparative analyses and visualization of both cross-species and cross-strain characteristics of S. mitis group bacteria. StreptoBase incorporates sophisticated in-house designed bioinformatics web tools such as Pairwise Genome Comparison (PGC tool and Pathogenomic Profiling Tool (PathoProT, which facilitate comparative pathogenomics analysis of Streptococcus strains. Examples are provided to demonstrate how StreptoBase can be employed to compare genome structure of different S. mitis group bacteria and putative virulence genes profile across multiple streptococcal strains. In conclusion, StreptoBase offers access to a range of streptococci genomic resources as well as analysis tools and will be an invaluable platform to accelerate research in streptococci. Database URL: http://streptococcus.um.edu.my.

  8. StreptoBase: An Oral Streptococcus mitis Group Genomic Resource and Analysis Platform.

    Science.gov (United States)

    Zheng, Wenning; Tan, Tze King; Paterson, Ian C; Mutha, Naresh V R; Siow, Cheuk Chuen; Tan, Shi Yang; Old, Lesley A; Jakubovics, Nicholas S; Choo, Siew Woh

    2016-01-01

    The oral streptococci are spherical Gram-positive bacteria categorized under the phylum Firmicutes which are among the most common causative agents of bacterial infective endocarditis (IE) and are also important agents in septicaemia in neutropenic patients. The Streptococcus mitis group is comprised of 13 species including some of the most common human oral colonizers such as S. mitis, S. oralis, S. sanguinis and S. gordonii as well as species such as S. tigurinus, S. oligofermentans and S. australis that have only recently been classified and are poorly understood at present. We present StreptoBase, which provides a specialized free resource focusing on the genomic analyses of oral species from the mitis group. It currently hosts 104 S. mitis group genomes including 27 novel mitis group strains that we sequenced using the high throughput Illumina HiSeq technology platform, and provides a comprehensive set of genome sequences for analyses, particularly comparative analyses and visualization of both cross-species and cross-strain characteristics of S. mitis group bacteria. StreptoBase incorporates sophisticated in-house designed bioinformatics web tools such as Pairwise Genome Comparison (PGC) tool and Pathogenomic Profiling Tool (PathoProT), which facilitate comparative pathogenomics analysis of Streptococcus strains. Examples are provided to demonstrate how StreptoBase can be employed to compare genome structure of different S. mitis group bacteria and putative virulence genes profile across multiple streptococcal strains. In conclusion, StreptoBase offers access to a range of streptococci genomic resources as well as analysis tools and will be an invaluable platform to accelerate research in streptococci. Database URL: http://streptococcus.um.edu.my.

  9. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review.

    Science.gov (United States)

    Lamb, Karen E; Thornton, Lukar E; Cerin, Ester; Ball, Kylie

    2015-01-01

    Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Searches were conducted for articles published from 2000-2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results.

  10. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Karen E. Lamb

    2015-07-01

    Full Text Available BackgroundInequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses.MethodsSearches were conducted for articles published from 2000-2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status.ResultsFifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer. To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation.ConclusionsWith advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results.

  11. Metabolite analysis of endophytic fungi from cultivars of Zingiber officinale Rosc. identifies myriad of bioactive compounds including tyrosol.

    Science.gov (United States)

    Anisha, C; Radhakrishnan, E K

    2017-06-01

    Endophytic fungi associated with rhizomes of four cultivars of Zingiber officinale were identified by molecular and morphological methods and evaluated for their activity against soft rot pathogen Pythium myriotylum and clinical pathogens. The volatile bioactive metabolites produced by these isolates were identified by GC-MS analysis of the fungal crude extracts. Understanding of the metabolites produced by endophytes is also important in the context of raw consumption of ginger as medicine and spice. A total of fifteen isolates were identified from the four varieties studied. The various genera identified were Acremonium sp., Gliocladiopsis sp., Fusarium sp., Colletotrichum sp., Aspergillus sp., Phlebia sp., Earliella sp., and Pseudolagarobasidium sp. The endophytic community was unique to each variety, which could be due to the varying host genotype. Fungi from phylum Basidiomycota were identified for the first time from ginger. Seven isolates showed activity against Pythium, while only two showed antibacterial activity. The bioactive metabolites identified in the fungal crude extracts include tyrosol, benzene acetic acid, ergone, dehydromevalonic lactone, N-aminopyrrolidine, and many bioactive fatty acids and their derivatives which included linoleic acid, oleic acid, myristic acid, n-hexadecanoic acid, palmitic acid methyl ester, and methyl linoleate. The presence of these varying bioactive endophytic fungi may be one of the reasons for the differences in the performance of the different ginger varieties.

  12. Lipidomic data analysis: Tutorial, practical guidelines and applications

    Energy Technology Data Exchange (ETDEWEB)

    Checa, Antonio [Department of Medical Biochemistry and Biophysics, Karolinska Institutet, Scheeles väg 2, SE-171 77 Stockholm (Sweden); Bedia, Carmen [Department of Environmental Chemistry, IDAEA-CSIC, Jordi Girona 18–26, Barcelona 08034 (Spain); Jaumot, Joaquim, E-mail: joaquim.jaumot@idaea.csic.es [Department of Environmental Chemistry, IDAEA-CSIC, Jordi Girona 18–26, Barcelona 08034 (Spain)

    2015-07-23

    Highlights: • An overview of chemometric methods applied to lipidomic data analysis is presented. • A lipidomic data set is analyzed showing the strengths of the introduced methods. • Practical guidelines for lipidomic data analysis are discussed. • Examples of applications of lipidomic data analysis in different fields are provided. - Abstract: Lipids are a broad group of biomolecules involved in diverse critical biological roles such as cellular membrane structure, energy storage or cell signaling and homeostasis. Lipidomics is the -omics science that pursues the comprehensive characterization of lipids present in a biological sample. Different analytical strategies such as nuclear magnetic resonance or mass spectrometry with or without previous chromatographic separation are currently used to analyze the lipid composition of a sample. However, current analytical techniques provide a vast amount of data which complicates the interpretation of results without the use of advanced data analysis tools. The choice of the appropriate chemometric method is essential to extract valuable information from the crude data as well as to interpret the lipidomic results in the biological context studied. The present work summarizes the diverse methods of analysis than can be used to study lipidomic data, from statistical inference tests to more sophisticated multivariate analysis methods. In addition to the theoretical description of the methods, application of various methods to a particular lipidomic data set as well as literature examples are presented.

  13. Rapid DNA analysis for automated processing and interpretation of low DNA content samples.

    Science.gov (United States)

    Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F

    2016-01-01

    Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample

  14. STAF: A Powerful and Sophisticated CAI System.

    Science.gov (United States)

    Loach, Ken

    1982-01-01

    Describes the STAF (Science Teacher's Authoring Facility) computer-assisted instruction system developed at Leeds University (England), focusing on STAF language and major program features. Although programs for the system emphasize physical chemistry and organic spectroscopy, the system and language are general purpose and can be used in any…

  15. Rising Trend: Complex and sophisticated attack methods

    Indian Academy of Sciences (India)

    Stux, DuQu, Nitro, Luckycat, Exploit Kits, FLAME. ADSL/SoHo Router Compromise. Botnets of compromised ADSL/SoHo Routers; User Redirection via malicious DNS entry. Web Application attacks. SQL Injection, RFI etc. More and more Webshells. More utility to hackers; Increasing complexity and evading mechanisms.

  16. Endothelial microparticles: Sophisticated vesicles modulating vascular function

    Science.gov (United States)

    Curtis, Anne M; Edelberg, Jay; Jonas, Rebecca; Rogers, Wade T; Moore, Jonni S; Syed, Wajihuddin; Mohler, Emile R

    2015-01-01

    Endothelial microparticles (EMPs) belong to a family of extracellular vesicles that are dynamic, mobile, biological effectors capable of mediating vascular physiology and function. The release of EMPs can impart autocrine and paracrine effects on target cells through surface interaction, cellular fusion, and, possibly, the delivery of intra-vesicular cargo. A greater understanding of the formation, composition, and function of EMPs will broaden our understanding of endothelial communication and may expose new pathways amenable for therapeutic manipulation. PMID:23892447

  17. Rising Trend: Complex and sophisticated attack methods

    Indian Academy of Sciences (India)

    Increased frequency and intensity of DoS/DDoS. Few Gbps is now normal; Anonymous VPNs being used; Botnets being used as a vehicle for launching DDoS attacks. Large scale booking of domain names. Hundred thousands of domains registered in short duration via few registrars; Single registrant; Most of the domains ...

  18. A frequency domain linearized Navier-Stokes method including acoustic damping by eddy viscosity using RANS

    Science.gov (United States)

    Holmberg, Andreas; Kierkegaard, Axel; Weng, Chenyang

    2015-06-01

    In this paper, a method for including damping of acoustic energy in regions of strong turbulence is derived for a linearized Navier-Stokes method in the frequency domain. The proposed method is validated and analyzed in 2D only, although the formulation is fully presented in 3D. The result is applied in a study of the linear interaction between the acoustic and the hydrodynamic field in a 2D T-junction, subject to grazing flow at Mach 0.1. Part of the acoustic energy at the upstream edge of the junction is shed as harmonically oscillating disturbances, which are conveyed across the shear layer over the junction, where they interact with the acoustic field. As the acoustic waves travel in regions of strong shear, there is a need to include the interaction between the background turbulence and the acoustic field. For this purpose, the oscillation of the background turbulence Reynold's stress, due to the acoustic field, is modeled using an eddy Newtonian model assumption. The time averaged flow is first solved for using RANS along with a k-ε turbulence model. The spatially varying turbulent eddy viscosity is then added to the spatially invariant kinematic viscosity in the acoustic set of equations. The response of the 2D T-junction to an incident acoustic field is analyzed via a plane wave scattering matrix model, and the result is compared to experimental data for a T-junction of rectangular ducts. A strong improvement in the agreement between calculation and experimental data is found when the modification proposed in this paper is implemented. Discrepancies remaining are likely due to inaccuracies in the selected turbulence model, which is known to produce large errors e.g. for flows with significant rotation, which the grazing flow across the T-junction certainly is. A natural next step is therefore to test the proposed methodology together with more sophisticated turbulence models.

  19. Modal testing and analysis of NOVA laser structures

    International Nuclear Information System (INIS)

    Burdick, R.B.; Weaver, H.J.; Pastrnak, J.W.

    1984-09-01

    NOVA, currently the world's most powerful laser system, is an ongoing project at the Lawrence Livermore National Laboratory in California. The project seeks to develop a feasible method of achieving controlled fusion reaction, initiated by multiple laser beams targeted on a tiny fuel pellet. The NOVA system consists of several large steel framed structures, the largest of which is the Target Chamber Tower. In conjunction with design engineers, the tower was first modelled and analyzed by sophisticated finite element techniques. A modal test was then conducted on the tower structure to evaluate its vibrational characteristics and seismic integrity as well as for general comparison to the finite element results. This paper will discuss the procedure used in the experimental modal analysis and the results obtained from that test

  20. Bioconductor workflow for microbiome data analysis: from raw reads to community analyses [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Ben J. Callahan

    2016-06-01

    Full Text Available High-throughput sequencing of PCR-amplified taxonomic markers (like the 16S rRNA gene has enabled a new level of analysis of complex bacterial communities known as microbiomes. Many tools exist to quantify and compare abundance levels or microbial composition of communities in different conditions. The sequencing reads have to be denoised and assigned to the closest taxa from a reference database. Common approaches use a notion of 97% similarity and normalize the data by subsampling to equalize library sizes. In this paper, we show that statistical models allow more accurate abundance estimates. By providing a complete workflow in R, we enable the user to do sophisticated downstream statistical analyses, including both parameteric and nonparametric methods. We provide examples of using the R packages dada2, phyloseq, DESeq2, ggplot2 and vegan to filter, visualize and test microbiome data. We also provide examples of supervised analyses using random forests, partial least squares and linear models as well as nonparametric testing using community networks and the ggnetwork package.

  1. Building Software Tools for Combat Modeling and Analysis

    National Research Council Canada - National Science Library

    Yuanxin, Chen

    2004-01-01

    ... (Meta-Language for Combat Simulations) and its associated parser and C++ code generator were designed to reduce the amount of time and developmental efforts needed to build sophisticated real world combat simulations. A C++...

  2. Developments and needs in nuclear analysis of fusion technology

    Energy Technology Data Exchange (ETDEWEB)

    Pampin, R., E-mail: raul.pampin@f4e.europa.eu [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Davis, A. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Izquierdo, J. [F4E Fusion For Energy, Josep Pla 2, Torres Diagonal Litoral B3, Barcelona 08019 (Spain); Leichtle, D. [Karlsruhe Institute of Technology (KIT), Hermann-von-Helmholtz Platz 1, D-76344 Karlsruhe (Germany); Loughlin, M.J. [ITER Organisation, Route de Vinon sur Verdon, 13115 Saint Paul lez Durance (France); Sanz, J. [UNED, Departamento de Ingenieria Energetica, Juan del Rosal 12, 28040 Madrid (Spain); Turner, A. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Villari, R. [Associazione EURATOM-ENEA sulla Fusione, Via Enrico Fermi 45, 00044 Frascati, Rome (Italy); Wilson, P.P.H. [University of Wisconsin, Nuclear Engineering Department, Madison, WI (United States)

    2013-10-15

    Highlights: • Complex fusion nuclear analyses require detailed models, sophisticated acceleration and coupling of cumbersome tools. • Progress on development of tools and methods to meet specific needs of fusion nuclear analysis reported. • Advances in production of reference models and in preparation and QA of acceleration and coupling algorithms shown. • Evaluation and adaptation studies of alternative transport codes presented. • Discussion made of the importance of efforts in these and other areas, considering some of the more pressing needs. -- Abstract: Nuclear analyses provide essential input to the conceptual design, optimisation, engineering and safety case of fusion technology in current experiments, ITER, next-step devices and power plant studies. Calculations are intricate and computer-intensive, typically requiring detailed geometry models, sophisticated acceleration algorithms, high-performance parallel computations, and coupling of large and complex transport and activation codes and databases. This paper reports progress on some key areas in the development of tools and methods to meet the specific needs of fusion nuclear analyses. In particular, advances in the production and modernisation of reference models, in the preparation and quality assurance of acceleration algorithms and coupling schemes, and in the evaluation and adaptation of alternative transport codes are presented. Emphasis is given to ITER-relevant activities, which are the main driver of advances in the field. Discussion is made of the importance of efforts in these and other areas, considering some of the more pressing needs and requirements. In some cases, they call for a more efficient and coordinated use of the scarce resources available.

  3. Environmental impact assessment including indirect effects--a case study using input-output analysis

    International Nuclear Information System (INIS)

    Lenzen, Manfred; Murray, Shauna A.; Korte, Britta; Dey, Christopher J.

    2003-01-01

    Environmental impact assessment (EIA) is a process covered by several international standards, dictating that as many environmental aspects as possible should be identified in a project appraisal. While the ISO 14011 standard stipulates a broad-ranging study, off-site, indirect impacts are not specifically required for an Environmental Impact Statement (EIS). The reasons for this may relate to the perceived difficulty of measuring off-site impacts, or the assumption that these are a relatively insignificant component of the total impact. In this work, we describe a method that uses input-output analysis to calculate the indirect effects of a development proposal in terms of several indicator variables. The results of our case study of a Second Sydney Airport show that the total impacts are considerably higher than the on-site impacts for the indicators land disturbance, greenhouse gas emissions, water use, emissions of NO x and SO 2 , and employment. We conclude that employing input-output analysis enhances conventional EIA, as it allows for national and international effects to be taken into account in the decision-making process

  4. An Excel‐based implementation of the spectral method of action potential alternans analysis

    Science.gov (United States)

    Pearman, Charles M.

    2014-01-01

    Abstract Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro‐arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T‐wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. PMID:25501439

  5. Reliability analysis of the auxiliary feedwater system of Angra-1 including common cause failures using the multiple greek letter model

    International Nuclear Information System (INIS)

    Lapa, Celso Marcelo Franklin.

    1996-05-01

    The use of redundancy to increase the reliability of industrial systems make them subject to the occurrence of common cause events. The industrial experience and the results of safety analysis studies have indicated that common cause failures are the main contributors to the unreliability of plants that have redundant systems, specially in nuclear power plants. In this Thesis procedures are developed in order to include the impact of common cause failures in the calculation of the top event occurrence probability of the Auxiliary Feedwater System in a typical two-loop Nuclear Power Plant (PWR). For this purpose the Multiple Greek Letter Model is used. (author). 14 refs., 10 figs., 11 tabs

  6. Advanced tools for in vivo skin analysis.

    Science.gov (United States)

    Cal, Krzysztof; Zakowiecki, Daniel; Stefanowska, Justyna

    2010-05-01

    A thorough examination of the skin is essential for accurate disease diagnostics, evaluation of the effectiveness of topically applied drugs and the assessment of the results of dermatologic surgeries such as skin grafts. Knowledge of skin parameters is also important in the cosmetics industry, where the effects of skin care products are evaluated. Due to significant progress in the electronics and computer industries, sophisticated analytic devices are increasingly available for day-to-day diagnostics. The aim of this article is to review several advanced methods for in vivo skin analysis in humans: magnetic resonance imaging, electron paramagnetic resonance, laser Doppler flowmetry and time domain reflectometry. The molecular bases of these techniques are presented, and several interesting applications in the field are discussed. Methods for in vivo assessment of the biomechanical properties of human skin are also reviewed.

  7. HPTAM, a two-dimensional Heat Pipe Transient Analysis Model, including the startup from a frozen state

    Science.gov (United States)

    Tournier, Jean-Michel; El-Genk, Mohamed S.

    1995-01-01

    A two-dimensional Heat Pipe Transient Analysis Model, 'HPTAM,' was developed to simulate the transient operation of fully-thawed heat pipes and the startup of heat pipes from a frozen state. The model incorporates: (a) sublimation and resolidification of working fluid; (b) melting and freezing of the working fluid in the porous wick; (c) evaporation of thawed working fluid and condensation as a thin liquid film on a frozen substrate; (d) free-molecule, transition, and continuum vapor flow regimes, using the Dusty Gas Model; (e) liquid flow and heat transfer in the porous wick; and (f) thermal and hydrodynamic couplings of phases at their respective interfaces. HPTAM predicts the radius of curvature of the liquid meniscus at the liquid-vapor interface and the radial location of the working fluid level (liquid or solid) in the wick. It also includes the transverse momentum jump condition (capillary relationship of Pascal) at the liquid-vapor interface and geometrically relates the radius of curvature of the liquid meniscus to the volume fraction of vapor in the wick. The present model predicts the capillary limit and partial liquid recess (dryout) in the evaporator wick, and incorporates a liquid pooling submodel, which simulates accumulation of the excess liquid in the vapor core at the condenser end.

  8. Microstructure of cheese: Processing, technological and microbiological considerations

    OpenAIRE

    Pereira, Cláudia I.; Gomes, Ana M. P.; Malcata, F. Xavier

    2009-01-01

    Cheese is a classical dairy product, which is strongly judged by its appearance and texture; hence, a renewed interest in its microstructure has been on the rise, as sophisticated techniques of analysis become more and more informative and widely available. Processing parameters that affect microstructure play a dominant role upon the features exhibited by the final product as perceived by the consumer; rational relationships between microstructure (which includes biochem...

  9. Modeling accelerator structures and RF components

    International Nuclear Information System (INIS)

    Ko, K., Ng, C.K.; Herrmannsfeldt, W.B.

    1993-03-01

    Computer modeling has become an integral part of the design and analysis of accelerator structures RF components. Sophisticated 3D codes, powerful workstations and timely theory support all contributed to this development. We will describe our modeling experience with these resources and discuss their impact on ongoing work at SLAC. Specific examples from R ampersand D on a future linear collide and a proposed e + e - storage ring will be included

  10. Biomimetic Envelopes

    OpenAIRE

    Ilaria Mazzoleni

    2010-01-01

    How to translate the lessons learned from the analysis and observation of the animal world is the design learning experience presented in this article. Skin is a complex and incredibly sophisticated organ that performs various functions, including protection, sensation and heat and water regulation. In a similar way building envelopes serve multiple roles, as they are the interface between the building inhabitants and environmental elements. The resulting architectural building envelopes prot...

  11. Analysis of the jet pipe electro-hydraulic servo valve with finite element methods

    Directory of Open Access Journals (Sweden)

    Kaiyu Zhao

    2018-01-01

    Full Text Available The dynamic characteristics analysis about the jet pipe electro-hydraulic servo valve based on experience and mathematical derivation was difficult and not so precise. So we have analysed the armature feedback components, torque motor and jet pipe receiver in electrohydraulic servo valve by sophisticated finite element analysis tools respectively and have got physical meaning data on these parts. Then the data were fitted by Matlab and the mathematical relationships among them were calculated. We have done the dynamic multi-physical fields’ Simulink co-simulation using above mathematical relationship, and have got the input-output relationship of the overall valve, the frequency response and step response. This work can show the actual working condition accurately. At the same time, we have considered the materials and the impact of the critical design dimensions in the finite element analysis process. It provides some new ideas to the overall design of jet pipe electro-hydraulic servo valve.

  12. BROCCOLI: Software for Fast fMRI Analysis on Many-Core CPUs and GPUs

    Directory of Open Access Journals (Sweden)

    Anders eEklund

    2014-03-01

    Full Text Available Analysis of functional magnetic resonance imaging (fMRI data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU can perform non-linear spatial normalization to a 1 mm3 brain template in 4-6 seconds, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/.

  13. Philosophical rhetoric and sophistical dialectic: some implications of Plato’s critique of rhetoric in the Phaedrus and the Sophist

    NARCIS (Netherlands)

    Wagemans, J.H.M.; Blair, J.A.; Farr, D.; Hansen, H.V.; Johnson, R.H.; Tindale, C.W.

    2003-01-01

    My PhD research concentrates on the philosophical backgrounds of the relationship between dialectic and rhetoric. In order to pinpoint the discord between both disciplines, I studied their genesis and early history. In this paper, some characteristics of both disciplines will be outlined by

  14. Automated analysis of off-line measured gamma-spectra using UniSampo gamma-ray spectrum analysis software including criterias for alarming systems

    International Nuclear Information System (INIS)

    Nikkinen, M.T.

    2005-01-01

    In many laboratories the number of measured routine gamma-spectra can be significant and the labour work to review all the data is time consuming and expensive task. In many cases the routine sample does not normally contain radiation above a detectable level, and still the review of the spectra has to be performed. By introducing simple rules for emerging conditions, the review work can be significantly reduced. In one case the need to review the environmental measurement spectra was reduced to less than 1% compared to the original need, which in turn made the review personnel available for more useful functions. Using the UniSampo analysis system, the analysis results of spectra that are causing alarming conditions can be transmitted via e-mail to any address. Some systems are even equipped with the capability to forward these results to hand-portable telephones or pagers. This is a very practical solution for automated environmental monitoring, when the sample spectra are collected automatically and transmitted to central computer for further analysis. Set up of an automatic analysis system, rules for the emerging conditions, technical solutions for an automated alarming system and a generic hypothesis test for the alarming system developed for UniSampo analysis software are described. (author)

  15. Remote Access to Instrumental Analysis for Distance Education in Science

    Directory of Open Access Journals (Sweden)

    Dietmar Kennepohl

    2005-11-01

    Full Text Available Remote access to experiments offers distance educators another tool to integrate a strong laboratory component within a science course. Since virtually all modern chemical instrumental analysis in industry now use devices operated by a computer interface, remote control of instrumentation is not only relatively facile, it enhances students’ opportunity to learn the subject matter and be exposed to “real world” contents. Northern Alberta Institute of Technology (NAIT and Athabasca University are developing teaching laboratories based on the control of analytical instruments in real-time via an Internet connection. Students perform real-time analysis using equipment, methods, and skills that are common to modern analytical laboratories (or sophisticated teaching laboratories. Students obtain real results using real substances to arrive at real conclusions, just as they would if they were in a physical laboratory with the equipment; this approach allows students to access to conduct instrumental science experiments, thus providing them with an advantageous route to upgrade their laboratory skills while learning at a distance.

  16. Compendium of computer codes for the safety analysis of LMFBR's

    International Nuclear Information System (INIS)

    1975-06-01

    A high level of mathematical sophistication is required in the safety analysis of LMFBR's to adequately meet the demands for realism and confidence in all areas of accident consequence evaluation. The numerical solution procedures associated with these analyses are generally so complex and time consuming as to necessitate their programming into computer codes. These computer codes have become extremely powerful tools for safety analysis, combining unique advantages in accuracy, speed and cost. The number, diversity and complexity of LMFBR safety codes in the U. S. has grown rapidly in recent years. It is estimated that over 100 such codes exist in various stages of development throughout the country. It is inevitable that such a large assortment of codes will require rigorous cataloguing and abstracting to aid individuals in identifying what is available. It is the purpose of this compendium to provide such a service through the compilation of code summaries which describe and clarify the status of domestic LMFBR safety codes. (U.S.)

  17. Genome and transcriptome analysis of the food-yeast Candida utilis.

    Directory of Open Access Journals (Sweden)

    Yasuyuki Tomita

    Full Text Available The industrially important food-yeast Candida utilis is a Crabtree effect-negative yeast used to produce valuable chemicals and recombinant proteins. In the present study, we conducted whole genome sequencing and phylogenetic analysis of C. utilis, which showed that this yeast diverged long before the formation of the CUG and Saccharomyces/Kluyveromyces clades. In addition, we performed comparative genome and transcriptome analyses using next-generation sequencing, which resulted in the identification of genes important for characteristic phenotypes of C. utilis such as those involved in nitrate assimilation, in addition to the gene encoding the functional hexose transporter. We also found that an antisense transcript of the alcohol dehydrogenase gene, which in silico analysis did not predict to be a functional gene, was transcribed in the stationary-phase, suggesting a novel system of repression of ethanol production. These findings should facilitate the development of more sophisticated systems for the production of useful reagents using C. utilis.

  18. Large Deployable Reflector (LDR) feasibility study update

    Science.gov (United States)

    Alff, W. H.; Banderman, L. W.

    1983-01-01

    In 1982 a workshop was held to refine the science rationale for large deployable reflectors (LDR) and develop technology requirements that support the science rationale. At the end of the workshop, a set of LDR consensus systems requirements was established. The subject study was undertaken to update the initial LDR study using the new systems requirements. The study included mirror materials selection and configuration, thermal analysis, structural concept definition and analysis, dynamic control analysis and recommendations for further study. The primary emphasis was on the dynamic controls requirements and the sophistication of the controls system needed to meet LDR performance goals.

  19. Lessons from hot spot analysis for fragment-based drug discovery

    Science.gov (United States)

    Hall, David R.; Vajda, Sandor

    2015-01-01

    Analysis of binding energy hot spots at protein surfaces can provide crucial insights into the prospects for successful application of fragment-based drug discovery (FBDD), and whether a fragment hit can be advanced into a high affinity, druglike ligand. The key factor is the strength of the top ranking hot spot, and how well a given fragment complements it. We show that published data are sufficient to provide a sophisticated and quantitative understanding of how hot spots derive from protein three-dimensional structure, and how their strength, number and spatial arrangement govern the potential for a surface site to bind to fragment-sized and larger ligands. This improved understanding provides important guidance for the effective application of FBDD in drug discovery. PMID:26538314

  20. The evolution of the dorsal thalamus of jawed vertebrates, including mammals: cladistic analysis and a new hypothesis.

    Science.gov (United States)

    Butler, A B

    1994-01-01

    The evolution of the dorsal thalamus in various vertebrate lineages of jawed vertebrates has been an enigma, partly due to two prevalent misconceptions: the belief that the multitude of nuclei in the dorsal thalamus of mammals could be meaningfully compared neither with the relatively few nuclei in the dorsal thalamus of anamniotes nor with the intermediate number of dorsal thalamic nuclei of other amniotes and a definition of the dorsal thalamus that too narrowly focused on the features of the dorsal thalamus of mammals. The cladistic analysis carried out here allows us to recognize which features are plesiomorphic and which apomorphic for the dorsal thalamus of jawed vertebrates and to then reconstruct the major changes that have occurred in the dorsal thalamus over evolution. Embryological data examined in the context of Von Baerian theory (embryos of later-descendant species resemble the embryos of earlier-descendant species to the point of their divergence) supports a new 'Dual Elaboration Hypothesis' of dorsal thalamic evolution generated from this cladistic analysis. From the morphotype for an early stage in the embryological development of the dorsal thalamus of jawed vertebrates, the divergent, sequential stages of the development of the dorsal thalamus are derived for each major radiation and compared. The new hypothesis holds that the dorsal thalamus comprises two basic divisions--the collothalamus and the lemnothalamus--that receive their predominant input from the midbrain roof and (plesiomorphically) from lemniscal pathways, including the optic tract, respectively. Where present, the collothalamic, midbrain-sensory relay nuclei are homologous to each other in all vertebrate radiations as discrete nuclei. Within the lemnothalamus, the dorsal lateral geniculate nucleus of mammals and the dorsal lateral optic nucleus of non-synapsid amniotes (diapsid reptiles, birds and turtles) are homologous as discrete nuclei; most or all of the ventral nuclear group

  1. Price transmission in the Swiss wheat market: does sophisticated border protection make the difference?

    OpenAIRE

    Esposti, Roberto; Listorti, Giulia

    2014-01-01

    This study deals with horizontal wheat price transmission from the international markets to the domestic Swiss market. The analysis takes into account trade policies implemented at the borders that might shelter the domestic market from international markets fluctuations, as well as the presence of explosive behavior in some of the price series. Furthermore, the Swiss case is peculiar due to the presence of different border policies for wheat according to its domestic use, food or feed. The p...

  2. Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.

    Science.gov (United States)

    Dunn, Joshua G; Weissman, Jonathan S

    2016-11-22

    Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily

  3. Growth curve analysis for plasma profiles using smoothing splines. Final report, January 1993--January 1995

    International Nuclear Information System (INIS)

    Imre, K.

    1995-07-01

    In this project, we parameterize the shape and magnitude of the temperature and density profiles on JET and the temperature profiles on TFTR. The key control variables for the profiles were tabulated and the response functions were estimated. A sophisticated statistical analysis code was developed to fit the plasma profiles. Our analysis indicate that the JET density shape depends primarily on bar n/B t for Ohmic heating, bar n for L-mode and I p for H-mode. The temperature profiles for JET are mainly determined by q 95 for the case of Ohmic heating, and by B t and P/bar n for the L-mode. For the H-mode the shape depends on the type of auxiliary heating, Z eff , N bar n, q 95 , and P

  4. Process simulation and uncertainty analysis of plasma arc mixed waste treatment

    International Nuclear Information System (INIS)

    Ferrada, J.J.; Welch, T.D.

    1994-01-01

    Innovative mixed waste treatment subsystems have been analyzed for performance, risk, and life-cycle cost as part of the U.S. Department of Energy's (DOE)'s Mixed Waste Integrated Program (MWIP) treatment alternatives development and evaluation process. This paper concerns the analysis of mixed waste treatment system performance. Performance systems analysis includes approximate material and energy balances and assessments of operability, effectiveness, and reliability. Preliminary material and energy balances of innovative processes have been analyzed using FLOW, an object-oriented, process simulator for waste management systems under development at Oak Ridge National Laboratory. The preliminary models developed for FLOW provide rough order-of-magnitude calculations useful for sensitivity analysis. The insight gained from early modeling of these technologies approximately will ease the transition to more sophisticated simulators as adequate performance and property data become available. Such models are being developed in ASPEN by DOE's Mixed Waste Treatment Project (MWTP) for baseline and alternative flow sheets based on commercial technologies. One alternative to the baseline developed by the MWIP support groups in plasma arc treatment. This process offers a noticeable reduction in the number of process operations as compared to the baseline process because a plasma arc melter is capable of accepting a wide variety of waste streams as direct inputs (without sorting or preprocessing). This innovative process for treating mixed waste replaces several units from the baseline process and, thus, promises an economic advantage. The performance in the plasma arc furnace will directly affect the quality of the waste form and the requirements of the off-gas treatment units. The ultimate objective of MWIP is to reduce the amount of final waste produced, the cost, and the environmental impact

  5. Mechanical break junctions: enormous information in a nanoscale package.

    Science.gov (United States)

    Natelson, Douglas

    2012-04-24

    Mechanical break junctions, particularly those in which a metal tip is repeatedly moved in and out of contact with a metal film, have provided many insights into electronic conduction at the atomic and molecular scale, most often by averaging over many possible junction configurations. This averaging throws away a great deal of information, and Makk et al. in this issue of ACS Nano demonstrate that, with both simulated and real experimental data, more sophisticated two-dimensional analysis methods can reveal information otherwise obscured in simple histograms. As additional measured quantities come into play in break junction experiments, including thermopower, noise, and optical response, these more sophisticated analytic approaches are likely to become even more powerful. While break junctions are not directly practical for useful electronic devices, they are incredibly valuable tools for unraveling the electronic transport physics relevant for ultrascaled nanoelectronics.

  6. Toward sophisticated basal ganglia neuromodulation: Review on basal ganglia deep brain stimulation.

    Science.gov (United States)

    Da Cunha, Claudio; Boschen, Suelen L; Gómez-A, Alexander; Ross, Erika K; Gibson, William S J; Min, Hoon-Ki; Lee, Kendall H; Blaha, Charles D

    2015-11-01

    This review presents state-of-the-art knowledge about the roles of the basal ganglia (BG) in action-selection, cognition, and motivation, and how this knowledge has been used to improve deep brain stimulation (DBS) treatment of neurological and psychiatric disorders. Such pathological conditions include Parkinson's disease, Huntington's disease, Tourette syndrome, depression, and obsessive-compulsive disorder. The first section presents evidence supporting current hypotheses of how the cortico-BG circuitry works to select motor and emotional actions, and how defects in this circuitry can cause symptoms of the BG diseases. Emphasis is given to the role of striatal dopamine on motor performance, motivated behaviors and learning of procedural memories. Next, the use of cutting-edge electrochemical techniques in animal and human studies of BG functioning under normal and disease conditions is discussed. Finally, functional neuroimaging studies are reviewed; these works have shown the relationship between cortico-BG structures activated during DBS and improvement of disease symptoms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Interpreting and Presenting Data to Management. Air Professional File Number 36.

    Science.gov (United States)

    Clagett, Craig A.

    Guidelines are offered to institutional researchers and planning analysts for presenting research results in formats and levels of sophistication that are accessible to top management. Fundamental principles include: (1) know what is needed; (2) know when the information is needed; (3) match format to analytical sophistication and learning…

  8. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. PWR core safety analysis with 3-dimensional methods

    International Nuclear Information System (INIS)

    Gensler, A.; Kühnel, K.; Kuch, S.

    2015-01-01

    Highlights: • An overview of AREVA’s safety analysis codes their coupling is provided. • The validation base and licensing applications of these codes are summarized. • Coupled codes and methods provide improved margins and non-conservative results. • Examples for REA and inadvertent opening of the pressurizer safety valve are given. - Abstract: The main focus of safety analysis is to demonstrate the required safety level of the reactor core. Because of the demanding requirements, the quality of the safety analysis strongly affects the confidence in the operational safety of a reactor. To ensure the highest quality, it is essential that the methodology consists of appropriate analysis tools, an extensive validation base, and last but not least highly educated engineers applying the methodology. The sophisticated 3-dimensional core models applied by AREVA ensure that all physical effects relevant for safety are treated and the results are reliable and conservative. Presently AREVA employs SCIENCE, CASMO/NEMO and CASCADE-3D for pressurized water reactors. These codes are currently being consolidated into the next generation 3D code system ARCADIA®. AREVA continuously extends the validation base, including measurement campaigns in test facilities and comparisons of the predictions of steady state and transient measured data gathered from plants during many years of operation. Thus, the core models provide reliable and comprehensive results for a wide range of applications. For the application of these powerful tools, AREVA is taking benefit of its interdisciplinary know-how and international teamwork. Experienced engineers of different technical backgrounds are working together to ensure an appropriate interpretation of the calculation results, uncertainty analysis, along with continuously maintaining and enhancing the quality of the analysis methodologies. In this paper, an overview of AREVA’s broad application experience as well as the broad validation

  10. Alpha-adducin Gly460Trp polymorphism and hypertension risk: a meta-analysis of 22 studies including 14303 cases and 15961 controls.

    Directory of Open Access Journals (Sweden)

    Kuo Liu

    Full Text Available BACKGROUND: No clear consensus has been reached on the alpha-adducin polymorphism (Gly460Trp and essential hypertension risk. We performed a meta-analysis in an effort to systematically summarize the possible association. METHODOLOGY/PRINCIPAL FINDINGS: Studies were identified by searching MEDLINE and EMBASE databases complemented with perusal of bibliographies of retrieved articles and correspondence with original authors. The fixed-effects model and the random-effects model were applied for dichotomous outcomes to combine the results of the individual studies. We selected 22 studies that met the inclusion criteria including a total of 14303 hypertensive patients and 15961 normotensive controls. Overall, the 460Trp allele showed no statistically significant association with hypertension risk compared to Gly460 allele (P = 0.69, OR = 1.02, 95% CI 0.94-1.10, P(heterogeneity<0.0001 in all subjects. Meta-analysis under other genetic contrasts still did not reveal any significant association in all subjects, Caucasians, East Asians and others. The results were similar but heterogeneity did not persist when sensitivity analyses were limited to these studies. CONCLUSIONS/SIGNIFICANCE: Our meta-analysis failed to provide evidence for the genetic association of α-adducin gene Gly460Trp polymorphism with hypertension. Further studies investigating the effect of genetic networks, environmental factors, individual biological characteristics and their mutual interactions are needed to elucidate the possible mechanism for hypertension in humans.

  11. Modeling and analysis of real-time and embedded systems with UML and MARTE developing cyber-physical systems

    CERN Document Server

    Selic, Bran

    2013-01-01

    Modeling and Analysis of Real-Time and Embedded Systems with UML and MARTE explains how to apply the complex MARTE standard in practical situations. This approachable reference provides a handy user guide, illustrating with numerous examples how you can use MARTE to design and develop real-time and embedded systems and software. Expert co-authors Bran Selic and Sébastien Gérard lead the team that drafted and maintain the standard and give you the tools you need apply MARTE to overcome the limitations of cyber-physical systems. The functional sophistication required of modern cyber-physical

  12. MARS: Microarray analysis, retrieval, and storage system

    Directory of Open Access Journals (Sweden)

    Scheideler Marcel

    2005-04-01

    Full Text Available Abstract Background Microarray analysis has become a widely used technique for the study of gene-expression patterns on a genomic scale. As more and more laboratories are adopting microarray technology, there is a need for powerful and easy to use microarray databases facilitating array fabrication, labeling, hybridization, and data analysis. The wealth of data generated by this high throughput approach renders adequate database and analysis tools crucial for the pursuit of insights into the transcriptomic behavior of cells. Results MARS (Microarray Analysis and Retrieval System provides a comprehensive MIAME supportive suite for storing, retrieving, and analyzing multi color microarray data. The system comprises a laboratory information management system (LIMS, a quality control management, as well as a sophisticated user management system. MARS is fully integrated into an analytical pipeline of microarray image analysis, normalization, gene expression clustering, and mapping of gene expression data onto biological pathways. The incorporation of ontologies and the use of MAGE-ML enables an export of studies stored in MARS to public repositories and other databases accepting these documents. Conclusion We have developed an integrated system tailored to serve the specific needs of microarray based research projects using a unique fusion of Web based and standalone applications connected to the latest J2EE application server technology. The presented system is freely available for academic and non-profit institutions. More information can be found at http://genome.tugraz.at.

  13. Dipole model analysis of highest precision HERA data, including very low Q"2's

    International Nuclear Information System (INIS)

    Luszczak, A.; Kowalski, H.

    2016-12-01

    We analyse, within a dipole model, the final, inclusive HERA DIS cross section data in the low χ region, using fully correlated errors. We show, that these highest precision data are very well described within the dipole model framework starting from Q"2 values of 3.5 GeV"2 to the highest values of Q"2=250 GeV"2. To analyze the saturation effects we evaluated the data including also the very low 0.35< Q"2 GeV"2 region. The fits including this region show a preference of the saturation ansatz.

  14. Analysis Method for Laterally Loaded Pile Groups Using an Advanced Modeling of Reinforced Concrete Sections.

    Science.gov (United States)

    Stacul, Stefano; Squeglia, Nunziante

    2018-02-15

    A Boundary Element Method (BEM) approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ.

  15. Analysis Method for Laterally Loaded Pile Groups Using an Advanced Modeling of Reinforced Concrete Sections

    Directory of Open Access Journals (Sweden)

    Stefano Stacul

    2018-02-01

    Full Text Available A Boundary Element Method (BEM approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ.

  16. Introducing the Resources and Energy Analysis Programme (REAP)

    Energy Technology Data Exchange (ETDEWEB)

    Paul, Alistair; Wiedmann, Thomas; Barrett, John; Minx, Jan; Scott, Kate; Dawkins, Elena; Owen, Anne; Briggs, Julian; Gray, Ian

    2010-02-15

    REAP is a highly sophisticated model that helps policy makers to understand and measure the environmental pressures associated with human consumption. It can be used at the local, regional and national levels and generates indicators on: - Carbon dioxide and greenhouse gas emissions measured in tonnes per capita; - The Ecological Footprint required to sustain an area in global hectares per capita; - The Material Flows of products and services through an area measured in thousands of tonnes. REAP contains several unique features and has applications in a wide range of policy areas including transport, housing and planning. The programme's powerful scenario tool models the impacts of policy and creates plausible scenarios of the future. These scenarios can be set against targets or compared to alternative futures based on different trends or assumptions

  17. Unsaturated Seepage Analysis of Cracked Soil including Development Process of Cracks

    Directory of Open Access Journals (Sweden)

    Ling Cao

    2016-01-01

    Full Text Available Cracks in soil provide preferential pathways for water flow and their morphological parameters significantly affect the hydraulic conductivity of the soil. To study the hydraulic properties of cracks, the dynamic development of cracks in the expansive soil during drying and wetting has been measured in the laboratory. The test results enable the development of the relationships between the cracks morphological parameters and the water content. In this study, the fractal model has been used to predict the soil-water characteristic curve (SWCC of the cracked soil, including the developmental process of the cracks. The cracked expansive soil has been considered as a crack-pore medium. A dual media flow model has been developed to simulate the seepage characteristics of the cracked expansive soil. The variations in pore water pressure at different part of the model are quite different due to the impact of the cracks. This study proves that seepage characteristics can be better predicted if the impact of cracks is taken into account.

  18. A Generalized Email Classification System for Workflow Analysis

    NARCIS (Netherlands)

    P. Chaipornkaew (Piyanuch); T. Prexawanprasut (Takorn); C-L. Chang (Chia-Lin); M.J. McAleer (Michael)

    2017-01-01

    textabstractOne of the most powerful internet communication channels is email. As employees and their clients communicate primarily via email, much crucial business data is conveyed via email content. Where businesses are understandably concerned, they need a sophisticated workflow management

  19. A pragmatic approach to including complex natural modes of vibration in aeroelastic analysis

    CSIR Research Space (South Africa)

    Van Zyl, Lourens H

    2015-09-01

    Full Text Available complex natural modes of vibration in aeroelastic analysis Louw van Zyl International Aerospace Symposium of South Africa 14 to 16 September, 2015 Stellenbosch, South Africa Slide 2 © CSIR 2006 www.csir.co.za Problem statement..., the square of the angular frequencies in radians per second) [ ]{ } [ ]{ } [ ]{ } { }fxKxCxM =++ &&& [ ]{ } [ ]{ } 0=+ xKxMs2 Slide 4 © CSIR 2006 www.csir.co.za Structural Dynamics (continued) • The corresponding eigenvectors are real...

  20. Economic analysis of including an MRS facility in the waste management system

    International Nuclear Information System (INIS)

    Williams, J.W.; Conner, C.; Leiter, A.J.; Ching, E.

    1992-01-01

    The MRS System Study Summary Report (System Study) in June 1989 concluded that an MRS facility would provide early spent fuel acceptance as well as flexibility for the waste management system. However, these advantages would be offset by an increase in the total system cost (i.e., total cost to the ratepayer) ranging from $1.3 billion to about $2.8 billion depending on the configuration of the waste management system. This paper discusses this new investigation which will show that, in addition to the advantages of an MRS facility described above, a basic (i.e., store-only) MRS facility may result in a cost savings to the total system, primarily due to the inclusion in the analysis of additional at-reactor operating costs for maintaining shutdown reactor sites

  1. Viscous-Inviscid Methods in Unsteady Aerodynamic Analysis of Bio-Inspired Morphing Wings

    Science.gov (United States)

    Dhruv, Akash V.

    Flight has been one of the greatest realizations of human imagination, revolutionizing communication and transportation over the years. This has greatly influenced the growth of technology itself, enabling researchers to communicate and share their ideas more effectively, extending the human potential to create more sophisticated systems. While the end product of a sophisticated technology makes our lives easier, its development process presents an array of challenges in itself. In last decade, scientists and engineers have turned towards bio-inspiration to design more efficient and robust aerodynamic systems to enhance the ability of Unmanned Aerial Vehicles (UAVs) to be operated in cluttered environments, where tight maneuverability and controllability are necessary. Effective use of UAVs in domestic airspace will mark the beginning of a new age in communication and transportation. The design of such complex systems necessitates the need for faster and more effective tools to perform preliminary investigations in design, thereby streamlining the design process. This thesis explores the implementation of numerical panel methods for aerodynamic analysis of bio-inspired morphing wings. Numerical panel methods have been one of the earliest forms of computational methods for aerodynamic analysis to be developed. Although the early editions of this method performed only inviscid analysis, the algorithm has matured over the years as a result of contributions made by prominent aerodynamicists. The method discussed in this thesis is influenced by recent advancements in panel methods and incorporates both viscous and inviscid analysis of multi-flap wings. The surface calculation of aerodynamic coefficients makes this method less computationally expensive than traditional Computational Fluid Dynamics (CFD) solvers available, and thus is effective when both speed and accuracy are desired. The morphing wing design, which consists of sequential feather-like flaps installed

  2. Microfluidic System Simulation Including the Electro-Viscous Effect

    Science.gov (United States)

    Rojas, Eileen; Chen, C. P.; Majumdar, Alok

    2007-01-01

    This paper describes a practical approach using a general purpose lumped-parameter computer program, GFSSP (Generalized Fluid System Simulation Program) for calculating flow distribution in a network of micro-channels including electro-viscous effects due to the existence of electrical double layer (EDL). In this study, an empirical formulation for calculating an effective viscosity of ionic solutions based on dimensional analysis is described to account for surface charge and bulk fluid conductivity, which give rise to electro-viscous effect in microfluidics network. Two dimensional slit micro flow data was used to determine the model coefficients. Geometry effect is then included through a Poiseuille number correlation in GFSSP. The bi-power model was used to calculate flow distribution of isotropically etched straight channel and T-junction microflows involving ionic solutions. Performance of the proposed model is assessed against experimental test data.

  3. Error analysis to improve the speech recognition accuracy on ...

    Indian Academy of Sciences (India)

    dictionary plays a key role in the speech recognition accuracy. .... Sophisticated microphone is used for the recording speech corpus in a noise free environment. .... values, word error rate (WER) and error-rate will be calculated as follows:.

  4. Dynamics Analysis of Origami-Folded Deployable Space Structures with Elastic Hinges

    Data.gov (United States)

    National Aeronautics and Space Administration — The future of space exploration needs highly sophisticated deployable space structure technology in order to achieve the ambitious goals being set today. Several...

  5. Linear mixed-effects modeling approach to FMRI group analysis.

    Science.gov (United States)

    Chen, Gang; Saad, Ziad S; Britton, Jennifer C; Pine, Daniel S; Cox, Robert W

    2013-06-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance-covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance-covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity

  6. Modelling optimization involving different types of elements in finite element analysis

    International Nuclear Information System (INIS)

    Wai, C M; Rivai, Ahmad; Bapokutty, Omar

    2013-01-01

    Finite elements are used to express the mechanical behaviour of a structure in finite element analysis. Therefore, the selection of the elements determines the quality of the analysis. The aim of this paper is to compare and contrast 1D element, 2D element, and 3D element used in finite element analysis. A simple case study was carried out on a standard W460x74 I-beam. The I-beam was modelled and analyzed statically with 1D elements, 2D elements and 3D elements. The results for the three separate finite element models were compared in terms of stresses, deformation and displacement of the I-beam. All three finite element models yield satisfactory results with acceptable errors. The advantages and limitations of these elements are discussed. 1D elements offer simplicity although lacking in their ability to model complicated geometry. 2D elements and 3D elements provide more detail yet sophisticated results which require more time and computer memory in the modelling process. It is also found that the choice of element in finite element analysis is influence by a few factors such as the geometry of the structure, desired analysis results, and the capability of the computer

  7. An Excel-based implementation of the spectral method of action potential alternans analysis.

    Science.gov (United States)

    Pearman, Charles M

    2014-12-01

    Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. © 2014 The Author. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.

  8. Lessons from Hot Spot Analysis for Fragment-Based Drug Discovery.

    Science.gov (United States)

    Hall, David R; Kozakov, Dima; Whitty, Adrian; Vajda, Sandor

    2015-11-01

    Analysis of binding energy hot spots at protein surfaces can provide crucial insights into the prospects for successful application of fragment-based drug discovery (FBDD), and whether a fragment hit can be advanced into a high-affinity, drug-like ligand. The key factor is the strength of the top ranking hot spot, and how well a given fragment complements it. We show that published data are sufficient to provide a sophisticated and quantitative understanding of how hot spots derive from a protein 3D structure, and how their strength, number, and spatial arrangement govern the potential for a surface site to bind to fragment-sized and larger ligands. This improved understanding provides important guidance for the effective application of FBDD in drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. TMVA - Toolkit for Multivariate Data Analysis with ROOT Users guide

    CERN Document Server

    Höcker, A; Tegenfeldt, F; Voss, H; Voss, K; Christov, A; Henrot-Versillé, S; Jachowski, M; Krasznahorkay, A; Mahalalel, Y; Prudent, X; Speckmayer, P

    2007-01-01

    Multivariate machine learning techniques for the classification of data from high-energy physics (HEP) experiments have become standard tools in most HEP analyses. The multivariate classifiers themselves have significantly evolved in recent years, also driven by developments in other areas inside and outside science. TMVA is a toolkit integrated in ROOT which hosts a large variety of multivariate classification algorithms. They range from rectangular cut optimisation (using a genetic algorithm) and likelihood estimators, over linear and non-linear discriminants (neural networks), to sophisticated recent developments like boosted decision trees and rule ensemble fitting. TMVA organises the simultaneous training, testing, and performance evaluation of all these classifiers with a user-friendly interface, and expedites the application of the trained classifiers to the analysis of data sets with unknown sample composition.

  10. Comprehensive processing of high-throughput small RNA sequencing data including quality checking, normalization, and differential expression analysis using the UEA sRNA Workbench.

    Science.gov (United States)

    Beckers, Matthew; Mohorianu, Irina; Stocks, Matthew; Applegate, Christopher; Dalmay, Tamas; Moulton, Vincent

    2017-06-01

    Recently, high-throughput sequencing (HTS) has revealed compelling details about the small RNA (sRNA) population in eukaryotes. These 20 to 25 nt noncoding RNAs can influence gene expression by acting as guides for the sequence-specific regulatory mechanism known as RNA silencing. The increase in sequencing depth and number of samples per project enables a better understanding of the role sRNAs play by facilitating the study of expression patterns. However, the intricacy of the biological hypotheses coupled with a lack of appropriate tools often leads to inadequate mining of the available data and thus, an incomplete description of the biological mechanisms involved. To enable a comprehensive study of differential expression in sRNA data sets, we present a new interactive pipeline that guides researchers through the various stages of data preprocessing and analysis. This includes various tools, some of which we specifically developed for sRNA analysis, for quality checking and normalization of sRNA samples as well as tools for the detection of differentially expressed sRNAs and identification of the resulting expression patterns. The pipeline is available within the UEA sRNA Workbench, a user-friendly software package for the processing of sRNA data sets. We demonstrate the use of the pipeline on a H. sapiens data set; additional examples on a B. terrestris data set and on an A. thaliana data set are described in the Supplemental Information A comparison with existing approaches is also included, which exemplifies some of the issues that need to be addressed for sRNA analysis and how the new pipeline may be used to do this. © 2017 Beckers et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  11. Preliminary Nuclear Analysis for the HANARO Fuel Element with Burnable Absorber

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Chul Gyo; Kim, So Young; In, Won Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Burnable absorber is used for reducing reactivity swing and power peaking in high performance research reactors. Development of the HANARO fuel element with burnable absorber was started in the U-Mo fuel development program at HANARO, but detailed full core analysis was not performed because the current HANARO fuel management system is uncertain to analysis the HANARO core with burnable absorber. A sophisticated reactor physics system is required to analysis the core. The McCARD code was selected and the detailed McCARD core models, in which the basic HANARO core model was developed by one of the McCARD developers, are used in this study. The development of nuclear fuel requires a long time and correct developing direction especially by the nuclear analysis. This paper presents a preliminary nuclear analysis to promote the fuel development. Based on the developed fuel, the further nuclear analysis will improve reactor performance and safety. Basic nuclear analysis for the HANARO and the AHR were performed for getting the proper fuel elements with burnable absorber. Addition of 0.3 - 0.4% Cd to the fuel meat is promising for the current HANARO fuel element. Small addition of burnable absorber may not change any fuel characteristics of the HANARO fuel element, but various basic tests and irradiation tests at the HANARO core are required.

  12. InTILF Method for Analysis of Polished Mirror Surfaces, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Numerical simulation of the performance of new x-ray mirror performed by NASA and those under upgrade requires sophisticated and reliable information about the...

  13. A combined multibody and finite element approach for dynamic interaction analysis of high-speed train and railway structure including post-derailment behavior during an earthquake

    International Nuclear Information System (INIS)

    Tanabe, M; Wakui, H; Sogabe, M; Matsumoto, N; Tanabe, Y

    2010-01-01

    A combined multibody and finite element approach is given to solve the dynamic interaction of a Shinkansen train (high-speed train in Japan) and the railway structure including post-derailment during an earthquake effectively. The motion of the train is expressed in multibody dynamics. Efficient mechanical models to express interactions between wheel and track structure including post-derailment are given. Rail and track elements expressed in multibody dynamics and FEM are given to solve contact problems between wheel and long railway components effectively. The motion of a railway structure is modeled with various finite elements and rail and track elements. The computer program has been developed for the dynamic interaction analysis of a Shinkansen train and railway structure including post derailment during an earthquake. Numerical examples are demonstrated.

  14. Shadow analysis via the C+K Visioline: A technical note.

    Science.gov (United States)

    Houser, T; Zerweck, C; Grove, G; Wickett, R

    2017-11-01

    This research investigated the ability of shadow analysis (via the Courage + Khazaka Visioline and Image Pro Premiere 9.0 software) to accurately assess the differences in skin topography associated with photo aging. Analyses were performed on impressions collected from a microfinish comparator scale (GAR Electroforming) as well a series of impressions collected from the crow's feet region of 9 women who represent each point on the Zerweck Crow's Feet classification scale. Analyses were performed using a Courage + Khazaka Visioline VL 650 as well as Image Pro Premiere 9.0 software. Shadow analysis showed an ability to accurately measure the groove depth when measuring impressions collected from grooves of known depth. Several shadow analysis parameters showed a correlation with the expert grader ratings of crow's feet when averaging measurements taken from the North and South directions. The Max Depth parameter in particular showed a strong correlation with the expert grader's ratings which improved when a more sophisticated analysis was performed using Image Pro Premiere. When used properly, shadow analysis is effective at accurately measuring skin surface impressions for differences in skin topography. Shadow analysis is shown to accurately assess the differences across a range of crow's feet severity correlating to a 0-8 grader scale. The Visioline VL 650 is a good tool for this measurement, with room for improvement in analysis which can be achieved through third party image analysis software. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Sophisticated lessons from simple organisms: appreciating the value of curiosity-driven research

    Directory of Open Access Journals (Sweden)

    Robert J. Duronio

    2017-12-01

    Full Text Available For hundreds of years, biologists have studied accessible organisms such as garden peas, sea urchins collected at low tide, newt eggs, and flies circling rotten fruit. These organisms help us to understand the world around us, attracting and inspiring each new generation of biologists with the promise of mystery and discovery. Time and time again, what we learn from such simple organisms has emphasized our common biological origins by proving to be applicable to more complex organisms, including humans. Yet, biologists are increasingly being tasked with developing applications from the known, rather than being allowed to follow a path to discovery of the as yet unknown. Here, we provide examples of important lessons learned from research using selected non-vertebrate organisms. We argue that, for the purpose of understanding human disease, simple organisms cannot and should not be replaced solely by human cell-based culture systems. Rather, these organisms serve as powerful discovery tools for new knowledge that could subsequently be tested for conservation in human cell-based culture systems. In this way, curiosity-driven biological research in simple organisms has and will continue to pay huge dividends in both the short and long run for improving the human condition.

  16. Modeling and flow analysis of pure nylon polymer for injection molding process

    International Nuclear Information System (INIS)

    Nuruzzaman, D M; Kusaseh, N; Basri, S; Hamedon, Z; Oumer, A N

    2016-01-01

    In the production of complex plastic parts, injection molding is one of the most popular industrial processes. This paper addresses the modeling and analysis of the flow process of the nylon (polyamide) polymer for injection molding process. To determine the best molding conditions, a series of simulations are carried out using Autodesk Moldflow Insight software and the processing parameters are adjusted. This mold filling commercial software simulates the cavity filling pattern along with temperature and pressure distributions in the mold cavity. In the modeling, during the plastics flow inside the mold cavity, different flow parameters such as fill time, pressure, temperature, shear rate and warp at different locations in the cavity are analyzed. Overall, this Moldflow is able to perform a relatively sophisticated analysis of the flow process of pure nylon. Thus the prediction of the filling of a mold cavity is very important and it becomes useful before a nylon plastic part to be manufactured. (paper)

  17. APPLICATION OF IMPRECISE MODELS IN ANALYSIS OF RISK MANAGEMENT OF SOFTWARE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-11-01

    Full Text Available The analysis of functional completeness for existing detection systems was conducted. It made it possible to define information systems with a similar feature set, to assess the degree of similarity and the matching degree of the means from the "standard" model of risk management system, that considers the recommended ICAO practices and standards on aviation safety, to justify the advisability of decision-making support system creation, using imprecise model and imprecise logic for risk analysis at aviation activities. Imprecise models have a number of features regarding the possibility of taking into account the experts’ intuition and experience, the possibility of more adequate flight safety management processes modelling and obtaining the accurate decisions that correlate with the initial data; support for the rapid development of a safety management system with its further functionality complexity increase; their hardware and software implementation in control systems and decision making is less sophisticated in comparison with classical algorithms.

  18. Modeling and flow analysis of pure nylon polymer for injection molding process

    Science.gov (United States)

    Nuruzzaman, D. M.; Kusaseh, N.; Basri, S.; Oumer, A. N.; Hamedon, Z.

    2016-02-01

    In the production of complex plastic parts, injection molding is one of the most popular industrial processes. This paper addresses the modeling and analysis of the flow process of the nylon (polyamide) polymer for injection molding process. To determine the best molding conditions, a series of simulations are carried out using Autodesk Moldflow Insight software and the processing parameters are adjusted. This mold filling commercial software simulates the cavity filling pattern along with temperature and pressure distributions in the mold cavity. In the modeling, during the plastics flow inside the mold cavity, different flow parameters such as fill time, pressure, temperature, shear rate and warp at different locations in the cavity are analyzed. Overall, this Moldflow is able to perform a relatively sophisticated analysis of the flow process of pure nylon. Thus the prediction of the filling of a mold cavity is very important and it becomes useful before a nylon plastic part to be manufactured.

  19. Predictive value of diffusion-weighted imaging without and with including contrast-enhanced magnetic resonance imaging in image analysis of head and neck squamous cell carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Noij, Daniel P., E-mail: d.noij@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, Amsterdam, Noord-Holland (Netherlands); Pouwels, Petra J.W., E-mail: pjw.pouwels@vumc.nl [Department of Physics and Medical Technology, VU University Medical Center, De Boelelaan 1117, Amsterdam, Noord-Holland (Netherlands); Ljumanovic, Redina, E-mail: rljumanovic@adventh.org [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, Amsterdam, Noord-Holland (Netherlands); Knol, Dirk L., E-mail: dirklknol@gmail.com [Department of Epidemiology and Biostatistics, VU University Medical Center, De Boelelaan 1117, Amsterdam, Noord-Holland (Netherlands); Doornaert, Patricia, E-mail: p.doornaert@vumc.nl [Department of Radiation Oncology, VU University Medical Center, De Boelelaan 1117, Amsterdam, Noord-Holland (Netherlands); Bree, Remco de, E-mail: r.debree@vumc.nl [Department of Otolaryngology – Head and Neck Surgery, VU University Medical Center, De Boelelaan 1117, Amsterdam, Noord-Holland (Netherlands); Castelijns, Jonas A., E-mail: j.castelijns@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, Amsterdam, Noord-Holland (Netherlands); Graaf, Pim de, E-mail: p.degraaf@vumc.nl [Department of Radiology and Nuclear Medicine, VU University Medical Center, De Boelelaan 1117, Amsterdam, Noord-Holland (Netherlands)

    2015-01-15

    Highlights: • Primary tumor volume and lymph node ADC1000 are predictors of survival. • CE-T1WI does not improve the prognostic capacity of DWI. • Using CE-T1WI for ROI placement results in lower interobserver agreement. - Abstract: Objectives: To assess disease-free survival (DFS) in head and neck squamous cell carcinoma (HNSCC) treated with (chemo)radiotherapy ([C]RT). Methods: Pretreatment MR-images of 78 patients were retrospectively studied. Apparent diffusion coefficients (ADC) were calculated with two sets of two b-values: 0–750 s/mm{sup 2} (ADC{sub 750}) and 0–1000 s/mm{sup 2} (ADC{sub 1000}). One observer assessed tumor volume on T1-WI. Two independent observers assessed ADC-values of primary tumor and largest lymph node in two sessions (i.e. without and with including CE-T1WI in image analysis). Interobserver and intersession agreement were assessed with intraclass correlation coefficients (ICC) separately for ADC{sub 750} and ADC{sub 1000}. Lesion volumes and ADC-values were related to DFS using Cox regression analysis. Results: Median follow-up was 18 months. Interobserver ICC was better without than with CE-T1WI (primary tumor: 0.92 and 0.75–0.83, respectively; lymph node: 0.81–0.83 and 0.61–0.64, respectively). Intersession ICC ranged from 0.84 to 0.89. With CE-T1WI, mean ADC-values of primary tumor and lymph node were higher at both b-values than without CE-T1WI (P < 0.001). Tumor volume (sensitivity: 73%; specificity: 57%) and lymph node ADC{sub 1000} (sensitivity: 71–79%; specificity: 77–79%) were independent significant predictors of DFS without and with including CE-T1WI (P < 0.05). Conclusions: Pretreatment primary tumor volume and lymph node ADC{sub 1000} were significant independent predictors of DFS in HNSCC treated with (C)RT. DFS could be predicted from ADC-values acquired without and with including CE-T1WI in image analysis. The inclusion of CE-T1WI did not result in significant improvements in the predictive value of

  20. Clinical efficacy of including capecitabine in neoadjuvant chemotherapy for breast cancer: a systematic review and meta-analysis of randomized controlled trials.

    Directory of Open Access Journals (Sweden)

    Qiuyun Li

    Full Text Available BACKGROUND: Capecitabine has proven effective as a chemotherapy for metastatic breast cancer. Though several Phase II/III studies of capecitabine as neoadjuvant chemotherapy have been conducted, the results still remain inconsistent. Therefore, we performed a meta-analysis to obtain more precise understanding of the role of capecitabine in neoadjuvant chemotherapy for breast cancer patients. METHODS: The electronic database PubMed and online abstracts from ASCO and SABCS were searched to identify randomized clinical trials comparing neoadjuvant chemotherapy with or without capecitabine in early/operable breast cancer patients without distant metastasis. Risk ratios were used to estimate the association between capecitabine in neoadjuvant chemotherapy and various efficacy outcomes. Fixed- or random-effect models were adopted to pool data in RevMan 5.1. RESULTS: Five studies were included in the meta-analysis. Neoadjuvant use of capecitabine with anthracycline and/or taxane based therapy was not associated with significant improvement in clinical outcomes including: pathologic complete response in breast (pCR; RR = 1.10, 95% CI 0.87-1.40, p = 0.43, pCR in breast tumor and nodes (tnpCR RR = 0.99, 95% CI 0.83-1.18, p = 0.90, overall response rate (ORR; RR = 1.00, 95% CI 0.94-1.07, p = 0.93, or breast-conserving surgery (BCS; RR = 0.98, 95% CI 0.93-1.04, p = 0.49. CONCLUSIONS: Neoadjuvant treatment of breast cancer involving capecitabine did not significantly improve pCR, tnpCR, BCS or ORR. Thus adding capecitabine to neoadjuvant chemotherapy regimes is unlikely to improve outcomes in breast cancer patients without distant metastasis. Further research is required to establish the condition that capecitabine may be useful in breast cancer neoadjuvant chemotherapy.

  1. Critical confrontation of standard and more sophisticated methods for modelling the dispersion in air of heavy gas clouds; evaluation and illustration of the intrinsic limitations of both categories

    International Nuclear Information System (INIS)

    Riethmuller, M.L.

    1983-01-01

    Mathematical models of gas dispersion have evolved drastically since the 1930's. For a long time, the most widely used approach was the so-called Gaussian model as described in practical terms by Turner or box models which have shown relative merits. In the field of heavy gas dispersion, the use of such approaches appeared somewhat limited and therefore new models have been proposed. Some of these new generation models were making use of the latest progress in turbulence modelling as derived from laboratory work as well as numerical advances. The advent of faster and larger computers made possible the development of three dimensional codes that were computing both flow field and gas dispersion taking into account details of the ground obstacles, heat exchange and possibly phase changes as well. The description of these new types of models makes them appear as a considerable improvement over the simpler approaches. However, recent comparisons between many of these have led to the conclusion that the scatter between predictions attained with sophisticated models was just as large as with other ones. It seems therefore, that current researchers might have fallen into the trap of confusing mathematical precision with accuracy. It is therefore felt necessary to enlighten this question by an investigation which, rather than comparing individual models, would analyse the key features of both approaches and put in evidence their relative merits and degree of realism when being really applied

  2. 40 CFR 60.2901 - What should I include in my waste management plan?

    Science.gov (United States)

    2010-07-01

    ... Analysis Waste Management Plan § 60.2901 What should I include in my waste management plan? A waste management plan must include consideration of the reduction or separation of waste-stream elements such as... must identify any additional waste management measures and implement those measures the source...

  3. Comparative analysis of collective doses received since 1976 at the Ardennes nuclear power plant

    International Nuclear Information System (INIS)

    Aye, Louis

    1980-01-01

    The analysis of the collective doses at the Centrale nucleaire des Ardennes povides valuable data about the origin of exposures in P.W.R. reactors and their evolution regarding the increase of activity of the circuits after more than 10 years of operation. The investigations led for most of the works since 1976 reveals that, in some cases, the use of sophisticated implements combined with modifications of equipments and procedures may bring appreciable savings on doses during normal operation as well as maintenance and refuelings shut-down. However, the study gives the 'reasonably achievable' limits than can be aimed in a plant on operation, the doses resulting mainly from problems that should be taken into account at design [fr

  4. Long term volcanic hazard analysis in the Canary Islands

    Science.gov (United States)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit

  5. An analysis of perceived prominent decision making areas in ...

    African Journals Online (AJOL)

    The development of the highly individualistic commercial springbuck (Antidorcas marsupialis) production systems and the resulting growth in the commercial value of the springbuck has opened a new realm of game management decision making. These relatively undomesticated production systems demand sophisticated ...

  6. Data breach locations, types, and associated characteristics among US hospitals.

    Science.gov (United States)

    Gabriel, Meghan Hufstader; Noblin, Alice; Rutherford, Ashley; Walden, Amanda; Cortelyou-Ward, Kendall

    2018-02-01

    The objectives of this study were to describe the locations in hospitals where data are breached, the types of breaches that occur most often at hospitals, and hospital characteristics, including health information technology (IT) sophistication and biometric security capabilities, that may be predicting factors of large data breaches that affect 500 or more patients. The Office of Civil Rights breach data from healthcare providers regarding breaches that affected 500 or more individuals from 2009 to 2016 were linked with hospital characteristics from the Health Information Management Systems Society and the American Hospital Association Health IT Supplement databases. Descriptive statistics were used to characterize hospitals with and without breaches, data breach type, and location/mode of data breaches in hospitals. Multivariate logistic regression analysis explored hospital characteristics that were predicting factors of a data breach affecting at least 500 patients, including area characteristics, region, health system membership, size, type, biometric security use, health IT sophistication, and ownership. Of all types of healthcare providers, hospitals accounted for approximately one-third of all data breaches and hospital breaches affected the largest number of individuals. Paper and films were the most frequent location of breached data, occurring in 65 hospitals during the study period, whereas network servers were the least common location but their breaches affected the most patients overall. Adjusted multivariate results showed significant associations among data breach occurrences and some hospital characteristics, including type and size, but not others, including health IT sophistication or biometric use for security. Hospitals should conduct routine audits to allow them to see their vulnerabilities before a breach occurs. Additionally, information security systems should be implemented concurrently with health information technologies. Improving

  7. Thinking Ahead on Deep Brain Stimulation: An Analysis of the Ethical Implications of a Developing Technology.

    Science.gov (United States)

    Johansson, Veronica; Garwicz, Martin; Kanje, Martin; Halldenius, Lena; Schouenborg, Jens

    2014-01-01

    Deep brain stimulation (DBS) is a developing technology. New generations of DBS technology are already in the pipeline, yet this particular fact has been largely ignored among ethicists interested in DBS. Focusing only on ethical concerns raised by the current DBS technology is, albeit necessary, not sufficient. Since current bioethical concerns raised by a specific technology could be quite different from the concerns it will raise a couple of years ahead, an ethical analysis should be sensitive to such alterations, or it could end up with results that soon become dated. The goal of this analysis is to address these changing bioethical concerns, to think ahead on upcoming and future DBS concerns both in terms of a changing technology and changing moral attitudes. By employing the distinction between inherent and noninherent bioethical concerns we identify and make explicit the particular limits and potentials for change within each category, respectively, including how present and upcoming bioethical concerns regarding DBS emerge and become obsolete. Many of the currently identified ethical problems with DBS, such as stimulation-induced mania, are a result of suboptimal technology. These challenges could be addressed by technical advances, while for instance perceptions of an altered body image caused by the mere awareness of having an implant may not. Other concerns will not emerge until the technology has become sophisticated enough for new uses to be realized, such as concerns on DBS for enhancement purposes. As a part of the present analysis, concerns regarding authenticity are used as an example.

  8. Advances in Intelligence and Security Informatics

    CERN Document Server

    Mao, Wenji

    2012-01-01

    The Intelligent Systems Series comprises titles that present state of the art knowledge and the latest advances in intelligent systems. Its scope includes theoretical studies, design methods, and real-world implementations and applications. Traditionally, Intelligence and Security Informatics (ISI) research and applications have focused on information sharing and data mining, social network analysis, infrastructure protection and emergency responses for security informatics. With the continuous advance of IT technologies and the increasing sophistication of national and international securi

  9. Smart Farming: Including Rights Holders for Responsible Agricultural Innovation

    OpenAIRE

    Kelly Bronson

    2018-01-01

    This article draws on the literature of responsible innovation to suggest concrete processes for including rights holders in the “smart” agricultural revolution. It first draws upon historical agricultural research in Canada to highlight how productivist values drove seed innovations with particular consequences for the distribution of power in the food system. Next, the article uses document analysis to suggest that a similar value framework is motivating public investment in smart farming i...

  10. Analysis of Low Probability of Intercept (LPI) Radar Signals Using Cyclostationary Processing

    National Research Council Canada - National Science Library

    Lime, Antonio

    2002-01-01

    ... problem in the battle space To detect these types of radar, new digital receivers that use sophisticated signal processing techniques are required This thesis investigates the use of cyclostationary...

  11. Betting patterns for sports and races: a longitudinal analysis of online wagering in Australia.

    Science.gov (United States)

    Gainsbury, Sally M; Russell, Alex

    2015-03-01

    Online wagering is increasing in popularity as it is easily accessible through websites which market these services widely. However, few studies have examined online betting based on actual behavioural data. This paper describes the results of an analysis of 2,522,299 bets placed with an Australian online wagering operator over a 1-year period. The majority of bets placed were for a win (45.31 %) and were placed on races (86.74 %) or sports (11.29 %). Sports betting was dominated by ball sports, reflecting popular interest in these events. More than three-quarters (77.63 %) of the bets were losses and there was large variation in bet size between bet types and events bet on although average bets were higher than in previously reported studies. The most popular bets placed to win, had a relatively high rate of losses and lowest average returns, which may reflect less sophisticated betting behaviour. More specific handicap and total bets were placed by fewer customers, but were larger bets with the greatest returns. Similarly, bets placed on less popular sporting events had greater average returns potentially reflecting greater customer sophistication and knowledge raising the possibility of a proportion of bettors being more 'skilled'. As the first paper to analyze the types of bets placed on events and outcomes the results support the notion that wagering is an entertainment activity, and the majority of customers are motivated by factors other than simply winning money.

  12. Comparison of global sensitivity analysis methods – Application to fuel behavior modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ikonen, Timo, E-mail: timo.ikonen@vtt.fi

    2016-02-15

    Highlights: • Several global sensitivity analysis methods are compared. • The methods’ applicability to nuclear fuel performance simulations is assessed. • The implications of large input uncertainties and complex models are discussed. • Alternative strategies to perform sensitivity analyses are proposed. - Abstract: Fuel performance codes have two characteristics that make their sensitivity analysis challenging: large uncertainties in input parameters and complex, non-linear and non-additive structure of the models. The complex structure of the code leads to interactions between inputs that show as cross terms in the sensitivity analysis. Due to the large uncertainties of the inputs these interactions are significant, sometimes even dominating the sensitivity analysis. For the same reason, standard linearization techniques do not usually perform well in the analysis of fuel performance codes. More sophisticated methods are typically needed in the analysis. To this end, we compare the performance of several sensitivity analysis methods in the analysis of a steady state FRAPCON simulation. The comparison of importance rankings obtained with the various methods shows that even the simplest methods can be sufficient for the analysis of fuel maximum temperature. However, the analysis of the gap conductance requires more powerful methods that take into account the interactions of the inputs. In some cases, moment-independent methods are needed. We also investigate the computational cost of the various methods and present recommendations as to which methods to use in the analysis.

  13. The Scientific Image in Behavior Analysis.

    Science.gov (United States)

    Keenan, Mickey

    2016-05-01

    Throughout the history of science, the scientific image has played a significant role in communication. With recent developments in computing technology, there has been an increase in the kinds of opportunities now available for scientists to communicate in more sophisticated ways. Within behavior analysis, though, we are only just beginning to appreciate the importance of going beyond the printing press to elucidate basic principles of behavior. The aim of this manuscript is to stimulate appreciation of both the role of the scientific image and the opportunities provided by a quick response code (QR code) for enhancing the functionality of the printed page. I discuss the limitations of imagery in behavior analysis ("Introduction"), and I show examples of what can be done with animations and multimedia for teaching philosophical issues that arise when teaching about private events ("Private Events 1 and 2"). Animations are also useful for bypassing ethical issues when showing examples of challenging behavior ("Challenging Behavior"). Each of these topics can be accessed only by scanning the QR code provided. This contingency has been arranged to help the reader embrace this new technology. In so doing, I hope to show its potential for going beyond the limitations of the printing press.

  14. Analysis of an Anti-Phishing Lab Activity

    Science.gov (United States)

    Werner, Laurie A.; Courte, Jill

    2010-01-01

    Despite advances in spam detection software, anti-spam laws, and increasingly sophisticated users, the number of successful phishing scams continues to grow. In addition to monetary losses attributable to phishing, there is also a loss of confidence that stifles use of online services. Using in-class activities in an introductory computer course…

  15. Model - including thermal creep effects - for the analysis of three-dimensional concrete structures

    International Nuclear Information System (INIS)

    Rodriguez, C.; Rebora, B.; Favrod, J.D.

    1979-01-01

    This article presents the most recent developments and results of research carried out by IPEN to establish a mathematical model for the non-linear rheological three-dimensional analysis of massive prestressed concrete structures. The main point of these latest developments is the simulation of the creep of concrete submitted to high temperatures over a long period of time. This research, financed by the Swiss National Science Foundation, has taken an increased importance with the advent of nuclear reactor vessels of the HHT type and new conceptions concerning the cooling of their concrete (replacement of the thermal insulation by a zone of hot concrete). (orig.)

  16. Life cycle analysis. The ISO-standards elaborated in a practical manual; Levenscyclusanalyse. De ISO-normen uitgewerkt in een praktijkgerichte Handleiding

    Energy Technology Data Exchange (ETDEWEB)

    Guinee, J.B. (ed.) (and others)

    2002-04-01

    The general aim of the new LCA (life cycle analysis) Guide is to provide a 'cookbook' with operational guidelines for conducting a step-by-step LCA study as justified by a scientific background document based on the ISO Standards for LCA. The different ISO elements and requirements are made operational to what we judged as the 'best available practice' for each step. The new Guide gives guidelines for two levels of sophistication of LCA: a simplified and a detailed level; in addition, optional extensions to the detailed level are provided. The simplified level has been introduced for making faster and cheaper LCAs compared to detailed level LCAs. The simplified level may be good enough for certain applications. The guidelines for detailed LCA fully comply with the ISO standards, while the guidelines given for simplified LCA do not. The new LCA Guide consists of three parts: (1) 'LCA in perspective' provides a general introduction to LCA and includes a discussion on the possibilities and limitations of LCA; (2) consists of two parts, 2a ('Guide') and 2b ('Operational annex'). Part 2a provides an introduction to the procedural design of an LCA project, and guidelines on the best available practice for each of the steps involved in an LCA study, at the two levels of LCA sophistication. Part 2b operationalises most of the guidelines provided in Part 2a and provides the most up-to-date operational models and data associated with the best available practice for the two levels of sophistication, as a separate document. This has been done to facilitate up-dating of these operational elements, most of which are likely to change regularly. Part 3 provides the scientific background to the study, as well as a reasoned justification of all the choices made in designing a best available practice for each phase of an LCA. Next to these books, on-line support is supplied through specific web-sites. For example, a spreadsheet

  17. SUSTAINABLE ENVIRONMENTAL TECHNOLOGIES INCLUDING WATER RECOVERY FOR REUSE FROM TANNERY AND INDUSTRIAL WASTEWATER – INDIAN AND ASIAN SCENARIO

    Directory of Open Access Journals (Sweden)

    Dr. S. RAJAMANI

    2017-05-01

    Full Text Available World leather sector generates 600million m3 of wastewater per annum. The Asian tanneries contributes more than 350 million m3 of wastewater from the process of 8 to 10 millions tons of hides and skins. Environmental challenges due to depletion of quality water resources and increase in salinity, it has become necessary to control Total Dissolved Solids (TDS in the treated effluent with water recovery wherever feasible. Adoption of special membrane system has been engineered in many individual and Common Effluent Treatment Plants (CETPs in India, China and other leather producing countries. The sustainability of saline reject management is one of the major challenges. Conventional tannery wastewater treatment systems include physiochemical and biological treatment to reduce Chromium, BOD, COD and Suspended Solids. To tackle treated effluent with TDS in the rage of 10000 to 30000mg/l, multiple stage high pressure membrane units have been designed and implemented for recovery of water. To reduce the chemical usage and sludge generation in the tertiary treatment, Membrane Bio-Reactor (MBR has been adopted which replace secondary clarifier and sophisticated tertiary treatment units such as Reactive Clarifier, Ultra-filtration (UF, etc. Commercial scale high-tech membrane systems have been implemented in many locations for the capacities ranging from 500 to 10000m3/day. Recent applied R&D on the environmental protection techniques with focus on water-recovery for reuse, salt recovery, marine disposal of saline reject with proper bio-control system, etc. are dealt in this novel technical paper.

  18. Finite Element Modeling and Analysis of Nonlinear Impact and Frictional Motion Responses Including Fluid—Structure Coupling Effects

    Directory of Open Access Journals (Sweden)

    Yong Zhao

    1997-01-01

    Full Text Available A nonlinear three dimensional (3D single rack model and a nonlinear 3D whole pool multi-rack model are developed for the spent fuel storage racks of a nuclear power plant (NPP to determine impacts and frictional motion responses when subjected to 3D excitations from the supporting building floor. The submerged free standing rack system and surrounding water are coupled due to hydrodynamic fluid-structure interaction (FSI using potential theory. The models developed have features that allow consideration of geometric and material nonlinearities including (1 the impacts of fuel assemblies to rack cells, a rack to adjacent racks or pool walls, and rack support legs to the pool floor; (2 the hydrodynamic coupling of fuel assemblies with their storing racks, and of a rack with adjacent racks, pool walls, and the pool floor; and (3 the dynamic motion behavior of rocking, twisting, and frictional sliding of rack modules. Using these models 3D nonlinear time history dynamic analyses are performed per the U.S. Nuclear Regulatory Commission (USNRC criteria. Since few such modeling, analyses, and results using both the 3D single and whole pool multiple rack models are available in the literature, this paper emphasizes description of modeling and analysis techniques using the SOLVIA general purpose nonlinear finite element code. Typical response results with different Coulomb friction coefficients are presented and discussed.

  19. Analysis of electronic models for solar cells including energy resolved defect densities

    Energy Technology Data Exchange (ETDEWEB)

    Glitzky, Annegret

    2010-07-01

    We introduce an electronic model for solar cells including energy resolved defect densities. The resulting drift-diffusion model corresponds to a generalized van Roosbroeck system with additional source terms coupled with ODEs containing space and energy as parameters for all defect densities. The system has to be considered in heterostructures and with mixed boundary conditions from device simulation. We give a weak formulation of the problem. If the boundary data and the sources are compatible with thermodynamic equilibrium the free energy along solutions decays monotonously. In other cases it may be increasing, but we estimate its growth. We establish boundedness and uniqueness results and prove the existence of a weak solution. This is done by considering a regularized problem, showing its solvability and the boundedness of its solutions independent of the regularization level. (orig.)

  20. Selection Component Analysis of Natural Polymorphisms using Population Samples Including Mother-Offspring Combinations, II

    DEFF Research Database (Denmark)

    Jarmer, Hanne Østergaard; Christiansen, Freddy Bugge

    1981-01-01

    Population samples including mother-offspring combinations provide information on the selection components: zygotic selection, sexual selection, gametic seletion and fecundity selection, on the mating pattern, and on the deviation from linkage equilibrium among the loci studied. The theory...