WorldWideScience

Sample records for non-traditional signatures uncertainties

  1. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  2. Uncertainty in hydrological signatures for gauged and ungauged catchments

    Science.gov (United States)

    Westerberg, Ida K.; Wagener, Thorsten; Coxon, Gemma; McMillan, Hilary K.; Castellarin, Attilio; Montanari, Alberto; Freer, Jim

    2016-03-01

    Reliable information about hydrological behavior is needed for water-resource management and scientific investigations. Hydrological signatures quantify catchment behavior as index values, and can be predicted for ungauged catchments using a regionalization procedure. The prediction reliability is affected by data uncertainties for the gauged catchments used in prediction and by uncertainties in the regionalization procedure. We quantified signature uncertainty stemming from discharge data uncertainty for 43 UK catchments and propagated these uncertainties in signature regionalization, while accounting for regionalization uncertainty with a weighted-pooling-group approach. Discharge uncertainty was estimated using Monte Carlo sampling of multiple feasible rating curves. For each sampled rating curve, a discharge time series was calculated and used in deriving the gauged signature uncertainty distribution. We found that the gauged uncertainty varied with signature type, local measurement conditions and catchment behavior, with the highest uncertainties (median relative uncertainty ±30-40% across all catchments) for signatures measuring high- and low-flow magnitude and dynamics. Our regionalization method allowed assessing the role and relative magnitudes of the gauged and regionalized uncertainty sources in shaping the signature uncertainty distributions predicted for catchments treated as ungauged. We found that (1) if the gauged uncertainties were neglected there was a clear risk of overconditioning the regionalization inference, e.g., by attributing catchment differences resulting from gauged uncertainty to differences in catchment behavior, and (2) uncertainty in the regionalization results was lower for signatures measuring flow distribution (e.g., mean flow) than flow dynamics (e.g., autocorrelation), and for average flows (and then high flows) compared to low flows.

  3. Analysis of Non-contact Acousto Thermal Signature Data (Postprint)

    Science.gov (United States)

    2016-02-01

    AFRL-RX-WP-JA-2016-0321 ANALYSIS OF NON-CONTACT ACOUSTO-THERMAL SIGNATURE DATA (POSTPRINT) Amanda K. Criner AFRL/RX...October 2014 – 16 September 2015 4. TITLE AND SUBTITLE ANALYSIS OF NON-CONTACT ACOUSTO-THERMAL SIGNATURE DATA (POSTPRINT) 5a. CONTRACT NUMBER...words) The non-contact acousto-thermal signature (NCATS) is a nondestructive evaluation technique with potential to detect fatigue in materials such as

  4. Estimation of the Impacts of Non-Oil Traditional and NonTraditional Export Sectors on Non-Oil Export of Azerbaijan

    Directory of Open Access Journals (Sweden)

    Nicat Hagverdiyev

    2016-12-01

    Full Text Available The significant share of oil sector of the Azerbaijan export portfolio necessitates promotion of non-oil exports. This study analyzes weather the commodities which contain the main share (more than 70% in non-oil export are traditional or non-traditional areas, using the so-called Commodity-specific cumulative export experience function, for the 1995-2015 time frame. Then, the impact of traditional and non-traditional exports on non-oil GDP investigated employing econometric model. The results of the study based on 16 non-oil commodities show that cotton, tobacco, and production of mechanic devices are traditional sectors in non-oil export. The estimation results of the model indicate that both, traditional and non-traditional non-oil export sectors have economically and statistically significant impact on non-oil GDP.

  5. Search for non-standard SUSY signatures in CMS

    International Nuclear Information System (INIS)

    Teyssier, Daniel

    2008-01-01

    New studies of the CMS collaboration are presented on the sensitivity to searches for non-standard signatures of particular SUSY scenarios. These signatures include non-pointing photons as well as pairs of prompt photons as expected GMSB SUSY models, as well as heavy stable charged particles produced in split supersymmetry models, long lived staus from GMSB SUSY and long lived stops in other SUSY scenarios. Detailed detector simulation is used for the study, and all relevant Standard Model background and detector effects that can mimic these special signatures are included. It is shown that with already with less than 100 pb -1 the CMS sensitivity will probe an interesting as yet by data unexplored parameter range of these models.

  6. CONSUMERS’ BRAND EQUITY PERCEPTIONS OF TRADITIONAL AND NON-TRADITIONAL BRANDS

    OpenAIRE

    Catli, Ozlem; Ermec Sertoglu, Aysegul; Ors, Husniye

    2017-01-01

    Thisstudy aims to compare consumers' brand perception of traditional brands withbrand perceptions of non-traditional brands.  Consumers livingin Ankara constitute the universe of work, and data were gathered in aface-to-face interview using the survey method. the demographic characteristicsof the participants was prepared with the aim of evaluating and comparing onetraditional brand and one non traditional brand of brand equity related to thebrand equity by the participants. According to...

  7. Practice Location Characteristics of Non-Traditional Dental Practices.

    Science.gov (United States)

    Solomon, Eric S; Jones, Daniel L

    2016-04-01

    Current and future dental school graduates are increasingly likely to choose a non-traditional dental practice-a group practice managed by a dental service organization or a corporate practice with employed dentists-for their initial practice experience. In addition, the growth of non-traditional practices, which are located primarily in major urban areas, could accelerate the movement of dentists to those areas and contribute to geographic disparities in the distribution of dental services. To help the profession understand the implications of these developments, the aim of this study was to compare the location characteristics of non-traditional practices and traditional dental practices. After identifying non-traditional practices across the United States, the authors located those practices and traditional dental practices geographically by zip code. Non-traditional dental practices were found to represent about 3.1% of all dental practices, but they had a greater impact on the marketplace with almost twice the average number of staff and annual revenue. Virtually all non-traditional dental practices were located in zip codes that also had a traditional dental practice. Zip codes with non-traditional practices had significant differences from zip codes with only a traditional dental practice: the populations in areas with non-traditional practices had higher income levels and higher education and were slightly younger and proportionally more Hispanic; those practices also had a much higher likelihood of being located in a major metropolitan area. Dental educators and leaders need to understand the impact of these trends in the practice environment in order to both prepare graduates for practice and make decisions about planning for the workforce of the future.

  8. On psychoanalytic supervision as signature pedagogy.

    Science.gov (United States)

    Watkins, C Edward

    2014-04-01

    What is signature pedagogy in psychoanalytic education? This paper examines that question, considering why psychoanalytic supervision best deserves that designation. In focusing on supervision as signature pedagogy, I accentuate its role in building psychoanalytic habits of mind, habits of hand, and habits of heart, and transforming theory and self-knowledge into practical product. Other facets of supervision as signature pedagogy addressed in this paper include its features of engagement, uncertainty, formation, and pervasiveness, as well as levels of surface, deep, and implicit structure. Epistemological, ontological, and axiological in nature, psychoanalytic supervision engages trainees in learning to do, think, and value what psychoanalytic practitioners in the field do, think, and value: It is, most fundamentally, professional preparation for competent, "good work." In this paper, effort is made to shine a light on and celebrate the pivotal role of supervision in "making" or developing budding psychoanalysts and psychoanalytic psychotherapists. Now over a century old, psychoanalytic supervision remains unparalleled in (1) connecting and integrating conceptualization and practice, (2) transforming psychoanalytic theory and self-knowledge into an informed analyzing instrument, and (3) teaching, transmitting, and perpetuating the traditions, practice, and culture of psychoanalytic treatment.

  9. Traditional and non-traditional educational outcomes : Trade-off or complementarity?

    NARCIS (Netherlands)

    van der Wal, Marieke; Waslander, Sietske

    2007-01-01

    Recently, schools have increasingly been charged with enhancing non-traditional academic competencies, in addition to traditional academic competencies. This article raises the question whether schools can implement these new educational goals in their curricula and simultaneously realise the

  10. Non-Traditional Wraps

    Science.gov (United States)

    Owens, Buffy

    2009-01-01

    This article presents a recipe for non-traditional wraps. In this article, the author describes how adults and children can help with the recipe and the skills involved with this recipe. The bigger role that children can play in the making of the item the more they are apt to try new things and appreciate the texture and taste.

  11. Social Capital of Non-Traditional Students at a German University. Do Traditional and Non-Traditional Students Access Different Social Resources?

    Science.gov (United States)

    Brändle, Tobias; Häuberer, Julia

    2015-01-01

    Social capital is of particular value for the acquisition of education. Not only does it prevent scholars from dropping out but it improves the educational achievement. The paper focuses on access to social resources by traditional and non-traditional students at a German university and asks if there are group differences considering this…

  12. Testing Algorithmic Skills in Traditional and Non-Traditional Programming Environments

    Science.gov (United States)

    Csernoch, Mária; Biró, Piroska; Máth, János; Abari, Kálmán

    2015-01-01

    The Testing Algorithmic and Application Skills (TAaAS) project was launched in the 2011/2012 academic year to test first year students of Informatics, focusing on their algorithmic skills in traditional and non-traditional programming environments, and on the transference of their knowledge of Informatics from secondary to tertiary education. The…

  13. The impact of gender ideologies on men's and women's desire for a traditional or non-traditional partner

    OpenAIRE

    Thomae, M.; Houston, Diane

    2016-01-01

    Two studies examine preferences for a long-term partner who conforms to traditional or non- traditional gender\\ud roles. The studies both demonstrate a link between benevolent sexism and preference for a traditional partner.\\ud However, Study 1 also demonstrates a strong preference among women for a non-traditional partner. We measured\\ud ambivalent sexist ideologies before introducing participants to either a stereotypically traditional or stereotypically non-traditional character of the opp...

  14. Traditional and non-traditional treatments for autism spectrum disorder with seizures: an on-line survey.

    Science.gov (United States)

    Frye, Richard E; Sreenivasula, Swapna; Adams, James B

    2011-05-18

    Despite the high prevalence of seizure, epilepsy and abnormal electroencephalograms in individuals with autism spectrum disorder (ASD), there is little information regarding the relative effectiveness of treatments for seizures in the ASD population. In order to determine the effectiveness of traditional and non-traditional treatments for improving seizures and influencing other clinical factor relevant to ASD, we developed a comprehensive on-line seizure survey. Announcements (by email and websites) by ASD support groups asked parents of children with ASD to complete the on-line surveys. Survey responders choose one of two surveys to complete: a survey about treatments for individuals with ASD and clinical or subclinical seizures or abnormal electroencephalograms, or a control survey for individuals with ASD without clinical or subclinical seizures or abnormal electroencephalograms. Survey responders rated the perceived effect of traditional antiepileptic drug (AED), non-AED seizure treatments and non-traditional ASD treatments on seizures and other clinical factors (sleep, communication, behavior, attention and mood), and listed up to three treatment side effects. Responses were obtained concerning 733 children with seizures and 290 controls. In general, AEDs were perceived to improve seizures but worsened other clinical factors for children with clinical seizure. Valproic acid, lamotrigine, levetiracetam and ethosuximide were perceived to improve seizures the most and worsen other clinical factors the least out of all AEDs in children with clinical seizures. Traditional non-AED seizure and non-traditional treatments, as a group, were perceived to improve other clinical factors and seizures but the perceived improvement in seizures was significantly less than that reported for AEDs. Certain traditional non-AED treatments, particularly the ketogenic diet, were perceived to improve both seizures and other clinical factors.For ASD individuals with reported

  15. Communicating spatial uncertainty to non-experts using R

    Science.gov (United States)

    Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze

    2016-04-01

    Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R

  16. Electronic health records: what does your signature signify?

    Directory of Open Access Journals (Sweden)

    Victoroff MD Michael S

    2012-08-01

    Full Text Available Abstract Electronic health records serve multiple purposes, including clinical communication, legal documentation, financial transaction capture, research and analytics. Electronic signatures attached to entries in EHRs have different logical and legal meanings for different users. Some of these are vestiges from historic paper formats that require reconsideration. Traditionally accepted functions of signatures, such as identity verification, attestation, consent, authorization and non-repudiation can become ambiguous in the context of computer-assisted workflow processes that incorporate functions like logins, auto-fill and audit trails. This article exposes the incompatibility of expectations among typical users of electronically signed information.

  17. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  18. Controlling quantum memory-assisted entropic uncertainty in non-Markovian environments

    Science.gov (United States)

    Zhang, Yanliang; Fang, Maofa; Kang, Guodong; Zhou, Qingping

    2018-03-01

    Quantum memory-assisted entropic uncertainty relation (QMA EUR) addresses that the lower bound of Maassen and Uffink's entropic uncertainty relation (without quantum memory) can be broken. In this paper, we investigated the dynamical features of QMA EUR in the Markovian and non-Markovian dissipative environments. It is found that dynamical process of QMA EUR is oscillation in non-Markovian environment, and the strong interaction is favorable for suppressing the amount of entropic uncertainty. Furthermore, we presented two schemes by means of prior weak measurement and posterior weak measurement reversal to control the amount of entropic uncertainty of Pauli observables in dissipative environments. The numerical results show that the prior weak measurement can effectively reduce the wave peak values of the QMA-EUA dynamic process in non-Markovian environment for long periods of time, but it is ineffectual on the wave minima of dynamic process. However, the posterior weak measurement reversal has an opposite effects on the dynamic process. Moreover, the success probability entirely depends on the quantum measurement strength. We hope that our proposal could be verified experimentally and might possibly have future applications in quantum information processing.

  19. The Malay Enclave of Kampong Bharu as a Living Tradition: A place of uncertainty

    Directory of Open Access Journals (Sweden)

    Norsidah Ujang

    2016-06-01

    Full Text Available In the case of Asian cities, poor redevelopment process has often resulted in the loss of historic urban fabric. Kampong Bharu is a traditional Malay settlement in the heart of the Kuala Lumpur city, holds a unique case of a struggle to preserve its local identity. This paper reviews the scenario regarding the enclave in light of the current redevelopment proposal. Reviews of literature and analysis of recent reports indicated that the future of the enclave is in the state of uncertainty. People oriented planning based upon a deep understanding of culture and tradition could bring about a natural approach towards a definitive redevelopment initiatives.

  20. Searching for signatures of non-standard physics in cosmological surveys

    International Nuclear Information System (INIS)

    Brunier, Tristan

    2006-01-01

    This report focuses on the origin of large-scale structures of the universe and the corresponding observable signatures. We examine classical evolution of perturbations before studying quantum mechanism which gave rise to them. The classical evolution of inhomogeneities is described with the theory of cosmological perturbations. Special attention is paid to second-order non-linearities, which are likely to show up tiny effects, set up initially or induced by the evolution. These tools are used to connect the structure of inhomogeneities to that of anisotropies of temperature and polarization of the cosmic microwave background. They allow a new physical interpretation of the power spectra behaviors, including high order effects. The effect induced by a dipolar anisotropy, of statistical or experimental nature, is examined as well as the behavior of non-Gaussian temperature fluctuations of primordial origin. We then explore the properties of quantum fields in curved spacetime, focusing on de Sitter space. In particular, we characterize the consequence of renormalization on the field masses. We finally apply these concepts to quantum fluctuations during inflation. We study the specific signatures of some inflationary models and show that non-Gaussianities are characterized and we check that these models survive radiative corrections to fields masses. (author) [fr

  1. Minimal length uncertainty and generalized non-commutative geometry

    International Nuclear Information System (INIS)

    Farmany, A.; Abbasi, S.; Darvishi, M.T.; Khani, F.; Naghipour, A.

    2009-01-01

    A generalized formulation of non-commutative geometry for the Bargmann-Fock space of quantum field theory is presented. The analysis is related to the symmetry of the simplistic space and a minimal length uncertainty.

  2. Exploring the molecular mechanisms of Traditional Chinese Medicine components using gene expression signatures and connectivity map.

    Science.gov (United States)

    Yoo, Minjae; Shin, Jimin; Kim, Hyunmin; Kim, Jihye; Kang, Jaewoo; Tan, Aik Choon

    2018-04-04

    Traditional Chinese Medicine (TCM) has been practiced over thousands of years in China and other Asian countries for treating various symptoms and diseases. However, the underlying molecular mechanisms of TCM are poorly understood, partly due to the "multi-component, multi-target" nature of TCM. To uncover the molecular mechanisms of TCM, we perform comprehensive gene expression analysis using connectivity map. We interrogated gene expression signatures obtained 102 TCM components using the next generation Connectivity Map (CMap) resource. We performed systematic data mining and analysis on the mechanism of action (MoA) of these TCM components based on the CMap results. We clustered the 102 TCM components into four groups based on their MoAs using next generation CMap resource. We performed gene set enrichment analysis on these components to provide additional supports for explaining these molecular mechanisms. We also provided literature evidence to validate the MoAs identified through this bioinformatics analysis. Finally, we developed the Traditional Chinese Medicine Drug Repurposing Hub (TCM Hub) - a connectivity map resource to facilitate the elucidation of TCM MoA for drug repurposing research. TCMHub is freely available in http://tanlab.ucdenver.edu/TCMHub. Molecular mechanisms of TCM could be uncovered by using gene expression signatures and connectivity map. Through this analysis, we identified many of the TCM components possess diverse MoAs, this may explain the applications of TCM in treating various symptoms and diseases. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Energy and non-traditional security (NTS) in Asia

    Energy Technology Data Exchange (ETDEWEB)

    Caballero-Anthony, Mely [Nanyang Technological Univ., Singapore (SG). Centre for Non-Traditional Security (NTS) Studies; Chang, Youngho [Nanyang Technological Univ., Singapore (Singapore). Division of Economics; Putra, Nur Azha (eds.) [National Univ. of Singapore (Singapore). Energy Security Division

    2012-07-01

    Traditional notions of security are premised on the primacy of state security. In relation to energy security, traditional policy thinking has focused on ensuring supply without much emphasis on socioeconomic and environmental impacts. Non-traditional security (NTS) scholars argue that threats to human security have become increasingly prominent since the end of the Cold War, and that it is thus critical to adopt a holistic and multidisciplinary approach in addressing rising energy needs. This volume represents the perspectives of scholars from across Asia, looking at diverse aspects of energy security through a non-traditional security lens. The issues covered include environmental and socioeconomic impacts, the role of the market, the role of civil society, energy sustainability and policy trends in the ASEAN region.

  4. Non-traditional Stable Isotope Systematics of Seafloor Hydrothermal Systems

    Science.gov (United States)

    Rouxel, O. J.

    2009-05-01

    Seafloor hydrothermal activity at mid-ocean ridges is one of the fundamental processes controlling the chemistry of the oceans and the altered oceanic crust. Past studies have demonstrated the complexity and diversity of seafloor hydrothermal systems and have highlighted the importance of subsurface environments in controlling the composition of hydrothermal fluids and mineralization types. Traditionally, the behavior of metals in seafloor hydrothermal systems have been investigated by integrating results from laboratory studies, theoretical models, mineralogy and fluid and mineral chemistry. Isotope ratios of various metals and metalloids, such as Fe, Cu, Zn, Se, Cd and Sb have recently provided new approaches for the study of seafloor hydrothermal systems. Despite these initial investigations, the cause of the isotopic variability of these elements remains poorly constrained. We have little understanding of the isotope variations between vent types (black or white smokers) as well as the influence of source rock composition (basalt, felsic or ultrabasic rocks) and alteration types. Here, I will review and present new results of metal isotope systematics of seafloor hydrothermal systems, in particular: (1) determination of empirical isotope fractionation factors for Zn, Fe and Cu-isotopes through isotopic analysis of mono-mineralic sulfide grains lining the internal chimney wall in contact with hydrothermal fluid; (2) comparison of Fe- and Cu-isotope signatures of vent fluids from mid- oceanic and back-arc hydrothermal fields, spanning wide ranges of pH, temperature, metal concentrations and contributions of magmatic fluids enriched in SO2. Ultimately, the use of complementary non-traditional stable isotope systems may help identify and constrain the complex interactions between fluids,minerals, and organisms in seafloor hydrothermal systems.

  5. Signature molecular descriptor : advanced applications.

    Energy Technology Data Exchange (ETDEWEB)

    Visco, Donald Patrick, Jr. (Tennessee Technological University, Cookeville, TN)

    2010-04-01

    In this work we report on the development of the Signature Molecular Descriptor (or Signature) for use in the solution of inverse design problems as well as in highthroughput screening applications. The ultimate goal of using Signature is to identify novel and non-intuitive chemical structures with optimal predicted properties for a given application. We demonstrate this in three studies: green solvent design, glucocorticoid receptor ligand design and the design of inhibitors for Factor XIa. In many areas of engineering, compounds are designed and/or modified in incremental ways which rely upon heuristics or institutional knowledge. Often multiple experiments are performed and the optimal compound is identified in this brute-force fashion. Perhaps a traditional chemical scaffold is identified and movement of a substituent group around a ring constitutes the whole of the design process. Also notably, a chemical being evaluated in one area might demonstrate properties very attractive in another area and serendipity was the mechanism for solution. In contrast to such approaches, computer-aided molecular design (CAMD) looks to encompass both experimental and heuristic-based knowledge into a strategy that will design a molecule on a computer to meet a given target. Depending on the algorithm employed, the molecule which is designed might be quite novel (re: no CAS registration number) and/or non-intuitive relative to what is known about the problem at hand. While CAMD is a fairly recent strategy (dating to the early 1980s), it contains a variety of bottlenecks and limitations which have prevented the technique from garnering more attention in the academic, governmental and industrial institutions. A main reason for this is how the molecules are described in the computer. This step can control how models are developed for the properties of interest on a given problem as well as how to go from an output of the algorithm to an actual chemical structure. This report

  6. An uncertainty inclusive un-mixing model to identify tracer non-conservativeness

    Science.gov (United States)

    Sherriff, Sophie; Rowan, John; Franks, Stewart; Fenton, Owen; Jordan, Phil; hUallacháin, Daire Ó.

    2015-04-01

    Sediment fingerprinting is being increasingly recognised as an essential tool for catchment soil and water management. Selected physico-chemical properties (tracers) of soils and river sediments are used in a statistically-based 'un-mixing' model to apportion sediment delivered to the catchment outlet (target) to its upstream sediment sources. Development of uncertainty-inclusive approaches, taking into account uncertainties in the sampling, measurement and statistical un-mixing, are improving the robustness of results. However, methodological challenges remain including issues of particle size and organic matter selectivity and non-conservative behaviour of tracers - relating to biogeochemical transformations along the transport pathway. This study builds on our earlier uncertainty-inclusive approach (FR2000) to detect and assess the impact of tracer non-conservativeness using synthetic data before applying these lessons to new field data from Ireland. Un-mixing was conducted on 'pristine' and 'corrupted' synthetic datasets containing three to fifty tracers (in the corrupted dataset one target tracer value was manually corrupted to replicate non-conservative behaviour). Additionally, a smaller corrupted dataset was un-mixed using a permutation version of the algorithm. Field data was collected in an 11 km2 river catchment in Ireland. Source samples were collected from topsoils, subsoils, channel banks, open field drains, damaged road verges and farm tracks. Target samples were collected using time integrated suspended sediment samplers at the catchment outlet at 6-12 week intervals from July 2012 to June 2013. Samples were dried (affected whereas uncertainty was only marginally impacted by the corrupted tracer. Improvement of uncertainty resulted from increasing the number of tracers in both the perfect and corrupted datasets. FR2000 was capable of detecting non-conservative tracer behaviour within the range of mean source values, therefore, it provided a more

  7. Quantitative non-monotonic modeling of economic uncertainty by probability and possibility distributions

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2012-01-01

    uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... are thoroughly discussed in the case of rectangular representation of uncertainty by the uniform probability distribution and the interval, respectively. Also triangular representations are dealt with and compared. Calculation of monotonic as well as non-monotonic functions of variables represented...

  8. Student learning or the student experience: the shift from traditional to non-traditional faculty in higher education

    Directory of Open Access Journals (Sweden)

    Carlos Tasso Eira de Aquino

    2016-10-01

    Full Text Available Trends in higher education indicate transformations from teachers to facilitators, mentors, or coaches. New classroom management requires diverse teaching methods for a changing population. Non-traditional students require non-traditional faculty. Higher education operates similar to a traditional corporation, but competes for students, faculty, and funding to sustain daily operations and improve academic ranking among peers (Pak, 2013. This growing phenomenon suggests the need for faculty to transform the existing educational culture, ensuring the ability to attract and retain students. Transitions from student learning to the student experience and increasing student satisfaction scores are influencing facilitation in the classroom. On-line facilitation methods are transforming to include teamwork, interactive tutorials, media, and extending beyond group discussion. Faculty should be required to provide more facilitation, coaching, and mentoring with the shifting roles resulting in transitions from traditional faculty to faculty-coach and faculty mentor. The non-traditional adult student may require a more hands on guidance approach and may not be as self-directed as the adult learning theory proposes. This topic is important to individuals that support creation of new knowledge related to non-traditional adult learning models.

  9. An improved non-Markovian degradation model with long-term dependency and item-to-item uncertainty

    Science.gov (United States)

    Xi, Xiaopeng; Chen, Maoyin; Zhang, Hanwen; Zhou, Donghua

    2018-05-01

    It is widely noted in the literature that the degradation should be simplified into a memoryless Markovian process for the purpose of predicting the remaining useful life (RUL). However, there actually exists the long-term dependency in the degradation processes of some industrial systems, including electromechanical equipments, oil tankers, and large blast furnaces. This implies the new degradation state depends not only on the current state, but also on the historical states. Such dynamic systems cannot be accurately described by traditional Markovian models. Here we present an improved non-Markovian degradation model with both the long-term dependency and the item-to-item uncertainty. As a typical non-stationary process with dependent increments, fractional Brownian motion (FBM) is utilized to simulate the fractal diffusion of practical degradations. The uncertainty among multiple items can be represented by a random variable of the drift. Based on this model, the unknown parameters are estimated through the maximum likelihood (ML) algorithm, while a closed-form solution to the RUL distribution is further derived using a weak convergence theorem. The practicability of the proposed model is fully verified by two real-world examples. The results demonstrate that the proposed method can effectively reduce the prediction error.

  10. Multifidelity, Multidisciplinary Design Under Uncertainty with Non-Intrusive Polynomial Chaos

    Science.gov (United States)

    West, Thomas K., IV; Gumbert, Clyde

    2017-01-01

    The primary objective of this work is to develop an approach for multifidelity uncertainty quantification and to lay the framework for future design under uncertainty efforts. In this study, multifidelity is used to describe both the fidelity of the modeling of the physical systems, as well as the difference in the uncertainty in each of the models. For computational efficiency, a multifidelity surrogate modeling approach based on non-intrusive polynomial chaos using the point-collocation technique is developed for the treatment of both multifidelity modeling and multifidelity uncertainty modeling. Two stochastic model problems are used to demonstrate the developed methodologies: a transonic airfoil model and multidisciplinary aircraft analysis model. The results of both showed the multifidelity modeling approach was able to predict the output uncertainty predicted by the high-fidelity model as a significant reduction in computational cost.

  11. Comparing performance of standard and iterative linear unmixing methods for hyperspectral signatures

    Science.gov (United States)

    Gault, Travis R.; Jansen, Melissa E.; DeCoster, Mallory E.; Jansing, E. David; Rodriguez, Benjamin M.

    2016-05-01

    Linear unmixing is a method of decomposing a mixed signature to determine the component materials that are present in sensor's field of view, along with the abundances at which they occur. Linear unmixing assumes that energy from the materials in the field of view is mixed in a linear fashion across the spectrum of interest. Traditional unmixing methods can take advantage of adjacent pixels in the decomposition algorithm, but is not the case for point sensors. This paper explores several iterative and non-iterative methods for linear unmixing, and examines their effectiveness at identifying the individual signatures that make up simulated single pixel mixed signatures, along with their corresponding abundances. The major hurdle addressed in the proposed method is that no neighboring pixel information is available for the spectral signature of interest. Testing is performed using two collections of spectral signatures from the Johns Hopkins University Applied Physics Laboratory's Signatures Database software (SigDB): a hand-selected small dataset of 25 distinct signatures from a larger dataset of approximately 1600 pure visible/near-infrared/short-wave-infrared (VIS/NIR/SWIR) spectra. Simulated spectra are created with three and four material mixtures randomly drawn from a dataset originating from SigDB, where the abundance of one material is swept in 10% increments from 10% to 90%with the abundances of the other materials equally divided amongst the remainder. For the smaller dataset of 25 signatures, all combinations of three or four materials are used to create simulated spectra, from which the accuracy of materials returned, as well as the correctness of the abundances, is compared to the inputs. The experiment is expanded to include the signatures from the larger dataset of almost 1600 signatures evaluated using a Monte Carlo scheme with 5000 draws of three or four materials to create the simulated mixed signatures. The spectral similarity of the inputs to the

  12. Sonic Boom Pressure Signature Uncertainty Calculation and Propagation to Ground Noise

    Science.gov (United States)

    West, Thomas K., IV; Bretl, Katherine N.; Walker, Eric L.; Pinier, Jeremy T.

    2015-01-01

    The objective of this study was to outline an approach for the quantification of uncertainty in sonic boom measurements and to investigate the effect of various near-field uncertainty representation approaches on ground noise predictions. These approaches included a symmetric versus asymmetric uncertainty band representation and a dispersion technique based on a partial sum Fourier series that allows for the inclusion of random error sources in the uncertainty. The near-field uncertainty was propagated to the ground level, along with additional uncertainty in the propagation modeling. Estimates of perceived loudness were obtained for the various types of uncertainty representation in the near-field. Analyses were performed on three configurations of interest to the sonic boom community: the SEEB-ALR, the 69o DeltaWing, and the LM 1021-01. Results showed that representation of the near-field uncertainty plays a key role in ground noise predictions. Using a Fourier series based dispersion approach can double the amount of uncertainty in the ground noise compared to a pure bias representation. Compared to previous computational fluid dynamics results, uncertainty in ground noise predictions were greater when considering the near-field experimental uncertainty.

  13. X-RAY SIGNATURES OF NON-EQUILIBRIUM IONIZATION EFFECTS IN GALAXY CLUSTER ACCRETION SHOCK REGIONS

    International Nuclear Information System (INIS)

    Wong, Ka-Wah; Sarazin, Craig L.; Ji Li

    2011-01-01

    The densities in the outer regions of clusters of galaxies are very low, and the collisional timescales are very long. As a result, heavy elements will be under-ionized after they have passed through the accretion shock. We have studied systematically the effects of non-equilibrium ionization for relaxed clusters in the ΛCDM cosmology using one-dimensional hydrodynamic simulations. We found that non-equilibrium ionization effects do not depend on cluster mass, but depend strongly on redshift which can be understood by self-similar scaling arguments. The effects are stronger for clusters at lower redshifts. We present X-ray signatures such as surface brightness profiles and emission lines in detail for a massive cluster at low redshift. In general, soft emission (0.3-1.0 keV) is enhanced significantly by under-ionization, and the enhancement can be nearly an order of magnitude near the shock radius. The most prominent non-equilibrium ionization signature we found is the O VII and O VIII line ratio. The ratios for non-equilibrium ionization and collisional ionization equilibrium models are different by more than an order of magnitude at radii beyond half of the shock radius. These non-equilibrium ionization signatures are equally strong for models with different non-adiabatic shock electron heating efficiencies. We have also calculated the detectability of the O VII and O VIII lines with the future International X-ray Observatory (IXO). Depending on the line ratio measured, we conclude that an exposure of ∼130-380 ks on a moderate-redshift, massive regular cluster with the X-ray Microcalorimeter Spectrometer (XMS) on the IXO will be sufficient to provide a strong test for the non-equilibrium ionization model.

  14. Non-traditional inheritance

    International Nuclear Information System (INIS)

    Hall, J.G.

    1992-01-01

    In the last few years, several non-traditional forms of inheritance have been recognized. These include mosaicism, cytoplasmic inheritance, uniparental disomy, imprinting, amplification/anticipation, and somatic recombination. Genomic imprinting (GI) is the dependence of the phenotype on the sex of the transmitting parent. GI in humans seems to involve growth, behaviour, and survival in utero. The detailed mechanism of genomic imprinting is not known, but it seems that some process is involved in turning a gene off; this probably involves two genes, one of which produces a product that turns a gene off, and the gene that is itself turned off. The process of imprinting (turning off) may be associated with methylation. Erasure of imprinting can occur, and seems to be associated with meiosis. 10 refs

  15. An overview of non-traditional nuclear threats

    International Nuclear Information System (INIS)

    Geelhood, B.D.; Wogman, N.A.

    2005-01-01

    In view of the terrorist threats to the United States, the country needs to consider new vectors and weapons related to nuclear and radiological threats against our homeland. The traditional threat vectors, missiles and bombers, have expanded to include threats arriving through the flow of commerce. The new commerce-related vectors include: sea cargo, truck cargo, rail cargo, air cargo, and passenger transport. The types of weapons have also expanded beyond nuclear warheads to include radiation dispersal devices (RDD) or 'dirty' bombs. The consequences of these nuclear and radiological threats are both economic and life threatening. The defense against undesirable materials entering our borders involves extensive radiation monitoring at ports of entry. The radiation and other signatures of potential nuclear and radiological threats are examined along with potential sensors to discover undesirable items in the flow of commerce. Techniques to improve radiation detection are considered. A strategy of primary and secondary screening is proposed to rapidly clear most cargo and carefully examine suspect cargo. (author)

  16. Andragogical Teaching Methods to Enhance Non-Traditional Student Classroom Engagement

    Science.gov (United States)

    Allen, Pamela; Withey, Paul; Lawton, Deb; Aquino, Carlos Tasso

    2016-01-01

    The aim of this study was to provide a reflection of current trends in higher education, identify some of the changes in student behavior, and potential identification of non-traditional classroom facilitation with the purpose of strengthening active learning and use of technology in the classroom. Non-traditional teaching is emerging in the form…

  17. Learning How to Learn: Implications for Non Traditional Adult Students

    Science.gov (United States)

    Tovar, Lynn A.

    2008-01-01

    In this article, learning how to learn for non traditional adult students is discussed with a focus on police officers and firefighters. Learning how to learn is particularly relevant for all returning non-traditional adults; however in the era of terrorism it is critical for the public safety officers returning to college after years of absence…

  18. Development Of International Non-Governmental Organizations And Legal Traditions Of Russia

    OpenAIRE

    Alexandra A. Dorskaya

    2015-01-01

    The article examines the role of international non-governmental organizations in the maintenance and creation of a positive attitude to national legal traditions. The basic stages of development of international non-governmental organizations. Analyzed their advantages and disadvantages. Considered as the legal traditions of the Russian society are reflected in the activities of legal entities and individuals - members of international non-governmental organizations.

  19. Exploring Non-Traditional Learning Methods in Virtual and Real-World Environments

    Science.gov (United States)

    Lukman, Rebeka; Krajnc, Majda

    2012-01-01

    This paper identifies the commonalities and differences within non-traditional learning methods regarding virtual and real-world environments. The non-traditional learning methods in real-world have been introduced within the following courses: Process Balances, Process Calculation, and Process Synthesis, and within the virtual environment through…

  20. Development Of International Non-Governmental Organizations And Legal Traditions Of Russia

    Directory of Open Access Journals (Sweden)

    Alexandra A. Dorskaya

    2015-06-01

    Full Text Available The article examines the role of international non-governmental organizations in the maintenance and creation of a positive attitude to national legal traditions. The basic stages of development of international non-governmental organizations. Analyzed their advantages and disadvantages. Considered as the legal traditions of the Russian society are reflected in the activities of legal entities and individuals - members of international non-governmental organizations.

  1. Student Media Usage Patterns and Non-Traditional Learning in Higher Education

    Directory of Open Access Journals (Sweden)

    Olaf Zawacki-Richter

    2015-04-01

    Full Text Available A total of 2,338 students at German universities participated in a survey, which investigated media usage patterns of so-called traditional and non-traditional students (Schuetze & Wolter, 2003. The students provided information on the digital devices that they own or have access to, and on their usage of media and e-learning tools and services for their learning. A distinction was made between external, formal and internal, informal tools and services. Based on the students’ responses, a typology of media usage patterns was established by means of a latent class analysis (LCA. Four types or profiles of media usage patterns were identified. These types were labeled entertainment users, peripheral users, advanced users and instrumental users. Among non-traditional students, the proportion of instrumental users was rather high. Based on the usage patterns of traditional and non-traditional students, implications for media selection in the instructional design process are outlined in the paper.

  2. Owner Ethnicity: the Determinant of Relationship Between Environment Uncertainty and Performance

    OpenAIRE

    Andriany, Lussia Mariesti

    2017-01-01

    The aim of this research is to reveal effect of environment uncertainty which seen through competition level and costumer to performance. This research reveals the difference of relationship in environment uncertainty to performance between Chinese and non-Chinese business. Data are collected through direct survey to traditional retail owners by questionnaire and interview. Then, those data analyze in two stages, confirmatory factor analysis and simple regression with dummy variable. This res...

  3. Privacy in wireless sensor networks using ring signature

    Directory of Open Access Journals (Sweden)

    Ashmita Debnath

    2014-07-01

    Full Text Available The veracity of a message from a sensor node must be verified in order to avoid a false reaction by the sink. This verification requires the authentication of the source node. The authentication process must also preserve the privacy such that the node and the sensed object are not endangered. In this work, a ring signature was proposed to authenticate the source node while preserving its spatial privacy. However, other nodes as signers and their numbers must be chosen to preclude the possibility of a traffic analysis attack by an adversary. The spatial uncertainty increases with the number of signers but requires larger memory size and communication overhead. This requirement can breach the privacy of the sensed object. To determine the effectiveness of the proposed scheme, the location estimate of a sensor node by an adversary and enhancement in the location uncertainty with a ring signature was evaluated. Using simulation studies, the ring signature was estimated to require approximately four members from the same neighbor region of the source node to sustain the privacy of the node. Furthermore, the ring signature was also determined to have a small overhead and not to adversely affect the performance of the sensor network.

  4. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  5. Non-European traditional herbal medicines in Europe: a community herbal monograph perspective.

    Science.gov (United States)

    Qu, Liping; Zou, Wenjun; Zhou, Zhenxiang; Zhang, Tingmo; Greef, JanVander; Wang, Mei

    2014-10-28

    The European Directive 2004/24/EC introducing a simplified registration procedure for traditional herbal medicinal products, plays an important role in harmonising the current legislation framework for all herbal medicinal products in the European Union (EU). Although substantial achievements have been made under the new scheme, only a limited number of herbal medicinal products from non-European traditions commonly used in Europe have been registered. Therefore, identification of the obstacles, and determination of appropriate means to overcome the major challenges in the registration of non-European traditional herbal medicinal products are of critical importance for the EU herbal medicinal product market. The primary aims of this study were to understand the key issues and obstacles to registration of non-European traditional herbal medicinal products within the EU. The findings may identify the need for more attention on the Community herbal monographs elaborated by the Herbal Medicinal Products Committee (HMPC), as well as further evidence based scientific research on non-European herbal substances/preparations by the scientific community. A systematic evaluation of the herbal substances and preparations included in Community herbal monographs and public statements has been carried out. The focus was herbal substances and preparations derived from non-European traditions. Of the 109 adopted Community herbal monographs, 10 are herbal substances used in Chinese traditional medicine. Where the HMPC issued a public statement because it was unable to elaborate a monograph more than half-involved herbal substances/preparations from non-European traditions. The main reasons herbal substances/preparations from non-European traditions were not accepted for inclusion in the Community herbal monographs have been identified as due to unfulfilled requirements of Directive 2004/24/EC. The most common reasons were the lack of evidence to demonstrate a 15-year minimum

  6. A group signature scheme based on quantum teleportation

    International Nuclear Information System (INIS)

    Wen Xiaojun; Tian Yuan; Ji Liping; Niu Xiamu

    2010-01-01

    In this paper, we present a group signature scheme using quantum teleportation. Different from classical group signature and current quantum signature schemes, which could only deliver either group signature or unconditional security, our scheme guarantees both by adopting quantum key preparation, quantum encryption algorithm and quantum teleportation. Security analysis proved that our scheme has the characteristics of group signature, non-counterfeit, non-disavowal, blindness and traceability. Our quantum group signature scheme has a foreseeable application in the e-payment system, e-government, e-business, etc.

  7. A group signature scheme based on quantum teleportation

    Energy Technology Data Exchange (ETDEWEB)

    Wen Xiaojun; Tian Yuan; Ji Liping; Niu Xiamu, E-mail: wxjun36@gmail.co [Information Countermeasure Technique Research Institute, Harbin Institute of Technology, Harbin 150001 (China)

    2010-05-01

    In this paper, we present a group signature scheme using quantum teleportation. Different from classical group signature and current quantum signature schemes, which could only deliver either group signature or unconditional security, our scheme guarantees both by adopting quantum key preparation, quantum encryption algorithm and quantum teleportation. Security analysis proved that our scheme has the characteristics of group signature, non-counterfeit, non-disavowal, blindness and traceability. Our quantum group signature scheme has a foreseeable application in the e-payment system, e-government, e-business, etc.

  8. Five Guidelines for Selecting Hydrological Signatures

    Science.gov (United States)

    McMillan, H. K.; Westerberg, I.; Branger, F.

    2017-12-01

    Hydrological signatures are index values derived from observed or modeled series of hydrological data such as rainfall, flow or soil moisture. They are designed to extract relevant information about hydrological behavior, such as to identify dominant processes, and to determine the strength, speed and spatiotemporal variability of the rainfall-runoff response. Hydrological signatures play an important role in model evaluation. They allow us to test whether particular model structures or parameter sets accurately reproduce the runoff generation processes within the watershed of interest. Most modeling studies use a selection of different signatures to capture different aspects of the catchment response, for example evaluating overall flow distribution as well as high and low flow extremes and flow timing. Such studies often choose their own set of signatures, or may borrow subsets of signatures used in multiple other works. The link between signature values and hydrological processes is not always straightforward, leading to uncertainty and variability in hydrologists' signature choices. In this presentation, we aim to encourage a more rigorous approach to hydrological signature selection, which considers the ability of signatures to represent hydrological behavior and underlying processes for the catchment and application in question. To this end, we propose a set of guidelines for selecting hydrological signatures. We describe five criteria that any hydrological signature should conform to: Identifiability, Robustness, Consistency, Representativeness, and Discriminatory Power. We describe an example of the design process for a signature, assessing possible signature designs against the guidelines above. Due to their ubiquity, we chose a signature related to the Flow Duration Curve, selecting the FDC mid-section slope as a proposed signature to quantify catchment overall behavior and flashiness. We demonstrate how assessment against each guideline could be used to

  9. Adoption of agricultural innovations through non-traditional financial ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Adoption of agricultural innovations through non-traditional financial services ... donors, banks, and financial institutions to explore new kinds of financial services to ... enterprises, and others in the production process to connect with markets.

  10. Traditional and non-traditional treatments for autism spectrum disorder with seizures: an on-line survey

    OpenAIRE

    Frye, Richard E; Sreenivasula, Swapna; Adams, James B

    2011-01-01

    Abstract Background Despite the high prevalence of seizure, epilepsy and abnormal electroencephalograms in individuals with autism spectrum disorder (ASD), there is little information regarding the relative effectiveness of treatments for seizures in the ASD population. In order to determine the effectiveness of traditional and non-traditional treatments for improving seizures and influencing other clinical factor relevant to ASD, we developed a comprehensive on-line seizure survey. Methods A...

  11. Developing a Signature Based Safeguards Approach for the Electrorefiner and Salt Cleanup Unit Operations in Pyroprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Chantell Lynne-Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-27

    Traditional nuclear materials accounting does not work well for safeguards when applied to pyroprocessing. Alternate methods such as Signature Based Safeguards (SBS) are being investigated. The goal of SBS is real-time/near-real-time detection of anomalous events in the pyroprocessing facility as they could indicate loss of special nuclear material. In high-throughput reprocessing facilities, metric tons of separated material are processed that must be accounted for. Even with very low uncertainties of accountancy measurements (<0.1%) the uncertainty of the material balances is still greater than the desired level. Novel contributions of this work are as follows: (1) significant enhancement of SBS development for the salt cleanup process by creating a new gas sparging process model, selecting sensors to monitor normal operation, identifying safeguards-significant off-normal scenarios, and simulating those off-normal events and generating sensor output; (2) further enhancement of SBS development for the electrorefiner by simulating off-normal events caused by changes in salt concentration and identifying which conditions lead to Pu and Cm not tracking throughout the rest of the system; and (3) new contribution in applying statistical techniques to analyze the signatures gained from these two models to help draw real-time conclusions on anomalous events.

  12. Approximating prediction uncertainty for random forest regression models

    Science.gov (United States)

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  13. ADAGE signature analysis: differential expression analysis with data-defined gene sets.

    Science.gov (United States)

    Tan, Jie; Huyck, Matthew; Hu, Dongbo; Zelaya, René A; Hogan, Deborah A; Greene, Casey S

    2017-11-22

    ADAGE signature analysis to perform gene set analysis using data-defined functional gene signatures. This approach addresses an important gap for biologists studying non-traditional model organisms and those without extensive curated resources available. We built both an R package and web server to provide ADAGE signature analysis to the community.

  14. Conceptualisation of learning satisfaction experienced by non-traditional learners in Singapore

    OpenAIRE

    Khiat, Henry

    2013-01-01

    This study uncovered the different factors that make up the learning satisfaction of non-traditional learners in Singapore. Data was collected from a component of the student evaluation exercise in a Singapore university in 2011. A mixed-methods approach was adopted in the analysis. The study stated that non-traditional learners’ learning satisfaction can be generally grouped into four main categories: a) Desirable Learning Deliverables; b) Directed Learning Related Factors; c) Lecturer/Tutor...

  15. ANALYSIS OF DIGITAL SIGNATURE RULE IN THE ELECTRONIC COMMERCE LAW OF INDONESIA

    Directory of Open Access Journals (Sweden)

    Irna Nurhayati

    2015-02-01

    Full Text Available The Indonesian legislature has enacted the Information and Electronic Transaction Bill (the Bill in 2003, in which digital signature as a technology specific electronic (e commerce has been promoted. The promotion of digital signature is still problematic, since there is an uncertainty whether Indonesia can develop well a required viable technology of digital signature. Moreover, there is a gap between the actual use of digital signature and the projections of future utilization of digital signature by interested parties on e-commerce. This paper will discuss the reason of the Indonesian legislature promotes digital signature. It will then analyse the gap between the actual use of digital signature and the projections of future utilization of digital signature by interested parties on e-commerce. This paper will finally argue whether the promotion of digital signature of the Bill is useful for the growth of e-commerce in Indonesia.

  16. An Investigation of the Perceptions of Business Students Regarding Non-Traditional Business Education Formats.

    Science.gov (United States)

    Barnes, John W.; Hadjimarcou, John

    1999-01-01

    A survey of 118 undergraduate business students at a major southwestern university found that most consider non-traditional education as a viable option to traditional education. However, respondents also identified disadvantages of non-traditional programs, such as cost, external validity of degrees, and impersonalized learning environment.…

  17. Non-codified traditional medicine practices from Belgaum Region in Southern India: present scenario

    Science.gov (United States)

    2014-01-01

    Background Traditional medicine in India can be classified into codified (Ayurveda, Unani, Siddha, Homeopathy) and non-codified (folk medicine) systems. Both the systems contributing equally to the primary healthcare in India. The present study is aimed to understand the current scenario of medicinal practices of non-codified system of traditional medicine in Belgaum region, India. Methods The study has been conducted as a basic survey of identified non-codified traditional practitioners by convenience sampling with semi structured, open ended interviews and discussions. The learning process, disease diagnosis, treatment, remuneration, sharing of knowledge and socio-demographic data was collected, analysed and discussed. Results One hundred and forty traditional practitioners were identified and interviewed for the present study. These practitioners are locally known as “Vaidya”. The study revealed that the non-codified healthcare tradition is practiced mainly by elderly persons in the age group of 61 years and above (40%). 73% of the practitioners learnt the tradition from their forefathers, and 19% of practitioners developed their own practices through experimentation, reading and learning. 20% of the practitioners follow distinctive “Nadi Pariksha” (pulse examination) for disease diagnosis, while others follow bodily symptoms and complaints. 29% of the traditional practitioners do not charge anything, while 59% practitioners receive money as remuneration. Plant and animal materials are used as sources of medicines, with a variety of preparation methods. The preference ranking test revealed higher education and migration from villages are the main reasons for decreasing interest amongst the younger generation, while deforestation emerged as the main cause of medicinal plants depletion. Conclusion Patrilineal transfer of the knowledge to younger generation was observed in Belgaum region. The observed resemblance in disease diagnosis, plant collection and

  18. Comparison between conservative perturbation and sampling based methods for propagation of Non-Neutronic uncertainties

    International Nuclear Information System (INIS)

    Campolina, Daniel de A.M.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2013-01-01

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using sampling based method is recent because of the huge computational effort required. In this work a sample space of MCNP calculations were used as a black box model to propagate the uncertainty of system parameters. The efficiency of the method was compared to a conservative method. Uncertainties in input parameters of the reactor considered non-neutronic uncertainties, including geometry dimensions and density. The effect of the uncertainties on the effective multiplication factor of the system was analyzed respect to the possibility of using many uncertainties in the same input. If the case includes more than 46 parameters with uncertainty in the same input, the sampling based method is proved to be more efficient than the conservative method. (author)

  19. The secret of René Guénon: A critical review of Guénon's traditionalism

    OpenAIRE

    Poznanović Željko

    2014-01-01

    This paper aims to present the basic principles of the doctrine of René Guénon (1886-1951), a French thinker of broad and comprehensive scope when it comes to Hinduism, Western Occult Tradition, Freemasonry, Taoism, symbolism and certain aspects of metaphysics. His teaching did not leave a mark in the mainstream of philosophy, yet it founded a whole syncretic movement known as Traditionalism or Perennialism. While Guénons doctrine is commonly either ignored or accepted as unquestionably true,...

  20. The quantum entropic uncertainty relation and entanglement witness in the two-atom system coupling with the non-Markovian environments

    International Nuclear Information System (INIS)

    Zou, Hong-Mei; Fang, Mao-Fa; Yang, Bai-Yuan; Guo, You-Neng; He, Wei; Zhang, Shi-Yang

    2014-01-01

    The quantum entropic uncertainty relation and entanglement witness in the two-atom system coupling with the non-Markovian environments are studied using the time-convolutionless master-equation approach. The influence of the non-Markovian effect and detuning on the lower bound of the quantum entropic uncertainty relation and entanglement witness is discussed in detail. The results show that, only if the two non-Markovian reservoirs are identical, increasing detuning and non-Markovian effect can reduce the lower bound of the entropic uncertainty relation, lengthen the time region during which the entanglement can be witnessed, and effectively protect the entanglement region witnessed by the lower bound of the entropic uncertainty relation. The results can be applied in quantum measurement, quantum cryptography tasks and quantum information processing. (paper)

  1. An Arbitrated Quantum Signature Scheme without Entanglement*

    International Nuclear Information System (INIS)

    Li Hui-Ran; Luo Ming-Xing; Peng Dai-Yuan; Wang Xiao-Jun

    2017-01-01

    Several quantum signature schemes are recently proposed to realize secure signatures of quantum or classical messages. Arbitrated quantum signature as one nontrivial scheme has attracted great interests because of its usefulness and efficiency. Unfortunately, previous schemes cannot against Trojan horse attack and DoS attack and lack of the unforgeability and the non-repudiation. In this paper, we propose an improved arbitrated quantum signature to address these secure issues with the honesty arbitrator. Our scheme takes use of qubit states not entanglements. More importantly, the qubit scheme can achieve the unforgeability and the non-repudiation. Our scheme is also secure for other known quantum attacks . (paper)

  2. Uncertainty quantification using evidence theory in multidisciplinary design optimization

    International Nuclear Information System (INIS)

    Agarwal, Harish; Renaud, John E.; Preston, Evan L.; Padmanabhan, Dhanesh

    2004-01-01

    Advances in computational performance have led to the development of large-scale simulation tools for design. Systems generated using such simulation tools can fail in service if the uncertainty of the simulation tool's performance predictions is not accounted for. In this research an investigation of how uncertainty can be quantified in multidisciplinary systems analysis subject to epistemic uncertainty associated with the disciplinary design tools and input parameters is undertaken. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. To illustrate the methodology, multidisciplinary analysis problems are introduced as an extension to the epistemic uncertainty challenge problems identified by Sandia National Laboratories. After uncertainty has been characterized mathematically the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such non-smooth functions cannot be used in traditional gradient-based optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A sequential approximate optimization approach is used to drive the optimization process. The methodology is illustrated in application to multidisciplinary example problems

  3. Research on Signature Verification Method Based on Discrete Fréchet Distance

    Science.gov (United States)

    Fang, J. L.; Wu, W.

    2018-05-01

    This paper proposes a multi-feature signature template based on discrete Fréchet distance, which breaks through the limitation of traditional signature authentication using a single signature feature. It solves the online handwritten signature authentication signature global feature template extraction calculation workload, signature feature selection unreasonable problem. In this experiment, the false recognition rate (FAR) and false rejection rate (FRR) of the statistical signature are calculated and the average equal error rate (AEER) is calculated. The feasibility of the combined template scheme is verified by comparing the average equal error rate of the combination template and the original template.

  4. Rethinking energy security in Asia. A non-traditional view of human security

    Energy Technology Data Exchange (ETDEWEB)

    Caballero-Anthony, Mely [Nanyang Technological Univ., Singapore (SG). Centre for Non-Traditional Security (NTS) Studies; Chang, Youngho [Nanyang Technological Univ., Singapore (Singapore). Division of Economics; Putra, Nur Azha (eds.) [National Univ. of Singapore (Singapore). Energy Security Division

    2012-07-01

    Traditional notions of security are premised on the primacy of state security. In relation to energy security, traditional policy thinking has focused on ensuring supply without much emphasis on socioeconomic and environmental impacts. Non-traditional security (NTS) scholars argue that threats to human security have become increasingly prominent since the end of the Cold War, and that it is thus critical to adopt a holistic and multidisciplinary approach in addressing rising energy needs. This volume represents the perspectives of scholars from across Asia, looking at diverse aspects of energy security through a non-traditional security lens. The issues covered include environmental and socioeconomic impacts, the role of the market, the role of civil society, energy sustainability and policy trends in the ASEAN region.

  5. [Individualized clinical treatment from the prospective of hepatotoxicity of non-toxic traditional Chinese medicine].

    Science.gov (United States)

    Yang, Nan; Chen, Juan; Hou, Xue-Feng; Song, Jie; Feng, Liang; Jia, Xiao-Bin

    2017-04-01

    Traditional Chinese medicine has a long history in clinical application, and been proved to be safe and effective. In recent years, the toxicity and side-effects caused by the western medicine have been attracted much attention. As a result, increasing people have shifted their attention to traditional Chinese medicine. Nonetheless, due to the natural origin of traditional Chinese medicine and the lack of basic knowledge about them, many people mistakenly consider the absolute safety of traditional Chinese medicine, except for well-known toxic ones, such as arsenic. However, according to the clinical practices and recent studies, great importance shall be attached to the toxicity of non-toxic traditional Chinese medicine, in particular the hepatotoxicity. Relevant studies indicated that the toxicity of non-toxic traditional Chinese medicine is closely correlated with individual gene polymorphism and constitution. By discussing the causes and mechanisms of the hepatotoxicity induced by non-toxic traditional Chinese medicine in clinical practices, we wrote this article with the aim to provide new ideas for individualized clinical therapy of traditional Chinese medicine and give guidance for rational and safe use of traditional Chinese medicine. Copyright© by the Chinese Pharmaceutical Association.

  6. Access to and Use of Export Market Information by Non- Traditional ...

    African Journals Online (AJOL)

    Ghana has traditionally depended on a number of export commodities such as cocoa, timber, gold and diamonds for its economic and social development. Recent economic policies of government have aimed to expand the country's exports to include non-traditional exports such as horticultural products, textiles, fishery ...

  7. Perceived constraints by non-traditional users on the Mt. Baker-Snoqualmie National Forest

    Science.gov (United States)

    Elizabeth A. Covelli; Robert C. Burns; Alan Graefe

    2007-01-01

    The purpose of this study was to investigate the constraints that non-traditional users face, along with the negotiation strategies that are employed in order to start, continue, or increase participation in recreation on a national forest. Non-traditional users were defined as respondents who were not Caucasian. Additionally, both constraints and negotiation...

  8. The perfect storm of information: combining traditional and non-traditional data sources for public health situational awareness during hurricane response.

    Science.gov (United States)

    Bennett, Kelly J; Olsen, Jennifer M; Harris, Sara; Mekaru, Sumiko; Livinski, Alicia A; Brownstein, John S

    2013-12-16

    Hurricane Isaac made landfall in southeastern Louisiana in late August 2012, resulting in extensive storm surge and inland flooding. As the lead federal agency responsible for medical and public health response and recovery coordination, the Department of Health and Human Services (HHS) must have situational awareness to prepare for and address state and local requests for assistance following hurricanes. Both traditional and non-traditional data have been used to improve situational awareness in fields like disease surveillance and seismology. This study investigated whether non-traditional data (i.e., tweets and news reports) fill a void in traditional data reporting during hurricane response, as well as whether non-traditional data improve the timeliness for reporting identified HHS Essential Elements of Information (EEI). HHS EEIs provided the information collection guidance, and when the information indicated there was a potential public health threat, an event was identified and categorized within the larger scope of overall Hurricane Issac situational awareness. Tweets, news reports, press releases, and federal situation reports during Hurricane Isaac response were analyzed for information about EEIs. Data that pertained to the same EEI were linked together and given a unique event identification number to enable more detailed analysis of source content. Reports of sixteen unique events were examined for types of data sources reporting on the event and timeliness of the reports. Of these sixteen unique events identified, six were reported by only a single data source, four were reported by two data sources, four were reported by three data sources, and two were reported by four or more data sources. For five of the events where news tweets were one of multiple sources of information about an event, the tweet occurred prior to the news report, press release, local government\\emergency management tweet, and federal situation report. In all circumstances where

  9. Do Ghanaian non-traditional exporters understand the importance of ...

    African Journals Online (AJOL)

    Do Ghanaian non-traditional exporters understand the importance of sales ... The older the firm in export business, the more likely it was for management to put in ... taking into consideration other factors like internet use and planning of sales ...

  10. Nuclear forensics of a non-traditional sample: Neptunium

    International Nuclear Information System (INIS)

    Doyle, Jamie L.; Schwartz, Daniel; Tandon, Lav

    2016-01-01

    Recent nuclear forensics cases have focused primarily on plutonium (Pu) and uranium (U) materials. By definition however, nuclear forensics can apply to any diverted nuclear material. This includes neptunium (Np), an internationally safeguarded material like Pu and U, that could offer a nuclear security concern if significant quantities were found outside of regulatory control. This case study couples scanning electron microscopy (SEM) with quantitative analysis using newly developed specialized software, to evaluate a non-traditional nuclear forensic sample of Np. Here, the results of the morphological analyses were compared with another Np sample of known pedigree, as well as other traditional actinide materials in order to determine potential processing and point-of-origin

  11. Uncertainty and Sensitivity Analysis of Filtration Models for Non-Fickian transport and Hyperexponential deposition

    DEFF Research Database (Denmark)

    Yuan, Hao; Sin, Gürkan

    2011-01-01

    Uncertainty and sensitivity analyses are carried out to investigate the predictive accuracy of the filtration models for describing non-Fickian transport and hyperexponential deposition. Five different modeling approaches, involving the elliptic equation with different types of distributed...... filtration coefficients and the CTRW equation expressed in Laplace space, are selected to simulate eight experiments. These experiments involve both porous media and colloid-medium interactions of different heterogeneity degrees. The uncertainty of elliptic equation predictions with distributed filtration...... coefficients is larger than that with a single filtration coefficient. The uncertainties of model predictions from the elliptic equation and CTRW equation in Laplace space are minimal for solute transport. Higher uncertainties of parameter estimation and model outputs are observed in the cases with the porous...

  12. Basis for the implementation of digital signature in Argentine's health environment

    International Nuclear Information System (INIS)

    Escobar, P P; Formica, M

    2007-01-01

    The growth of telemedical applications and electronic transactions in health environments is paced by the constant technology evolution. This implies a big cultural change in traditional medicine and in hospital information systems' users which arrival is delayed, basically, by the lack of solid laws and a well defined role-based infrastructure. The use of digital signature as a mean of identification, authentication, confidentiality and non-repudiation is the most suitable tool for assuring the electronic transactions and patient's data protection. The implementation of a Public Key Infrastructure (PKI) in health environment allows for authentication, encryption and use of digital signature for assuring confidentiality and control of the movement of sensitive information. This work defines the minimum technological, legal and procedural basis for a successful PKI implementation and establishes the roles for the different actors in the chain of confidence in the public health environment of Argentine

  13. Renewable energy sources. Non-traditional actors on the international market

    International Nuclear Information System (INIS)

    1999-01-01

    Five of Sweden's technical attaches have investigated the non-traditional actors activity within the field of renewable energy sources. Countries studied are USA, Japan, France, Germany and Great Britain

  14. Student Media Usage Patterns and Non-Traditional Learning in Higher Education

    Science.gov (United States)

    Zawacki-Richter, Olaf; Müskens, Wolfgang; Krause, Ulrike; Alturki, Uthman; Aldraiweesh, Ahmed

    2015-01-01

    A total of 2,338 students at German universities participated in a survey, which investigated media usage patterns of so-called traditional and non-traditional students (Schuetze & Wolter, 2003). The students provided information on the digital devices that they own or have access to, and on their usage of media and e-learning tools and…

  15. Blind Quantum Signature with Blind Quantum Computation

    Science.gov (United States)

    Li, Wei; Shi, Ronghua; Guo, Ying

    2017-04-01

    Blind quantum computation allows a client without quantum abilities to interact with a quantum server to perform a unconditional secure computing protocol, while protecting client's privacy. Motivated by confidentiality of blind quantum computation, a blind quantum signature scheme is designed with laconic structure. Different from the traditional signature schemes, the signing and verifying operations are performed through measurement-based quantum computation. Inputs of blind quantum computation are securely controlled with multi-qubit entangled states. The unique signature of the transmitted message is generated by the signer without leaking information in imperfect channels. Whereas, the receiver can verify the validity of the signature using the quantum matching algorithm. The security is guaranteed by entanglement of quantum system for blind quantum computation. It provides a potential practical application for e-commerce in the cloud computing and first-generation quantum computation.

  16. Smuggling, non-fundamental uncertainty, and parallel market exchange rate volatility

    OpenAIRE

    Richard Clay Barnett

    2003-01-01

    We explore a model where smuggling and a parallel currency market arise, owing to government restrictions that prevent agents from legally holding foreign exchange. Despite such restrictions, agents are able to diversify their savings, holding both domestic and parallel foreign cash, basing their portfolio allocation on current and prospective parallel exchange rates. We attribute movements in parallel rates to non-fundamental uncertainty. The model generates equilibria with both positive and...

  17. A small set of succinct signature patterns distinguishes Chinese and non-Chinese HIV-1 genomes.

    Directory of Open Access Journals (Sweden)

    Yan Wang

    Full Text Available The epidemiology of HIV-1 in China has unique features that may have led to unique viral strains. We therefore tested the hypothesis that it is possible to find distinctive patterns in HIV-1 genomes sampled in China. Using a rule inference algorithm we could indeed extract from sequences of the third variable loop (V3 of HIV-1 gp120 a set of 14 signature patterns that with 89% accuracy distinguished Chinese from non-Chinese sequences. These patterns were found to be specific to HIV-1 subtype, i.e. sequences complying with pattern 1 were of subtype B, pattern 2 almost exclusively covered sequences of subtype 01_AE, etc. We then analyzed the first of these signature patterns in depth, namely that L and W at two V3 positions are specifically occurring in Chinese sequences of subtype B/B' (3% false positives. This pattern was found to be in agreement with the phylogeny of HIV-1 of subtype B inside and outside of China. We could neither reject nor convincingly confirm that the pattern is stabilized by immune escape. For further interpretation of the signature pattern we used the recently developed measure of Direct Information, and in this way discovered evidence for physical interactions between V2 and V3. We conclude by a discussion of limitations of signature patterns, and the applicability of the approach to other genomic regions and other countries.

  18. Analysis of an indirect neutron signature for enhanced UF_6 cylinder verification

    International Nuclear Information System (INIS)

    Kulisek, J.A.; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-01-01

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF_6) cylinders. The current method provides relatively low accuracy for the assay of "2"3"5U enrichment, especially for natural and depleted UF_6. Furthermore, the current method provides no capability to assay the absolute mass of "2"3"5U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from "2"3"5U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA_N_T). HEVA_N_T enables full-volume assay of UF_6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF_6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA_N_T in terms of the individual contributions to HEVA_N_T from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA_N_T signature to manipulation by the nearby placement of neutron-conversion materials.

  19. A Secure and Efficient Certificateless Short Signature Scheme

    Directory of Open Access Journals (Sweden)

    Lin Cheng

    2013-07-01

    Full Text Available Certificateless public key cryptography combines advantage of traditional public key cryptography and identity-based public key cryptography as it avoids usage of certificates and resolves the key escrow problem. In 2007, Huang et al. classified adversaries against certificateless signatures according to their attack power into normal, strong and super adversaries (ordered by their attack power. In this paper, we propose a new certificateless short signature scheme and prove that it is secure against both of the super type I and the super type II adversaries. Our new scheme not only achieves the strongest security level but also has the shortest signature length (one group element. Compared with the other short certificateless signature schemes which have a similar security level, our new scheme has less operation cost.

  20. Drifting Apart or Converging? Grades among Non-Traditional and Traditional Students over the Course of Their Studies: A Case Study from Germany

    Science.gov (United States)

    Brändle, Tobias; Lengfeld, Holger

    2017-01-01

    Since 2009, German universities were opened by law to freshmen who do not possess the traditional graduation certificate required for entry into University, but who are rather vocationally qualified. In this article, we track the grades of these so-called non-traditional students and compare them to those of traditional students using a…

  1. Affordable non-traditional source data mining for context assessment to improve distributed fusion system robustness

    Science.gov (United States)

    Bowman, Christopher; Haith, Gary; Steinberg, Alan; Morefield, Charles; Morefield, Michael

    2013-05-01

    This paper describes methods to affordably improve the robustness of distributed fusion systems by opportunistically leveraging non-traditional data sources. Adaptive methods help find relevant data, create models, and characterize the model quality. These methods also can measure the conformity of this non-traditional data with fusion system products including situation modeling and mission impact prediction. Non-traditional data can improve the quantity, quality, availability, timeliness, and diversity of the baseline fusion system sources and therefore can improve prediction and estimation accuracy and robustness at all levels of fusion. Techniques are described that automatically learn to characterize and search non-traditional contextual data to enable operators integrate the data with the high-level fusion systems and ontologies. These techniques apply the extension of the Data Fusion & Resource Management Dual Node Network (DNN) technical architecture at Level 4. The DNN architecture supports effectively assessment and management of the expanded portfolio of data sources, entities of interest, models, and algorithms including data pattern discovery and context conformity. Affordable model-driven and data-driven data mining methods to discover unknown models from non-traditional and `big data' sources are used to automatically learn entity behaviors and correlations with fusion products, [14 and 15]. This paper describes our context assessment software development, and the demonstration of context assessment of non-traditional data to compare to an intelligence surveillance and reconnaissance fusion product based upon an IED POIs workflow.

  2. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  3. The Impact of Intrinsic and Extrinsic Motivation on the Academic Achievement of Non-Traditional Undergraduate Students

    Science.gov (United States)

    Arce, Alma Lorenia

    2017-01-01

    Non-traditional students have become a growing component of the student population in today's college systems. Research has shown that non-traditional students are less likely to achieve academically and complete their degree programs compared to traditional students. The purpose of this quantitative, correlational study was to investigate the…

  4. Study of the uncertainty in estimation of the exposure of non-human biota to ionising radiation.

    Science.gov (United States)

    Avila, R; Beresford, N A; Agüero, A; Broed, R; Brown, J; Iospje, M; Robles, B; Suañez, A

    2004-12-01

    Uncertainty in estimations of the exposure of non-human biota to ionising radiation may arise from a number of sources including values of the model parameters, empirical data, measurement errors and biases in the sampling. The significance of the overall uncertainty of an exposure assessment will depend on how the estimated dose compares with reference doses used for risk characterisation. In this paper, we present the results of a study of the uncertainty in estimation of the exposure of non-human biota using some of the models and parameters recommended in the FASSET methodology. The study was carried out for semi-natural terrestrial, agricultural and marine ecosystems, and for four radionuclides (137Cs, 239Pu, 129I and 237Np). The parameters of the radionuclide transfer models showed the highest sensitivity and contributed the most to the uncertainty in the predictions of doses to biota. The most important ones were related to the bioavailability and mobility of radionuclides in the environment, for example soil-to-plant transfer factors, the bioaccumulation factors for marine biota and the gut uptake fraction for terrestrial mammals. In contrast, the dose conversion coefficients showed low sensitivity and contributed little to the overall uncertainty. Radiobiological effectiveness contributed to the overall uncertainty of the dose estimations for alpha emitters although to a lesser degree than a number of transfer model parameters.

  5. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Analysis of an indirect neutron signature for enhanced UF{sub 6} cylinder verification

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, J.A., E-mail: Jonathan.Kulisek@pnnl.gov; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-02-21

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF{sub 6}) cylinders. The current method provides relatively low accuracy for the assay of {sup 235}U enrichment, especially for natural and depleted UF{sub 6}. Furthermore, the current method provides no capability to assay the absolute mass of {sup 235}U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from {sup 235}U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA{sub NT}). HEVA{sub NT} enables full-volume assay of UF{sub 6} cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF{sub 6}. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA{sub NT} in terms of the individual contributions to HEVA{sub NT} from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA{sub NT} signature to manipulation by the nearby placement of neutron-conversion materials.

  7. The Pleasures and Pitfalls of a Non-traditional Occupation.

    Science.gov (United States)

    Scott, Robert E.

    Both men and women who engage in non-traditional occupations (occupations in which 80 percent or more of the participants are of the opposite sex) are generally happy with their occupational choice, according to interviews with seventy such women and ten men. The women, however, experienced more discrimination and sexual harassment, while the men…

  8. Large scale applicability of a Fully Adaptive Non-Intrusive Spectral Projection technique: Sensitivity and uncertainty analysis of a transient

    International Nuclear Information System (INIS)

    Perkó, Zoltán; Lathouwers, Danny; Kloosterman, Jan Leen; Hagen, Tim van der

    2014-01-01

    Highlights: • Grid and basis adaptive Polynomial Chaos techniques are presented for S and U analysis. • Dimensionality reduction and incremental polynomial order reduce computational costs. • An unprotected loss of flow transient is investigated in a Gas Cooled Fast Reactor. • S and U analysis is performed with MC and adaptive PC methods, for 42 input parameters. • PC accurately estimates means, variances, PDFs, sensitivities and uncertainties. - Abstract: Since the early years of reactor physics the most prominent sensitivity and uncertainty (S and U) analysis methods in the nuclear community have been adjoint based techniques. While these are very effective for pure neutronics problems due to the linearity of the transport equation, they become complicated when coupled non-linear systems are involved. With the continuous increase in computational power such complicated multi-physics problems are becoming progressively tractable, hence affordable and easily applicable S and U analysis tools also have to be developed in parallel. For reactor physics problems for which adjoint methods are prohibitive Polynomial Chaos (PC) techniques offer an attractive alternative to traditional random sampling based approaches. At TU Delft such PC methods have been studied for a number of years and this paper presents a large scale application of our Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm for performing the sensitivity and uncertainty analysis of a Gas Cooled Fast Reactor (GFR) Unprotected Loss Of Flow (ULOF) transient. The transient was simulated using the Cathare 2 code system and a fully detailed model of the GFR2400 reactor design that was investigated in the European FP7 GoFastR project. Several sources of uncertainty were taken into account amounting to an unusually high number of stochastic input parameters (42) and numerous output quantities were investigated. The results show consistently good performance of the applied adaptive PC

  9. Uncertainty relations and topological-band insulator transitions in 2D gapped Dirac materials

    International Nuclear Information System (INIS)

    Romera, E; Calixto, M

    2015-01-01

    Uncertainty relations are studied for a characterization of topological-band insulator transitions in 2D gapped Dirac materials isostructural with graphene. We show that the relative or Kullback–Leibler entropy in position and momentum spaces, and the standard variance-based uncertainty relation give sharp signatures of topological phase transitions in these systems. (paper)

  10. The Revival of Non-Traditional State Actors' Interests in Africa

    DEFF Research Database (Denmark)

    Kragelund, Peter

    2012-01-01

    credit ratings make external finance available for African governments. This article examines how non-traditional state actors affect the possibility of African governments setting and funding their own development priorities. It argues that while the current situation may increase the policy autonomy...

  11. Non-fragile observer design for discrete-time genetic regulatory networks with randomly occurring uncertainties

    International Nuclear Information System (INIS)

    Banu, L Jarina; Balasubramaniam, P

    2015-01-01

    This paper investigates the problem of non-fragile observer design for a class of discrete-time genetic regulatory networks (DGRNs) with time-varying delays and randomly occurring uncertainties. A non-fragile observer is designed, for estimating the true concentration of mRNAs and proteins from available measurement outputs. One important feature of the results obtained that are reported here is that the parameter uncertainties are assumed to be random and their probabilities of occurrence are known a priori. On the basis of the Lyapunov–Krasovskii functional approach and using a convex combination technique, a delay-dependent estimation criterion is established for DGRNs in terms of linear matrix inequalities (LMIs) that can be efficiently solved using any available LMI solver. Finally numerical examples are provided to substantiate the theoretical results. (paper)

  12. Review of traditional and non-traditional medicinal genetic resources in the USDA, ARS, PGRCU collection evaluated for flavonoid concentrations and anthocyanin indexes

    Science.gov (United States)

    Non-traditional medicinal species include velvetleaf (Abutilon theophrasti Medik.), Desmodium species, Termanus labialis (L.f.) Spreng. and the traditional species consists of roselle (Hibiscus sabdariffa L.). There is a need to identify plant sources of flavonoids and anthocyanins since they have s...

  13. Quantum signature scheme based on a quantum search algorithm

    International Nuclear Information System (INIS)

    Yoon, Chun Seok; Kang, Min Sung; Lim, Jong In; Yang, Hyung Jin

    2015-01-01

    We present a quantum signature scheme based on a two-qubit quantum search algorithm. For secure transmission of signatures, we use a quantum search algorithm that has not been used in previous quantum signature schemes. A two-step protocol secures the quantum channel, and a trusted center guarantees non-repudiation that is similar to other quantum signature schemes. We discuss the security of our protocol. (paper)

  14. The secret of René Guénon: A critical review of Guénon's traditionalism

    Directory of Open Access Journals (Sweden)

    Poznanović Željko

    2014-01-01

    Full Text Available This paper aims to present the basic principles of the doctrine of René Guénon (1886-1951, a French thinker of broad and comprehensive scope when it comes to Hinduism, Western Occult Tradition, Freemasonry, Taoism, symbolism and certain aspects of metaphysics. His teaching did not leave a mark in the mainstream of philosophy, yet it founded a whole syncretic movement known as Traditionalism or Perennialism. While Guénons doctrine is commonly either ignored or accepted as unquestionably true, a critical approach to it is very rare, especially criticism of the logical coherence of the system itself. Although Guénon was not a transparently systematic thinker, his own beliefs just like any other cannot be self-contradictory. A special attention is given to the contradiction of his doctrine with Islam despite the fact that Guénon was a declared Muslim. In this paper we have particularly shown socio-political implications of Guénons doctrine supporting our attitudes by both Guénons own statements and the statements of his supporters as well as critics ranging from moderate to the most severe ones. We hope we have shown that in his doctrine there is a hidden coherent system that leads to conclusions and objectives that are precisely determined regardless of apparent contradictions.

  15. Developing an efficient decision support system for non-traditional machine selection: an application of MOORA and MOOSRA

    Directory of Open Access Journals (Sweden)

    Asis Sarkar

    2015-01-01

    Full Text Available The purpose of this paper is to find out an efficient decision support method for non-traditional machine selection. It seeks to analyze potential non-traditional machine selection attributes with a relatively new MCDM approach of MOORA and MOOSRA method. The use of MOORA and MOOSRA method has been adopted to tackle subjective evaluation of information collected from an expert group. An example case study is shown here for better understanding of the said selection module which can be effectively applied to any other decision-making scenario. The method is not only computationally very simple, easily comprehensible, and robust, but also believed to have numerous subjective attributes. The rankings are expected to provide good guidance to the managers of an organization to select a feasible non-traditional machine. It shall also provide a good insight for the non-traditional machine manufacturer who might encourage research work concerning non-traditional machine selection.

  16. Reaching the Non-Traditional Stopout Population: A Segmentation Approach

    Science.gov (United States)

    Schatzel, Kim; Callahan, Thomas; Scott, Crystal J.; Davis, Timothy

    2011-01-01

    An estimated 21% of 25-34-year-olds in the United States, about eight million individuals, have attended college and quit before completing a degree. These non-traditional students may or may not return to college. Those who return to college are referred to as stopouts, whereas those who do not return are referred to as stayouts. In the face of…

  17. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  18. Export contracts for non-traditional products: Chayote from Costa Rica

    NARCIS (Netherlands)

    Saénz, F.; Ruben, R.

    2004-01-01

    This paper focuses on the determinants of market and contract choice for non-traditional crops and the possibilities for involving local producers in global agro-food chains through delivery relationships with packers and brokers. Main attention is given to the importance of quality for entering the

  19. Methods and apparatus for multi-parameter acoustic signature inspection

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, Aaron A [Richland, WA; Samuel, Todd J [Pasco, WA; Valencia, Juan D [Kennewick, WA; Gervais, Kevin L [Richland, WA; Tucker, Brian J [Pasco, WA; Kirihara, Leslie J [Richland, WA; Skorpik, James R [Kennewick, WA; Reid, Larry D [Benton City, WA; Munley, John T [Benton City, WA; Pappas, Richard A [Richland, WA; Wright, Bob W [West Richland, WA; Panetta, Paul D [Richland, WA; Thompson, Jason S [Richland, WA

    2007-07-24

    A multiparameter acoustic signature inspection device and method are described for non-invasive inspection of containers. Dual acoustic signatures discriminate between various fluids and materials for identification of the same.

  20. Chemical signature study of tupiguarani ceramic tradition from Central region of the Rio Grande do Sul state, Brazil

    International Nuclear Information System (INIS)

    Bona, Irene Akemy Tomiyoshi

    2006-01-01

    In this work a model based on experimental results using chemical composition data of the pottery sherds applied to Spearmann's no parametric test, principal component analysis and discriminant analysis, was applied. The samples are soils and Tupiguarani Tradition pottery sherd from the central area of the Rio Grande do Sul State. The chemical elements , Al, Ba, Ca, Cr, Fe, K Mn, Pb, Rb, S, Si, Sr, Ti, V and Zn were determined by energy dispersive X-ray fluorescence (EDXFR) while Ce, Cu, Gd, La, Nd, Pr, Sm, Th and Y by high-resolution inductively coupled plasma mass spectrometry (HR-ICP-MS) techniques. Relationships among the pottery characteristics, studied sites and sherd dispersion in the several sites were proposed. Indications of chemical signature of the small pottery with function to go or not to the fire were observed. The largest dispersion is of small pottery with surface treatment no corrugated. The potteries chemical fingerprints from Ijui River, Ibicui-Vacacai Mirim River and Jacui River were verified. (author)

  1. Identification of uranium signatures in swipe samples on verification of nuclear activities for nuclear safeguards purposes

    International Nuclear Information System (INIS)

    Pestana, Rafael Cardoso Baptistini

    2013-01-01

    The use of environmental sampling for safeguards purposes, has been applied by the International Atomic Energy Agency–IAEA since 1996 and are routinely used as a complementary measure to strengthen the traditional nuclear safeguards procedures. The aim is verify if the states signatory to the safeguards agreements are not diverging their peaceful nuclear activities for undeclared nuclear activities. This work describes a new protocol of collect and analysis of the swipe samples for identification of nuclear signatures that may be related to the nuclear activities developed in the inspected facility. This work was used as a case of study a real uranium conversion plant of the nuclear fuel cycle of IPEN. The strategy proposed uses different analytical techniques, such as alpha radiation meter, SEM-EDX and ICP-MS to identify signatures of uranium adhered to the swipe samples. In the swipe samples analysis, it was possible to identify particles of UO 2 F 2 and UF4 through the morphological comparison and semi-quantitative analyses performed by SEM-EDX technique. In this work, methods were used that as a result has the average isotopic composition of the sample, in which the enrichment ranged from 1.453 ± 0.023 to 18.24 % ± 0.15 % in the 235 U isotope. Through these externally collections, a non-intrusive sampling, it was possible to identify enriched material handling activities with enrichment of 1.453 % ± 0.023 % to 6.331 ± 0.055 % in the isotope 235 U, as well as the use of reprocessed material, through the identification of the 236 U isotope. The uncertainties obtained for the n( 235 U)/n( 238 U) ratio varied from 0.40% to 0.86 % for the internal swipe samples. (author)

  2. Uncertainty quantification for criticality problems using non-intrusive and adaptive Polynomial Chaos techniques

    International Nuclear Information System (INIS)

    Gilli, L.; Lathouwers, D.; Kloosterman, J.L.; Hagen, T.H.J.J. van der; Koning, A.J.; Rochman, D.

    2013-01-01

    Highlights: ► Non-intrusive spectral techniques are applied to perform UQ of criticality problems. ► A new adaptive algorithm based on the definition of sparse grid is derived. ► The method is applied to two reference criticality problems. - Abstract: In this paper we present the implementation and the application of non-intrusive spectral techniques for uncertainty analysis of criticality problems. Spectral techniques can be used to reconstruct stochastic quantities of interest by means of a Fourier-like expansion. Their application to uncertainty propagation problems can be performed in a non-intrusive fashion by evaluating a set of projection integrals that are used to reconstruct the spectral expansion. This can be done either by using standard Monte Carlo integration approaches or by adopting numerical quadrature rules. We present the derivation of a new adaptive quadrature algorithm, based on the definition of a sparse grid, which can be used to reduce the computational cost associated with non-intrusive spectral techniques. This new adaptive algorithm and the Monte Carlo integration alternative are then applied to two reference problems. First, a stochastic multigroup diffusion problem is introduced by considering the microscopic cross-sections of the system to be random quantities. Then a criticality benchmark is defined for which a set of resonance parameters in the resolved region are assumed to be stochastic

  3. Application of quantile functions for the analysis and comparison of gas pressure balance uncertainties

    Directory of Open Access Journals (Sweden)

    Ramnath Vishal

    2017-01-01

    Full Text Available Traditionally in the field of pressure metrology uncertainty quantification was performed with the use of the Guide to the Uncertainty in Measurement (GUM; however, with the introduction of the GUM Supplement 1 (GS1 the use of Monte Carlo simulations has become an accepted practice for uncertainty analysis in metrology for mathematical models in which the underlying assumptions of the GUM are not valid. Consequently the use of quantile functions was developed as a means to easily summarize and report on uncertainty numerical results that were based on Monte Carlo simulations. In this paper, we considered the case of a piston–cylinder operated pressure balance where the effective area is modelled in terms of a combination of explicit/implicit and linear/non-linear models, and how quantile functions may be applied to analyse results and compare uncertainties from a mixture of GUM and GS1 methodologies.

  4. Quantum signature scheme for known quantum messages

    International Nuclear Information System (INIS)

    Kim, Taewan; Lee, Hyang-Sook

    2015-01-01

    When we want to sign a quantum message that we create, we can use arbitrated quantum signature schemes which are possible to sign for not only known quantum messages but also unknown quantum messages. However, since the arbitrated quantum signature schemes need the help of a trusted arbitrator in each verification of the signature, it is known that the schemes are not convenient in practical use. If we consider only known quantum messages such as the above situation, there can exist a quantum signature scheme with more efficient structure. In this paper, we present a new quantum signature scheme for known quantum messages without the help of an arbitrator. Differing from arbitrated quantum signature schemes based on the quantum one-time pad with the symmetric key, since our scheme is based on quantum public-key cryptosystems, the validity of the signature can be verified by a receiver without the help of an arbitrator. Moreover, we show that our scheme provides the functions of quantum message integrity, user authentication and non-repudiation of the origin as in digital signature schemes. (paper)

  5. Isotopic source signatures: Impact of regional variability on the δ13CH4 trend and spatial distribution

    Science.gov (United States)

    Feinberg, Aryeh I.; Coulon, Ancelin; Stenke, Andrea; Schwietzke, Stefan; Peter, Thomas

    2018-02-01

    The atmospheric methane growth rate has fluctuated over the past three decades, signifying variations in methane sources and sinks. Methane isotopic ratios (δ13CH4) differ between emission categories, and can therefore be used to distinguish which methane sources have changed. However, isotopic modelling studies have mainly focused on uncertainties in methane emissions rather than uncertainties in isotopic source signatures. We simulated atmospheric δ13CH4 for the period 1990-2010 using the global chemistry-climate model SOCOL. Empirically-derived regional variability in the isotopic signatures was introduced in a suite of sensitivity simulations. These simulations were compared to a baseline simulation with commonly used global mean isotopic signatures. We investigated coal, natural gas/oil, wetland, livestock, and biomass burning source signatures to determine whether regional variations impact the observed isotopic trend and spatial distribution. Based on recently published source signature datasets, our calculated global mean isotopic signatures are in general lighter than the commonly used values. Trends in several isotopic signatures were also apparent during the period 1990-2010. Tropical livestock emissions grew during the 2000s, introducing isotopically heavier livestock emissions since tropical livestock consume more C4 vegetation than midlatitude livestock. Chinese coal emissions, which are isotopically heavy compared to other coals, increase during the 2000s leading to higher global values of δ13CH4 for coal emissions. EDGAR v4.2 emissions disagree with the observed atmospheric isotopic trend for almost all simulations, confirming past doubts about this emissions inventory. The agreement between the modelled and observed δ13CH4 interhemispheric differences improves when regional source signatures are used. Even though the simulated results are highly dependent on the choice of methane emission inventories, they emphasize that the commonly used

  6. Commercialized non-Camellia tea: traditional function and molecular identification

    Directory of Open Access Journals (Sweden)

    Ping Long

    2014-06-01

    Full Text Available Non-Camellia tea is a part of the colorful Chinese tea culture, and is also widely used as beverage and medicine in folk for disease prevention and treatment. In this study, 37 samples were collected, including 33 kinds of non-Camellia teas and 4 kinds of teas (Camellia. Traditional functions of non-Camellia teas were investigated. Furthermore, non-Camellia teas of original plants were characterized and identified by molecular methods. Four candidate regions (rbcL, matK, ITS2, psbA-trnH were amplified by polymerase chain reaction. In addition, DNA barcodes were used for the first time to discriminate the commercial non-Camellia tea and their adulterants, and to evaluate their safety. This study showed that BLASTN and the relevant phylogenetic tree are efficient tools for identification of the commercial non-Camellia tea and their adulterants. However, some sequences from original plants have not been found and there is a limitation of sequence number of original plants in GenBank. Submitting more original plant sequences to the GenBank will be helpful for evaluating the safety of non-Camellia teas.

  7. Comparison of a traditional and non-traditional residential care facility for persons living with dementia and the impact of the environment on occupational engagement.

    Science.gov (United States)

    Richards, Kieva; D'Cruz, Rachel; Harman, Suzanne; Stagnitti, Karen

    2015-12-01

    Dementia residential facilities can be described as traditional or non-traditional facilities. Non-traditional facilities aim to utilise principles of environmental design to create a milieu that supports persons experiencing cognitive decline. This study aimed to compare these two environments in rural Australia, and their influence on residents' occupational engagement. The Residential Environment Impact Survey (REIS) was used and consists of: a walk-through of the facility; activity observation; interviews with residents and employees. Thirteen residents were observed and four employees interviewed. Resident interviews did not occur given the population diagnosis of moderate to severe dementia. Descriptive data from the walk-through and activity observation were analysed for potential opportunities of occupational engagement. Interviews were thematically analysed to discern perception of occupational engagement of residents within their facility. Both facilities provided opportunities for occupational engagement. However, the non-traditional facility provided additional opportunities through employee interactions and features of the physical environment. Interviews revealed six themes: Comfortable environment; roles and responsibilities; getting to know the resident; more stimulation can elicit increased engagement; the home-like experience and environmental layout. These themes coupled with the features of the environment provided insight into the complexity of occupational engagement within this population. This study emphasises the influence of the physical and social environment on occupational engagement opportunities. A non-traditional dementia facility maximises these opportunities and can support development of best-practice guidelines within this population. © 2015 Occupational Therapy Australia.

  8. Non-Traditional Authorship Attribution Studies of William Shakespeare’s Canon: Some Caveats

    Directory of Open Access Journals (Sweden)

    Joseph Rudman

    2016-03-01

    Full Text Available The paper looks at the problems in conducting non-traditional authorship attribution studies on the canon of William Shakespeare. After a short introduction, the case is put forth that these studies are ‘scientific’ and must adhere to the tenets of the scientific method. By showing that a complete and valid experimental plan is necessary and pointing out the many and varied pitfalls (e.g., the text, the control groups, the treatment of errors, it becomes clear what a valid study of Shakespearean non-traditional authorship attribution demands. I then come to the conclusion that such a valid study is not attainable with the limits of present-day knowledge.

  9. Using Options to Manage Dynamic Uncertainty in Acquisition Projects

    National Research Council Canada - National Science Library

    Ceylan, B. K; Ford, David N

    2002-01-01

    Uncertainty in acquisition projects and environments can degrade performance. Traditional project planning, management tools, and methods can effectively deal with uncertainties in relatively stable environments...

  10. A novel biometric signature: multi-site, remote (> 100 m) photo-plethysmography using ambient light

    NARCIS (Netherlands)

    Verkruijsse, W.; Bodlaender, M.P.

    2010-01-01

    We propose a novel biometric signature based on the principle of multi-site remote photo-plethysmography (remote PPG). This signature can be measured using video cameras. Traditional PPG uses a contact sensor to measure slight skin color changes that occur with every heartbeat. Recently we have

  11. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    Science.gov (United States)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the

  12. Impact of Uncertainties in the Cosmological Parameters on the Measurement of Primordial non-Gaussianity

    CERN Document Server

    Liguori, M

    2008-01-01

    We study the impact of cosmological parameters' uncertainties on estimates of the primordial NG parameter f_NL in local and equilateral models of non-Gaussianity. We show that propagating these errors increases the f_NL relative uncertainty by 16% for WMAP and 5 % for Planck in the local case, whereas for equilateral configurations the correction term are 14% and 4%, respectively. If we assume for local f_NL a central value of order 60, according to recent WMAP 5-years estimates, we obtain for Planck a final correction \\Delta f_NL = 3. Although not dramatic, this correction is at the level of the expected estimator uncertainty for Planck, and should then be taken into account when quoting the significance of an eventual future detection. In current estimates of f_NL the cosmological parameters are held fixed at their best-fit values. We finally note that the impact of uncertainties in the cosmological parameters on the final f_NL error bar would become totally negligible if the parameters were allowed to vary...

  13. Equivalence of the Traditional and Non-Standard Definitions of Concepts from Real Analysis

    Directory of Open Access Journals (Sweden)

    John Cowles

    2014-06-01

    Full Text Available ACL2(r is a variant of ACL2 that supports the irrational real and complex numbers. Its logical foundation is based on internal set theory (IST, an axiomatic formalization of non-standard analysis (NSA. Familiar ideas from analysis, such as continuity, differentiability, and integrability, are defined quite differently in NSA–some would argue the NSA definitions are more intuitive. In previous work, we have adopted the NSA definitions in ACL2(r, and simply taken as granted that these are equivalent to the traditional analysis notions, e.g., to the familiar epsilon-delta definitions. However, we argue in this paper that there are circumstances when the more traditional definitions are advantageous in the setting of ACL2(r, precisely because the traditional notions are classical, so they are unencumbered by IST limitations on inference rules such as induction or the use of pseudo-lambda terms in functional instantiation. To address this concern, we describe a formal proof in ACL2(r of the equivalence of the traditional and non-standards definitions of these notions.

  14. Electroencephalographic Evidence of Abnormal Anticipatory Uncertainty Processing in Gambling Disorder Patients.

    Science.gov (United States)

    Megías, Alberto; Navas, Juan F; Perandrés-Gómez, Ana; Maldonado, Antonio; Catena, Andrés; Perales, José C

    2018-06-01

    Putting money at stake produces anticipatory uncertainty, a process that has been linked to key features of gambling. Here we examined how learning and individual differences modulate the stimulus preceding negativity (SPN, an electroencephalographic signature of perceived uncertainty of valued outcomes) in gambling disorder patients (GDPs) and healthy controls (HCs), during a non-gambling contingency learning task. Twenty-four GDPs and 26 HCs performed a causal learning task under conditions of high and medium uncertainty (HU, MU; null and positive cue-outcome contingency, respectively). Participants were asked to predict the outcome trial-by-trial, and to regularly judge the strength of the cue-outcome contingency. A pre-outcome SPN was extracted from simultaneous electroencephalographic recordings for each participant, uncertainty level, and task block. The two groups similarly learnt to predict the occurrence of the outcome in the presence/absence of the cue. In HCs, SPN amplitude decreased as the outcome became predictable in the MU condition, a decrement that was absent in the HU condition, where the outcome remained unpredictable during the task. Most importantly, GDPs' SPN remained high and insensitive to task type and block. In GDPs, the SPN amplitude was linked to gambling preferences. When both groups were considered together, SPN amplitude was also related to impulsivity. GDPs thus showed an abnormal electrophysiological response to outcome uncertainty, not attributable to faulty contingency learning. Differences with controls were larger in frequent players of passive games, and smaller in players of more active games. Potential psychological mechanisms underlying this set of effects are discussed.

  15. Construction of Graduation Certificate Issuing System Based on Digital Signature Technique

    Directory of Open Access Journals (Sweden)

    Mohammed Issam Younis

    2015-06-01

    Full Text Available With the development of computer architecture and its technologies in recent years, applications like e-commerce, e-government, e-governance and e-finance are widely used, and they act as active research areas. In addition, in order to increase the quality and quantity of the ordinary everyday transactions, it is desired to migrate from the paper-based environment to a digital-based computerized environment. Such migration increases efficiency, saves time, eliminates paperwork, increases safety and reduces the cost in an organization. Digital signatures are playing an essential role in many electronic and automatic based systems and facilitate this migration. The digital signatures are used to provide many services and solutions that would not have been possible by the conventional hand-written signature. In the educational environment, the process of issuing the graduation certificates can no longer be restricted to the traditional methods. Hence, a computerized system for issuing certificates of graduation in an electronic form is needed and desired. This paper proposes a Graduation Certificates Issuing System (GCIS based on digital signature technology. In doing so, this research highlights the state-of-the-art and the art-of-the-practice for some existing digital signature-based systems in the literatures. In addition, eight intertwined elected services are identified, namely: message authentication, entity authentication, integrity, non-repudiation, time stamping, distinguished signing authorities, delegating signing capability and supporting workflow systems. Moreover, this research examines nine existing systems, showing their merits and demerits in terms of these elected services. Furthermore, the research describes the architectural design using the Unified Modeling Language (UML and provides the concrete implementation of the proposed GCIS. The GCIS is implemented using Visual Basic.Net programming language and SQL Server database

  16. Lighting the Gym: A Guide to Illuminating Non-Traditional Spaces.

    Science.gov (United States)

    Womack, Jennifer; Nelson, Steve

    2000-01-01

    Covers all the steps needed to light an open, non-traditional performance space--everything from where to locate lights, support towers, and power sources, to cable and dimmer requirements. Covers safety issues, equipment costs, what students should and should not be allowed to do, and how to deal with electricians and rental companies. (SC)

  17. Non-Participatory Intervention in a Traditional Participatory Organization

    DEFF Research Database (Denmark)

    Jønsson, Thomas; Jeppesen, Hans Jeppe

    2009-01-01

    The aim of the present study is to investigate employee attitudes to non-participatory (topdown) changes in an organizational environment that has hitherto been participatory.Until now, research has traditionally investigated the effects of increased organizational influence on employee attitudes...... and behaviour. This study takes the opposite approach by looking at a decrease in influence. The study was undertaken in a production company with 480 employees. The work was organized in production lines and semi-autonomous working groups. Data was compiled via interviews with selected employees from three...... kinds of production areas: Areas that had implemented 1) all of the  planned changes; 2) some of the changes; or 3) only a few of the changes. The results show that the employees’ reactions to the non-participatory change process addressed the decrease of influence and the consequences thereof; i...

  18. NON-TRADITIONAL SPORTS AT SCHOOL. BENEFITS FOR PHYSICAL AND MOTOR DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    AMADOR J. LARA-SÁNCHEZ

    2010-12-01

    Full Text Available Physical Education teachers have been using some very classic team sports, like football, basketball, handball, volleyball, etc. for many years in order to develop their education work at school. As a consequence of this, the benefits of this kind of activities on Physical Education lessons have not been as notable as we mighthave expected, since, even if they are increasing, their development and application are still low. There are many and very varied new non-traditional sports that have emerged and extended across Spain in recent years. To mention an example, we could refer to a newly created non-traditional sport such as kin-ball. This sport wascreated for the purpose of achieving a way to combine several factors such as health, team-work and competitiveness. Three teams of four players each participate. This way, every player can participate to a great extent in all the moves of the match, for each of them must defend one area of their half in order to achieve a common objective. Besides, kin-ball helps to develop motor skills at school in an easy way; that is, coordination, balance and perception. There is a large variety of non-traditional games and sports that are similar to kin-ball, such as floorball, intercrosse, mazaball, tchoukball, ultimate, indiaca, shuttleball... All of them show many physical, psychic and social advantages, and can help us to make the Physical Education teaching-learning process more motivating, acquiring the recreational component that it showed some years ago and which hasnow disappeared

  19. An exploration of on-line access by non-traditional students in higher education: a case study.

    Science.gov (United States)

    Dearnley, Chris; Dunn, Ginny; Watson, Sue

    2006-07-01

    The nature of Higher Education (HE) has seen many changes throughout the last decade. The agenda for widening participation in HE has led to an increase in the number of students with a broader range of educational backgrounds. At the same time there has been a surge in the development of digitalisation and the convergence of computing and telecommunications technologies available for use in education. This paper discusses the outcomes of a case study, conducted in a School of Health Studies within a northern English University, which identified the extent to which 'non-traditional' students access on-line learning facilities, such as virtual learning environments and library networks, and what factors enhanced or formed barriers to access. 'Non-traditional' students, for the purpose of this study, were defined as mature students who were returning to higher education after a considerable break. The outcomes indicated that skill deficit is a major obstacle for many 'non-traditional' students. The paper explores this issue in depth and suggests potential ways forward for the delivery of technology supported learning for 'non-traditional' students in Higher Education.

  20. Understanding uncertainties in non-linear population trajectories: a Bayesian semi-parametric hierarchical approach to large-scale surveys of coral cover.

    Directory of Open Access Journals (Sweden)

    Julie Vercelloni

    Full Text Available Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making.

  1. A new non-specificity measure in evidence theory based on belief intervals

    Institute of Scientific and Technical Information of China (English)

    Yang Yi; Han Deqiang; Jean Dezert

    2016-01-01

    In the theory of belief functions, the measure of uncertainty is an important concept, which is used for representing some types of uncertainty incorporated in bodies of evidence such as the discord and the non-specificity. For the non-specificity part, some traditional measures use for reference the Hartley measure in classical set theory;other traditional measures use the simple and heuristic function for joint use of mass assignments and the cardinality of focal elements. In this paper, a new non-specificity measure is proposed using lengths of belief intervals, which represent the degree of imprecision. Therefore, it has more intuitive physical meaning. It can be proved that our new measure can be rewritten in a general form for the non-specificity. Our new measure is also proved to be a strict non-specificity measure with some desired properties. Numerical examples, simulations, the related analyses and proofs are provided to show the characteristics and good properties of the new non-specificity definition. An example of an application of the new non-specificity measure is also presented.

  2. Patterns of gender-role behaviour in children attending traditional and non-traditional day-care centres.

    Science.gov (United States)

    Cole, H J; Zucker, K J; Bradley, S J

    1982-08-01

    Using a sex-typed free-play task and the Draw-a-Person test, the gender-role behaviour of children attending a day-care centre whose staff adhered to a "non-sexist" child-rearing philosophy was compared to the gender-role behaviour of children attending a more traditional day-care center. Parental provision of sex-typed and neutral toys and approval of cross-sex role behaviour was also assessed. On both measures, the two groups of children showed culturally typical patterns of gender-role behaviour. The parents of the two groups of children were generally similar in terms of the kinds of toys they provided and in their attitudes toward the expression of cross-sex role behaviour. Potential explanations for the inability to demonstrate effects of the "non-sexist" child-rearing philosophy were discussed.

  3. Uncertainty propagation in life cycle assessment of biodiesel versus diesel: global warming and non-renewable energy.

    Science.gov (United States)

    Hong, Jinglan

    2012-06-01

    Uncertainty information is essential for the proper use of life cycle assessment and environmental assessments in decision making. To investigate the uncertainties of biodiesel and determine the level of confidence in the assertion that biodiesel is more environmentally friendly than diesel, an explicit analytical approach based on the Taylor series expansion for lognormal distribution was applied in the present study. A biodiesel case study demonstrates the probability that biodiesel has a lower global warming and non-renewable energy score than diesel, that is 92.3% and 93.1%, respectively. The results indicate the level of confidence in the assertion that biodiesel is more environmentally friendly than diesel based on the global warming and non-renewable energy scores. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Magnetic signature surveillance of nuclear fuel

    International Nuclear Information System (INIS)

    Bernatowicz, H.; Schoenig, F.C.

    1981-01-01

    Typical nuclear fuel material contains tramp ferromagnetic particles of random size and distribution. Also, selected amounts of paramagnetic or ferromagnetic material can be added at random or at known positions in the fuel material. The fuel material in its non-magnetic container is scanned along its length by magnetic susceptibility detecting apparatus whereby susceptibility changes along its length are obtained and provide a unique signal waveform of the container of fuel material as a signature thereof. The output signature is stored. At subsequent times in its life the container is again scanned and respective signatures obtained which are compared with the initially obtained signature, any differences indicating alteration or tampering with the fuel material. If the fuel material includes a paramagnetic additive by taking two measurements along the container the effects thereof can be cancelled out. (author)

  5. Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results

    Energy Technology Data Exchange (ETDEWEB)

    Chavez, Gregory M [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory

    2009-01-01

    The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which can be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.

  6. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples

    Directory of Open Access Journals (Sweden)

    Rígel Licier

    2016-10-01

    Full Text Available The proper handling of samples to be analyzed by mass spectrometry (MS can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  7. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples.

    Science.gov (United States)

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-10-17

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  8. Secure Mobile Agent from Leakage-Resilient Proxy Signatures

    Directory of Open Access Journals (Sweden)

    Fei Tang

    2015-01-01

    Full Text Available A mobile agent can sign a message in a remote server on behalf of a customer without exposing its secret key; it can be used not only to search for special products or services, but also to make a contract with a remote server. Hence a mobile agent system can be used for electronic commerce as an important key technology. In order to realize such a system, Lee et al. showed that a secure mobile agent can be constructed using proxy signatures. Intuitively, a proxy signature permits an entity (delegator to delegate its signing right to another entity (proxy to sign some specified messages on behalf of the delegator. However, the proxy signatures are often used in scenarios where the signing is done in an insecure environment, for example, the remote server of a mobile agent system. In such setting, an adversary could launch side-channel attacks to exploit some leakage information about the proxy key or even other secret states. The proxy signatures which are secure in the traditional security models obviously cannot provide such security. Based on this consideration, in this paper, we design a leakage-resilient proxy signature scheme for the secure mobile agent systems.

  9. Digital "Testimonio" as a Signature Pedagogy for Latin@ Studies

    Science.gov (United States)

    Benmayor, Rina

    2012-01-01

    This article proposes the curricular integration of digital "testimonio" as a "signature" pedagogy in Latin@ Studies. The "testimonio" tradition of urgent narratives and the creative multimedia languages of digital storytelling--text, voice, image, and sound--invite historically marginalized subjects, especially younger generations, to author and…

  10. What is the uncertainty principle of non-relativistic quantum mechanics?

    Science.gov (United States)

    Riggs, Peter J.

    2018-05-01

    After more than ninety years of discussions over the uncertainty principle, there is still no universal agreement on what the principle states. The Robertson uncertainty relation (incorporating standard deviations) is given as the mathematical expression of the principle in most quantum mechanics textbooks. However, the uncertainty principle is not merely a statement of what any of the several uncertainty relations affirm. It is suggested that a better approach would be to present the uncertainty principle as a statement about the probability distributions of incompatible variables and the resulting restrictions on quantum states.

  11. Barriers to Blended Digital Distance Vocational Learning for Non-Traditional Students

    Science.gov (United States)

    Safford, Kimberly; Stinton, Julia

    2016-01-01

    This research identifies and examines the challenges of blending digital distance and vocational learning for non-traditional and low-socio-economic status students who are new to university education. A survey of students in vocational primary education and early years qualifications in a distance university is illuminated by interviews with…

  12. The effects of different representations on static structure analysis of computer malware signatures.

    Science.gov (United States)

    Narayanan, Ajit; Chen, Yi; Pang, Shaoning; Tao, Ban

    2013-01-01

    The continuous growth of malware presents a problem for internet computing due to increasingly sophisticated techniques for disguising malicious code through mutation and the time required to identify signatures for use by antiviral software systems (AVS). Malware modelling has focused primarily on semantics due to the intended actions and behaviours of viral and worm code. The aim of this paper is to evaluate a static structure approach to malware modelling using the growing malware signature databases now available. We show that, if malware signatures are represented as artificial protein sequences, it is possible to apply standard sequence alignment techniques in bioinformatics to improve accuracy of distinguishing between worm and virus signatures. Moreover, aligned signature sequences can be mined through traditional data mining techniques to extract metasignatures that help to distinguish between viral and worm signatures. All bioinformatics and data mining analysis were performed on publicly available tools and Weka.

  13. Active visual search in non-stationary scenes: coping with temporal variability and uncertainty

    Science.gov (United States)

    Ušćumlić, Marija; Blankertz, Benjamin

    2016-02-01

    Objective. State-of-the-art experiments for studying neural processes underlying visual cognition often constrain sensory inputs (e.g., static images) and our behavior (e.g., fixed eye-gaze, long eye fixations), isolating or simplifying the interaction of neural processes. Motivated by the non-stationarity of our natural visual environment, we investigated the electroencephalography (EEG) correlates of visual recognition while participants overtly performed visual search in non-stationary scenes. We hypothesized that visual effects (such as those typically used in human-computer interfaces) may increase temporal uncertainty (with reference to fixation onset) of cognition-related EEG activity in an active search task and therefore require novel techniques for single-trial detection. Approach. We addressed fixation-related EEG activity in an active search task with respect to stimulus-appearance styles and dynamics. Alongside popping-up stimuli, our experimental study embraces two composite appearance styles based on fading-in, enlarging, and motion effects. Additionally, we explored whether the knowledge obtained in the pop-up experimental setting can be exploited to boost the EEG-based intention-decoding performance when facing transitional changes of visual content. Main results. The results confirmed our initial hypothesis that the dynamic of visual content can increase temporal uncertainty of the cognition-related EEG activity in active search with respect to fixation onset. This temporal uncertainty challenges the pivotal aim to keep the decoding performance constant irrespective of visual effects. Importantly, the proposed approach for EEG decoding based on knowledge transfer between the different experimental settings gave a promising performance. Significance. Our study demonstrates that the non-stationarity of visual scenes is an important factor in the evolution of cognitive processes, as well as in the dynamic of ocular behavior (i.e., dwell time and

  14. Does Twitter trigger bursts in signature collections?

    Science.gov (United States)

    Yamaguchi, Rui; Imoto, Seiya; Kami, Masahiro; Watanabe, Kenji; Miyano, Satoru; Yuji, Koichiro

    2013-01-01

    The quantification of social media impacts on societal and political events is a difficult undertaking. The Japanese Society of Oriental Medicine started a signature-collecting campaign to oppose a medical policy of the Government Revitalization Unit to exclude a traditional Japanese medicine, "Kampo," from the public insurance system. The signature count showed a series of aberrant bursts from November 26 to 29, 2009. In the same interval, the number of messages on Twitter including the keywords "Signature" and "Kampo," increased abruptly. Moreover, the number of messages on an Internet forum that discussed the policy and called for signatures showed a train of spikes. In order to estimate the contributions of social media, we developed a statistical model with state-space modeling framework that distinguishes the contributions of multiple social media in time-series of collected public opinions. We applied the model to the time-series of signature counts of the campaign and quantified contributions of two social media, i.e., Twitter and an Internet forum, by the estimation. We found that a considerable portion (78%) of the signatures was affected from either of the social media throughout the campaign and the Twitter effect (26%) was smaller than the Forum effect (52%) in total, although Twitter probably triggered the initial two bursts of signatures. Comparisons of the estimated profiles of the both effects suggested distinctions between the social media in terms of sustainable impact of messages or tweets. Twitter shows messages on various topics on a time-line; newer messages push out older ones. Twitter may diminish the impact of messages that are tweeted intermittently. The quantification of social media impacts is beneficial to better understand people's tendency and may promote developing strategies to engage public opinions effectively. Our proposed method is a promising tool to explore information hidden in social phenomena.

  15. Does Twitter trigger bursts in signature collections?

    Directory of Open Access Journals (Sweden)

    Rui Yamaguchi

    Full Text Available INTRODUCTION: The quantification of social media impacts on societal and political events is a difficult undertaking. The Japanese Society of Oriental Medicine started a signature-collecting campaign to oppose a medical policy of the Government Revitalization Unit to exclude a traditional Japanese medicine, "Kampo," from the public insurance system. The signature count showed a series of aberrant bursts from November 26 to 29, 2009. In the same interval, the number of messages on Twitter including the keywords "Signature" and "Kampo," increased abruptly. Moreover, the number of messages on an Internet forum that discussed the policy and called for signatures showed a train of spikes. METHODS AND FINDINGS: In order to estimate the contributions of social media, we developed a statistical model with state-space modeling framework that distinguishes the contributions of multiple social media in time-series of collected public opinions. We applied the model to the time-series of signature counts of the campaign and quantified contributions of two social media, i.e., Twitter and an Internet forum, by the estimation. We found that a considerable portion (78% of the signatures was affected from either of the social media throughout the campaign and the Twitter effect (26% was smaller than the Forum effect (52% in total, although Twitter probably triggered the initial two bursts of signatures. Comparisons of the estimated profiles of the both effects suggested distinctions between the social media in terms of sustainable impact of messages or tweets. Twitter shows messages on various topics on a time-line; newer messages push out older ones. Twitter may diminish the impact of messages that are tweeted intermittently. CONCLUSIONS: The quantification of social media impacts is beneficial to better understand people's tendency and may promote developing strategies to engage public opinions effectively. Our proposed method is a promising tool to explore

  16. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  17. New and non-traditional mineral raw materials deposits, perspectives of use

    International Nuclear Information System (INIS)

    Beyseev, O.; Beyseev, A.; Baichigasov, I.; Sergev, E.; Shakirova, G.

    1996-01-01

    Deposits of new and non-traditional kinds of mineral raw material are revealed, explored and prepared to industrial recovery in Kazakstan, that can be used in frames of conversion process to create new materials with unique properties, to prepare base for new technologies elaboration, and to achieve appreciable economic benefit. These deposits are located mostly in geographic and economic conditions of advanced infrastructure and mining works network, favorable for recovery.On the tests results the following is of heaviest interest: RHODUCITE, NEMALITE-CONTAINING CHRYSOTILE-ASBESTOS, NICKEL-CONTAINING SILICATE-ASBOLAN ORES, MEDICINAL MINERALS, SHUNGITES, FULLERENES, RAW QUARTZ MINERALS - the group of deposits containing 5 min tons of high quality quartz good for manufacture of cut-glass and fibre-optical articles, is explored in details. There are also deposits of other kinds of non-traditional strategic mineral raw material in the Republic of Kazakstan - natural fillers, that can be used in the national economy of the country and bring considerable economic benefit: chrysotile-asbestos, amphibole-asbestos, talk, vollastonite, tremolite, actinolite, vermiculite, zeolite, etc

  18. Quantum uncertainty relation based on the mean deviation

    OpenAIRE

    Sharma, Gautam; Mukhopadhyay, Chiranjib; Sazim, Sk; Pati, Arun Kumar

    2018-01-01

    Traditional forms of quantum uncertainty relations are invariably based on the standard deviation. This can be understood in the historical context of simultaneous development of quantum theory and mathematical statistics. Here, we present alternative forms of uncertainty relations, in both state dependent and state independent forms, based on the mean deviation. We illustrate the robustness of this formulation in situations where the standard deviation based uncertainty relation is inapplica...

  19. Calculation of the detection limit in radiation measurements with systematic uncertainties

    International Nuclear Information System (INIS)

    Kirkpatrick, J.M.; Russ, W.; Venkataraman, R.; Young, B.M.

    2015-01-01

    The detection limit (L D ) or Minimum Detectable Activity (MDA) is an a priori evaluation of assay sensitivity intended to quantify the suitability of an instrument or measurement arrangement for the needs of a given application. Traditional approaches as pioneered by Currie rely on Gaussian approximations to yield simple, closed-form solutions, and neglect the effects of systematic uncertainties in the instrument calibration. These approximations are applicable over a wide range of applications, but are of limited use in low-count applications, when high confidence values are required, or when systematic uncertainties are significant. One proposed modification to the Currie formulation attempts account for systematic uncertainties within a Gaussian framework. We have previously shown that this approach results in an approximation formula that works best only for small values of the relative systematic uncertainty, for which the modification of Currie's method is the least necessary, and that it significantly overestimates the detection limit or gives infinite or otherwise non-physical results for larger systematic uncertainties where such a correction would be the most useful. We have developed an alternative approach for calculating detection limits based on realistic statistical modeling of the counting distributions which accurately represents statistical and systematic uncertainties. Instead of a closed form solution, numerical and iterative methods are used to evaluate the result. Accurate detection limits can be obtained by this method for the general case

  20. Uncertainties in the estimation of specific absorption rate during radiofrequency alternating magnetic field induced non-adiabatic heating of ferrofluids

    Science.gov (United States)

    Lahiri, B. B.; Ranoo, Surojit; Philip, John

    2017-11-01

    Magnetic fluid hyperthermia (MFH) is becoming a viable cancer treatment methodology where the alternating magnetic field induced heating of magnetic fluid is utilized for ablating the cancerous cells or making them more susceptible to the conventional treatments. The heating efficiency in MFH is quantified in terms of specific absorption rate (SAR), which is defined as the heating power generated per unit mass. In majority of the experimental studies, SAR is evaluated from the temperature rise curves, obtained under non-adiabatic experimental conditions, which is prone to various thermodynamic uncertainties. A proper understanding of the experimental uncertainties and its remedies is a prerequisite for obtaining accurate and reproducible SAR. Here, we study the thermodynamic uncertainties associated with peripheral heating, delayed heating, heat loss from the sample and spatial variation in the temperature profile within the sample. Using first order approximations, an adiabatic reconstruction protocol for the measured temperature rise curves is developed for SAR estimation, which is found to be in good agreement with those obtained from the computationally intense slope corrected method. Our experimental findings clearly show that the peripheral and delayed heating are due to radiation heat transfer from the heating coils and slower response time of the sensor, respectively. Our results suggest that the peripheral heating is linearly proportional to the sample area to volume ratio and coil temperature. It is also observed that peripheral heating decreases in presence of a non-magnetic insulating shielding. The delayed heating is found to contribute up to ~25% uncertainties in SAR values. As the SAR values are very sensitive to the initial slope determination method, explicit mention of the range of linear regression analysis is appropriate to reproduce the results. The effect of sample volume to area ratio on linear heat loss rate is systematically studied and the

  1. Uncertainties in the estimation of specific absorption rate during radiofrequency alternating magnetic field induced non-adiabatic heating of ferrofluids

    International Nuclear Information System (INIS)

    Lahiri, B B; Ranoo, Surojit; Philip, John

    2017-01-01

    Magnetic fluid hyperthermia (MFH) is becoming a viable cancer treatment methodology where the alternating magnetic field induced heating of magnetic fluid is utilized for ablating the cancerous cells or making them more susceptible to the conventional treatments. The heating efficiency in MFH is quantified in terms of specific absorption rate (SAR), which is defined as the heating power generated per unit mass. In majority of the experimental studies, SAR is evaluated from the temperature rise curves, obtained under non-adiabatic experimental conditions, which is prone to various thermodynamic uncertainties. A proper understanding of the experimental uncertainties and its remedies is a prerequisite for obtaining accurate and reproducible SAR. Here, we study the thermodynamic uncertainties associated with peripheral heating, delayed heating, heat loss from the sample and spatial variation in the temperature profile within the sample. Using first order approximations, an adiabatic reconstruction protocol for the measured temperature rise curves is developed for SAR estimation, which is found to be in good agreement with those obtained from the computationally intense slope corrected method. Our experimental findings clearly show that the peripheral and delayed heating are due to radiation heat transfer from the heating coils and slower response time of the sensor, respectively. Our results suggest that the peripheral heating is linearly proportional to the sample area to volume ratio and coil temperature. It is also observed that peripheral heating decreases in presence of a non-magnetic insulating shielding. The delayed heating is found to contribute up to ∼25% uncertainties in SAR values. As the SAR values are very sensitive to the initial slope determination method, explicit mention of the range of linear regression analysis is appropriate to reproduce the results. The effect of sample volume to area ratio on linear heat loss rate is systematically studied and

  2. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  3. The difference between traditional experiments and CFD validation benchmark experiments

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Barton L., E-mail: barton.smith@usu.edu

    2017-02-15

    Computation Fluid Dynamics provides attractive features for design, and perhaps licensing, of nuclear power plants. The most important of these features is low cost compared to experiments. However, uncertainty of CFD calculations must accompany these calculations in order for the results to be useful for important decision making. In order to properly assess the uncertainty of a CFD calculation, it must be “validated” against experimental data. Unfortunately, traditional “discovery” experiments are normally ill-suited to provide all of the information necessary for the validation exercise. Traditionally, experiments are performed to discover new physics, determine model parameters, or to test designs. This article will describe a new type of experiment; one that is designed and carried out with the specific purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. We will demonstrate that the goals of traditional experiments and validation experiments are often in conflict, making use of traditional experimental results problematic and leading directly to larger predictive uncertainty of the CFD model.

  4. The difference between traditional experiments and CFD validation benchmark experiments

    International Nuclear Information System (INIS)

    Smith, Barton L.

    2017-01-01

    Computation Fluid Dynamics provides attractive features for design, and perhaps licensing, of nuclear power plants. The most important of these features is low cost compared to experiments. However, uncertainty of CFD calculations must accompany these calculations in order for the results to be useful for important decision making. In order to properly assess the uncertainty of a CFD calculation, it must be “validated” against experimental data. Unfortunately, traditional “discovery” experiments are normally ill-suited to provide all of the information necessary for the validation exercise. Traditionally, experiments are performed to discover new physics, determine model parameters, or to test designs. This article will describe a new type of experiment; one that is designed and carried out with the specific purpose of providing Computational Fluid Dynamics (CFD) validation benchmark data. We will demonstrate that the goals of traditional experiments and validation experiments are often in conflict, making use of traditional experimental results problematic and leading directly to larger predictive uncertainty of the CFD model.

  5. The quest for knowledge transfer efficacy: blended teaching, online and in-class, with consideration of learning typologies for non-traditional and traditional students

    Science.gov (United States)

    Van Doorn, Judy R.; Van Doorn, John D.

    2014-01-01

    The pedagogical paradigm shift in higher education to 24-h learning environments composed of teaching delivery methods of online courses, blended/hybrid formats, and face-to-face (f2f) classes is increasing access to global, lifelong learning. Online degrees have been offered at 62.4% of 2800 colleges and universities. Students can now design flexible, life-balanced course schedules. Higher knowledge transfer rates may exist with blended course formats with online quizzes and valuable class time set for Socratic, quality discussions and creative team presentations. Research indicates that younger, traditional students exhibit heightened performance goal orientations and prefer entertaining professors who are funny, whereas non-traditional students exhibit mastery profiles and prefer courses taught by flexible, yet organized, professors. A 5-year study found that amongst 51,000 students taking both f2f and online courses, higher online failure rates occurred. Competing life roles for non-traditional students and reading and writing needs for at-risk students suggest that performance may be better if programs are started in f2f courses. Models on effective knowledge transfer consider the planning process, delivery methods, and workplace application, but a gap exists for identifying the diversity of learner needs. Higher education enrollments are being compromised with lower online retention rates. Therefore, the main purpose of this review is to delineate disparate learning styles and present a typology for the learning needs of traditional and non-traditional students. Secondly, psychology as a science may need more rigorous curriculum markers like mapping APA guidelines to knowledge objectives, critical assignments, and student learning outcomes (SLOs) (e.g., online rubric assessments for scoring APA style critical thinking essays on selected New York Times books). Efficacious knowledge transfer to diverse, 21st century students should be the Academy's focus. PMID

  6. The quest for knowledge transfer efficacy: blended teaching, online and in-class, with consideration of learning typologies for non-traditional and traditional students

    Directory of Open Access Journals (Sweden)

    Judy Rouse Van Doorn

    2014-04-01

    Full Text Available The pedagogical paradigm shift in higher education to 24-hour learning environments composed of teaching delivery methods of online courses, blended/hybrid formats, and face-to-face (f2f classes is increasing access to global, lifelong learning. Online degrees have been offered at 62.4% of 2,800 colleges and universities. Students can now design flexible, life-balanced course schedules. Higher knowledge transfer rates may exist with blended course formats with online quizzes and valuable class time set for Socratic, quality discussions and creative team presentations. Research indicates that younger, traditional students exhibit heightened performance goal orientations and prefer entertaining professors who are funny, whereas non-traditional students exhibit mastery profiles and prefer courses taught by flexible, yet organized, professors. A 5-year study found that amongst 51,000 students taking both f2f and online courses, higher online failure rates occurred. Competing life roles for non-traditional students and reading and writing needs for at-risk students suggest that performance may be better if programs are started in f2f courses. Models on effective knowledge transfer consider the planning process, delivery methods, and workplace application, but a gap exists for identifying the diversity of learner needs. Higher education enrollments are being compromised with lower online retention rates. Therefore, the main purpose of this review is to delineate disparate learning styles and present a typology for the learning needs of traditional and non-traditional students. Secondly, psychology as a science may need more rigorous curriculum markers like mapping APA guidelines to knowledge objectives, critical assignments, and student learning outcomes (SLOs (e.g. online rubric assessments for scoring APA style critical thinking essays on selected New York Times books. Efficacious knowledge transfer to diverse, 21st century students should be the

  7. The quest for knowledge transfer efficacy: blended teaching, online and in-class, with consideration of learning typologies for non-traditional and traditional students.

    Science.gov (United States)

    Van Doorn, Judy R; Van Doorn, John D

    2014-01-01

    The pedagogical paradigm shift in higher education to 24-h learning environments composed of teaching delivery methods of online courses, blended/hybrid formats, and face-to-face (f2f) classes is increasing access to global, lifelong learning. Online degrees have been offered at 62.4% of 2800 colleges and universities. Students can now design flexible, life-balanced course schedules. Higher knowledge transfer rates may exist with blended course formats with online quizzes and valuable class time set for Socratic, quality discussions and creative team presentations. Research indicates that younger, traditional students exhibit heightened performance goal orientations and prefer entertaining professors who are funny, whereas non-traditional students exhibit mastery profiles and prefer courses taught by flexible, yet organized, professors. A 5-year study found that amongst 51,000 students taking both f2f and online courses, higher online failure rates occurred. Competing life roles for non-traditional students and reading and writing needs for at-risk students suggest that performance may be better if programs are started in f2f courses. Models on effective knowledge transfer consider the planning process, delivery methods, and workplace application, but a gap exists for identifying the diversity of learner needs. Higher education enrollments are being compromised with lower online retention rates. Therefore, the main purpose of this review is to delineate disparate learning styles and present a typology for the learning needs of traditional and non-traditional students. Secondly, psychology as a science may need more rigorous curriculum markers like mapping APA guidelines to knowledge objectives, critical assignments, and student learning outcomes (SLOs) (e.g., online rubric assessments for scoring APA style critical thinking essays on selected New York Times books). Efficacious knowledge transfer to diverse, 21st century students should be the Academy's focus.

  8. Stronger Schrödinger-like uncertainty relations

    International Nuclear Information System (INIS)

    Song, Qiu-Cheng; Qiao, Cong-Feng

    2016-01-01

    Highlights: • A stronger Schrödinger-like uncertainty relation in the sum of variances of two observables is obtained. • An improved Schrödinger-like uncertainty relation in the product of variances of two observables is obtained. • A stronger uncertainty relation in the sum of variances of three observables is proposed. - Abstract: Uncertainty relation is one of the fundamental building blocks of quantum theory. Nevertheless, the traditional uncertainty relations do not fully capture the concept of incompatible observables. Here we present a stronger Schrödinger-like uncertainty relation, which is stronger than the relation recently derived by Maccone and Pati (2014) [11]. Furthermore, we give an additive uncertainty relation which holds for three incompatible observables, which is stronger than the relation newly obtained by Kechrimparis and Weigert (2014) [12] and the simple extension of the Schrödinger uncertainty relation.

  9. Export contracts for non-traditional products: Chayote from Costa Rica

    OpenAIRE

    Saénz, F.; Ruben, R.

    2004-01-01

    This paper focuses on the determinants of market and contract choice for non-traditional crops and the possibilities for involving local producers in global agro-food chains through delivery relationships with packers and brokers. Main attention is given to the importance of quality for entering the export market and the impact of contractual arrangements on loyal behaviour. Core stipulations in the contract regarding the frequency of delivery and the provision of technical assistance are med...

  10. Non-linear Calibration Leads to Improved Correspondence between Uncertainties

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    2007-01-01

    limit theorem, an excellent correspondence was obtained between predicted uncertainties and measured uncertainties. In order to validate the method, experiments were applied of flame atomic absorption spectrometry (FAAS) for the analysis of Co and Pt, and experiments of electrothermal atomic absorption...

  11. Starry messages: Searching for signatures of interstellar archaeology

    Energy Technology Data Exchange (ETDEWEB)

    Carrigan, Richard A., Jr.; /Fermilab

    2009-12-01

    Searching for signatures of cosmic-scale archaeological artifacts such as Dyson spheres or Kardashev civilizations is an interesting alternative to conventional SETI. Uncovering such an artifact does not require the intentional transmission of a signal on the part of the original civilization. This type of search is called interstellar archaeology or sometimes cosmic archaeology. The detection of intelligence elsewhere in the Universe with interstellar archaeology or SETI would have broad implications for science. For example, the constraints of the anthropic principle would have to be loosened if a different type of intelligence was discovered elsewhere. A variety of interstellar archaeology signatures are discussed including non-natural planetary atmospheric constituents, stellar doping with isotopes of nuclear wastes, Dyson spheres, as well as signatures of stellar and galactic-scale engineering. The concept of a Fermi bubble due to interstellar migration is introduced in the discussion of galactic signatures. These potential interstellar archaeological signatures are classified using the Kardashev scale. A modified Drake equation is used to evaluate the relative challenges of finding various sources. With few exceptions interstellar archaeological signatures are clouded and beyond current technological capabilities. However SETI for so-called cultural transmissions and planetary atmosphere signatures are within reach.

  12. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  13. Molecular Signature in HCV-Positive Lymphomas

    Directory of Open Access Journals (Sweden)

    Valli De Re

    2012-01-01

    Full Text Available Hepatitis C virus (HCV is a positive, single-stranded RNA virus, which has been associated to different subtypes of B-cell non-Hodgkin lymphoma (B-NHL. Cumulative evidence suggests an HCV-related antigen driven process in the B-NHL development. The underlying molecular signature associated to HCV-related B-NHL has to date remained obscure. In this review, we discuss the recent developments in this field with a special mention to different sets of genes whose expression is associated with BCR coupled to Blys signaling which in turn was found to be linked to B-cell maturation stages and NF-κb transcription factor. Even if recent progress on HCV-B-NHL signature has been made, the precise relationship between HCV and lymphoma development and phenotype signature remain to be clarified.

  14. Perseveration induces dissociative uncertainty in obsessive-compulsive disorder.

    Science.gov (United States)

    Giele, Catharina L; van den Hout, Marcel A; Engelhard, Iris M; Dek, Eliane C P; Toffolo, Marieke B J; Cath, Danielle C

    2016-09-01

    Obsessive compulsive (OC)-like perseveration paradoxically increases feelings of uncertainty. We studied whether the underlying mechanism between perseveration and uncertainty is a reduced accessibility of meaning ('semantic satiation'). OCD patients (n = 24) and matched non-clinical controls (n = 24) repeated words 2 (non-perseveration) or 20 times (perseveration). They decided whether this word was related to another target word. Speed of relatedness judgments and feelings of dissociative uncertainty were measured. The effects of real-life perseveration on dissociative uncertainty were tested in a smaller subsample of the OCD group (n = 9). Speed of relatedness judgments was not affected by perseveration. However, both groups reported more dissociative uncertainty after perseveration compared to non-perseveration, which was higher in OCD patients. Patients reported more dissociative uncertainty after 'clinical' perseveration compared to non-perseveration.. Both parts of this study are limited by some methodological issues and a small sample size. Although the mechanism behind 'perseveration → uncertainty' is still unclear, results suggest that the effects of perseveration are counterproductive. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. The angle-angular momentum and entropic uncertainty relations for quantum scattering

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.

    1999-01-01

    Recently the entropic uncertainty relations are obtained in a more general form by using Tsallis-like entropies for the quantum scattering. Hence, using Riesz theorem, the state-independent entropic angle-angular momentum uncertainty relations are proved for the Tsallis-like scattering entropies of spinless particles. The generalized entropic inequalities for the Tsallis-like entropies are presented. The two upper bounds are optimal bounds and can be obtained via Lagrange multipliers by extremizing the Tsallis-like entropies subject to the normalization constraints, respectively. The proof of the lower bound is provided by considering the condition that the angular distribution of probability, P(x) has, everywhere, a finite magnitude. Next, by using the Riesz Theorem a general result was obtained, appearing as inequalities valid for the case of hadron-hadron scattering. An important entropic uncertainty relation for the scattering of spinless particle was thus obtained. For σ el and dσ/dΩ, fixed from experiment, we proved that the optimal scattering entropies are the maximum possible entropies in the scattering process. In as previous paper it was shown that the experimental values of the entropies for the pion--nucleus scatterings are systematically described by the optimal entropies, at all available pion kinetic energies. In this sense the obtained results can also be considered as new experimental signatures for the validity of the principle of minimum distance in space of scattering states. The extension of the optimal state analysis to the generalized non-extensive statistics case, as well as, a test of the entropic inequalities, can be obtained in similar way by using non-extensive optimal entropies. Since this kind of analysis is more involved the numerical examples will be given in a following more extended paper. Finally, we believe that the results obtained here are encouraging for further investigations of the entropic uncertainty relations as well

  16. Knowledge, decision making, and uncertainty

    International Nuclear Information System (INIS)

    Fox, J.

    1986-01-01

    Artificial intelligence (AI) systems depend heavily upon the ability to make decisions. Decisions require knowledge, yet there is no knowledge-based theory of decision making. To the extent that AI uses a theory of decision-making it adopts components of the traditional statistical view in which choices are made by maximizing some function of the probabilities of decision options. A knowledge-based scheme for reasoning about uncertainty is proposed, which extends the traditional framework but is compatible with it

  17. A Third-Party E-Payment Protocol Based on Quantum Group Blind Signature

    Science.gov (United States)

    Zhang, Jian-Zhong; Yang, Yuan-Yuan; Xie, Shu-Cui

    2017-09-01

    A third-party E-payment protocol based on quantum group blind signature is proposed in this paper. Our E-payment protocol could protect user's anonymity as the traditional E-payment systems do, and also have unconditional security which the classical E-payment systems can not provide. To achieve that, quantum key distribution, one-time pad and quantum group blind signature are adopted in our scheme. Furthermore, if there were a dispute, the manager Trent can identify who tells a lie.

  18. Non-intrusive uncertainty quantification in structural-acoustic systems using polynomial chaos expansion method

    Directory of Open Access Journals (Sweden)

    Wang Mingjie

    2017-01-01

    Full Text Available A framework of non-intrusive polynomial chaos expansion method (PC was proposed to investigate the statistic characteristics of the response of structural-acoustic system containing random uncertainty. The PC method does not need to reformulate model equations, and the statistics of the response can be evaluated directly. The results show that compared to the direct Monte Carlo method (MCM based on the original numerical model, the PC method is effective and more efficient.

  19. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Perkó, Zoltán, E-mail: Z.Perko@tudelft.nl; Gilli, Luca, E-mail: Gilli@nrg.eu; Lathouwers, Danny, E-mail: D.Lathouwers@tudelft.nl; Kloosterman, Jan Leen, E-mail: J.L.Kloosterman@tudelft.nl

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance

  20. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    International Nuclear Information System (INIS)

    Perkó, Zoltán; Gilli, Luca; Lathouwers, Danny; Kloosterman, Jan Leen

    2014-01-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both

  1. Characterization of the non-uniqueness of used nuclear fuel burnup signatures through a Mesh-Adaptive Direct Search

    Energy Technology Data Exchange (ETDEWEB)

    Skutnik, Steven E., E-mail: sskutnik@utk.edu; Davis, David R.

    2016-05-01

    The use of passive gamma and neutron signatures from fission indicators is a common means of estimating used fuel burnup, enrichment, and cooling time. However, while characteristic fission product signatures such as {sup 134}Cs, {sup 137}Cs, {sup 154}Eu, and others are generally reliable estimators for used fuel burnup within the context where the assembly initial enrichment and the discharge time are known, in the absence of initial enrichment and/or cooling time information (such as when applying NDA measurements in a safeguards/verification context), these fission product indicators no longer yield a unique solution for assembly enrichment, burnup, and cooling time after discharge. Through the use of a new Mesh-Adaptive Direct Search (MADS) algorithm, it is possible to directly probe the shape of this “degeneracy space” characteristic of individual nuclides (and combinations thereof), both as a function of constrained parameters (such as the assembly irradiation history) and unconstrained parameters (e.g., the cooling time before measurement and the measurement precision for particular indicator nuclides). In doing so, this affords the identification of potential means of narrowing the uncertainty space of potential assembly enrichment, burnup, and cooling time combinations, thereby bounding estimates of assembly plutonium content. In particular, combinations of gamma-emitting nuclides with distinct half-lives (e.g., {sup 134}Cs with {sup 137}Cs and {sup 154}Eu) in conjunction with gross neutron counting (via {sup 244}Cm) are able to reasonably constrain the degeneracy space of possible solutions to a space small enough to perform useful discrimination and verification of fuel assemblies based on their irradiation history.

  2. Differences Do Make a Difference: Recruitment Strategies for the Non-Traditional Student.

    Science.gov (United States)

    Zamanou, Sonia

    Many colleges and universities lack a comprehensive, fully integrated marketing plan to combat high attrition rates in programs offered to non-traditional students. A clear understanding of the needs of the marketplace is crucial to an effective marketing program. Research suggests that life transitions are what motivate adults to pursue…

  3. Injury survey of a non-traditional 'soft-edged' trampoline designed to lower equipment hazards.

    Science.gov (United States)

    Eager, David B; Scarrott, Carl; Nixon, Jim; Alexander, Keith

    2013-01-01

    In Australia trampolines contribute one quarter of all childhood play equipment injuries. The objective of this study was to gather and evaluate injury data from a non-traditional, 'soft-edged', consumer trampoline, where the design aimed to minimise injuries from the equipment and from falling off. The manufacturer of the non-traditional trampoline provided the University of Technology Sydney with their Australian customer database. The study involved surveys in Queensland and New South Wales, between May 2007 and March 2010. Initially injury data was gathered by a phone interview pilot study, then in the full study, through an email survey. The 3817 respondents were the carers of child users of the 'soft-edge' trampolines. Responses were compared with Australian and US emergency department data. In both countries the proportion of injuries caused by the equipment and falling off was compared with the proportion caused by the jumpers to themselves or each other. The comparisons showed a significantly lower proportion resulted from falling-off or hitting the equipment for this design when compared to traditional trampolines, both in Australia and the US. This research concludes that equipment-induced and falling-off injuries, the more severe injuries on traditional trampolines, can be significantly reduced with appropriate trampoline design.

  4. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-01-01

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  5. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  6. Tightly Secure Signatures From Lossy Identification Schemes

    OpenAIRE

    Abdalla , Michel; Fouque , Pierre-Alain; Lyubashevsky , Vadim; Tibouchi , Mehdi

    2015-01-01

    International audience; In this paper, we present three digital signature schemes with tight security reductions in the random oracle model. Our first signature scheme is a particularly efficient version of the short exponent discrete log-based scheme of Girault et al. (J Cryptol 19(4):463–487, 2006). Our scheme has a tight reduction to the decisional short discrete logarithm problem, while still maintaining the non-tight reduction to the computational version of the problem upon which the or...

  7. Usefulness of traditionally defined herbal properties for distinguishing prescriptions of traditional Chinese medicine from non-prescription recipes.

    Science.gov (United States)

    Ung, C Y; Li, H; Kong, C Y; Wang, J F; Chen, Y Z

    2007-01-03

    Traditional Chinese medicine (TCM) has been widely practiced and is considered as an attractive to conventional medicine. Multi-herb recipes have been routinely used in TCM. These have been formulated by using TCM-defined herbal properties (TCM-HPs), the scientific basis of which is unclear. The usefulness of TCM-HPs was evaluated by analyzing the distribution pattern of TCM-HPs of the constituent herbs in 1161 classical TCM prescriptions, which shows patterns of multi-herb correlation. Two artificial intelligence (AI) methods were used to examine whether TCM-HPs are capable of distinguishing TCM prescriptions from non-TCM recipes. Two AI systems were trained and tested by using 1161 TCM prescriptions, 11,202 non-TCM recipes, and two separate evaluation methods. These systems correctly classified 83.1-97.3% of the TCM prescriptions, 90.8-92.3% of the non-TCM recipes. These results suggest that TCM-HPs are capable of separating TCM prescriptions from non-TCM recipes, which are useful for formulating TCM prescriptions and consistent with the expected correlation between TCM-HPs and the physicochemical properties of herbal ingredients responsible for producing the collective pharmacological and other effects of specific TCM prescriptions.

  8. Prevalence of chronic kidney disease of non-traditional causes in patients on hemodialysis in southwest Guatemala.

    Science.gov (United States)

    Laux, Timothy S; Barnoya, Joaquin; Cipriano, Ever; Herrera, Erick; Lopez, Noemi; Polo, Vicente Sanchez; Rothstein, Marcos

    2016-04-01

    Objective To document the prevalence of patients on hemodialysis in southwestern Guatemala who have chronic kidney disease (CKD) of non-traditional causes (CKDnt). Methods This cross-sectional descriptive study interviewed patients on hemodialysis at the Instituto Guatemalteco de Seguridad Social on their health and occupational history. Laboratory serum, urine and vital sign data at the initiation of hemodialysis were obtained from chart reviews. Patients were classified according to whether they had hypertension or obesity or neither. The proportion of patients with and without these traditional CKD risk factors was recorded and the association between demographic and occupational factors and a lack of traditional CKD risk factors analyzed using multivariate logistic regression. Results Of 242 total patients (including 171 non-diabetics) enrolled in hemodialysis in southwestern Guatemala, 45 (18.6% of total patients and 26.3% of non-diabetics) lacked traditional CKD risk factors. While agricultural work history was common, only travel time greater than 30 minutes and age less than 50 years old were significantly associated with CKD in the absence of traditional risk factors. Individuals without such risk factors lived throughout southwestern Guatemala's five departments. Conclusions The prevalence of CKDnT appears to be much lower in this sample of patients receiving hemodialysis in Southwestern Guatemala than in hospitalized patients in El Salvador. It has yet to be determined whether the prevalence is higher in the general population and in patients on peritoneal dialysis.

  9. Prevalence of chronic kidney disease of non-traditional causes in patients on hemodialysis in southwest Guatemala

    Directory of Open Access Journals (Sweden)

    Timothy S. Laux

    Full Text Available ABSTRACT Objective To document the prevalence of patients on hemodialysis in southwestern Guatemala who have chronic kidney disease (CKD of non-traditional causes (CKDnt. Methods This cross-sectional descriptive study interviewed patients on hemodialysis at the Instituto Guatemalteco de Seguridad Social on their health and occupational history. Laboratory serum, urine and vital sign data at the initiation of hemodialysis were obtained from chart reviews. Patients were classified according to whether they had hypertension or obesity or neither. The proportion of patients with and without these traditional CKD risk factors was recorded and the association between demographic and occupational factors and a lack of traditional CKD risk factors analyzed using multivariate logistic regression. Results Of 242 total patients (including 171 non-diabetics enrolled in hemodialysis in southwestern Guatemala, 45 (18.6% of total patients and 26.3% of non-diabetics lacked traditional CKD risk factors. While agricultural work history was common, only travel time greater than 30 minutes and age less than 50 years old were significantly associated with CKD in the absence of traditional risk factors. Individuals without such risk factors lived throughout southwestern Guatemala’s five departments. Conclusions The prevalence of CKDnT appears to be much lower in this sample of patients receiving hemodialysis in Southwestern Guatemala than in hospitalized patients in El Salvador. It has yet to be determined whether the prevalence is higher in the general population and in patients on peritoneal dialysis.

  10. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Directory of Open Access Journals (Sweden)

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  11. Motif signatures of transcribed enhancers

    KAUST Repository

    Kleftogiannis, Dimitrios

    2017-09-14

    In mammalian cells, transcribed enhancers (TrEn) play important roles in the initiation of gene expression and maintenance of gene expression levels in spatiotemporal manner. One of the most challenging questions in biology today is how the genomic characteristics of enhancers relate to enhancer activities. This is particularly critical, as several recent studies have linked enhancer sequence motifs to specific functional roles. To date, only a limited number of enhancer sequence characteristics have been investigated, leaving space for exploring the enhancers genomic code in a more systematic way. To address this problem, we developed a novel computational method, TELS, aimed at identifying predictive cell type/tissue specific motif signatures. We used TELS to compile a comprehensive catalog of motif signatures for all known TrEn identified by the FANTOM5 consortium across 112 human primary cells and tissues. Our results confirm that distinct cell type/tissue specific motif signatures characterize TrEn. These signatures allow discriminating successfully a) TrEn from random controls, proxy of non-enhancer activity, and b) cell type/tissue specific TrEn from enhancers expressed and transcribed in different cell types/tissues. TELS codes and datasets are publicly available at http://www.cbrc.kaust.edu.sa/TELS.

  12. A Geoscience Workforce Model for Non-Geoscience and Non-Traditional STEM Students

    Science.gov (United States)

    Liou-Mark, J.; Blake, R.; Norouzi, H.; Vladutescu, D. V.; Yuen-Lau, L.

    2016-12-01

    The Summit on the Future of Geoscience Undergraduate Education has recently identified key professional skills, competencies, and conceptual understanding necessary in the development of undergraduate geoscience students (American Geosciences Institute, 2015). Through a comprehensive study involving a diverse range of the geoscience academic and employer community, the following professional scientist skills were rated highly important: 1) critical thinking/problem solving skills; 2) effective communication; 3) ability to access and integrate information; 4) strong quantitative skills; and 5) ability to work in interdisciplinary/cross cultural teams. Based on the findings of the study above, the New York City College of Technology (City Tech) has created a one-year intensive training program that focusses on the development of technical and non-technical geoscience skills for non-geoscience, non-traditional STEM students. Although City Tech does not offer geoscience degrees, the primary goal of the program is to create an unconventional pathway for under-represented minority STEM students to enter, participate, and compete in the geoscience workforce. The selected cohort of STEM students engage in year-round activities that include a geoscience course, enrichment training workshops, networking sessions, leadership development, research experiences, and summer internships at federal, local, and private geoscience facilities. These carefully designed programmatic elements provide both the geoscience knowledge and the non-technical professional skills that are essential for the geoscience workforce. Moreover, by executing this alternate, robust geoscience workforce model that attracts and prepares underrepresented minorities for geoscience careers, this unique pathway opens another corridor that helps to ameliorate the dire plight of the geoscience workforce shortage. This project is supported by NSF IUSE GEOPATH Grant # 1540721.

  13. An Indirect Simulation-Optimization Model for Determining Optimal TMDL Allocation under Uncertainty

    Directory of Open Access Journals (Sweden)

    Feng Zhou

    2015-11-01

    Full Text Available An indirect simulation-optimization model framework with enhanced computational efficiency and risk-based decision-making capability was developed to determine optimal total maximum daily load (TMDL allocation under uncertainty. To convert the traditional direct simulation-optimization model into our indirect equivalent model framework, we proposed a two-step strategy: (1 application of interval regression equations derived by a Bayesian recursive regression tree (BRRT v2 algorithm, which approximates the original hydrodynamic and water-quality simulation models and accurately quantifies the inherent nonlinear relationship between nutrient load reductions and the credible interval of algal biomass with a given confidence interval; and (2 incorporation of the calibrated interval regression equations into an uncertain optimization framework, which is further converted to our indirect equivalent framework by the enhanced-interval linear programming (EILP method and provides approximate-optimal solutions at various risk levels. The proposed strategy was applied to the Swift Creek Reservoir’s nutrient TMDL allocation (Chesterfield County, VA to identify the minimum nutrient load allocations required from eight sub-watersheds to ensure compliance with user-specified chlorophyll criteria. Our results indicated that the BRRT-EILP model could identify critical sub-watersheds faster than the traditional one and requires lower reduction of nutrient loadings compared to traditional stochastic simulation and trial-and-error (TAE approaches. This suggests that our proposed framework performs better in optimal TMDL development compared to the traditional simulation-optimization models and provides extreme and non-extreme tradeoff analysis under uncertainty for risk-based decision making.

  14. Disaggregating measurement uncertainty from population variability and Bayesian treatment of uncensored results

    International Nuclear Information System (INIS)

    Strom, Daniel J.; Joyce, Kevin E.; Maclellan, Jay A.; Watson, David J.; Lynch, Timothy P.; Antonio, Cheryl L.; Birchall, Alan; Anderson, Kevin K.; Zharov, Peter

    2012-01-01

    In making low-level radioactivity measurements of populations, it is commonly observed that a substantial portion of net results are negative. Furthermore, the observed variance of the measurement results arises from a combination of measurement uncertainty and population variability. This paper presents a method for disaggregating measurement uncertainty from population variability to produce a probability density function (PDF) of possibly true results. To do this, simple, justifiable, and reasonable assumptions are made about the relationship of the measurements to the measurands (the 'true values'). The measurements are assumed to be unbiased, that is, that their average value is the average of the measurands. Using traditional estimates of each measurement's uncertainty to disaggregate population variability from measurement uncertainty, a PDF of measurands for the population is produced. Then, using Bayes's theorem, the same assumptions, and all the data from the population of individuals, a prior PDF is computed for each individual's measurand. These PDFs are non-negative, and their average is equal to the average of the measurement results for the population. The uncertainty in these Bayesian posterior PDFs is all Berkson with no remaining classical component. The methods are applied to baseline bioassay data from the Hanford site. The data include 90Sr urinalysis measurements on 128 people, 137Cs in vivo measurements on 5,337 people, and 239Pu urinalysis measurements on 3,270 people. The method produces excellent results for the 90Sr and 137Cs measurements, since there are nonzero concentrations of these global fallout radionuclides in people who have not been occupationally exposed. The method does not work for the 239Pu measurements in non-occupationally exposed people because the population average is essentially zero.

  15. Signature-based User Authentication

    OpenAIRE

    Hámorník, Juraj

    2015-01-01

    This work aims on missing handwritten signature authentication in Windows. Result of this work is standalone software that allow users to log into Windows by writing signature. We focus on security of signature authentification and best overall user experience. We implemented signature authentification service that accept signature and return user access token if signature is genuine. Signature authentification is done by comparing given signature to signature patterns by their similarity. Si...

  16. Accounting for uncertainty in marine reserve design.

    Science.gov (United States)

    Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A

    2006-01-01

    Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals.

  17. Application of status uncertainty analysis methods for AP1000 LBLOCA calculation

    International Nuclear Information System (INIS)

    Zhang Shunxiang; Liang Guoxing

    2012-01-01

    Parameter uncertainty analysis is developed by using the reasonable method to establish the response relations between input parameter uncertainties and output uncertainties. The application of the parameter uncertainty analysis makes the simulation of plant state more accuracy and improves the plant economy with reasonable security assurance. The AP1000 LBLOCA was analyzed in this paper and the results indicate that the random sampling statistical analysis method, sensitivity analysis numerical method and traditional error propagation analysis method can provide quite large peak cladding temperature (PCT) safety margin, which is much helpful for choosing suitable uncertainty analysis method to improve the plant economy. Additionally, the random sampling statistical analysis method applying mathematical statistics theory makes the largest safety margin due to the reducing of the conservation. Comparing with the traditional conservative bounding parameter analysis method, the random sampling method can provide the PCT margin of 100 K, while the other two methods can only provide 50-60 K. (authors)

  18. Association between proximity to and coverage of traditional fast-food restaurants and non-traditional fast-food outlets and fast-food consumption among rural adults

    OpenAIRE

    Sharkey, Joseph R; Johnson, Cassandra M; Dean, Wesley R; Horel, Scott A

    2011-01-01

    Abstract Objective The objective of this study is to examine the relationship between residential exposure to fast-food entrées, using two measures of potential spatial access: proximity (distance to the nearest location) and coverage (number of different locations), and weekly consumption of fast-food meals. Methods Traditional fast-food restaurants and non-traditional fast-food outlets, such as convenience stores, supermarkets, and grocery stores, from the 2006 Brazos Valley Food Environmen...

  19. Pupil-linked arousal is driven by decision uncertainty and alters serial choice bias

    Science.gov (United States)

    Urai, Anne E.; Braun, Anke; Donner, Tobias H.

    2017-03-01

    While judging their sensory environments, decision-makers seem to use the uncertainty about their choices to guide adjustments of their subsequent behaviour. One possible source of these behavioural adjustments is arousal: decision uncertainty might drive the brain's arousal systems, which control global brain state and might thereby shape subsequent decision-making. Here, we measure pupil diameter, a proxy for central arousal state, in human observers performing a perceptual choice task of varying difficulty. Pupil dilation, after choice but before external feedback, reflects three hallmark signatures of decision uncertainty derived from a computational model. This increase in pupil-linked arousal boosts observers' tendency to alternate their choice on the subsequent trial. We conclude that decision uncertainty drives rapid changes in pupil-linked arousal state, which shape the serial correlation structure of ongoing choice behaviour.

  20. Gauging User Interest in Non-Traditional Library Resources

    Energy Technology Data Exchange (ETDEWEB)

    Sandberg, Tami; Abbott, Jennifer

    2015-06-23

    The National Renewable Energy Laboratory (NREL) is a government funded research laboratory based in Golden, Colorado. In addition to collecting traditional library resources such as journals, conference proceedings, and print and electronic books, the library also spends a significant portion of its collection development funds on resources not often found in many libraries: technical industry standards (e.g., ISO, IEC, ASTM, IEEE) and energy-related market reports. Assessing user needs for these resources is difficult for a number of reasons, particularly because standardized usage statistics are lacking or non-existent. Standards and market reports are generally costly and include fairly restrictive license agreements, which increase the importance of making informed collection development decisions. This presentation will discuss the NREL Library's current collection assessment and development practices as they relate to these unique resources.

  1. Diagnostic and prognostic signatures from the small non-coding RNA transcriptome in prostate cancer

    DEFF Research Database (Denmark)

    Martens-Uzunova, E S; Jalava, S E; Dits, N F

    2011-01-01

    Prostate cancer (PCa) is the most frequent male malignancy and the second most common cause of cancer-related death in Western countries. Current clinical and pathological methods are limited in the prediction of postoperative outcome. It is becoming increasingly evident that small non-coding RNA...... signatures of 102 fresh-frozen patient samples during PCa progression by miRNA microarrays. Both platforms were cross-validated by quantitative reverse transcriptase-PCR. Besides the altered expression of several miRNAs, our deep sequencing analyses revealed strong differential expression of small nucleolar...... RNAs (snoRNAs) and transfer RNAs (tRNAs). From microarray analysis, we derived a miRNA diagnostic classifier that accurately distinguishes normal from cancer samples. Furthermore, we were able to construct a PCa prognostic predictor that independently forecasts postoperative outcome. Importantly...

  2. Women into Non-Traditional Sectors: Addressing Gender Segregation in the Northern Ireland Workplace

    Science.gov (United States)

    Potter, Michael; Hill, Myrtle

    2009-01-01

    The horizontal segregation of the workforce along gender lines tends to assign women to lower paid, lower status employment. Consequently, schemes to address segregation have focused on preparing women to enter non-traditional occupations through training and development processes. This article examines models to encourage women into…

  3. KEA-71 Smart Current Signature Sensor (SCSS)

    Science.gov (United States)

    Perotti, Jose M.

    2010-01-01

    This slide presentation reviews the development and uses of the Smart Current Signature Sensor (SCSS), also known as the Valve Health Monitor (VHM) system. SCSS provides a way to not only monitor real-time the valve's operation in a non invasive manner, but also to monitor its health (Fault Detection and Isolation) and identify potential faults and/or degradation in the near future (Prediction/Prognosis). This technology approach is not only applicable for solenoid valves, and it could be extrapolated to other electrical components with repeatable electrical current signatures such as motors.

  4. Real Traceable Signatures

    Science.gov (United States)

    Chow, Sherman S. M.

    Traceable signature scheme extends a group signature scheme with an enhanced anonymity management mechanism. The group manager can compute a tracing trapdoor which enables anyone to test if a signature is signed by a given misbehaving user, while the only way to do so for group signatures requires revealing the signer of all signatures. Nevertheless, it is not tracing in a strict sense. For all existing schemes, T tracing agents need to recollect all N' signatures ever produced and perform RN' “checks” for R revoked users. This involves a high volume of transfer and computations. Increasing T increases the degree of parallelism for tracing but also the probability of “missing” some signatures in case some of the agents are dishonest.

  5. Signature Balancing

    NARCIS (Netherlands)

    Noordkamp, H.W.; Brink, M. van den

    2006-01-01

    Signatures are an important part of the design of a ship. In an ideal situation, signatures must be as low as possible. However, due to budget constraints it is most unlikely to reach this ideal situation. The arising question is which levels of signatures are optimal given the different scenarios

  6. Association between proximity to and coverage of traditional fast-food restaurants and non-traditional fast-food outlets and fast-food consumption among rural adults

    Directory of Open Access Journals (Sweden)

    Horel Scott A

    2011-05-01

    Full Text Available Abstract Objective The objective of this study is to examine the relationship between residential exposure to fast-food entrées, using two measures of potential spatial access: proximity (distance to the nearest location and coverage (number of different locations, and weekly consumption of fast-food meals. Methods Traditional fast-food restaurants and non-traditional fast-food outlets, such as convenience stores, supermarkets, and grocery stores, from the 2006 Brazos Valley Food Environment Project were linked with individual participants (n = 1409 who completed the nutrition module in the 2006 Brazos Valley Community Health Assessment. Results Increased age, poverty, increased distance to the nearest fast food, and increased number of different traditional fast-food restaurants, non-traditional fast-food outlets, or fast-food opportunities were associated with less frequent weekly consumption of fast-food meals. The interaction of gender and proximity (distance or coverage (number indicated that the association of proximity to or coverage of fast-food locations on fast-food consumption was greater among women and opposite of independent effects. Conclusions Results provide impetus for identifying and understanding the complex relationship between access to all fast-food opportunities, rather than to traditional fast-food restaurants alone, and fast-food consumption. The results indicate the importance of further examining the complex interaction of gender and distance in rural areas and particularly in fast-food consumption. Furthermore, this study emphasizes the need for health promotion and policy efforts to consider all sources of fast-food as part of promoting healthful food choices.

  7. Association between proximity to and coverage of traditional fast-food restaurants and non-traditional fast-food outlets and fast-food consumption among rural adults

    Science.gov (United States)

    2011-01-01

    Objective The objective of this study is to examine the relationship between residential exposure to fast-food entrées, using two measures of potential spatial access: proximity (distance to the nearest location) and coverage (number of different locations), and weekly consumption of fast-food meals. Methods Traditional fast-food restaurants and non-traditional fast-food outlets, such as convenience stores, supermarkets, and grocery stores, from the 2006 Brazos Valley Food Environment Project were linked with individual participants (n = 1409) who completed the nutrition module in the 2006 Brazos Valley Community Health Assessment. Results Increased age, poverty, increased distance to the nearest fast food, and increased number of different traditional fast-food restaurants, non-traditional fast-food outlets, or fast-food opportunities were associated with less frequent weekly consumption of fast-food meals. The interaction of gender and proximity (distance) or coverage (number) indicated that the association of proximity to or coverage of fast-food locations on fast-food consumption was greater among women and opposite of independent effects. Conclusions Results provide impetus for identifying and understanding the complex relationship between access to all fast-food opportunities, rather than to traditional fast-food restaurants alone, and fast-food consumption. The results indicate the importance of further examining the complex interaction of gender and distance in rural areas and particularly in fast-food consumption. Furthermore, this study emphasizes the need for health promotion and policy efforts to consider all sources of fast-food as part of promoting healthful food choices. PMID:21599955

  8. Quantifying measurement uncertainties in ADCP measurements in non-steady, inhomogeneous flow

    Science.gov (United States)

    Schäfer, Stefan

    2017-04-01

    The author presents a laboratory study of fixed-platform four-beam ADCP and three-beam ADV measurements in the tailrace of a micro hydro power setup with a 35kW Kaplan-turbine and 2.5m head. The datasets discussed quantify measurement uncertainties of the ADCP measurement technique coming from non-steady, inhomogeneous flow. For constant discharge of 1.5m3/s, two different flow scenarios were investigated: one being the regular tailrace flow downstream the draft tube and the second being a straightened, less inhomogeneous flow, which was generated by the use of a flow straightening device: A rack of diameter 40mm pipe sections was mounted right behind the draft tube. ADCP measurements (sampling rate 1.35Hz) were conducted in three distances behind the draft tube and compared bin-wise to measurements of three simultaneously measuring ADV probes (sampling rate 64Hz). The ADV probes were aligned horizontally and the ADV bins were placed in the centers of two facing ADCP bins and in the vertical under the ADCP probe of the corresponding depth. Rotating the ADV probes by 90° allowed for measurements of the other two facing ADCP bins. For reasons of mutual probe interaction, ADCP and ADV measurements were not conducted at the same time. The datasets were evaluated by using mean and fluctuation velocities. Turbulence parameters were calculated and compared as far as applicable. Uncertainties coming from non-steady flow were estimated with the normalized mean square error und evaluated by comparing long-term measurements of 60 minutes to shorter measurement intervals. Uncertainties coming from inhomogeneous flow were evaluated by comparison of ADCP with ADV data along the ADCP beams where ADCP data were effectively measured and in the vertical under the ADCP probe where velocities of the ADCP measurements were displayed. Errors coming from non-steady flow could be compensated through sufficiently long measurement intervals with high enough sampling rates depending on the

  9. Arabic CWR Based on Correlation of Normalized Signatures of Words Images

    Directory of Open Access Journals (Sweden)

    Hala S. Zaghloul

    2007-12-01

    Full Text Available The traditional methods for Arabic OCR (AOCR based on segmentation of each word into a set of characters. The Arabic language is of cursive nature, and the character's shape depends on its position in the word. There are about 100 shape of the characters have to be classified, and some of them may be overlapped. Our approach use a normalized signature of the time signal of the pulse coupled neural network PCNN, supported with some shape primitives to represent the number of the word complementary and their positions within the image of the word. A lookup dictionary of words with its signatures was constructed, and structured in groups using a decision tree. The tested signature was routed through the tree to the nearest group, and then the signature and its related word with higher correlation within the selected group will be the classified. This method overcome many difficulties arise in cursive word recognition CWR for printed script with different font type and size; also it shows higher accuracy for the classification process, 96%.

  10. Higher-Order Squeezing of Quantum Field and the Generalized Uncertainty Relations in Non-Degenerate Four-Wave Mixing

    Science.gov (United States)

    Li, Xi-Zeng; Su, Bao-Xia

    1996-01-01

    It is found that the field of the combined mode of the probe wave and the phase-conjugate wave in the process of non-degenerate four-wave mixing exhibits higher-order squeezing to all even orders. And the generalized uncertainty relations in this process are also presented.

  11. What Makes a Student Non-Traditional? A Comparison of Students over and under Age 25 in Online, Accelerated Psychology Courses

    Science.gov (United States)

    Tilley, Brian P.

    2014-01-01

    The growing proportion of non-traditional students, very commonly defined as students over the age of 25 (though other features vary from study to study) necessitates more studies with this increasingly relevant group participating. Recently, the growth of non-traditional universities such as those offering predominantly online, accelerated…

  12. Technical note: Design flood under hydrological uncertainty

    Science.gov (United States)

    Botto, Anna; Ganora, Daniele; Claps, Pierluigi; Laio, Francesco

    2017-07-01

    Planning and verification of hydraulic infrastructures require a design estimate of hydrologic variables, usually provided by frequency analysis, and neglecting hydrologic uncertainty. However, when hydrologic uncertainty is accounted for, the design flood value for a specific return period is no longer a unique value, but is represented by a distribution of values. As a consequence, the design flood is no longer univocally defined, making the design process undetermined. The Uncertainty Compliant Design Flood Estimation (UNCODE) procedure is a novel approach that, starting from a range of possible design flood estimates obtained in uncertain conditions, converges to a single design value. This is obtained through a cost-benefit criterion with additional constraints that is numerically solved in a simulation framework. This paper contributes to promoting a practical use of the UNCODE procedure without resorting to numerical computation. A modified procedure is proposed by using a correction coefficient that modifies the standard (i.e., uncertainty-free) design value on the basis of sample length and return period only. The procedure is robust and parsimonious, as it does not require additional parameters with respect to the traditional uncertainty-free analysis. Simple equations to compute the correction term are provided for a number of probability distributions commonly used to represent the flood frequency curve. The UNCODE procedure, when coupled with this simple correction factor, provides a robust way to manage the hydrologic uncertainty and to go beyond the use of traditional safety factors. With all the other parameters being equal, an increase in the sample length reduces the correction factor, and thus the construction costs, while still keeping the same safety level.

  13. An Investigation of Women Engineers in Non-Traditional Occupations in the Thai Construction Industry

    Directory of Open Access Journals (Sweden)

    Nuanthip Kaewsri

    2011-06-01

    Full Text Available For over a decade, the public and the private sectors have carried out research aimed at attracting women engineers to the construction industry and retaining them. However, studies on women engineers working in other types of construction-related businesses apart from contractor companies such as consultancies, developers, etc., have not been many. This paper aims to examine the experiences of women engineers in non-traditional careers and the implications for their turnover. A literature search on women’s careers in construction was performed in conjunction with semi-structured interviews with a sampling of 141 individuals. Results from three viewpoints, viz those of professional men and women engineers in contractor companies, and women engineers in non-contractor companies, were found to differ in many respects, including their opinions about career advancement, career path and the difficulties involved. It was also found that women engineers in contractor companies were much more affected by problems such as sexual harassment, work-life conflicts and equal opportunity than women engineers in non-contractor companies. Turnover rates of women engineers and their reasons for leaving were examined. Women engineers, particularly those in contractor companies, had to confront more barriers in non-traditional careers than their male counterparts.  Nonetheless, working in non-contractor companies provides a viable alternative for women engineers who want to have successful careers in the Thai construction industry.

  14. Detection of proteolytic signatures for Parkinson's disease

    DEFF Research Database (Denmark)

    Jordal, Peter Lüttge; Dyrlund, Thomas F.; Winge, Kristian

    2016-01-01

    Aim: To investigate if idiopathic Parkinson's disease (IPD) is associated with distinct proteolytic signatures relative to non-neurodegenerative controls (NND) and patients with multiple system atrophy (MSA). Materials & methods: A subtiligase-based N-terminomics screening method was exploited...

  15. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    Science.gov (United States)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for

  16. Book review: OF OTHER THOUGHTS: NON-TRADITIONAL WAYS TO THE

    Directory of Open Access Journals (Sweden)

    Johan Verbeke

    2014-12-01

    Full Text Available Research paradigms in the fields of architecture and arts have been developing and changing during the last decade. Part of this development is a shift to include design work and artistic work into the knowledge processes of doctoral work. This work evidently also needs supervision. At the same time doctoral degrees have been developing in relation to indigenous ways of thinking. The book Other Thoughts: Non-Traditional Ways to the Doctorate discusses the challenges one is facing, either as a PhD student or as a supervisor, when doing or supervising a PhD in a less established field.

  17. ALPs effective field theory and collider signatures

    DEFF Research Database (Denmark)

    Brivio, I.; Gavela, M. B.; Merlo, L.

    2017-01-01

    We study the leading effective interactions between the Standard Model fields and a generic singlet CP-odd (pseudo-) Goldstone boson. Two possible frameworks for electroweak symmetry breaking are considered: linear and non-linear. For the latter case, the basis of leading effective operators is d...... final states are most promising signals expected in both frameworks. In addition, non-standard Higgs decays and mono-Higgs signatures are especially prominent and expected to be dominant in non-linear realisations....

  18. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Science.gov (United States)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  19. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Directory of Open Access Journals (Sweden)

    Jinchao Feng

    2018-03-01

    Full Text Available We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data. The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  20. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  1. Using snowflake surface-area-to-volume ratio to model and interpret snowfall triple-frequency radar signatures

    Science.gov (United States)

    Gergely, Mathias; Cooper, Steven J.; Garrett, Timothy J.

    2017-10-01

    The snowflake microstructure determines the microwave scattering properties of individual snowflakes and has a strong impact on snowfall radar signatures. In this study, individual snowflakes are represented by collections of randomly distributed ice spheres where the size and number of the constituent ice spheres are specified by the snowflake mass and surface-area-to-volume ratio (SAV) and the bounding volume of each ice sphere collection is given by the snowflake maximum dimension. Radar backscatter cross sections for the ice sphere collections are calculated at X-, Ku-, Ka-, and W-band frequencies and then used to model triple-frequency radar signatures for exponential snowflake size distributions (SSDs). Additionally, snowflake complexity values obtained from high-resolution multi-view snowflake images are used as an indicator of snowflake SAV to derive snowfall triple-frequency radar signatures. The modeled snowfall triple-frequency radar signatures cover a wide range of triple-frequency signatures that were previously determined from radar reflectivity measurements and illustrate characteristic differences related to snow type, quantified through snowflake SAV, and snowflake size. The results show high sensitivity to snowflake SAV and SSD maximum size but are generally less affected by uncertainties in the parameterization of snowflake mass, indicating the importance of snowflake SAV for the interpretation of snowfall triple-frequency radar signatures.

  2. Enhanced anti-counterfeiting measures for additive manufacturing: coupling lanthanide nanomaterial chemical signatures with blockchain technology

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, Zachary C.; Stephenson, David E.; Christ, Josef F.; Pope, Timothy R.; Arey, Bruce W.; Barrett, Christopher A.; Warner, Marvin G.

    2017-08-18

    The significant rise of additive manufacturing (AM) in recent years is in part due to the open sourced nature of the printing processes and reduced cost and capital barriers relative to traditional manufacturing. However, this democratization of manufacturing spurs an increased demand for producers and end-users to verify the authenticity and quality of individual parts. To this end, we introduce an anti-counterfeiting method composed of first embedding engineered nanomaterials into features of a 3D-printed part followed by non-destructive interrogation of these features to quantify a chemical signature profile. The part specific chemical signature data is then linked to a securitized, distributed, and time-stamped blockchain ledger entry. To demonstrate the utility of this approach, lanthanide-aspartic acid nanoscale coordination polymers (Ln3+- Asp NCs) / poly(lactic) acid (PLA) composites were formulated and transformed into a filament feedstock for fused deposition modeling (FDM) 3D printing. In the present case, a quick-response (QR) code containing the doped Ln3+-Asp NCs was printed using a dual-extruder FDM printer into pure PLA parts. The QR code provides a searchable reference to an Ethereum-based blockchain entry. The QR code physical features also serve as defined areas to probe the signatures arising from the embedded Ln3+-Asp NCs. Visible fluorescence emission with UV-excitation was quantified in terms of color using a smartphone camera and incorporated into blockchain entries. Ultimately, linking unique chemical signature data to blockchain databases is anticipated to make the costs of counterfeiting AM materials significantly more prohibitive and transactions between those in the supply chain more trustworthy.

  3. Benchmarking observational uncertainties for hydrology (Invited)

    Science.gov (United States)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    become more common for hydrologists to use multiple data types and sources within a single study. This may be driven by complex water management questions which integrate water quantity, quality and ecology; or by recognition of the value of auxiliary data to understand hydrological processes. We discuss briefly the impact of data uncertainty on the increasingly popular use of diagnostic signatures for hydrological process understanding and model development.

  4. Molecular signatures database (MSigDB) 3.0.

    Science.gov (United States)

    Liberzon, Arthur; Subramanian, Aravind; Pinchback, Reid; Thorvaldsdóttir, Helga; Tamayo, Pablo; Mesirov, Jill P

    2011-06-15

    Well-annotated gene sets representing the universe of the biological processes are critical for meaningful and insightful interpretation of large-scale genomic data. The Molecular Signatures Database (MSigDB) is one of the most widely used repositories of such sets. We report the availability of a new version of the database, MSigDB 3.0, with over 6700 gene sets, a complete revision of the collection of canonical pathways and experimental signatures from publications, enhanced annotations and upgrades to the web site. MSigDB is freely available for non-commercial use at http://www.broadinstitute.org/msigdb.

  5. Māori identity signatures: A latent profile analysis of the types of Māori identity.

    Science.gov (United States)

    Greaves, Lara M; Houkamau, Carla; Sibley, Chris G

    2015-10-01

    Māori are the indigenous peoples of New Zealand. However, the term 'Māori' can refer to a wide range of people of varying ethnic compositions and cultural identity. We present a statistical model identifying 6 distinct types, or 'Māori Identity Signatures,' and estimate their proportion in the Māori population. The model is tested using a Latent Profile Analysis of a national probability sample of 686 Māori drawn from the New Zealand Attitudes and Values Study. We identify 6 distinct signatures: Traditional Essentialists (22.6%), Traditional Inclusives (16%), High Moderates (31.7%), Low Moderates (18.7%), Spiritually Orientated (4.1%), and Disassociated (6.9%). These distinct Identity Signatures predicted variation in deprivation, age, mixed-ethnic affiliation, and religion. This research presents the first formal statistical model assessing how people's identity as Māori is psychologically structured, documents the relative proportion of these different patterns of structures, and shows that these patterns reliably predict differences in core demographics. We identify a range of patterns of Māori identity far more diverse than has been previously proposed based on qualitative data, and also show that the majority of Māori fit a moderate or traditional identity pattern. The application of our model for studying Māori health and identity development is discussed. (c) 2015 APA, all rights reserved).

  6. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  7. Two-Dimensional Resonance Raman Signatures of Vibronic Coherence Transfer in Chemical Reactions.

    Science.gov (United States)

    Guo, Zhenkun; Molesky, Brian P; Cheshire, Thomas P; Moran, Andrew M

    2017-11-02

    Two-dimensional resonance Raman (2DRR) spectroscopy has been developed for studies of photochemical reaction mechanisms and structural heterogeneity in condensed phase systems. 2DRR spectroscopy is motivated by knowledge of non-equilibrium effects that cannot be detected with traditional resonance Raman spectroscopy. For example, 2DRR spectra may reveal correlated distributions of reactant and product geometries in systems that undergo chemical reactions on the femtosecond time scale. Structural heterogeneity in an ensemble may also be reflected in the 2D spectroscopic line shapes of both reactive and non-reactive systems. In this chapter, these capabilities of 2DRR spectroscopy are discussed in the context of recent applications to the photodissociation reactions of triiodide. We show that signatures of "vibronic coherence transfer" in the photodissociation process can be targeted with particular 2DRR pulse sequences. Key differences between the signal generation mechanisms for 2DRR and off-resonant 2D Raman spectroscopy techniques are also addressed. Overall, recent experimental developments and applications of the 2DRR method suggest that it will be a valuable tool for elucidating ultrafast chemical reaction mechanisms.

  8. Some Proxy Signature and Designated verifier Signature Schemes over Braid Groups

    OpenAIRE

    Lal, Sunder; Verma, Vandani

    2009-01-01

    Braids groups provide an alternative to number theoretic public cryptography and can be implemented quite efficiently. The paper proposes five signature schemes: Proxy Signature, Designated Verifier, Bi-Designated Verifier, Designated Verifier Proxy Signature And Bi-Designated Verifier Proxy Signature scheme based on braid groups. We also discuss the security aspects of each of the proposed schemes.

  9. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  10. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  11. Does model fit decrease the uncertainty of the data in comparison with a general non-model least squares fit?

    International Nuclear Information System (INIS)

    Pronyaev, V.G.

    2003-01-01

    The information entropy is taken as a measure of knowledge about the object and the reduced univariante variance as a common measure of uncertainty. Covariances in the model versus non-model least square fits are discussed

  12. Should fatty acid signature proportions sum to 1 for diet estimation?

    Science.gov (United States)

    Bromaghin, Jeffrey F.; Budge, Suzanne M.; Thiemann, Gregory W.

    2016-01-01

    Knowledge of predator diets, including how diets might change through time or differ among predators, provides essential insights into their ecology. Diet estimation therefore remains an active area of research within quantitative ecology. Quantitative fatty acid signature analysis (QFASA) is an increasingly common method of diet estimation. QFASA is based on a data library of prey signatures, which are vectors of proportions summarizing the fatty acid composition of lipids, and diet is estimated as the mixture of prey signatures that most closely approximates a predator’s signature. Diets are typically estimated using proportions from a subset of all fatty acids that are known to be solely or largely influenced by diet. Given the subset of fatty acids selected, the current practice is to scale their proportions to sum to 1.0. However, scaling signature proportions has the potential to distort the structural relationships within a prey library and between predators and prey. To investigate that possibility, we compared the practice of scaling proportions with two alternatives and found that the traditional scaling can meaningfully bias diet estimators under some conditions. Two aspects of the prey types that contributed to a predator’s diet influenced the magnitude of the bias: the degree to which the sums of unscaled proportions differed among prey types and the identifiability of prey types within the prey library. We caution investigators against the routine scaling of signature proportions in QFASA.

  13. Decolonizing Qualitative Research: Non-traditional Reporting Forms in the Academy

    Directory of Open Access Journals (Sweden)

    Elsa M. González y González

    2006-09-01

    Full Text Available Qualitative researchers have assumed that cross-cultural work required deep understanding of the culture being reported on. Even earlier, cross-cultural work focused on "receiving contexts," and on end-users who were primarily Western. The utility of such studies is severely limited, however, in a globalized world, and studies undertaken now must serve the interests of not only Western scholars, but also the needs of nationals and locals (or indigenous peoples. Research conducted in different languages, non-Western contexts and different cultures becomes more problematic and understanding intrinsic issues more urgent with the increasing number of reports (such as dissertations conducted by international scholars and thus bear potential for decolonizing the academy. Conducting and reporting cross-cultural qualitative data focuses on understanding at least five major ideas: working with bilingual data, considering non-Western cultural traditions, multiple perspectives, multi-vocal & multi-lingual texts, and technical issues to insure accessibility. URN: urn:nbn:de:0114-fqs060418

  14. Using qubits to reveal quantum signatures of an oscillator

    Science.gov (United States)

    Agarwal, Shantanu

    In this thesis, we seek to study the qubit-oscillator system with the aim to identify and quantify inherent quantum features of the oscillator. We show that the quantum signatures of the oscillator get imprinted on the dynamics of the joint system. The two key features which we explore are the quantized energy spectrum of the oscillator and the non-classicality of the oscillator's wave function. To investigate the consequences of the oscillator's discrete energy spectrum, we consider the qubit to be coupled to the oscillator through the Rabi Hamiltonian. Recent developments in fabrication technology have opened up the possibility to explore parameter regimes which were conventionally inaccessible. Motivated by these advancements, we investigate in this thesis a parameter space where the qubit frequency is much smaller than the oscillator frequency and the Rabi frequency is allowed to be an appreciable fraction of the bare frequency of the oscillator. We use the adiabatic approximation to understand the dynamics in this quasi-degenerate qubit regime. By deriving a dressed master equation, we systematically investigate the effects of the environment on the system dynamics. We develop a spectroscopic technique, using which one can probe the steady state response of the driven and damped system. The spectroscopic signal clearly reveals the quantized nature of the oscillator's energy spectrum. We extend the adiabatic approximation, earlier developed only for the single qubit case, to a scenario where multiple qubits interact with the oscillator. Using the extended adiabatic approximation, we study the collapse and revival of multi-qubit observables. We develop analytic expressions for the revival signals which are in good agreement with the numerically evaluated results. Within the quantum restriction imposed by Heisenberg's uncertainty principle, the uncertainty in the position and momentum of an oscillator is minimum and shared equally when the oscillator is prepared

  15. High-resolution characterization of sequence signatures due to non-random cleavage of cell-free DNA.

    Science.gov (United States)

    Chandrananda, Dineika; Thorne, Natalie P; Bahlo, Melanie

    2015-06-17

    High-throughput sequencing of cell-free DNA fragments found in human plasma has been used to non-invasively detect fetal aneuploidy, monitor organ transplants and investigate tumor DNA. However, many biological properties of this extracellular genetic material remain unknown. Research that further characterizes circulating DNA could substantially increase its diagnostic value by allowing the application of more sophisticated bioinformatics tools that lead to an improved signal to noise ratio in the sequencing data. In this study, we investigate various features of cell-free DNA in plasma using deep-sequencing data from two pregnant women (>70X, >50X) and compare them with matched cellular DNA. We utilize a descriptive approach to examine how the biological cleavage of cell-free DNA affects different sequence signatures such as fragment lengths, sequence motifs at fragment ends and the distribution of cleavage sites along the genome. We show that the size distributions of these cell-free DNA molecules are dependent on their autosomal and mitochondrial origin as well as the genomic location within chromosomes. DNA mapping to particular microsatellites and alpha repeat elements display unique size signatures. We show how cell-free fragments occur in clusters along the genome, localizing to nucleosomal arrays and are preferentially cleaved at linker regions by correlating the mapping locations of these fragments with ENCODE annotation of chromatin organization. Our work further demonstrates that cell-free autosomal DNA cleavage is sequence dependent. The region spanning up to 10 positions on either side of the DNA cleavage site show a consistent pattern of preference for specific nucleotides. This sequence motif is present in cleavage sites localized to nucleosomal cores and linker regions but is absent in nucleosome-free mitochondrial DNA. These background signals in cell-free DNA sequencing data stem from the non-random biological cleavage of these fragments. This

  16. Contribution of non-traditional lipid profiles to reduced glomerular filtration rate in H-type hypertension population of rural China.

    Science.gov (United States)

    Wang, Haoyu; Li, Zhao; Guo, Xiaofan; Chen, Yintao; Chen, Shuang; Tian, Yichen; Sun, Yingxian

    2018-05-01

    Despite current interest in the unfavourable impact of non-traditional lipid profiles on cardiovascular disease, information regarding its relations to reduced glomerular filtration rate (GFR) in H-type hypertension population has not been systemically elucidated. Analyses were based upon a cross-sectional study of 3259 participants with H-type hypertension who underwent assessment of biochemical, anthropometric and blood pressure values. Reduced GFR was considered if meeting estimated GFR <60 ml/min/1.73 m 2 . A stepwise multivariate regression analysis indicated that non-traditional lipid parameters remained as independent determinants of estimated GFR (all p < .001). In multivariable models, we observed a 50%, 51%, 31%, and 24% higher risk for decreased GFR with each SD increment in TC/HDL-C, TG/HDL-C, LDL-C/HDL-C ratios and non-HDL-C levels, respectively. The highest quartile of TC/HDL-C, TG/HDL-C and LDL-C/HDL-C ratios carried reduced GFR odds (confidence intervals) of 5.50 (2.50 to 12.09), 6.63 (2.58 to 17.05) and 2.22 (1.15 to 4.29), respectively. The relative independent contribution of non-traditional lipid profiles, as indexed by TC/HDL-C, TG/HDL-C, LDL-C/HDL-C ratios and non-HDL-C, towards reduced GFR putting research evidence at the very heart of lipoprotein-mediated renal injury set a vital example for applying a clinical and public health recommendation for reducing the burden of chronic kidney disease. KEY MESSAGES Non-traditional lipid profiles has been linked with the occurrence of cardiovascular disease, but none of the studies that address the effect of non-traditional lipid profiles on reduced GFR risk in H-type hypertension population has been specifically established. A greater emphasis of this study resided in the intrinsic value of TC/HDL-C, TG/HDL-C, LDL-C/HDL-C ratios and non-HDL-C that integrate atherogenic and anti-atherogenic lipid molecules to predict the risk of reduced GFR among H-type hypertension population and provide

  17. Act-Frequency Signatures of the Big Five.

    Science.gov (United States)

    Chapman, Benjamin P; Goldberg, Lewis R

    2017-10-01

    The traditional focus of work on personality and behavior has tended toward "major outcomes" such as health or antisocial behavior, or small sets of behaviors observable over short periods in laboratories or in convenience samples. In a community sample, we examined a wide set (400) of mundane, incidental or "every day" behavioral acts, the frequencies of which were reported over the past year. Using an exploratory methodology similar to genomic approaches (relying on the False Discovery Rate) revealed 26 prototypical acts for Intellect, 24 acts for Extraversion, 13 for Emotional Stability, nine for Conscientiousness, and six for Agreeableness. Many links were consistent with general intuition-for instance, low Conscientiousness with work and procrastination. Some of the most robust associations, however, were for acts too specific for a priori hypothesis. For instance, Extraversion was strongly associated with telling dirty jokes, Intellect with "loung[ing] around [the] house without clothes on", and Agreeableness with singing in the shower. Frequency categories for these acts changed with markedly non-linearity across Big Five Z-scores. Findings may help ground trait scores in emblematic acts, and enrich understanding of mundane or common behavioral signatures of the Big Five.

  18. Risk assessment under deep uncertainty: A methodological comparison

    International Nuclear Information System (INIS)

    Shortridge, Julie; Aven, Terje; Guikema, Seth

    2017-01-01

    Probabilistic Risk Assessment (PRA) has proven to be an invaluable tool for evaluating risks in complex engineered systems. However, there is increasing concern that PRA may not be adequate in situations with little underlying knowledge to support probabilistic representation of uncertainties. As analysts and policy makers turn their attention to deeply uncertain hazards such as climate change, a number of alternatives to traditional PRA have been proposed. This paper systematically compares three diverse approaches for risk analysis under deep uncertainty (qualitative uncertainty factors, probability bounds, and robust decision making) in terms of their representation of uncertain quantities, analytical output, and implications for risk management. A simple example problem is used to highlight differences in the way that each method relates to the traditional risk assessment process and fundamental issues associated with risk assessment and description. We find that the implications for decision making are not necessarily consistent between approaches, and that differences in the representation of uncertain quantities and analytical output suggest contexts in which each method may be most appropriate. Finally, each methodology demonstrates how risk assessment can inform decision making in deeply uncertain contexts, informing more effective responses to risk problems characterized by deep uncertainty. - Highlights: • We compare three diverse approaches to risk assessment under deep uncertainty. • A simple example problem highlights differences in analytical process and results. • Results demonstrate how methodological choices can impact risk assessment results.

  19. A bibliographical survey of bruxism with special emphasis on non-traditional treatment modalities.

    Science.gov (United States)

    Nissani, M

    2001-06-01

    After proposing a common-sense definition of bruxism, this partial review distills its various symptoms and consequences from the literature. That literature suggests that the splint-the most popular treatment modality-falls short in some respects. The research literature is even less sanguine about the efficacy of such other traditional therapies as sound alarms and stress reduction. Given the limited success of traditional approaches, and given, moreover, the high incidence of bruxism and its harmful consequences, clinicians may occasionally be interested in experimenting with non-intrusive, safe, less widely known, treatment modalities. To meet this need, this review-unlike all other reviews of the subject-focuses on such comparatively unpopular or recent approaches.

  20. Long non-coding RNAs as novel expression signatures modulate DNA damage and repair in cadmium toxicology

    Science.gov (United States)

    Zhou, Zhiheng; Liu, Haibai; Wang, Caixia; Lu, Qian; Huang, Qinhai; Zheng, Chanjiao; Lei, Yixiong

    2015-10-01

    Increasing evidence suggests that long non-coding RNAs (lncRNAs) are involved in a variety of physiological and pathophysiological processes. Our study was to investigate whether lncRNAs as novel expression signatures are able to modulate DNA damage and repair in cadmium(Cd) toxicity. There were aberrant expression profiles of lncRNAs in 35th Cd-induced cells as compared to untreated 16HBE cells. siRNA-mediated knockdown of ENST00000414355 inhibited the growth of DNA-damaged cells and decreased the expressions of DNA-damage related genes (ATM, ATR and ATRIP), while increased the expressions of DNA-repair related genes (DDB1, DDB2, OGG1, ERCC1, MSH2, RAD50, XRCC1 and BARD1). Cadmium increased ENST00000414355 expression in the lung of Cd-exposed rats in a dose-dependent manner. A significant positive correlation was observed between blood ENST00000414355 expression and urinary/blood Cd concentrations, and there were significant correlations of lncRNA-ENST00000414355 expression with the expressions of target genes in the lung of Cd-exposed rats and the blood of Cd exposed workers. These results indicate that some lncRNAs are aberrantly expressed in Cd-treated 16HBE cells. lncRNA-ENST00000414355 may serve as a signature for DNA damage and repair related to the epigenetic mechanisms underlying the cadmium toxicity and become a novel biomarker of cadmium toxicity.

  1. Regime-dependent forecast uncertainty of convective precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Keil, Christian; Craig, George C. [Muenchen Univ. (Germany). Meteorologisches Inst.

    2011-04-15

    Forecast uncertainty of convective precipitation is influenced by all scales, but in different ways in different meteorological situations. Forecasts of the high resolution ensemble prediction system COSMO-DE-EPS of Deutscher Wetterdienst (DWD) are used to examine the dominant sources of uncertainty of convective precipitation. A validation with radar data using traditional as well as spatial verification measures highlights differences in precipitation forecast performance in differing weather regimes. When the forecast uncertainty can primarily be associated with local, small-scale processes individual members run with the same variation of the physical parameterisation driven by different global models outperform all other ensemble members. In contrast when the precipitation is governed by the large-scale flow all ensemble members perform similarly. Application of the convective adjustment time scale confirms this separation and shows a regime-dependent forecast uncertainty of convective precipitation. (orig.)

  2. A simple but highly effective approach to evaluate the prognostic performance of gene expression signatures.

    Directory of Open Access Journals (Sweden)

    Maud H W Starmans

    Full Text Available BACKGROUND: Highly parallel analysis of gene expression has recently been used to identify gene sets or 'signatures' to improve patient diagnosis and risk stratification. Once a signature is generated, traditional statistical testing is used to evaluate its prognostic performance. However, due to the dimensionality of microarrays, this can lead to false interpretation of these signatures. PRINCIPAL FINDINGS: A method was developed to test batches of a user-specified number of randomly chosen signatures in patient microarray datasets. The percentage of random generated signatures yielding prognostic value was assessed using ROC analysis by calculating the area under the curve (AUC in six public available cancer patient microarray datasets. We found that a signature consisting of randomly selected genes has an average 10% chance of reaching significance when assessed in a single dataset, but can range from 1% to ∼40% depending on the dataset in question. Increasing the number of validation datasets markedly reduces this number. CONCLUSIONS: We have shown that the use of an arbitrary cut-off value for evaluation of signature significance is not suitable for this type of research, but should be defined for each dataset separately. Our method can be used to establish and evaluate signature performance of any derived gene signature in a dataset by comparing its performance to thousands of randomly generated signatures. It will be of most interest for cases where few data are available and testing in multiple datasets is limited.

  3. Sparse grid-based polynomial chaos expansion for aerodynamics of an airfoil with uncertainties

    Directory of Open Access Journals (Sweden)

    Xiaojing WU

    2018-05-01

    Full Text Available The uncertainties can generate fluctuations with aerodynamic characteristics. Uncertainty Quantification (UQ is applied to compute its impact on the aerodynamic characteristics. In addition, the contribution of each uncertainty to aerodynamic characteristics should be computed by uncertainty sensitivity analysis. Non-Intrusive Polynomial Chaos (NIPC has been successfully applied to uncertainty quantification and uncertainty sensitivity analysis. However, the non-intrusive polynomial chaos method becomes inefficient as the number of random variables adopted to describe uncertainties increases. This deficiency becomes significant in stochastic aerodynamic analysis considering the geometric uncertainty because the description of geometric uncertainty generally needs many parameters. To solve the deficiency, a Sparse Grid-based Polynomial Chaos (SGPC expansion is used to do uncertainty quantification and sensitivity analysis for stochastic aerodynamic analysis considering geometric and operational uncertainties. It is proved that the method is more efficient than non-intrusive polynomial chaos and Monte Carlo Simulation (MSC method for the stochastic aerodynamic analysis. By uncertainty quantification, it can be learnt that the flow characteristics of shock wave and boundary layer separation are sensitive to the geometric uncertainty in transonic region. The uncertainty sensitivity analysis reveals the individual and coupled effects among the uncertainty parameters. Keywords: Non-intrusive polynomial chaos, Sparse grid, Stochastic aerodynamic analysis, Uncertainty sensitivity analysis, Uncertainty quantification

  4. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  5. Non-traditional neutron activation analysis by use of a nuclear reactor

    International Nuclear Information System (INIS)

    Mukhammedov, S.

    2003-01-01

    Full text: Traditional reactor neutron activation analysis (NAA) based on (n, γ) - thermal neutron capture nuclear reaction has been developed into a reliable and powerful analytical method, for trace element analysis, allowing the determination of over 60 chemical elements, with good accuracy and low detection limits. Considering all possibilities of activation and a radiochemical separation of the indicator radionuclide, the majority of the elements of this group can be determined at the ppm concentration level and below. However, for solving a number of analytical problems NAA technique is not well suited or it cannot be used at all. An important limitation is that all light elements, some medium and heavy elements cannot be determined even at ppm concentration level by this method, for example, H, Be, Li, B, C, N, O, Ti, Nb, Pb, etc. Accurate determination of lithium, oxygen and other light elements in sub-microgram level is of importance in geochemical and material studies. Such examples are great many. On such instances, several non-traditional reactor activation analysis can be used which have increasingly been developed and applied to several fields of semiconductor industry, biology, geology in recent years. The purpose of this presentation is to review the modern status of non-traditional nuclear reactor activation analysis based on use of nuclear reactions excited by the flow of secondary charged particles which are produced by two methods. In first method the triton flow is produced by thermal neutrons flux which excites the nuclear reaction 6 Li(n, α)T on lithium. The neutron activation analysis associated with two consecutive reactions 6 Li(n, α)T + 16 O(T, n) 18 F is established to determine trace amounts either of lithium or of oxygen in different geological, ecological and technological samples. Besides, the triton flow can be used for the determination of other light elements, for instance, B, N, S, Mg. This nuclear reactor triton activation

  6. An Innovative System for the Efficient and Effective Treatment of Non-Traditional Waters for Reuse in Thermoelectric Power Generation

    Energy Technology Data Exchange (ETDEWEB)

    John Rodgers; James Castle

    2008-08-31

    This study assessed opportunities for improving water quality associated with coal-fired power generation including the use of non-traditional waters for cooling, innovative technology for recovering and reusing water within power plants, novel approaches for the removal of trace inorganic compounds from ash pond effluents, and novel approaches for removing biocides from cooling tower blowdown. This research evaluated specifically designed pilot-scale constructed wetland systems for treatment of targeted constituents in non-traditional waters for reuse in thermoelectric power generation and other purposes. The overall objective of this project was to decrease targeted constituents in non-traditional waters to achieve reuse criteria or discharge limitations established by the National Pollutant Discharge Elimination System (NPDES) and Clean Water Act (CWA). The six original project objectives were completed, and results are presented in this final technical report. These objectives included identification of targeted constituents for treatment in four non-traditional water sources, determination of reuse or discharge criteria for treatment, design of constructed wetland treatment systems for these non-traditional waters, and measurement of treatment of targeted constituents in non-traditional waters, as well as determination of the suitability of the treated non-traditional waters for reuse or discharge to receiving aquatic systems. The four non-traditional waters used to accomplish these objectives were ash basin water, cooling water, flue gas desulfurization (FGD) water, and produced water. The contaminants of concern identified in ash basin waters were arsenic, chromium, copper, mercury, selenium, and zinc. Contaminants of concern in cooling waters included free oxidants (chlorine, bromine, and peroxides), copper, lead, zinc, pH, and total dissolved solids. FGD waters contained contaminants of concern including arsenic, boron, chlorides, selenium, mercury

  7. SIGNATURE: A workbench for gene expression signature analysis

    Directory of Open Access Journals (Sweden)

    Chang Jeffrey T

    2011-11-01

    Full Text Available Abstract Background The biological phenotype of a cell, such as a characteristic visual image or behavior, reflects activities derived from the expression of collections of genes. As such, an ability to measure the expression of these genes provides an opportunity to develop more precise and varied sets of phenotypes. However, to use this approach requires computational methods that are difficult to implement and apply, and thus there is a critical need for intelligent software tools that can reduce the technical burden of the analysis. Tools for gene expression analyses are unusually difficult to implement in a user-friendly way because their application requires a combination of biological data curation, statistical computational methods, and database expertise. Results We have developed SIGNATURE, a web-based resource that simplifies gene expression signature analysis by providing software, data, and protocols to perform the analysis successfully. This resource uses Bayesian methods for processing gene expression data coupled with a curated database of gene expression signatures, all carried out within a GenePattern web interface for easy use and access. Conclusions SIGNATURE is available for public use at http://genepattern.genome.duke.edu/signature/.

  8. A Sensitive and Specific Neural Signature for Picture-Induced Negative Affect.

    Directory of Open Access Journals (Sweden)

    Luke J Chang

    2015-06-01

    Full Text Available Neuroimaging has identified many correlates of emotion but has not yet yielded brain representations predictive of the intensity of emotional experiences in individuals. We used machine learning to identify a sensitive and specific signature of emotional responses to aversive images. This signature predicted the intensity of negative emotion in individual participants in cross validation (n =121 and test (n = 61 samples (high-low emotion = 93.5% accuracy. It was unresponsive to physical pain (emotion-pain = 92% discriminative accuracy, demonstrating that it is not a representation of generalized arousal or salience. The signature was comprised of mesoscale patterns spanning multiple cortical and subcortical systems, with no single system necessary or sufficient for predicting experience. Furthermore, it was not reducible to activity in traditional "emotion-related" regions (e.g., amygdala, insula or resting-state networks (e.g., "salience," "default mode". Overall, this work identifies differentiable neural components of negative emotion and pain, providing a basis for new, brain-based taxonomies of affective processes.

  9. 1.5 °C carbon budget dependent on carbon cycle uncertainty and future non-CO2 forcing.

    Science.gov (United States)

    Mengis, Nadine; Partanen, Antti-Ilari; Jalbert, Jonathan; Matthews, H Damon

    2018-04-11

    Estimates of the 1.5 °C carbon budget vary widely among recent studies, emphasizing the need to better understand and quantify key sources of uncertainty. Here we quantify the impact of carbon cycle uncertainty and non-CO 2 forcing on the 1.5 °C carbon budget in the context of a prescribed 1.5 °C temperature stabilization scenario. We use Bayes theorem to weight members of a perturbed parameter ensemble with varying land and ocean carbon uptake, to derive an estimate for the fossil fuel (FF) carbon budget of 469 PgC since 1850, with a 95% likelihood range of (411,528) PgC. CO 2 emissions from land-use change (LUC) add about 230 PgC. Our best estimate of the total (FF + LUC) carbon budget for 1.5 °C is therefore 699 PgC, which corresponds to about 11 years of current emissions. Non-CO 2 greenhouse gas and aerosol emissions represent equivalent cumulative CO 2 emissions of about 510 PgC and -180 PgC for 1.5 °C, respectively. The increased LUC, high non-CO 2 emissions and decreased aerosols in our scenario, cause the long-term FF carbon budget to decrease following temperature stabilization. In this scenario, negative emissions would be required to compensate not only for the increasing non-CO 2 climate forcing, but also for the declining natural carbon sinks.

  10. Comparative Analysis of Music Recordings from Western and Non-Western traditions by Automatic Tonal Feature Extraction

    Directory of Open Access Journals (Sweden)

    Emilia Gómez

    2008-09-01

    Full Text Available The automatic analysis of large musical corpora by means of computational models overcomes some limitations of manual analysis, and the unavailability of scores for most existing music makes necessary to work with audio recordings. Until now, research on this area has focused on music from the Western tradition. Nevertheless, we might ask if the available methods are suitable when analyzing music from other cultures. We present an empirical approach to the comparative analysis of audio recordings, focusing on tonal features and data mining techniques. Tonal features are related to the pitch class distribution, pitch range and employed scale, gamut and tuning system. We provide our initial but promising results obtained when trying to automatically distinguish music from Western and non- Western traditions; we analyze which descriptors are most relevant and study their distribution over 1500 pieces from different traditions and styles. As a result, some feature distributions differ for Western and non-Western music, and the obtained classification accuracy is higher than 80% for different classification algorithms and an independent test set. These results show that automatic description of audio signals together with data mining techniques provide means to characterize huge music collections from different traditions and complement musicological manual analyses.

  11. Motivational Orientations of Non-Traditional Adult Students to Enroll in a Degree-Seeking Program

    Science.gov (United States)

    Francois, Emmanuel Jean

    2014-01-01

    The purpose of this research was to investigate the motivational orientations of non-traditional adult students to enroll in a degree-seeking program based on their academic goal. The Education Participation Scale (EPS) was used to measure the motivational orientations of participants. Professional advancement, cognitive interest, and educational…

  12. The Long and Winding Road: Grades, Psychological Disengagement and Motivation among Female Students in (Non-)Traditional Career Paths

    Science.gov (United States)

    Rinfret, Natalie; Tougas, Francine; Beaton, Ann M.; Laplante, Joelle; Ngo Manguelle, Christiane; Lagacé, Marie Claude

    2014-01-01

    The purpose of this study was to evaluate the links between grades, psychological disengagement mechanisms (discounting evaluative feedback and devaluing school), and motivation among female students in traditional and non-traditional career paths. We predicted that the association between grades and discounting is affected by the importance of…

  13. Differences in traditional and non-traditional risk factors with special reference to nutritional factors in patients with coronary artery disease with or without diabetes mellitus

    Directory of Open Access Journals (Sweden)

    Namita P Mahalle

    2013-01-01

    Full Text Available Introduction: There is an increase in awareness about the role of nutritional factors in chronic non-communicable diseases. We therefore conducted this study with an aim to assess the relationship between nutritional factor (vitamin B12 and homocysteine [Hcy] and its association with insulin resistance and inflammatory markers, and differences in traditional and non-traditional risk factors among diabetics and non-diabetics in known cases of coronary artery disease (CAD. Materials and Methods: Three hundred consecutive patients with known coronary disease on coronary angiography, who were >25 years old were included in this study. All cases were interviewed using a questionnaire. Blood samples were analyzed for insulin, vitamin B12, Hcy and inflammatory markers (highly sensitive C-reactive protein [hsCRP], interleukin-6 [IL-6], Tumor necrosis factor-alfa [TNF-α]. Insulin resistance was calculated with homeostasis model assessment of insulin resistance (HOMA-IR. Results: Mean age of the patients was 60.95 ± 12.3 years. Body mass index and waist hip ratio were comparable in both groups. Triglyceride, very low-density lipoprotein and HbA1C were significantly higher and high-density lipoprotein (HDL was significantly lower in patients with diabetes. Patients with diabetes had significantly high levels of IL-6, hsCRP and TNF-α compared with non-diabetic patients. Insulin resistance was twofold higher in diabetic patients. Serum vitamin B12 levels were significantly lower and Hcy was significantly higher in the diabetic group compared with the non-diabetic patients. HbA1C, HOMA-IR and Hcy levels were positively correlated with inflammatory markers in the total study population and in the non-diabetic patients; but, in diabetic patients, HbA1C and Hcy showed this relation. Conclusions: Vitamin B12 deficiency is common in the diabetic population. Hcy levels were higher in diabetics compared with non-diabetics, and were related to glycemic level and

  14. Mitigating Provider Uncertainty in Service Provision Contracts

    Science.gov (United States)

    Smith, Chris; van Moorsel, Aad

    Uncertainty is an inherent property of open, distributed and multiparty systems. The viability of the mutually beneficial relationships which motivate these systems relies on rational decision-making by each constituent party under uncertainty. Service provision in distributed systems is one such relationship. Uncertainty is experienced by the service provider in his ability to deliver a service with selected quality level guarantees due to inherent non-determinism, such as load fluctuations and hardware failures. Statistical estimators utilized to model this non-determinism introduce additional uncertainty through sampling error. Inability of the provider to accurately model and analyze uncertainty in the quality level guarantees can result in the formation of sub-optimal service provision contracts. Emblematic consequences include loss of revenue, inefficient resource utilization and erosion of reputation and consumer trust. We propose a utility model for contract-based service provision to provide a systematic approach to optimal service provision contract formation under uncertainty. Performance prediction methods to enable the derivation of statistical estimators for quality level are introduced, with analysis of their resultant accuracy and cost.

  15. The uncertainties calculation of acoustic method for measurement of dissipative properties of heterogeneous non-metallic materials

    Directory of Open Access Journals (Sweden)

    Мaryna O. Golofeyeva

    2015-12-01

    Full Text Available The effective use of heterogeneous non-metallic materials and structures needs measurement of reliable values of dissipation characteristics, as well as common factors of their change during the loading process. Aim: The aim of this study is to prepare the budget for measurement uncertainty of dissipative properties of composite materials. Materials and Methods: The method used to study the vibrational energy dissipation characteristics based on coupling of vibrations damping decrement and acoustic velocity in a non-metallic heterogeneous material is reviewed. The proposed method allows finding the dependence of damping on vibrations amplitude and frequency of strain-stress state of material. Results: Research of the accuracy of measurement method during the definition of decrement attenuation of fluctuations in synthegran was performed. The international approach for evaluation of measurements quality is used. It includes the common practice international rules for uncertainty expression and their summation. These rules are used as internationally acknowledged confidence measure to the measurement results, which includes testing. The uncertainties budgeting of acoustic method for measurement of dissipative properties of materials were compiled. Conclusions: It was defined that there are two groups of reasons resulting in errors during measurement of materials dissipative properties. The first group of errors contains of parameters changing of calibrated bump in tolerance limits, displacement of sensor in repeated placement to measurement point, layer thickness variation of contact agent because of irregular hold-down of resolvers to control surface, inaccuracy in reading and etc. The second group of errors is linked with density and Poisson’s ratio measurement errors, distance between sensors, time difference between signals of vibroacoustic sensors.

  16. Trace element ink spiking for signature authentication

    International Nuclear Information System (INIS)

    Hatzistavros, V.S.; Kallithrakas-Kontos, N.G.

    2008-01-01

    Signature authentication is a critical question in forensic document examination. Last years the evolution of personal computers made signature copying a quite easy task, so the development of new ways for signature authentication is crucial. In the present work a commercial ink was spiked with many trace elements in various concentrations. Inorganic and organometallic ink soluble compounds were used as spiking agents, whilst ink retained its initial properties. The spiked inks were used for paper writing and the documents were analyzed by a non destructive method, the energy dispersive X-ray fluorescence. The thin target model was proved right for quantitative analysis and a very good linear relationship of the intensity (X-ray signal) against concentration was estimated for all used elements. Intensity ratios between different elements in the same ink gave very stable results, independent on the writing alterations. The impact of time both to written document and prepared inks was also investigated. (author)

  17. Using snowflake surface-area-to-volume ratio to model and interpret snowfall triple-frequency radar signatures

    Directory of Open Access Journals (Sweden)

    M. Gergely

    2017-10-01

    Full Text Available The snowflake microstructure determines the microwave scattering properties of individual snowflakes and has a strong impact on snowfall radar signatures. In this study, individual snowflakes are represented by collections of randomly distributed ice spheres where the size and number of the constituent ice spheres are specified by the snowflake mass and surface-area-to-volume ratio (SAV and the bounding volume of each ice sphere collection is given by the snowflake maximum dimension. Radar backscatter cross sections for the ice sphere collections are calculated at X-, Ku-, Ka-, and W-band frequencies and then used to model triple-frequency radar signatures for exponential snowflake size distributions (SSDs. Additionally, snowflake complexity values obtained from high-resolution multi-view snowflake images are used as an indicator of snowflake SAV to derive snowfall triple-frequency radar signatures. The modeled snowfall triple-frequency radar signatures cover a wide range of triple-frequency signatures that were previously determined from radar reflectivity measurements and illustrate characteristic differences related to snow type, quantified through snowflake SAV, and snowflake size. The results show high sensitivity to snowflake SAV and SSD maximum size but are generally less affected by uncertainties in the parameterization of snowflake mass, indicating the importance of snowflake SAV for the interpretation of snowfall triple-frequency radar signatures.

  18. lncRNA Gene Signatures for Prediction of Breast Cancer Intrinsic Subtypes and Prognosis

    Directory of Open Access Journals (Sweden)

    Silu Zhang

    2018-01-01

    Full Text Available Background: Breast cancer is intrinsically heterogeneous and is commonly classified into four main subtypes associated with distinct biological features and clinical outcomes. However, currently available data resources and methods are limited in identifying molecular subtyping on protein-coding genes, and little is known about the roles of long non-coding RNAs (lncRNAs, which occupies 98% of the whole genome. lncRNAs may also play important roles in subgrouping cancer patients and are associated with clinical phenotypes. Methods: The purpose of this project was to identify lncRNA gene signatures that are associated with breast cancer subtypes and clinical outcomes. We identified lncRNA gene signatures from The Cancer Genome Atlas (TCGA RNAseq data that are associated with breast cancer subtypes by an optimized 1-Norm SVM feature selection algorithm. We evaluated the prognostic performance of these gene signatures with a semi-supervised principal component (superPC method. Results: Although lncRNAs can independently predict breast cancer subtypes with satisfactory accuracy, a combined gene signature including both coding and non-coding genes will give the best clinically relevant prediction performance. We highlighted eight potential biomarkers (three from coding genes and five from non-coding genes that are significantly associated with survival outcomes. Conclusion: Our proposed methods are a novel means of identifying subtype-specific coding and non-coding potential biomarkers that are both clinically relevant and biologically significant.

  19. Compromise decision support problems for hierarchical design involving uncertainty

    Science.gov (United States)

    Vadde, S.; Allen, J. K.; Mistree, F.

    1994-08-01

    In this paper an extension to the traditional compromise Decision Support Problem (DSP) formulation is presented. Bayesian statistics is used in the formulation to model uncertainties associated with the information being used. In an earlier paper a compromise DSP that accounts for uncertainty using fuzzy set theory was introduced. The Bayesian Decision Support Problem is described in this paper. The method for hierarchical design is demonstrated by using this formulation to design a portal frame. The results are discussed and comparisons are made with those obtained using the fuzzy DSP. Finally, the efficacy of incorporating Bayesian statistics into the traditional compromise DSP formulation is discussed and some pending research issues are described. Our emphasis in this paper is on the method rather than the results per se.

  20. Cardiometabolic Risks in Polycystic Ovary Syndrome: Non-Traditional Risk Factors and the Impact of Obesity.

    Science.gov (United States)

    Chiu, Wei-Ling; Boyle, Jacqueline; Vincent, Amanda; Teede, Helena; Moran, Lisa J

    2017-01-01

    Polycystic ovary syndrome (PCOS) is a common and complex endocrinopathy with reproductive, metabolic, and psychological features and significantly increased cardiometabolic risks. PCOS is underpinned by inherent insulin resistance and hyperandrogenism. Obesity, more common in PCOS, plays an important role in the pathophysiology, exacerbating hyperinsulinaemia and hyperandrogenism, leading to recommended first-line lifestyle intervention. Significant traditional and non-traditional risk factors are implicated in PCOS in addition to obesity-exacerbated cardiometabolic risks and are explored in this review to promote the understanding of this common metabolic and reproductive condition. © 2016 S. Karger AG, Basel.

  1. Non-Traditional Students and Critical Pedagogy: Transformative Practice and the Teaching of Criminal Law

    Science.gov (United States)

    Menis, Susanna

    2017-01-01

    This article explores the practical implication of adopting critical pedagogy, and more specifically critical legal pedagogy, in the teaching of non-traditional students in higher education context. It is based on the teaching of criminal law at Birkbeck School of Law, addressing learning tasks which have been designed to enhance students'…

  2. Using Virtual Reality for Task-Based Exercises in Teaching Non-Traditional Students of German

    Science.gov (United States)

    Libbon, Stephanie

    2004-01-01

    Using task-based exercises that required web searches and online activities, this course introduced non-traditional students to the sights and sounds of the German culture and language and simultaneously to computer technology. Through partner work that required negotiation of the net as well as of the language, these adult beginning German…

  3. A bit of both science and economics: a non-traditional STEM identity narrative

    Science.gov (United States)

    Mark, Sheron L.

    2017-10-01

    Black males, as one non-dominant population, remain underrepresented and less successful in science, technology, engineering, and mathematics (STEM). Researchers focused on non-dominant populations are advised against generalizations and to examine cultural intersections (i.e. race, ethnicity, gender, and more) and also to explore cases of success, in addition to cases of under-achievement and underrepresentation. This study has focused on one African American male, Randy, who expressed high-achieving STEM career goals in computer science and engineering. Furthermore, recognizing that culture and identity development underlie STEM engagement and persistence, this long-term case study focused on how Randy developed a STEM identity during the course of the study and the implications of that process for his STEM career exploration. Étienne Wenger's (1999) communities-of-practice (CoP) was employed as a theoretical framework and, in doing so, (1) the informal STEM program in which Randy participated was characterized as a STEM-for-social-justice CoP and (2) Randy participated in ways that consistently utilized an "economics" lens from beyond the boundaries of the CoP. In doing so, Randy functioned as a broker within the CoP and developed a non-traditional STEM identity-in-practice which integrated STEM, "economics", and community engagement. Randy's STEM identity-in-practice is discussed in terms of the contextual factors that support scientific identity development (Hazari et al. in J Res Sci Teach 47:978-1003, 2010), the importance of recognizing and supporting the development of holistic and non-traditional STEM identities, especially for diverse populations in STEM, and the implications of this new understanding of Randy's STEM identity for his long-term STEM career exploration.

  4. Bayesian Chance-Constrained Hydraulic Barrier Design under Geological Structure Uncertainty.

    Science.gov (United States)

    Chitsazan, Nima; Pham, Hai V; Tsai, Frank T-C

    2015-01-01

    The groundwater community has widely recognized geological structure uncertainty as a major source of model structure uncertainty. Previous studies in aquifer remediation design, however, rarely discuss the impact of geological structure uncertainty. This study combines chance-constrained (CC) programming with Bayesian model averaging (BMA) as a BMA-CC framework to assess the impact of geological structure uncertainty in remediation design. To pursue this goal, the BMA-CC method is compared with traditional CC programming that only considers model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from salt water intrusion in the "1500-foot" sand and the "1700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address geological structure uncertainty, three groundwater models based on three different hydrostratigraphic architectures are developed. The results show that using traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from the connector wells is higher than the total pumpage of the protected public supply wells. While reducing the injection rate can be achieved by reducing the reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station may not be economically attractive. © 2014, National Ground Water Association.

  5. A genomic copy number signature predicts radiation exposure in post-Chernobyl breast cancer.

    Science.gov (United States)

    Wilke, Christina M; Braselmann, Herbert; Hess, Julia; Klymenko, Sergiy V; Chumak, Vadim V; Zakhartseva, Liubov M; Bakhanova, Elena V; Walch, Axel K; Selmansberger, Martin; Samaga, Daniel; Weber, Peter; Schneider, Ludmila; Fend, Falko; Bösmüller, Hans C; Zitzelsberger, Horst; Unger, Kristian

    2018-04-16

    Breast cancer is the second leading cause of cancer death among women worldwide and besides life style, age and genetic risk factors, exposure to ionizing radiation is known to increase the risk for breast cancer. Further, DNA copy number alterations (CNAs), which can result from radiation-induced double-strand breaks, are frequently occurring in breast cancer cells. We set out to identify a signature of CNAs discriminating breast cancers from radiation-exposed and non-exposed female patients. We analyzed resected breast cancer tissues from 68 exposed female Chernobyl clean-up workers and evacuees and 68 matched non-exposed control patients for CNAs by array comparative genomic hybridization analysis (aCGH). Using a stepwise forward-backward selection approach a non-complex CNA signature, that is, less than ten features, was identified in the training data set, which could be subsequently validated in the validation data set (p value < 0.05). The signature consisted of nine copy number regions located on chromosomal bands 7q11.22-11.23, 7q21.3, 16q24.3, 17q21.31, 20p11.23-11.21, 1p21.1, 2q35, 2q35, 6p22.2. The signature was independent of any clinical characteristics of the patients. In all, we identified a CNA signature that has the potential to allow identification of radiation-associated breast cancer at the individual level. © 2018 UICC.

  6. Predictive uncertainty in auditory sequence processing

    Directory of Open Access Journals (Sweden)

    Niels Chr. eHansen

    2014-09-01

    Full Text Available Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty - a property of listeners’ prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure.Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex. Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty. We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty. Finally, we simulate listeners’ perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature.The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music.

  7. Predictive uncertainty in auditory sequence processing.

    Science.gov (United States)

    Hansen, Niels Chr; Pearce, Marcus T

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty-a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music.

  8. S-Denying of the Signature Conditions Expands General Relativity's Space

    Directory of Open Access Journals (Sweden)

    Rabounski D.

    2006-07-01

    Full Text Available We apply the S-denying procedure to signature conditions in a four-dimensional pseudo-Riemannian space — i. e. we change one (or even all of the conditions to be partially true and partially false. We obtain five kinds of expanded space-time for General Relativity. Kind I permits the space-time to be in collapse. Kind II permits the space-time to change its own signature. Kind III has peculiarities, linked to the third signature condition. Kind IV permits regions where the metric fully degenerates: there may be non-quantum teleportation, and a home for virtual photons. Kind V is common for kinds I, II, III, and IV.

  9. A comparison of numerical solutions of partial differential equations with probabilistic and possibilistic parameters for the quantification of uncertainty in subsurface solute transport.

    Science.gov (United States)

    Zhang, Kejiang; Achari, Gopal; Li, Hua

    2009-11-03

    Traditionally, uncertainty in parameters are represented as probabilistic distributions and incorporated into groundwater flow and contaminant transport models. With the advent of newer uncertainty theories, it is now understood that stochastic methods cannot properly represent non random uncertainties. In the groundwater flow and contaminant transport equations, uncertainty in some parameters may be random, whereas those of others may be non random. The objective of this paper is to develop a fuzzy-stochastic partial differential equation (FSPDE) model to simulate conditions where both random and non random uncertainties are involved in groundwater flow and solute transport. Three potential solution techniques namely, (a) transforming a probability distribution to a possibility distribution (Method I) then a FSPDE becomes a fuzzy partial differential equation (FPDE), (b) transforming a possibility distribution to a probability distribution (Method II) and then a FSPDE becomes a stochastic partial differential equation (SPDE), and (c) the combination of Monte Carlo methods and FPDE solution techniques (Method III) are proposed and compared. The effects of these three methods on the predictive results are investigated by using two case studies. The results show that the predictions obtained from Method II is a specific case of that got from Method I. When an exact probabilistic result is needed, Method II is suggested. As the loss or gain of information during a probability-possibility (or vice versa) transformation cannot be quantified, their influences on the predictive results is not known. Thus, Method III should probably be preferred for risk assessments.

  10. Extremum uncertainty product and sum states

    Energy Technology Data Exchange (ETDEWEB)

    Mehta, C L; Kumar, S [Indian Inst. of Tech., New Delhi. Dept. of Physics

    1978-01-01

    The extremum product states and sum states of the uncertainties in non-commuting observables have been examined. These are illustrated by two specific examples of harmonic oscillator and the angular momentum states. It shows that the coherent states of the harmonic oscillator are characterized by the minimum uncertainty sum <(..delta..q)/sup 2/>+<(..delta..p)/sup 2/>. The extremum values of the sums and products of the uncertainties of the components of the angular momentum are also obtained.

  11. Unconditionally Secure Quantum Signatures

    Directory of Open Access Journals (Sweden)

    Ryan Amiri

    2015-08-01

    Full Text Available Signature schemes, proposed in 1976 by Diffie and Hellman, have become ubiquitous across modern communications. They allow for the exchange of messages from one sender to multiple recipients, with the guarantees that messages cannot be forged or tampered with and that messages also can be forwarded from one recipient to another without compromising their validity. Signatures are different from, but no less important than encryption, which ensures the privacy of a message. Commonly used signature protocols—signatures based on the Rivest–Adleman–Shamir (RSA algorithm, the digital signature algorithm (DSA, and the elliptic curve digital signature algorithm (ECDSA—are only computationally secure, similar to public key encryption methods. In fact, since these rely on the difficulty of finding discrete logarithms or factoring large primes, it is known that they will become completely insecure with the emergence of quantum computers. We may therefore see a shift towards signature protocols that will remain secure even in a post-quantum world. Ideally, such schemes would provide unconditional or information-theoretic security. In this paper, we aim to provide an accessible and comprehensive review of existing unconditionally securesecure signature schemes for signing classical messages, with a focus on unconditionally secure quantum signature schemes.

  12. Trabecular morphometry by fractal signature analysis is a novel marker of osteoarthritis progression.

    Science.gov (United States)

    Kraus, Virginia Byers; Feng, Sheng; Wang, ShengChu; White, Scott; Ainslie, Maureen; Brett, Alan; Holmes, Anthony; Charles, H Cecil

    2009-12-01

    To evaluate the effectiveness of using subchondral bone texture observed on a radiograph taken at baseline to predict progression of knee osteoarthritis (OA) over a 3-year period. A total of 138 participants in the Prediction of Osteoarthritis Progression study were evaluated at baseline and after 3 years. Fractal signature analysis (FSA) of the medial subchondral tibial plateau was performed on fixed flexion radiographs of 248 nonreplaced knees, using a commercially available software tool. OA progression was defined as a change in joint space narrowing (JSN) or osteophyte formation of 1 grade according to a standardized knee atlas. Statistical analysis of fractal signatures was performed using a new model based on correlating the overall shape of a fractal dimension curve with radius. Fractal signature of the medial tibial plateau at baseline was predictive of medial knee JSN progression (area under the curve [AUC] 0.75, of a receiver operating characteristic curve) but was not predictive of osteophyte formation or progression of JSN in the lateral compartment. Traditional covariates (age, sex, body mass index, knee pain), general bone mineral content, and joint space width at baseline were no more effective than random variables for predicting OA progression (AUC 0.52-0.58). The predictive model with maximum effectiveness combined fractal signature at baseline, knee alignment, traditional covariates, and bone mineral content (AUC 0.79). We identified a prognostic marker of OA that is readily extracted from a plain radiograph using FSA. Although the method needs to be validated in a second cohort, our results indicate that the global shape approach to analyzing these data is a potentially efficient means of identifying individuals at risk of knee OA progression.

  13. The Search for Hydrologic Signatures: The Effect of Data Transformations on Bayesian Model Calibration

    Science.gov (United States)

    Sadegh, M.; Vrugt, J. A.

    2011-12-01

    In the past few years, several contributions have begun to appear in the hydrologic literature that introduced and analyzed the benefits of using a signature based approach to watershed analysis. This signature-based approach abandons the standard single criteria model-data fitting paradigm in favor of a diagnostic approach that better extracts the available information from the available data. Despite the prospects of this new viewpoint, rather ad-hoc criteria have hitherto been proposed to improve watershed modeling. Here, we aim to provide a proper mathematical foundation to signature based analysis. We analyze the information content of different data transformation by analyzing their convergence speed with Markov Chain Monte Carlo (MCMC) simulation using the Generalized Likelihood function of Schousp and Vrugt (2010). We compare the information content of the original discharge data against a simple square root and Box-Cox transformation of the streamflow data. We benchmark these results against wavelet and flow duration curve transformations that temporally disaggregate the discharge data. Our results conclusive demonstrate that wavelet transformations and flow duration curves significantly reduce the information content of the streamflow data and consequently unnecessarily increase the uncertainty of the HYMOD model parameters. Hydrologic signatures thus need to be found in the original data, without temporal disaggregation.

  14. A Fast lattice-based polynomial digital signature system for m-commerce

    Science.gov (United States)

    Wei, Xinzhou; Leung, Lin; Anshel, Michael

    2003-01-01

    The privacy and data integrity are not guaranteed in current wireless communications due to the security hole inside the Wireless Application Protocol (WAP) version 1.2 gateway. One of the remedies is to provide an end-to-end security in m-commerce by applying application level security on top of current WAP1.2. The traditional security technologies like RSA and ECC applied on enterprise's server are not practical for wireless devices because wireless devices have relatively weak computation power and limited memory compared with server. In this paper, we developed a lattice based polynomial digital signature system based on NTRU's Polynomial Authentication and Signature Scheme (PASS), which enabled the feasibility of applying high-level security on both server and wireless device sides.

  15. STUDY ON MODELING AND VISUALIZING THE POSITIONAL UNCERTAINTY OF REMOTE SENSING IMAGE

    Directory of Open Access Journals (Sweden)

    W. Jiao

    2016-06-01

    Full Text Available It is inevitable to bring about uncertainty during the process of data acquisition. The traditional method to evaluate the geometric positioning accuracy is usually by the statistical method and represented by the root mean square errors (RMSEs of control points. It is individual and discontinuous, so it is difficult to describe the error spatial distribution. In this paper the error uncertainty of each control point is deduced, and the uncertainty spatial distribution model of each arbitrary point is established. The error model is proposed to evaluate the geometric accuracy of remote sensing image. Then several visualization methods are studied to represent the discrete and continuous data of geometric uncertainties. The experiments show that the proposed evaluation method of error distribution model compared with the traditional method of RMSEs can get the similar results but without requiring the user to collect control points as checkpoints, and error distribution information calculated by the model can be provided to users along with the geometric image data. Additionally, the visualization methods described in this paper can effectively and objectively represents the image geometric quality, and also can help users probe the reasons of bringing the image uncertainties in some extent.

  16. Stochastic variational approach to minimum uncertainty states

    Energy Technology Data Exchange (ETDEWEB)

    Illuminati, F.; Viola, L. [Dipartimento di Fisica, Padova Univ. (Italy)

    1995-05-21

    We introduce a new variational characterization of Gaussian diffusion processes as minimum uncertainty states. We then define a variational method constrained by kinematics of diffusions and Schroedinger dynamics to seek states of local minimum uncertainty for general non-harmonic potentials. (author)

  17. Hyperspectral signature analysis of skin parameters

    Science.gov (United States)

    Vyas, Saurabh; Banerjee, Amit; Garza, Luis; Kang, Sewon; Burlina, Philippe

    2013-02-01

    The temporal analysis of changes in biological skin parameters, including melanosome concentration, collagen concentration and blood oxygenation, may serve as a valuable tool in diagnosing the progression of malignant skin cancers and in understanding the pathophysiology of cancerous tumors. Quantitative knowledge of these parameters can also be useful in applications such as wound assessment, and point-of-care diagnostics, amongst others. We propose an approach to estimate in vivo skin parameters using a forward computational model based on Kubelka-Munk theory and the Fresnel Equations. We use this model to map the skin parameters to their corresponding hyperspectral signature. We then use machine learning based regression to develop an inverse map from hyperspectral signatures to skin parameters. In particular, we employ support vector machine based regression to estimate the in vivo skin parameters given their corresponding hyperspectral signature. We build on our work from SPIE 2012, and validate our methodology on an in vivo dataset. This dataset consists of 241 signatures collected from in vivo hyperspectral imaging of patients of both genders and Caucasian, Asian and African American ethnicities. In addition, we also extend our methodology past the visible region and through the short-wave infrared region of the electromagnetic spectrum. We find promising results when comparing the estimated skin parameters to the ground truth, demonstrating good agreement with well-established physiological precepts. This methodology can have potential use in non-invasive skin anomaly detection and for developing minimally invasive pre-screening tools.

  18. Traditions and Management Perspectives of Community and Non-Profit Organizations in Lithuania

    Directory of Open Access Journals (Sweden)

    Andrius Stasiukynas

    2015-02-01

    Full Text Available Purpose – To overview the traditions and management perspectives of community and non-profit organizations in Lithuania.Methodology – For the purpose of this research a literature analysis on community and non-profit organization case studies was conducted. The case studies describing stories of success were singled out and leaders of these organizations were interviewed.Findings – The research has showed the growth of the number of community and nonprofit organizations during the last twenty years and the difficulties of collecting the statistical data. This study presupposes the possibility to identify the tendencies of management in community and non-profit organizations, including the following: increasing use of the social networks for communication; proliferation of strategic planning; greater emphasis on educating and empowering new generation of leaders.Research implications – Prior studies in this area in Lithuania have not exhaustively analyzed the components of human resource management of non-profit organizations. An important follow up on this study would be to analyze the human resource management in community organizations.Practical implications – This study covered the management aspects important for the improvement of how community and non-profit organizations work.Originality/Value – This study expands the knowledge on Lithuanian community and non-profit organization development and management.Research type – literature review, research paper.

  19. Massive Black Hole Binaries: Dynamical Evolution and Observational Signatures

    Directory of Open Access Journals (Sweden)

    M. Dotti

    2012-01-01

    Full Text Available The study of the dynamical evolution of massive black hole pairs in mergers is crucial in the context of a hierarchical galaxy formation scenario. The timescales for the formation and the coalescence of black hole binaries are still poorly constrained, resulting in large uncertainties in the expected rate of massive black hole binaries detectable in the electromagnetic and gravitational wave spectra. Here, we review the current theoretical understanding of the black hole pairing in galaxy mergers, with a particular attention to recent developments and open issues. We conclude with a review of the expected observational signatures of massive binaries and of the candidates discussed in literature to date.

  20. Radiation signatures

    International Nuclear Information System (INIS)

    McGlynn, S.P.; Varma, M.N.

    1992-01-01

    A new concept for modelling radiation risk is proposed. This concept is based on the proposal that the spectrum of molecular lesions, which we dub ''the radiation signature'', can be used to identify the quality of the causal radiation. If the proposal concerning radiation signatures can be established then, in principle, both prospective and retrospective risk determination can be assessed on an individual basis. A major goal of biophysical modelling is to relate physical events such as ionization, excitation, etc. to the production of radiation carcinogenesis. A description of the physical events is provided by track structure. The track structure is determined by radiation quality, and it can be considered to be the ''physical signature'' of the radiation. Unfortunately, the uniqueness characteristics of this signature are dissipated in biological systems in ∼10 -9 s. Nonetheless, it is our contention that this physical disturbance of the biological system eventuates later, at ∼10 0 s, in molecular lesion spectra which also characterize the causal radiation. (author)

  1. Comparison of Fingerprint and Iris Biometric Authentication for Control of Digital Signatures

    Science.gov (United States)

    Zuckerman, Alan E.; Moon, Kenneth A.; Eaddy, Kenneth

    2002-01-01

    Biometric authentication systems can be used to control digital signature of medical documents. This pilot study evaluated the use of two different fingerprint technologies and one iris technology to control creation of digital signatures on a central server using public private key pairs stored on the server. Documents and signatures were stored in XML for portability. Key pairs and authentication certificates were generated during biometric enrollment. Usability and user acceptance were guarded and limitations of biometric systems prevented use of the system with all test subjects. The system detected alternations in the data content and provided future signer re-authentication for non-repudiation.

  2. Monthly water quality forecasting and uncertainty assessment via bootstrapped wavelet neural networks under missing data for Harbin, China.

    Science.gov (United States)

    Wang, Yi; Zheng, Tong; Zhao, Ying; Jiang, Jiping; Wang, Yuanyuan; Guo, Liang; Wang, Peng

    2013-12-01

    In this paper, bootstrapped wavelet neural network (BWNN) was developed for predicting monthly ammonia nitrogen (NH(4+)-N) and dissolved oxygen (DO) in Harbin region, northeast of China. The Morlet wavelet basis function (WBF) was employed as a nonlinear activation function of traditional three-layer artificial neural network (ANN) structure. Prediction intervals (PI) were constructed according to the calculated uncertainties from the model structure and data noise. Performance of BWNN model was also compared with four different models: traditional ANN, WNN, bootstrapped ANN, and autoregressive integrated moving average model. The results showed that BWNN could handle the severely fluctuating and non-seasonal time series data of water quality, and it produced better performance than the other four models. The uncertainty from data noise was smaller than that from the model structure for NH(4+)-N; conversely, the uncertainty from data noise was larger for DO series. Besides, total uncertainties in the low-flow period were the biggest due to complicated processes during the freeze-up period of the Songhua River. Further, a data missing-refilling scheme was designed, and better performances of BWNNs for structural data missing (SD) were observed than incidental data missing (ID). For both ID and SD, temporal method was satisfactory for filling NH(4+)-N series, whereas spatial imputation was fit for DO series. This filling BWNN forecasting method was applied to other areas suffering "real" data missing, and the results demonstrated its efficiency. Thus, the methods introduced here will help managers to obtain informed decisions.

  3. A gene signature to determine metastatic behavior in thymomas.

    Directory of Open Access Journals (Sweden)

    Yesim Gökmen-Polar

    Full Text Available PURPOSE: Thymoma represents one of the rarest of all malignancies. Stage and completeness of resection have been used to ascertain postoperative therapeutic strategies albeit with limited prognostic accuracy. A molecular classifier would be useful to improve the assessment of metastatic behaviour and optimize patient management. METHODS: qRT-PCR assay for 23 genes (19 test and four reference genes was performed on multi-institutional archival primary thymomas (n = 36. Gene expression levels were used to compute a signature, classifying tumors into classes 1 and 2, corresponding to low or high likelihood for metastases. The signature was validated in an independent multi-institutional cohort of patients (n = 75. RESULTS: A nine-gene signature that can predict metastatic behavior of thymomas was developed and validated. Using radial basis machine modeling in the training set, 5-year and 10-year metastasis-free survival rates were 77% and 26% for predicted low (class 1 and high (class 2 risk of metastasis (P = 0.0047, log-rank, respectively. For the validation set, 5-year metastasis-free survival rates were 97% and 30% for predicted low- and high-risk patients (P = 0.0004, log-rank, respectively. The 5-year metastasis-free survival rates for the validation set were 49% and 41% for Masaoka stages I/II and III/IV (P = 0.0537, log-rank, respectively. In univariate and multivariate Cox models evaluating common prognostic factors for thymoma metastasis, the nine-gene signature was the only independent indicator of metastases (P = 0.036. CONCLUSION: A nine-gene signature was established and validated which predicts the likelihood of metastasis more accurately than traditional staging. This further underscores the biologic determinants of the clinical course of thymoma and may improve patient management.

  4. Causal uncertainty, claimed and behavioural self-handicapping.

    Science.gov (United States)

    Thompson, Ted; Hepburn, Jonathan

    2003-06-01

    Causal uncertainty beliefs involve doubts about the causes of events, and arise as a consequence of non-contingent evaluative feedback: feedback that leaves the individual uncertain about the causes of his or her achievement outcomes. Individuals high in causal uncertainty are frequently unable to confidently attribute their achievement outcomes, experience anxiety in achievement situations and as a consequence are likely to engage in self-handicapping behaviour. Accordingly, we sought to establish links between trait causal uncertainty, claimed and behavioural self-handicapping. Participants were N=72 undergraduate students divided equally between high and low causally uncertain groups. We used a 2 (causal uncertainty status: high, low) x 3 (performance feedback condition: success, non-contingent success, non-contingent failure) between-subjects factorial design to examine the effects of causal uncertainty on achievement behaviour. Following performance feedback, participants completed 20 single-solution anagrams and 12 remote associate tasks serving as performance measures, and 16 unicursal tasks to assess practice effort. Participants also completed measures of claimed handicaps, state anxiety and attributions. Relative to low causally uncertain participants, high causally uncertain participants claimed more handicaps prior to performance on the anagrams and remote associates, reported higher anxiety, attributed their failure to internal, stable factors, and reduced practice effort on the unicursal tasks, evident in fewer unicursal tasks solved. These findings confirm links between trait causal uncertainty and claimed and behavioural self-handicapping, highlighting the need for educators to facilitate means by which students can achieve surety in the manner in which they attribute the causes of their achievement outcomes.

  5. Five years of lesson modification to implement non-traditional learning sessions in a traditional-delivery curriculum: A retrospective assessment using applied implementation variables.

    Science.gov (United States)

    Gleason, Shaun E; McNair, Bryan; Kiser, Tyree H; Franson, Kari L

    Non-traditional learning (NTL), including aspects of self-directed learning (SDL), may address self-awareness development needs. Many factors can impact successful implementation of NTL. To share our multi-year experience with modifications that aim to improve NTL sessions in a traditional curriculum. To improve understanding of applied implementation variables (some of which were based on successful SDL implementation components) that impact NTL. We delivered a single lesson in a traditional-delivery curriculum once annually for five years, varying delivery annually in response to student learning and reaction-to-learning results. At year 5, we compared student learning and reaction-to-learning to applied implementation factors using logistic regression. Higher instructor involvement and overall NTL levels predicted correct exam responses (p=0.0007 and ptraditional and highest overall NTL deliveries. Students rated instructor presentation skills and teaching methods higher when greater instructor involvement (pmethods were most effective when lower student involvement and higher technology levels (ptraditional-delivery curriculum, instructor involvement appears essential, while the impact of student involvement and educational technology levels varies. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. A nonparametric approach to medical survival data: Uncertainty in the context of risk in mortality analysis

    International Nuclear Information System (INIS)

    Janurová, Kateřina; Briš, Radim

    2014-01-01

    Medical survival right-censored data of about 850 patients are evaluated to analyze the uncertainty related to the risk of mortality on one hand and compare two basic surgery techniques in the context of risk of mortality on the other hand. Colorectal data come from patients who underwent colectomy in the University Hospital of Ostrava. Two basic surgery operating techniques are used for the colectomy: either traditional (open) or minimally invasive (laparoscopic). Basic question arising at the colectomy operation is, which type of operation to choose to guarantee longer overall survival time. Two non-parametric approaches have been used to quantify probability of mortality with uncertainties. In fact, complement of the probability to one, i.e. survival function with corresponding confidence levels is calculated and evaluated. First approach considers standard nonparametric estimators resulting from both the Kaplan–Meier estimator of survival function in connection with Greenwood's formula and the Nelson–Aalen estimator of cumulative hazard function including confidence interval for survival function as well. The second innovative approach, represented by Nonparametric Predictive Inference (NPI), uses lower and upper probabilities for quantifying uncertainty and provides a model of predictive survival function instead of the population survival function. The traditional log-rank test on one hand and the nonparametric predictive comparison of two groups of lifetime data on the other hand have been compared to evaluate risk of mortality in the context of mentioned surgery techniques. The size of the difference between two groups of lifetime data has been considered and analyzed as well. Both nonparametric approaches led to the same conclusion, that the minimally invasive operating technique guarantees the patient significantly longer survival time in comparison with the traditional operating technique

  7. [Current Status and Development of Traditional Chemotherapy in Non-small Cell Lung Cancer under the Background of Targeted Therapy].

    Science.gov (United States)

    Zhang, Guowei; Wang, Huijuan; Zhang, Mina; Li, Peng; Ma, Zhiyong

    2015-09-20

    In recent years, along with rapid development of targeted therapy in non-small cell lung cancer, traditional chemotherapy get less and less attention. Yet it still can not be ignored in the current that how to locate and use traditional chemotherapy so patients could derive maximum benefit. For this purpose, through the literature review and analysis, we point out there are still many traditional chemotherapy irreplaceable places whatever patients' driver gene status. And there are some new treatment modalities of traditional chemotherapy which have been developed to further improve patients' survival. At the same time, through exposition of predictive bio-markers development in chemotherapy, we pointed out that the future of traditional chemotherapy must be part of "targeted therapy".

  8. Enteric Pathogen Bacteria in Non-Broiler Chicken Egg Shells from Traditional Market and Supermarket, Jatinangor Subdistrict, West Java

    Directory of Open Access Journals (Sweden)

    Kavita Arumugam

    2015-09-01

    Full Text Available Background: Around 1.5 million of children dying annually due to diarrhea. Contaminated food is one of the sources of the diarrhea incidence (food borne diseases. Eggs are one of the least expensive forms of protein which is affordable by the community and is easily to find in either traditional or modern market/supermarkets.The objective of this study was to identify enteropathogenic bacteria contamination on non-broiler (ayam kampung egg shell and to compare the findings between eggs sold in traditional and modern markets. Methods: This was a descriptive study performed at the Microbiology Laboratory of the Faculty of Medicine, Universitas Padjadjaran. A total of 40 eggs were used, 20 from two traditional markets and 20 from two modern markets. The eggs were swabbed using saline, dipped in tryptic soy broth and streaked on Mac Conkey agar. The collected data were analyzed and presented in tables. Results: Out of 40 samples, there were 19 positive cultures found from the traditional market and 16 from the modern market. There were 30 pink colonies indicating that they were lactose fermented, 5 transparent colonies indicated non-lactose fermentation, 4 showed no colony growth, and 1 grew an unidentified colony. The most found bacteria were Klebsiella sp. and Enterobacter sp. in both market. Conclusions: Eggs shells from traditional and modern markets are contaminated with Enteropathogenic microbes.

  9. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    Science.gov (United States)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  10. Clinical value of prognosis gene expression signatures in colorectal cancer: a systematic review.

    Directory of Open Access Journals (Sweden)

    Rebeca Sanz-Pamplona

    Full Text Available INTRODUCTION: The traditional staging system is inadequate to identify those patients with stage II colorectal cancer (CRC at high risk of recurrence or with stage III CRC at low risk. A number of gene expression signatures to predict CRC prognosis have been proposed, but none is routinely used in the clinic. The aim of this work was to assess the prediction ability and potential clinical usefulness of these signatures in a series of independent datasets. METHODS: A literature review identified 31 gene expression signatures that used gene expression data to predict prognosis in CRC tissue. The search was based on the PubMed database and was restricted to papers published from January 2004 to December 2011. Eleven CRC gene expression datasets with outcome information were identified and downloaded from public repositories. Random Forest classifier was used to build predictors from the gene lists. Matthews correlation coefficient was chosen as a measure of classification accuracy and its associated p-value was used to assess association with prognosis. For clinical usefulness evaluation, positive and negative post-tests probabilities were computed in stage II and III samples. RESULTS: Five gene signatures showed significant association with prognosis and provided reasonable prediction accuracy in their own training datasets. Nevertheless, all signatures showed low reproducibility in independent data. Stratified analyses by stage or microsatellite instability status showed significant association but limited discrimination ability, especially in stage II tumors. From a clinical perspective, the most predictive signatures showed a minor but significant improvement over the classical staging system. CONCLUSIONS: The published signatures show low prediction accuracy but moderate clinical usefulness. Although gene expression data may inform prognosis, better strategies for signature validation are needed to encourage their widespread use in the clinic.

  11. Some applications of uncertainty relations in quantum information

    Science.gov (United States)

    Majumdar, A. S.; Pramanik, T.

    2016-08-01

    We discuss some applications of various versions of uncertainty relations for both discrete and continuous variables in the context of quantum information theory. The Heisenberg uncertainty relation enables demonstration of the Einstein, Podolsky and Rosen (EPR) paradox. Entropic uncertainty relations (EURs) are used to reveal quantum steering for non-Gaussian continuous variable states. EURs for discrete variables are studied in the context of quantum memory where fine-graining yields the optimum lower bound of uncertainty. The fine-grained uncertainty relation is used to obtain connections between uncertainty and the nonlocality of retrieval games for bipartite and tripartite systems. The Robertson-Schrödinger (RS) uncertainty relation is applied for distinguishing pure and mixed states of discrete variables.

  12. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  13. Cathedral outreach: student-led workshops for school curriculum enhancement in non-traditional environments

    Science.gov (United States)

    Posner, Matthew T.; Jantzen, Alexander; van Putten, Lieke D.; Ravagli, Andrea; Donko, Andrei L.; Soper, Nathan; Wong, Nicholas H. L.; John, Pearl V.

    2017-08-01

    Universities in the United Kingdom have been driven to work with a larger pool of potential students than just the more traditional student (middle-class white male), in order to tackle the widely-accepted skills-shortage in the fields of science, technology, engineering and mathematics (STEM), whilst honoring their commitment to fair access to higher education. Student-led outreach programs have contributed significantly to this drive. Two such programs run by postgraduate students at the University of Southampton are the Lightwave Roadshow and Southampton Accelerate!, which focus on photonics and particle physics, respectively. The program ambassadors have developed activities to enhance areas of the national curriculum through presenting fundamental physical sciences and their applications to optics and photonics research. The activities have benefitted significantly from investment from international organizations, such as SPIE, OSA and the IEEE Photonics Society, and UK research councils, in conjunction with university recruitment and outreach strategies. New partnerships have been formed to expand outreach programs to work in non-traditional environments to challenge stereotypes of scientists. This paper presents two case studies of collaboration with education learning centers at Salisbury Cathedral and Winchester Cathedral. The paper outlines workshops and shows developed for pupils aged 6-14 years (UK key stages 2-4) on the electromagnetic spectrum, particle physics, telecommunications and the human eye using a combination of readily obtainable items, hand-built kits and elements from the EYEST Photonics Explorer kit. The activities are interactive to stimulate learning through active participation, complement the UK national curriculum and link the themes of science with the non-traditional setting of a cathedral. We present methods to evaluate the impact of the activity and tools to obtain qualitative feedback for continual program improvement. We also

  14. Algorithms for Hyperspectral Endmember Extraction and Signature Classification with Morphological Dendritic Networks

    Science.gov (United States)

    Schmalz, M.; Ritter, G.

    Accurate multispectral or hyperspectral signature classification is key to the nonimaging detection and recognition of space objects. Additionally, signature classification accuracy depends on accurate spectral endmember determination [1]. Previous approaches to endmember computation and signature classification were based on linear operators or neural networks (NNs) expressed in terms of the algebra (R, +, x) [1,2]. Unfortunately, class separation in these methods tends to be suboptimal, and the number of signatures that can be accurately classified often depends linearly on the number of NN inputs. This can lead to poor endmember distinction, as well as potentially significant classification errors in the presence of noise or densely interleaved signatures. In contrast to traditional CNNs, autoassociative morphological memories (AMM) are a construct similar to Hopfield autoassociatived memories defined on the (R, +, ?,?) lattice algebra [3]. Unlimited storage and perfect recall of noiseless real valued patterns has been proven for AMMs [4]. However, AMMs suffer from sensitivity to specific noise models, that can be characterized as erosive and dilative noise. On the other hand, the prior definition of a set of endmembers corresponds to material spectra lying on vertices of the minimum convex region covering the image data. These vertices can be characterized as morphologically independent patterns. It has further been shown that AMMs can be based on dendritic computation [3,6]. These techniques yield improved accuracy and class segmentation/separation ability in the presence of highly interleaved signature data. In this paper, we present a procedure for endmember determination based on AMM noise sensitivity, which employs morphological dendritic computation. We show that detected endmembers can be exploited by AMM based classification techniques, to achieve accurate signature classification in the presence of noise, closely spaced or interleaved signatures, and

  15. Non-traditional stable isotope behaviors in immiscible silica-melts in a mafic magma chamber.

    Science.gov (United States)

    Zhu, Dan; Bao, Huiming; Liu, Yun

    2015-12-01

    Non-traditional stable isotopes have increasingly been applied to studies of igneous processes including planetary differentiation. Equilibrium isotope fractionation of these elements in silicates is expected to be negligible at magmatic temperatures (δ(57)Fe difference often less than 0.2 per mil). However, an increasing number of data has revealed a puzzling observation, e.g., the δ(57)Fe for silicic magmas ranges from 0‰ up to 0.6‰, with the most positive δ(57)Fe almost exclusively found in A-type granitoids. Several interpretations have been proposed by different research groups, but these have so far failed to explain some aspects of the observations. Here we propose a dynamic, diffusion-induced isotope fractionation model that assumes Si-melts are growing and ascending immiscibly in a Fe-rich bulk magma chamber. Our model offers predictions on the behavior of non-traditional stable isotope such as Fe, Mg, Si, and Li that are consistent with observations from many A-type granitoids, especially those associated with layered intrusions. Diffusion-induced isotope fractionation may be more commonly preserved in magmatic rocks than was originally predicted.

  16. Non-traditional shape GFRP rebars for concrete reinforcement

    Science.gov (United States)

    Claure, Guillermo G.

    The use of glass-fiber-reinforced-polymer (GFRP) composites as internal reinforcement (rebars) for concrete structures has proven to be an alternative to traditional steel reinforcement due to significant advantages such as magnetic transparency and, most importantly, corrosion resistance equating to durability and structural life extension. In recent years, the number of projects specifying GFRP reinforcement has increased dramatically leading the construction industry towards more sustainable practices. Typically, GFRP rebars are similar to their steel counterparts having external deformations or surface enhancements designed to develop bond to concrete, as well as having solid circular cross-sections; but lately, the worldwide composites industry has taken advantage of the pultrusion process developing GFRP rebars with non-traditional cross-sectional shapes destined to optimize their mechanical, physical, and environmental attributes. Recently, circular GFRP rebars with a hollow-core have also become available. They offer advantages such as a larger surface area for improved bond, and the use of the effective cross-sectional area that is engaged to carry load since fibers at the center of a solid cross-section are generally not fully engaged. For a complete understanding of GFRP rebar physical properties, a study on material characterization regarding a quantitative cross-sectional area analysis of different GFRP rebars was undertaken with a sample population of 190 GFRP specimens with rebar denomination ranging from #2 to #6 and with different cross-sectional shapes and surface deformations manufactured by five pultruders from around the world. The water displacement method was applied as a feasible and reliable way to conduct the investigation. In addition to developing a repeatable protocol for measuring cross-sectional area, the objectives of establishing critical statistical information related to the test methodology and recommending improvements to

  17. Enhancing Critical Thinking Skills and Writing Skills through the Variation in Non-Traditional Writing Task

    Science.gov (United States)

    Sinaga, Parlindungan; Feranie, Shelly

    2017-01-01

    The research aims to identify the impacts of embedding non-traditional writing tasks within the course of modern physics conducted to the students of Physics Education and Physics Study Programs. It employed a quasi-experimental method with the pretest-posttest control group design. The used instruments were tests on conceptual mastery, tests on…

  18. Searches for supersymmetry in R-parity violating and long-lived signatures with the ATLAS detector

    CERN Document Server

    Magerl, Veronika; The ATLAS collaboration

    2018-01-01

    R-parity violation introduces many viable signatures to the search for supersymmetry at the LHC. The decay of supersymmetric particles can produce leptons or jets, while removing the missing transverse momentum signal common to traditional supersymmetry searches. Several supersymmetric models also predict massive long-lived supersymmetric particles. Such particles may be detected through abnormal specific energy loss, appearing or disappearing tracks, displaced vertices, long time-of-flight or late calorimetric energy deposits. The talk presents recent results from searches of supersymmetry in these unusual signatures of R-parity violation and long-lived particles with the ATLAS detector.

  19. Early false-belief understanding in traditional non-Western societies.

    Science.gov (United States)

    Barrett, H Clark; Broesch, Tanya; Scott, Rose M; He, Zijing; Baillargeon, Renée; Wu, Di; Bolz, Matthias; Henrich, Joseph; Setoh, Peipei; Wang, Jianxin; Laurence, Stephen

    2013-03-22

    The psychological capacity to recognize that others may hold and act on false beliefs has been proposed to reflect an evolved, species-typical adaptation for social reasoning in humans; however, controversy surrounds the developmental timing and universality of this trait. Cross-cultural studies using elicited-response tasks indicate that the age at which children begin to understand false beliefs ranges from 4 to 7 years across societies, whereas studies using spontaneous-response tasks with Western children indicate that false-belief understanding emerges much earlier, consistent with the hypothesis that false-belief understanding is a psychological adaptation that is universally present in early childhood. To evaluate this hypothesis, we used three spontaneous-response tasks that have revealed early false-belief understanding in the West to test young children in three traditional, non-Western societies: Salar (China), Shuar/Colono (Ecuador) and Yasawan (Fiji). Results were comparable with those from the West, supporting the hypothesis that false-belief understanding reflects an adaptation that is universally present early in development.

  20. Electronic Signature Policy

    Science.gov (United States)

    Establishes the United States Environmental Protection Agency's approach to adopting electronic signature technology and best practices to ensure electronic signatures applied to official Agency documents are legally valid and enforceable

  1. Gene expression signatures for colorectal cancer microsatellite status and HNPCC

    DEFF Research Database (Denmark)

    Kruhøffer, M; Jensen, J L; Laiho, P

    2005-01-01

    The majority of microsatellite instable (MSI) colorectal cancers are sporadic, but a subset belongs to the syndrome hereditary non-polyposis colorectal cancer (HNPCC). Microsatellite instability is caused by dysfunction of the mismatch repair (MMR) system that leads to a mutator phenotype, and MSI...... of 101 stage II and III colorectal cancers (34 MSI, 67 microsatellite stable (MSS)) using high-density oligonucleotide microarrays. From these data, we constructed a nine-gene signature capable of separating the mismatch repair proficient and deficient tumours. Subsequently, we demonstrated...... is correlated to prognosis and response to chemotherapy. Gene expression signatures as predictive markers are being developed for many cancers, and the identification of a signature for MMR deficiency would be of interest both clinically and biologically. To address this issue, we profiled the gene expression...

  2. Blinding for unanticipated signatures

    NARCIS (Netherlands)

    D. Chaum (David)

    1987-01-01

    textabstractPreviously known blind signature systems require an amount of computation at least proportional to the number of signature types, and also that the number of such types be fixed in advance. These requirements are not practical in some applications. Here, a new blind signature technique

  3. Wave function of the Universe, preferred reference frame effects and metric signature transition

    International Nuclear Information System (INIS)

    Ghaffarnejad, Hossein

    2015-01-01

    Gravitational model of non-minimally coupled Brans Dicke (BD) scalar field 0 with dynamical unit time-like four vector field is used to study flat Robertson Walker (RW) cosmology in the presence of variable cosmological parameter V (ϕ) = Λϕ. Aim of the paper is to seek cosmological models which exhibit metric signature transition. The problem is studied in both classical and quantum cosmological approach with large values of BD parameter ω >> 1. Scale factor of RW metric is obtained as which describes nonsingular inflationary universe in Lorentzian signature sector. Euclidean signature sector of our solution describes a re-collapsing universe and is obtained from analytic continuation of the Lorentzian sector by exchanging . Dynamical vector field together with the BD scalar field are treated as fluid with time dependent barotropic index. They have regular (dark) matter dominance in the Euclidean (Lorentzian) sector. We solved Wheeler De Witt (WD) quantum wave equation of the cosmological system. Assuming a discrete non-zero ADM mass we obtained solutions of the WD equation as simple harmonic quantum Oscillator eigen functionals described by Hermite polynomials. Absolute values of these eigen functionals have nonzero values on the hypersurface in which metric field has signature degeneracy. Our eigen functionals describe nonzero probability of the space time with Lorentzian (Euclidean) signature for . Maximal probability corresponds to the ground state j = 0. (paper)

  4. A non-traditional multinational approach to construction inspection program

    International Nuclear Information System (INIS)

    Ram, Srinivasan; Smith, M.E.; Walker, T.F.

    2007-01-01

    The next generation of nuclear plants would be fabricated, constructed and licensed in markedly different ways than the present light water reactors. Non-traditional commercial nuclear industry suppliers, shipyards in Usa and international fabricators, would be a source to supply major components and subsystems. The codes of construction may vary depending upon the prevailing codes and standards used by the respective supplier. Such codes and standards need to be reconciled with the applicable regulations (e.g., 10 CFR 52). A Construction Inspection Program is an integral part of the Quality Assurance Measures required during the Construction Phase of the power plant. In order to achieve the stated cost and schedule goals of the new build plants, a nontraditional multi-national approach would be required. In lieu of the traditional approach of individual utility inspecting the quality of fabrication and construction, a multi-utility team approach is a method that will be discussed. Likewise, a multinational cooperative licensing approach is suggested taking advantage of inspectors of the regulatory authority where the component would be built. The multi-national approach proposed here is based on the principle of forming teaming agreements between the utilities, vendors and the regulators. For instance, rather than sending Country A's inspectors all over the world, inspectors of the regulator in Country B where a particular component is being fabricated would in fact be performing the required inspections for Country A's regulator. Similarly teaming arrangements could be set up between utilities and vendors in different countries. The required oversight for the utility or the vendor could be performed by their counterparts in the country where a particular item is being fabricated

  5. 1 CFR 18.7 - Signature.

    Science.gov (United States)

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Signature. 18.7 Section 18.7 General Provisions... PREPARATION AND TRANSMITTAL OF DOCUMENTS GENERALLY § 18.7 Signature. The original and each duplicate original... stamped beneath the signature. Initialed or impressed signatures will not be accepted. Documents submitted...

  6. Attribute-Based Digital Signature System

    NARCIS (Netherlands)

    Ibraimi, L.; Asim, Muhammad; Petkovic, M.

    2011-01-01

    An attribute-based digital signature system comprises a signature generation unit (1) for signing a message (m) by generating a signature (s) based on a user secret key (SK) associated with a set of user attributes, wherein the signature generation unit (1) is arranged for combining the user secret

  7. Reward uncertainty enhances incentive salience attribution as sign-tracking

    Science.gov (United States)

    Anselme, Patrick; Robinson, Mike J. F.; Berridge, Kent C.

    2014-01-01

    Conditioned stimuli (CSs) come to act as motivational magnets following repeated association with unconditioned stimuli (UCSs) such as sucrose rewards. By traditional views, the more reliably predictive a Pavlovian CS-UCS association, the more the CS becomes attractive. However, in some cases, less predictability might equal more motivation. Here we examined the effect of introducing uncertainty in CS-UCS association on CS strength as an attractive motivation magnet. In the present study, Experiment 1 assessed the effects of Pavlovian predictability versus uncertainty about reward probability and/or reward magnitude on the acquisition and expression of sign-tracking (ST) and goal-tracking (GT) responses in an autoshaping procedure. Results suggested that uncertainty produced strongest incentive salience expressed as sign-tracking. Experiment 2 examined whether a within-individual temporal shift from certainty to uncertainty conditions could produce a stronger CS motivational magnet when uncertainty began, and found that sign-tracking still increased after the shift. Overall, our results support earlier reports that ST responses become more pronounced in the presence of uncertainty regarding CS-UCS associations, especially when uncertainty combines both probability and magnitude. These results suggest that Pavlovian uncertainty, although diluting predictability, is still able to enhance the incentive motivational power of particular CSs. PMID:23078951

  8. PROSPECTS OF INTRODUCTION OF NON-TRADITIONAL FRUIT BERRY AND VEGETABLE CROPS IN THE CONDITIONS OF DAGESTAN

    Directory of Open Access Journals (Sweden)

    M. S. Gins

    2014-01-01

    Full Text Available June 9-13, 2014 in Makhachkala hosted XI International scientific-methodical conference on the theme: «Introduction, conservation and use of biological diversity of cultivated plants», organized by FGBNU VNIISSOK, Dagestan Research Institute for Agriculture and GBS DSC RAS. The conference was attended by scientists from Russia, CIS and foreign countries. Due to the conference Dagestan turned out to be a prime location for the cultivation of both traditional and non-traditional plants with a high content of biologically active substances, as well as a training ground for resistance tests because of the combination of mountain and plain zones.

  9. Connecting Bourdieu, Winnicott, and Honneth: Understanding the Experiences of Non-Traditional Learners through an Interdisciplinary Lens

    Science.gov (United States)

    West, Linden; Fleming, Ted; Finnegan, Fergal

    2013-01-01

    This paper connects Bourdieu's concepts of habitus, dispositions and capital with a psychosocial analysis of how Winnicott's psychoanalysis and Honneth's recognition theory can be of importance in understanding how and why non-traditional students remain in higher education. Understanding power relations in an interdisciplinary way makes…

  10. Fair quantum blind signatures

    International Nuclear Information System (INIS)

    Tian-Yin, Wang; Qiao-Yan, Wen

    2010-01-01

    We present a new fair blind signature scheme based on the fundamental properties of quantum mechanics. In addition, we analyse the security of this scheme, and show that it is not possible to forge valid blind signatures. Moreover, comparisons between this scheme and public key blind signature schemes are also discussed. (general)

  11. Cosmological transitions with changes in the signature of the metric

    International Nuclear Information System (INIS)

    Sakharov, A.D.

    1984-01-01

    It is conjectured that there exist states of the physical continuum which include regions with different signatures of the metric and that the observed Universe and an infinite number of other Universes arose as a result of quantum transitions with a change in the signature of the metric. The Lagrangian in such a theory must satisfy conditions of non-negativity in the regions with even signature. Signature here means the number of time coordinates. The induced gravitational Lagrangian in a conformally invariant theory of Kaluza-Klein type evidently satisfies this requirement and leads to effective equations of the gravitational theory of macroscopic space identical to the equations of the general theory of relativity. It is suggested that in our Universe there exist in addition to the observable (macroscopic) time dimension two or some other even number of compactified time dimensions. It is suggested that the formation of a Euclidean region in the center of a black hole or in the cosmological contraction of the Universe (if it is predetermined by the dynamics) is a possible outcome of gravitational collapse

  12. Xenon adsorption on geological media and implications for radionuclide signatures.

    Science.gov (United States)

    Paul, M J; Biegalski, S R; Haas, D A; Jiang, H; Daigle, H; Lowrey, J D

    2018-07-01

    The detection of radioactive noble gases is a primary technology for verifying compliance with the pending Comprehensive Nuclear-Test-Ban Treaty. A fundamental challenge in applying this technology for detecting underground nuclear explosions is estimating the timing and magnitude of the radionuclide signatures. While the primary mechanism for transport is advective transport, either through barometric pumping or thermally driven advection, diffusive transport in the surrounding matrix also plays a secondary role. From the study of primordial noble gas signatures, it is known that xenon has a strong physical adsorption affinity in shale formations. Given the unselective nature of physical adsorption, isotherm measurements reported here show that non-trivial amounts of xenon adsorb on a variety of media, in addition to shale. A dual-porosity model is then discussed demonstrating that sorption amplifies the diffusive uptake of an adsorbing matrix from a fracture. This effect may reduce the radioxenon signature down to approximately one-tenth, similar to primordial xenon isotopic signatures. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. A Traditional Turkish Fermented Non-Alcoholic Grape-Based Beverage, “Hardaliye”

    Directory of Open Access Journals (Sweden)

    Fatma Coskun

    2017-01-01

    Full Text Available Hardaliye is a non-alcoholic fermented beverage produced in a traditional way in Thrace, the European part of Turkey. The nutritional value of hardaliye is derived from the grapes and the fermentation process. Health benefits of hardaliye are also related to etheric oils present in mustard seeds. Hardaliye is a lactic acid fermented traditional beverage produced from grape juice and crushed grapes with the addition of different concentrations of whole/ground or heat-treated mustard seeds and sour cherry leaves. The color of hardaliye reflects the original color of the grapes and has a characteristic aroma. Dark red grape is preferred. Benzoic acid is used as preservative during production. Benzoic acid inhibits or decreases alcohol production by affecting the yeast. Fermentation occurs at room temperature for 7–10 days. If the ambient temperature is low, fermentation process can be extended until 20 days. Once fermented, the hardaliye is stored at 4 °C for three to four months. The hardaliye is consumed either fresh or aged. If it is aged, hardaliye may contain alcohol. The industrial production is just in small-scale and it must be developed. More studies are required to determine characteristic properties of hardaliye. Identification of the product properties will supply improvement for industrial production.

  14. Coping with the energy crisis: Impact assessment and potentials of non-traditional renewable energy in rural Kyrgyzstan

    International Nuclear Information System (INIS)

    Liu, Melisande F.M.; Pistorius, Till

    2012-01-01

    The Kyrgyz energy sector is characterised by a dramatic energy crisis that has deprived a substantial part of the population from access to energy. Non-traditional renewable energy sources have emerged as a promising alternative in providing basic energy services to the rural poor. Based on qualitative interview data from local households and project planners, this study sets out to assess impacts, limitations and barriers of non-traditional renewable energy projects in rural areas in Kyrgyzstan. This study argues that recent renewable energy efforts from multilateral international agencies, the private sector, and nongovernmental organisations exhibit great potential in creating tangible benefits and improving basic energy services, but have so far been inefficient in establishing and replicating sustainable and long-term energy solutions. Existing practices need to be improved by attaching greater importance to the capacities and real needs of the rural poor. The guidance of integrated programmes and policies along with alternative financing schemes and awareness-raising are urgently needed to leverage local success stories and to facilitate a sustainable energy development in rural Kyrgyzstan. - Highlights: ► We examine 11 rural households and 5 project planners in rural Kyrgyzstan. ► We assess impacts of non-traditional renewable energies compared with conventional fuels. ► Renewable energies exhibit a range of tangible benefits for rural users. ► Limitations concern performance, durability, repair, acceptance, finance and policy. ► Renewable energy is a promising alternative for rural households in Kyrgyzstan.

  15. Trace elements and naturally occurring radioactive materials in 'Non-traditional fertilizers' used in Ghana

    International Nuclear Information System (INIS)

    Assibey, E. O.

    2013-07-01

    Fertilizers have been implicated for being contaminated with toxic trace elements and naturally occurring radioactive materials (NORMs) even though they are an indispensable component of our agriculture. This phenomenon of contamination has been investigated and established world-wide in various forms of fertilizers (i.e., granular or 'traditional' type and liquid/powder or 'non-traditional type'). In Ghana, the crop sub-sector has seen a gradual rise in the importation and use of 'non-traditional fertilizers' which are applied to both the foliar parts and roots of plants. This notwithstanding, research on fertilizers has been largely skewed towards the 'traditional' types, focusing principally on the subjects of yield, effects of application and their quality. This study was, therefore, undertaken to bridge the knowledge gap by investigating the levels of trace elements and NORMs found in the 'non-traditional' fertilizers used in Ghana. The principal objective of the study was to investigate the suitability of the 'non-traditional fertilizers' for agricultural purposes with respect to trace elements and NORMs contamination. Atomic Absorption Spectrometry and instrumental Neutron Activation Analysis were employed to determine the trace elements (Cu, Zn, Fe, Na, Al, Br, Ni, Cd, As, Hg, Co, Pb, La, Mn, Si, Ca, Cl, S, K, Ba and V) and NORMs ( 238 U, 232 Th and 40 K) concentrations in thirty-nine (39) fertilizer samples taken from two major agro-input hubs in the country (Kumasi-Kejetia and Accra). Multivariate statistical analyses (cluster analysis, principal component analysis and pearson's correlation) were applied to the data obtained in order to identify possible sources of contamination, investigate sample/ parameter affinities and groupings and for fingerprinting. The toxic trace element concentrations determined in all samples were found to be in the order Fe>Cu>Co>Cd>Cr >Ni>Pb>As>Hg. The study found most of the trace elements determined to be within limits set

  16. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    International Nuclear Information System (INIS)

    Le Duy, T.D.

    2011-01-01

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  17. Molecular signatures associated with HCV-induced hepatocellular carcinoma and liver metastasis.

    Directory of Open Access Journals (Sweden)

    Valeria De Giorgi

    Full Text Available Hepatocellular carcinomas (HCCs are a heterogeneous group of tumors that differ in risk factors and genetic alterations. In Italy, particularly Southern Italy, chronic hepatitis C virus (HCV infection represents the main cause of HCC. Using high-density oligoarrays, we identified consistent differences in gene-expression between HCC and normal liver tissue. Expression patterns in HCC were also readily distinguishable from those associated with liver metastases. To characterize molecular events relevant to hepatocarcinogenesis and identify biomarkers for early HCC detection, gene expression profiling of 71 liver biopsies from HCV-related primary HCC and corresponding HCV-positive non-HCC hepatic tissue, as well as gastrointestinal liver metastases paired with the apparently normal peri-tumoral liver tissue, were compared to 6 liver biopsies from healthy individuals. Characteristic gene signatures were identified when normal tissue was compared with HCV-related primary HCC, corresponding HCV-positive non-HCC as well as gastrointestinal liver metastases. Pathway analysis classified the cellular and biological functions of the genes differentially expressed as related to regulation of gene expression and post-translational modification in HCV-related primary HCC; cellular Growth and Proliferation, and Cell-To-Cell Signaling and Interaction in HCV-related non HCC samples; Cellular Growth and Proliferation and Cell Cycle in metastasis. Also characteristic gene signatures were identified of HCV-HCC progression for early HCC diagnosis.A diagnostic molecular signature complementing conventional pathologic assessment was identified.

  18. Quantum messages with signatures forgeable in arbitrated quantum signature schemes

    International Nuclear Information System (INIS)

    Kim, Taewan; Choi, Jeong Woon; Jho, Nam-Su; Lee, Soojoon

    2015-01-01

    Even though a method to perfectly sign quantum messages has not been known, the arbitrated quantum signature scheme has been considered as one of the good candidates. However, its forgery problem has been an obstacle to the scheme becoming a successful method. In this paper, we consider one situation, which is slightly different from the forgery problem, that we use to check whether at least one quantum message with signature can be forged in a given scheme, although all the messages cannot be forged. If there are only a finite number of forgeable quantum messages in the scheme, then the scheme can be secured against the forgery attack by not sending forgeable quantum messages, and so our situation does not directly imply that we check whether the scheme is secure against the attack. However, if users run a given scheme without any consideration of forgeable quantum messages, then a sender might transmit such forgeable messages to a receiver and in such a case an attacker can forge the messages if the attacker knows them. Thus it is important and necessary to look into forgeable quantum messages. We show here that there always exists such a forgeable quantum message-signature pair for every known scheme with quantum encryption and rotation, and numerically show that there are no forgeable quantum message-signature pairs that exist in an arbitrated quantum signature scheme. (paper)

  19. Estimation of the uncertainty in wind power forecasting

    International Nuclear Information System (INIS)

    Pinson, P.

    2006-03-01

    WIND POWER experiences a tremendous development of its installed capacities in Europe. Though, the intermittence of wind generation causes difficulties in the management of power systems. Also, in the context of the deregulation of electricity markets, wind energy is penalized by its intermittent nature. It is recognized today that the forecasting of wind power for horizons up to 2/3-day ahead eases the integration of wind generation. Wind power forecasts are traditionally provided in the form of point predictions, which correspond to the most-likely power production for a given horizon. That sole information is not sufficient for developing optimal management or trading strategies. Therefore, we investigate on possible ways for estimating the uncertainty of wind power forecasts. The characteristics of the prediction uncertainty are described by a thorough study of the performance of some of the state-of-the-art approaches, and by underlining the influence of some variables e.g. level of predicted power on distributions of prediction errors. Then, a generic method for the estimation of prediction intervals is introduced. This statistical method is non-parametric and utilizes fuzzy logic concepts for integrating expertise on the prediction uncertainty characteristics. By estimating several prediction intervals at once, one obtains predictive distributions of wind power output. The proposed method is evaluated in terms of its reliability, sharpness and resolution. In parallel, we explore the potential use of ensemble predictions for skill forecasting. Wind power ensemble forecasts are obtained either by converting meteorological ensembles (from ECMWF and NCEP) to power or by applying a poor man's temporal approach. A proposal for the definition of prediction risk indices is given, reflecting the disagreement between ensemble members over a set of successive look-ahead times. Such prediction risk indices may comprise a more comprehensive signal on the expected level

  20. Error and Uncertainty in the Accuracy Assessment of Land Cover Maps

    Science.gov (United States)

    Sarmento, Pedro Alexandre Reis

    Traditionally the accuracy assessment of land cover maps is performed through the comparison of these maps with a reference database, which is intended to represent the "real" land cover, being this comparison reported with the thematic accuracy measures through confusion matrixes. Although, these reference databases are also a representation of reality, containing errors due to the human uncertainty in the assignment of the land cover class that best characterizes a certain area, causing bias in the thematic accuracy measures that are reported to the end users of these maps. The main goal of this dissertation is to develop a methodology that allows the integration of human uncertainty present in reference databases in the accuracy assessment of land cover maps, and analyse the impacts that uncertainty may have in the thematic accuracy measures reported to the end users of land cover maps. The utility of the inclusion of human uncertainty in the accuracy assessment of land cover maps is investigated. Specifically we studied the utility of fuzzy sets theory, more precisely of fuzzy arithmetic, for a better understanding of human uncertainty associated to the elaboration of reference databases, and their impacts in the thematic accuracy measures that are derived from confusion matrixes. For this purpose linguistic values transformed in fuzzy intervals that address the uncertainty in the elaboration of reference databases were used to compute fuzzy confusion matrixes. The proposed methodology is illustrated using a case study in which the accuracy assessment of a land cover map for Continental Portugal derived from Medium Resolution Imaging Spectrometer (MERIS) is made. The obtained results demonstrate that the inclusion of human uncertainty in reference databases provides much more information about the quality of land cover maps, when compared with the traditional approach of accuracy assessment of land cover maps. None

  1. The Decline of Traditional Banking Activities

    Directory of Open Access Journals (Sweden)

    Gabriela Cornelia Piciu

    2011-05-01

    Full Text Available The decline of traditional banking activities raise the issue of efficiency of financial stability, in terms ofquantitative and qualitative aspects – the increasing danger of banking failures as well as of susceptibility due toincreased propensity of banking institutions to assume additional to risks either in the form of riskier loans offer orengaging in other "non-traditional" financial activities which give a promise for greater profitability, but also higherrisks. Non-traditional activities of banking as financial products dealers (financial derivatives, generate an increasingrisks and vulnerabilities in the form of moral hazard issues. That is the reason why and these activities should beregulated as well as are the traditional activities. Challenges posed by the decline of traditional banking activities istwofold: the stability of the banking system must be maintained, while the banking system needs to be restructured toachieve financial stability in the long run. One possible way is an appropriate regulatory framework to encourage atransition period of changing the structure of banking activity(reduction of traditional activities and expanding nontraditional activities to enable banking institutions to perform a deep methodic analysis of non traditional activities,oriented to the financial banking efficiency.

  2. Medication non-adherence and uncertainty: Information-seeking and processing in the Danish LIFESTAT survey.

    Science.gov (United States)

    Kriegbaum, Margit; Lau, Sofie Rosenlund

    2017-09-23

    Statins are widely prescribed to lower cardiovascular morbidity and mortality. However, statin non-adherence is very high. The aim of this paper was to investigate reasons for stopping statin treatment in the general population and to study how aspects of information-seeking and processing is associated with statin non-adherence. This study used a population survey on 3050 Danish residents aged 45-65 years. Reasons for statin discontinuation was studied among those who were previous statin users. The association between information seeking and processing and statin discontinuation were analysed using multivariate logistical regression models. Experience of side effects and fear of side effects played an important role in the discontinuation of statin treatment. Feelings of uncertainty and confusion regarding information on statins predicted statin discontinuation. This applied to information from both mass media and from general practitioners. There was no clear pattern of information seeking and statin non-adherence. The article point to the impact of information-seeking on the decision to take cholesterol-lowering medication. This included contributions from information disseminated by media outlets. Side effects and fear of side effects should be addressed in clinical practice. Health care professionals should pay attention to emotional aspects of how information is disseminated and perceived by statin users. Copyright © 2017. Published by Elsevier Inc.

  3. Uncertainty propagation for statistical impact prediction of space debris

    Science.gov (United States)

    Hoogendoorn, R.; Mooij, E.; Geul, J.

    2018-01-01

    Predictions of the impact time and location of space debris in a decaying trajectory are highly influenced by uncertainties. The traditional Monte Carlo (MC) method can be used to perform accurate statistical impact predictions, but requires a large computational effort. A method is investigated that directly propagates a Probability Density Function (PDF) in time, which has the potential to obtain more accurate results with less computational effort. The decaying trajectory of Delta-K rocket stages was used to test the methods using a six degrees-of-freedom state model. The PDF of the state of the body was propagated in time to obtain impact-time distributions. This Direct PDF Propagation (DPP) method results in a multi-dimensional scattered dataset of the PDF of the state, which is highly challenging to process. No accurate results could be obtained, because of the structure of the DPP data and the high dimensionality. Therefore, the DPP method is less suitable for practical uncontrolled entry problems and the traditional MC method remains superior. Additionally, the MC method was used with two improved uncertainty models to obtain impact-time distributions, which were validated using observations of true impacts. For one of the two uncertainty models, statistically more valid impact-time distributions were obtained than in previous research.

  4. The Gritty: Grit and Non-traditional Doctoral Student Success

    Directory of Open Access Journals (Sweden)

    Ted M. Cross

    2014-07-01

    Full Text Available As higher education is changing to reach larger numbers of students via online modalities, the issue of student attrition and other measures of student success become increasingly important. While research has focused largely on undergraduate online students, less has been done in the area of online non-traditional doctoral student success, particularly from the student trait perspective. The concept of grit, passion and persistence for long-term goals, has been identified as an important element of the successful attainment of long-term goals. As doctoral education is a long-term goal the purpose of this study was to examine the impact of doctoral student grit scores on student success. Success was measured by examining current student GPA and other factors. Significant relationships were found between grit and current student GPA, grit and the average number of hours students spent on their program of study weekly, and grit and age. The results of this research maybe important for informing how doctoral education is structured and how students might be better prepared for doctoral work.

  5. Diffusion of non-traditional cookstoves across western Honduras: A social network analysis

    International Nuclear Information System (INIS)

    Ramirez, Sebastian; Dwivedi, Puneet; Ghilardi, Adrian; Bailis, Robert

    2014-01-01

    A third of the world's population uses inefficient biomass stoves, contributing to severe health problems, forest degradation, and climate change. Clean burning, fuel-efficient, non-traditional cookstoves (NTCS) are a promising solution; however, numerous projects fail during the diffusion process. We use social network analysis to reveal patterns driving a successful stove intervention in western Honduras. The intervention lacks formal marketing, but has spread across a wide area in just a few years. To understand the process, we map the social network of active community members who drove diffusion across a large swath of the country. We find that most ACMs heard about stoves twice before sharing information about it with others and introducing the stove into their own communities. On average, the social distance between ACMs and the project team is 3 degrees of separation. Both men and women are critical to the diffusion process, but men tend to communicate over longer distances, while women principally communicate over shorter distances. Government officials are also crucial to diffusion. Understanding how information moves through social networks and across geographic space allows us to theorize how knowledge about beneficial technologies spreads in the absence of formal marketing and inform policies for NTCS deployment worldwide. - Highlights: • We build a chain of referrals to track spread of information about non traditional cookstoves. • We find differences among gender and occupations that should inform policy. • People hear about the stoves twice before becoming suppliers of information. • Government officials play a substantial role in the diffusion. • Males play leading role in diffusion over long distances, females in short distances

  6. Quantifying Uncertainty in Instantaneous Orbital Data Products of TRMM over Indian Subcontinent

    Science.gov (United States)

    Jayaluxmi, I.; Nagesh, D.

    2013-12-01

    In the last 20 years, microwave radiometers have taken satellite images of earth's weather proving to be a valuable tool for quantitative estimation of precipitation from space. However, along with the widespread acceptance of microwave based precipitation products, it has also been recognized that they contain large uncertainties. While most of the uncertainty evaluation studies focus on the accuracy of rainfall accumulated over time (e.g., season/year), evaluation of instantaneous rainfall intensities from satellite orbital data products are relatively rare. These instantaneous products are known to potentially cause large uncertainties during real time flood forecasting studies at the watershed scale. Especially over land regions, where the highly varying land surface emissivity offer a myriad of complications hindering accurate rainfall estimation. The error components of orbital data products also tend to interact nonlinearly with hydrologic modeling uncertainty. Keeping these in mind, the present study fosters the development of uncertainty analysis using instantaneous satellite orbital data products (version 7 of 1B11, 2A25, 2A23) derived from the passive and active sensors onboard Tropical Rainfall Measuring Mission (TRMM) satellite, namely TRMM microwave imager (TMI) and Precipitation Radar (PR). The study utilizes 11 years of orbital data from 2002 to 2012 over the Indian subcontinent and examines the influence of various error sources on the convective and stratiform precipitation types. Analysis conducted over the land regions of India investigates three sources of uncertainty in detail. These include 1) Errors due to improper delineation of rainfall signature within microwave footprint (rain/no rain classification), 2) Uncertainty offered by the transfer function linking rainfall with TMI low frequency channels and 3) Sampling errors owing to the narrow swath and infrequent visits of TRMM sensors. Case study results obtained during the Indian summer

  7. Performance of maximum likelihood mixture models to estimate nursery habitat contributions to fish stocks: a case study on sea bream Sparus aurata

    Directory of Open Access Journals (Sweden)

    Edwin J. Niklitschek

    2016-10-01

    Full Text Available Background Mixture models (MM can be used to describe mixed stocks considering three sets of parameters: the total number of contributing sources, their chemical baseline signatures and their mixing proportions. When all nursery sources have been previously identified and sampled for juvenile fish to produce baseline nursery-signatures, mixing proportions are the only unknown set of parameters to be estimated from the mixed-stock data. Otherwise, the number of sources, as well as some/all nursery-signatures may need to be also estimated from the mixed-stock data. Our goal was to assess bias and uncertainty in these MM parameters when estimated using unconditional maximum likelihood approaches (ML-MM, under several incomplete sampling and nursery-signature separation scenarios. Methods We used a comprehensive dataset containing otolith elemental signatures of 301 juvenile Sparus aurata, sampled in three contrasting years (2008, 2010, 2011, from four distinct nursery habitats. (Mediterranean lagoons Artificial nursery-source and mixed-stock datasets were produced considering: five different sampling scenarios where 0–4 lagoons were excluded from the nursery-source dataset and six nursery-signature separation scenarios that simulated data separated 0.5, 1.5, 2.5, 3.5, 4.5 and 5.5 standard deviations among nursery-signature centroids. Bias (BI and uncertainty (SE were computed to assess reliability for each of the three sets of MM parameters. Results Both bias and uncertainty in mixing proportion estimates were low (BI ≤ 0.14, SE ≤ 0.06 when all nursery-sources were sampled but exhibited large variability among cohorts and increased with the number of non-sampled sources up to BI = 0.24 and SE = 0.11. Bias and variability in baseline signature estimates also increased with the number of non-sampled sources, but tended to be less biased, and more uncertain than mixing proportion ones, across all sampling scenarios (BI < 0.13, SE < 0

  8. Performance of maximum likelihood mixture models to estimate nursery habitat contributions to fish stocks: a case study on sea bream Sparus aurata

    Science.gov (United States)

    Darnaude, Audrey M.

    2016-01-01

    Background Mixture models (MM) can be used to describe mixed stocks considering three sets of parameters: the total number of contributing sources, their chemical baseline signatures and their mixing proportions. When all nursery sources have been previously identified and sampled for juvenile fish to produce baseline nursery-signatures, mixing proportions are the only unknown set of parameters to be estimated from the mixed-stock data. Otherwise, the number of sources, as well as some/all nursery-signatures may need to be also estimated from the mixed-stock data. Our goal was to assess bias and uncertainty in these MM parameters when estimated using unconditional maximum likelihood approaches (ML-MM), under several incomplete sampling and nursery-signature separation scenarios. Methods We used a comprehensive dataset containing otolith elemental signatures of 301 juvenile Sparus aurata, sampled in three contrasting years (2008, 2010, 2011), from four distinct nursery habitats. (Mediterranean lagoons) Artificial nursery-source and mixed-stock datasets were produced considering: five different sampling scenarios where 0–4 lagoons were excluded from the nursery-source dataset and six nursery-signature separation scenarios that simulated data separated 0.5, 1.5, 2.5, 3.5, 4.5 and 5.5 standard deviations among nursery-signature centroids. Bias (BI) and uncertainty (SE) were computed to assess reliability for each of the three sets of MM parameters. Results Both bias and uncertainty in mixing proportion estimates were low (BI ≤ 0.14, SE ≤ 0.06) when all nursery-sources were sampled but exhibited large variability among cohorts and increased with the number of non-sampled sources up to BI = 0.24 and SE = 0.11. Bias and variability in baseline signature estimates also increased with the number of non-sampled sources, but tended to be less biased, and more uncertain than mixing proportion ones, across all sampling scenarios (BI nursery signatures improved reliability

  9. The Chiral Index of the Fermionic Signature Operator

    OpenAIRE

    Finster, Felix

    2014-01-01

    We define an index of the fermionic signature operator on even-dimensional globally hyperbolic spin manifolds of finite lifetime. The invariance of the index under homotopies is studied. The definition is generalized to causal fermion systems with a chiral grading. We give examples of space-times and Dirac operators thereon for which our index is non-trivial.

  10. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  11. New particles and their experimental signatures

    International Nuclear Information System (INIS)

    Ellis, J.; Gelmini, G.; Kowalski, H.

    1984-08-01

    This report summarizes work done by our theoretical working group on exotic particles before, during and since the Lausanne meeting. We discuss the motivations, rates and experimental signatures for new physics and new particles in the 1 TeV mass range. Section 2 reviews some of the motivations for expecting new physics in this range. Of particular interest is the physics of gauge symmetry breaking. In section 3 we discuss the rates and experimental signatures of new particles predicted by theoretical models of gauge symmetry breaking, notably the Higgs boson supersymmetry and technicolour. Among the signatures we discuss are multiple Wsup(+-) and/or Z 0 events (for the Higgs), missing transverse energy (for supersymmetry) and multiple anti tt events (for the Higgs and technicolour). We provide many examples of final state differential distributions in rapidity and Psub(T), particularly for Higgses and for supersymmetry. We also analyse some physics backgrounds to the new particle production processes which interest us. Examples include W + W - , Z 0 Z 0 , W(anti tt) and (anti tt) production as backgrounds to Higgs production. However, we do not consider in detail non-physics backgrounds such as the jet fluctuation background to missing energy signals for supersymmetry production. Section 4 summarizes our preliminary conclusions on the observability at a high energy hadron collider of the new particles studied in this report. (orig./HSI)

  12. A qualitative signature for early diagnosis of hepatocellular carcinoma based on relative expression orderings.

    Science.gov (United States)

    Ao, Lu; Zhang, Zimei; Guan, Qingzhou; Guo, Yating; Guo, You; Zhang, Jiahui; Lv, Xingwei; Huang, Haiyan; Zhang, Huarong; Wang, Xianlong; Guo, Zheng

    2018-04-23

    Currently, using biopsy specimens to confirm suspicious liver lesions of early hepatocellular carcinoma are not entirely reliable because of insufficient sampling amount and inaccurate sampling location. It is necessary to develop a signature to aid early hepatocellular carcinoma diagnosis using biopsy specimens even when the sampling location is inaccurate. Based on the within-sample relative expression orderings of gene pairs, we identified a simple qualitative signature to distinguish both hepatocellular carcinoma and adjacent non-tumour tissues from cirrhosis tissues of non-hepatocellular carcinoma patients. A signature consisting of 19 gene pairs was identified in the training data sets and validated in 2 large collections of samples from biopsy and surgical resection specimens. For biopsy specimens, 95.7% of 141 hepatocellular carcinoma tissues and all (100%) of 108 cirrhosis tissues of non-hepatocellular carcinoma patients were correctly classified. Especially, all (100%) of 60 hepatocellular carcinoma adjacent normal tissues and 77.5% of 80 hepatocellular carcinoma adjacent cirrhosis tissues were classified to hepatocellular carcinoma. For surgical resection specimens, 99.7% of 733 hepatocellular carcinoma specimens were correctly classified to hepatocellular carcinoma, while 96.1% of 254 hepatocellular carcinoma adjacent cirrhosis tissues and 95.9% of 538 hepatocellular carcinoma adjacent normal tissues were classified to hepatocellular carcinoma. In contrast, 17.0% of 47 cirrhosis from non-hepatocellular carcinoma patients waiting for liver transplantation were classified to hepatocellular carcinoma, indicating that some patients with long-lasting cirrhosis could have already gained hepatocellular carcinoma characteristics. The signature can distinguish both hepatocellular carcinoma tissues and tumour-adjacent tissues from cirrhosis tissues of non-hepatocellular carcinoma patients even using inaccurately sampled biopsy specimens, which can aid early

  13. Is the investment-uncertainty relationship nonlinear? An empirical analysis for the Netherlands

    NARCIS (Netherlands)

    Bo, H; Lensin, R

    We examine the investment-uncertainty relationship for a panel of Dutch non-financial firms. The system generalized method of moments (GMM) estimates suggest that the effect of uncertainty on investment is nonlinear: for low levels of uncertainty an increase in uncertainty has a positive effect on

  14. Gamma signatures of the C-BORD Tagged Neutron Inspection System

    Directory of Open Access Journals (Sweden)

    Sardet A.

    2018-01-01

    Full Text Available In the frame of C-BORD project (H2020 program of the EU, a Rapidly relocatable Tagged Neutron Inspection System (RRTNIS is being developed to non-intrusively detect explosives, chemical threats, and other illicit goods in cargo containers. Material identification is performed through gamma spectroscopy, using twenty NaI detectors and four LaBr3 detectors, to determine the different elements composing the inspected item from their specific gamma signatures induced by fast neutrons. This is performed using an unfolding algorithm to decompose the energy spectrum of a suspect item, selected by X-ray radiography and on which the RRTNIS inspection is focused, on a database of pure element gamma signatures. This paper reports on simulated signatures for the NaI and LaBr3 detectors, constructed using the MCNP6 code. First experimental spectra of a few elements of interest are also presented.

  15. Radon measurements: the sources of uncertainties

    International Nuclear Information System (INIS)

    Zhukovsky, Michael; Onischenko, Alexandra; Bastrikov, Vladislav

    2008-01-01

    Full text: Radon measurements are quite complicated process and the correct estimation of uncertainties is very important. The sources of uncertainties for grab sampling, short term measurements (charcoal canisters), long term measurements (track detectors) and retrospective measurements (surface traps) are analyzed. The main sources of uncertainties for grab sampling measurements are: systematic bias of reference equipment; random Poisson and non-Poisson errors during calibration; random Poisson and non-Poisson errors during measurements. These sources are also common both for short term measurements (charcoal canisters) and long term measurements (track detectors). Usually during the calibration the high radon concentrations are used (1-5 kBq/m 3 ) and the Poisson random error rarely exceed some percents. Nevertheless the dispersion of measured values even during the calibration usually exceeds the Poisson dispersion expected on the basis of counting statistic. The origins of such non-Poisson random errors during calibration are different for different kinds of instrumental measurements. At present not all sources of non-Poisson random errors are trustworthy identified. The initial calibration accuracy of working devices rarely exceeds the value 20%. The real radon concentrations usually are in the range from some tens to some hundreds Becquerel per cubic meter and for low radon levels Poisson random error can reach up to 20%. The random non-Poisson errors and residual systematic biases are depends on the kind of measurement technique and the environmental conditions during radon measurements. For charcoal canisters there are additional sources of the measurement errors due to influence of air humidity and the variations of radon concentration during the canister exposure. The accuracy of long term measurements by track detectors will depend on the quality of chemical etching after exposure and the influence of season radon variations. The main sources of

  16. Milk-based traditional Turkish desserts

    Directory of Open Access Journals (Sweden)

    Tulay Ozcan

    2009-12-01

    Full Text Available Traditional foods are the reflection of cultural inheritance and affect the lifestyle habits. Culture can be viewed as a system of socially transmitted patterns of behaviour that characterises a particular group. Despite the fact of globalisation, these are key elements to accurately estimate a population’s dietary patterns and how these have been shaped through time. In Turkey, a meal with family or friends traditionally ends with a dessert, which is a testimony to the hosts’ hospitality or to the housewife’s love and affection for her husband and children, since sweets and desserts are important elements of Turkish cuisine. However, the consciousnesses of nutrition and healthy eating, due to rapid change in popular life style and dietary patterns, has contributed to the increased interest in traditional foods with potential health benefits, with increased uncertainty for dessert consumption. Dairy desserts are extensively consumed due to their nutritive and sensoric characteristics. Some of traditional dairy desserts are Mustafakemalpasa, Gullac, Kazandibi, Hosmerim and Tavukgogsu, which are mainly made from milk or fresh cheese, and the current paper discusses their manufacturing processes and composition.

  17. Application of a new importance measure for parametric uncertainty in PSA

    International Nuclear Information System (INIS)

    Poern, K.

    1997-04-01

    The traditional approach to uncertainty analysis in PSA, with propagation of basic event uncertainties through the PSA model, generates as an end product the uncertainty distribution of the top event frequency. This distribution, however, is not of much value for the decision maker. Most decisions are made under uncertainty. What the decision maker needs, to enhance the decision-making quality, is an adequate uncertainty importance measure that provides the decision maker with an indication of on what basic parameters it would be most valuable - as to the quality of the decision making in the specific situation - to procure more information. This paper will describe an application of a new measure of uncertainty importance that has been developed in the ongoing joint Nordic project NKS/RAK-1:3. The measure is called ''decision oriented'' because it is defined within a decision theoretic framework. It is defined as the expected value of a certain additional information about each basic parameter, and utilizes both the system structure and the complete uncertainty distributions of the basic parameters. The measure provides the analyst and the decision maker with a diagnostic information pointing to parameters on which more information would be most valuable to procure in order to enhance the decision-making quality. This uncertainty importance measure must not be confused with the more well-known, traditional importance measures of various kinds that are used to depict the contributions of each basic event or parameter (represented by point values) to the top event frequency. In this study the new measure is practically demonstrated through a real application on the top event: Water overflow through steam generator safety valves caused by steam generator tube rupture. This application object is one of the event sequences that the fore mentioned Nordic project has analysed with an integrated approach. The project has been funded by the Swedish Nuclear Power

  18. Signatures of selection in tilapia revealed by whole genome resequencing.

    Science.gov (United States)

    Xia, Jun Hong; Bai, Zhiyi; Meng, Zining; Zhang, Yong; Wang, Le; Liu, Feng; Jing, Wu; Wan, Zi Yi; Li, Jiale; Lin, Haoran; Yue, Gen Hua

    2015-09-16

    Natural selection and selective breeding for genetic improvement have left detectable signatures within the genome of a species. Identification of selection signatures is important in evolutionary biology and for detecting genes that facilitate to accelerate genetic improvement. However, selection signatures, including artificial selection and natural selection, have only been identified at the whole genome level in several genetically improved fish species. Tilapia is one of the most important genetically improved fish species in the world. Using next-generation sequencing, we sequenced the genomes of 47 tilapia individuals. We identified a total of 1.43 million high-quality SNPs and found that the LD block sizes ranged from 10-100 kb in tilapia. We detected over a hundred putative selective sweep regions in each line of tilapia. Most selection signatures were located in non-coding regions of the tilapia genome. The Wnt signaling, gonadotropin-releasing hormone receptor and integrin signaling pathways were under positive selection in all improved tilapia lines. Our study provides a genome-wide map of genetic variation and selection footprints in tilapia, which could be important for genetic studies and accelerating genetic improvement of tilapia.

  19. LDRD Final Report: Capabilities for Uncertainty in Predictive Science.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric Todd; Eldred, Michael S; Salinger, Andrew G.; Webster, Clayton G.

    2008-10-01

    Predictive simulation of systems comprised of numerous interconnected, tightly coupled com-ponents promises to help solve many problems of scientific and national interest. Howeverpredictive simulation of such systems is extremely challenging due to the coupling of adiverse set of physical and biological length and time scales. This report investigates un-certainty quantification methods for such systems that attempt to exploit their structure togain computational efficiency. The traditional layering of uncertainty quantification aroundnonlinear solution processes is inverted to allow for heterogeneous uncertainty quantificationmethods to be applied to each component in a coupled system. Moreover this approachallows stochastic dimension reduction techniques to be applied at each coupling interface.The mathematical feasibility of these ideas is investigated in this report, and mathematicalformulations for the resulting stochastically coupled nonlinear systems are developed.3

  20. Medication non-adherence and uncertainty

    DEFF Research Database (Denmark)

    Kriegbaum, Margit; Lau, Sofie Rosenlund

    2017-01-01

    BACKGROUND: Statins are widely prescribed to lower cardiovascular morbidity and mortality. However, statin non-adherence is very high. PURPOSE: The aim of this paper was to investigate reasons for stopping statin treatment in the general population and to study how aspects of information-seeking ......BACKGROUND: Statins are widely prescribed to lower cardiovascular morbidity and mortality. However, statin non-adherence is very high. PURPOSE: The aim of this paper was to investigate reasons for stopping statin treatment in the general population and to study how aspects of information......-seeking and processing is associated with statin non-adherence. METHODS: This study used a population survey on 3050 Danish residents aged 45-65 years. Reasons for statin discontinuation was studied among those who were previous statin users. The association between information seeking and processing and statin...... from information disseminated by media outlets. Side effects and fear of side effects should be addressed in clinical practice. Health care professionals should pay attention to emotional aspects of how information is disseminated and perceived by statin users....

  1. Robustness for slope stability modelling under deep uncertainty

    Science.gov (United States)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  2. 76 FR 30542 - Adult Signature Services

    Science.gov (United States)

    2011-05-26

    ... POSTAL SERVICE 39 CFR Part 111 Adult Signature Services AGENCY: Postal Service\\TM\\. ACTION: Final..., Domestic Mail Manual (DMM[supreg]) 503.8, to add a new extra service called Adult Signature. This new service has two available options: Adult Signature Required and Adult Signature Restricted Delivery. DATES...

  3. Traditional low-alcoholic and non-alcoholic fermented beverages consumed in European countries: a neglected food group.

    Science.gov (United States)

    Baschali, Aristea; Tsakalidou, Effie; Kyriacou, Adamantini; Karavasiloglou, Nena; Matalas, Antonia-Leda

    2017-06-01

    Fermented beverages hold a long tradition and contribution to the nutrition of many societies and cultures worldwide. Traditional fermentation has been empirically developed in ancient times as a process of raw food preservation and at the same time production of new foods with different sensorial characteristics, such as texture, flavour and aroma, as well as nutritional value. Low-alcoholic fermented beverages (LAFB) and non-alcoholic fermented beverages (NAFB) represent a subgroup of fermented beverages that have received rather little attention by consumers and scientists alike, especially with regard to their types and traditional uses in European societies. A literature review was undertaken and research articles, review papers and textbooks were searched in order to retrieve data regarding the dietary role, nutrient composition, health benefits and other relevant aspects of diverse ethnic LAFB and NAFB consumed by European populations. A variety of traditional LAFB and NAFB consumed in European regions, such as kefir, kvass, kombucha and hardaliye, are presented. Milk-based LAFB and NAFB are also available on the market, often characterised as 'functional' foods on the basis of their probiotic culture content. Future research should focus on elucidating the dietary role and nutritional value of traditional and 'functional' LAFB and NAFB, their potential health benefits and consumption trends in European countries. Such data will allow for LAFB and NAFB to be included in national food composition tables.

  4. Different importance of the volatile and non-volatile fractions of an olfactory signature for individual social recognition in rats versus mice and short-term versus long-term memory.

    Science.gov (United States)

    Noack, Julia; Richter, Karin; Laube, Gregor; Haghgoo, Hojjat Allah; Veh, Rüdiger W; Engelmann, Mario

    2010-11-01

    When tested in the olfactory cued social recognition/discrimination test, rats and mice differ in their retention of a recognition memory for a previously encountered conspecific juvenile: Rats are able to recognize a given juvenile for approximately 45 min only whereas mice show not only short-term, but also long-term recognition memory (≥ 24 h). Here we modified the social recognition/social discrimination procedure to investigate the neurobiological mechanism(s) underlying the species differences. We presented a conspecific juvenile repeatedly to the experimental subjects and monitored the investigation duration as a measure for recognition. Presentation of only the volatile fraction of the juvenile olfactory signature was sufficient for both short- and long-term recognition in mice but not rats. Applying additional volatile, mono-molecular odours to the "to be recognized" juveniles failed to affect short-term memory in both species, but interfered with long-term recognition in mice. Finally immunocytochemical analysis of c-Fos as a marker for cellular activation, revealed that juvenile exposure stimulated areas involved in the processing of olfactory signals in both the main and the accessory olfactory bulb in mice. In rats, we measured an increased c-Fos synthesis almost exclusively in cells of the accessory olfactory bulb. Our data suggest that the species difference in the retention of social recognition memory is based on differences in the processing of the volatile versus non-volatile fraction of the individuals' olfactory signature. The non-volatile fraction is sufficient for retaining a short-term social memory only. Long-term social memory - as observed in mice - requires a processing of both the volatile and non-volatile fractions of the olfactory signature. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. A host transcriptional signature for presymptomatic detection of infection in humans exposed to influenza H1N1 or H3N2.

    Directory of Open Access Journals (Sweden)

    Christopher W Woods

    Full Text Available There is great potential for host-based gene expression analysis to impact the early diagnosis of infectious diseases. In particular, the influenza pandemic of 2009 highlighted the challenges and limitations of traditional pathogen-based testing for suspected upper respiratory viral infection. We inoculated human volunteers with either influenza A (A/Brisbane/59/2007 (H1N1 or A/Wisconsin/67/2005 (H3N2, and assayed the peripheral blood transcriptome every 8 hours for 7 days. Of 41 inoculated volunteers, 18 (44% developed symptomatic infection. Using unbiased sparse latent factor regression analysis, we generated a gene signature (or factor for symptomatic influenza capable of detecting 94% of infected cases. This gene signature is detectable as early as 29 hours post-exposure and achieves maximal accuracy on average 43 hours (p = 0.003, H1N1 and 38 hours (p-value = 0.005, H3N2 before peak clinical symptoms. In order to test the relevance of these findings in naturally acquired disease, a composite influenza A signature built from these challenge studies was applied to Emergency Department patients where it discriminates between swine-origin influenza A/H1N1 (2009 infected and non-infected individuals with 92% accuracy. The host genomic response to Influenza infection is robust and may provide the means for detection before typical clinical symptoms are apparent.

  6. Adaptive polynomial chaos techniques for uncertainty quantification of a gas cooled fast reactor transient

    International Nuclear Information System (INIS)

    Perko, Z.; Gilli, L.; Lathouwers, D.; Kloosterman, J. L.

    2013-01-01

    Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used technique proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)

  7. ALPs effective field theory and collider signatures

    Energy Technology Data Exchange (ETDEWEB)

    Brivio, I. [Universidad Autonoma de Madrid, Departamento de Fisica Teorica y Instituto de Fisica Teorica, IFT-UAM/CSIC, Madrid (Spain); University of Copenhagen, Niels Bohr International Academy, Copenhagen (Denmark); Gavela, M.B.; Merlo, L.; Rey, R. del [Universidad Autonoma de Madrid, Departamento de Fisica Teorica y Instituto de Fisica Teorica, IFT-UAM/CSIC, Madrid (Spain); Mimasu, K. [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom); Universite Catholique de Louvain, Centre for Cosmology, Particle Physics and Phenomenology (CP3), Louvain-la-Neuve (Belgium); No, J.M. [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom); King' s College London, Department of Physics, London (United Kingdom); Sanz, V. [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom)

    2017-08-15

    We study the leading effective interactions between the Standard Model fields and a generic singlet CP-odd (pseudo-) Goldstone boson. Two possible frameworks for electroweak symmetry breaking are considered: linear and non-linear. For the latter case, the basis of leading effective operators is determined and compared with that for the linear expansion. Associated phenomenological signals at colliders are explored for both scenarios, deriving new bounds and analyzing future prospects, including LHC and High Luminosity LHC sensitivities. Mono-Z, mono-W, W-photon plus missing energy and on-shell top final states are most promising signals expected in both frameworks. In addition, non-standard Higgs decays and mono-Higgs signatures are especially prominent and expected to be dominant in non-linear realisations. (orig.)

  8. Syrian Refugees: Are They a Non Traditional Threat to Water Supplies in Lebanon and Jordan

    Science.gov (United States)

    2016-09-01

    effects of Syrian refugees on the water supplies of each country as a non-traditional security threat. Political stability is the ultimate goal of each...security.html. 11 against Syrians sets the stage for political instability because the Syrians represent an increasing portion of the population, if...of political instability could send shockwaves through the region and drastically alter U.S. foreign policy in the Middle East. Though the stakes

  9. Brazil’s fight against narcotraffic in the border with Colombia. An approach to the restrains of non-traditional threats over foreign policy

    Directory of Open Access Journals (Sweden)

    Emilse Calderón

    2014-05-01

    Full Text Available In the post-Cold War international scenario, the non-traditional nature of security threats conditions the states’ foreign policies. An example of the above is the policy employed by Brazil regarding the border shared with Colombia regarding the development that narcotraffic has been having since the end of the 20th century. Therefore, this article proposes a brief analysis around the influence exercised by the non-traditional nature of the drug traffic threat over the design of Brazilian foreign policy between 1999 and 2010.

  10. New Inequalities and Uncertainty Relations on Linear Canonical Transform Revisit

    Directory of Open Access Journals (Sweden)

    Xu Guanlei

    2009-01-01

    Full Text Available The uncertainty principle plays an important role in mathematics, physics, signal processing, and so on. Firstly, based on definition of the linear canonical transform (LCT and the traditional Pitt's inequality, one novel Pitt's inequality in the LCT domains is obtained, which is connected with the LCT parameters a and b. Then one novel logarithmic uncertainty principle is derived from this novel Pitt's inequality in the LCT domains, which is associated with parameters of the two LCTs. Secondly, from the relation between the original function and LCT, one entropic uncertainty principle and one Heisenberg's uncertainty principle in the LCT domains are derived, which are associated with the LCT parameters a and b. The reason why the three lower bounds are only associated with LCT parameters a and b and independent of c and d is presented. The results show it is possible that the bounds tend to zeros.

  11. Identification of a Genomic Signature Predicting for Recurrence in Early Stage Ovarian Cancer

    Science.gov (United States)

    2015-12-01

    do it. Thus, instead of simply sequencing all the FFPE samples, we used 10 tumor samples (5 recurrent and 5 non recurrent ) to test sequencing and...Award Number: W81XWH-12-1-0521 TITLE: Identification of a Genomic Signature Predicting for Recurrence in Early-Stage Ovarian Cancer PRINCIPAL...4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER 5b. GRANT NUMBER W81XWH-12-1-0521 Identification of a Genomic Signature Predicting for Recurrence in

  12. Lesson 6: Signature Validation

    Science.gov (United States)

    Checklist items 13 through 17 are grouped under the Signature Validation Process, and represent CROMERR requirements that the system must satisfy as part of ensuring that electronic signatures it receives are valid.

  13. 21 CFR 11.50 - Signature manifestations.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Signature manifestations. 11.50 Section 11.50 Food... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.50 Signature manifestations. (a) Signed electronic...: (1) The printed name of the signer; (2) The date and time when the signature was executed; and (3...

  14. The influence of out-of-institution environments on the university schooling project of non-traditional students in Uganda

    NARCIS (Netherlands)

    Tumuheki, Peace Buhwamatsiko; Zeelen, Jacobus; Openjuru, George L.

    2018-01-01

    Participation and integration of non-traditional students (NTS) in university education is influenced by factors within the institution and those external to the institution, including participants’ self-perceptions and dispositions. The objective of this qualitative study is to draw from the

  15. Prediction of disease-free survival by the PET/CT radiomic signature in non-small cell lung cancer patients undergoing surgery

    Energy Technology Data Exchange (ETDEWEB)

    Kirienko, Margarita; Fogliata, Antonella; Sollini, Martina [Humanitas University, Department of Biomedical Sciences, Pieve Emanuele, Milan (Italy); Cozzi, Luca [Humanitas Clinical and Research Center, Radiotherapy and Radiosurgery, Rozzano, Milan (Italy); Antunovic, Lidija [Humanitas Clinical and Research Center, Nuclear Medicine, Rozzano, Milan (Italy); Lozza, Lisa [Orobix Srl, Bergamo (Italy); Voulaz, Emanuele [Humanitas Clinical and Research Center, Thoracic Surgery, Rozzano, Milan (Italy); Rossi, Alexia [Humanitas University, Department of Biomedical Sciences, Pieve Emanuele, Milan (Italy); Humanitas Clinical and Research Center, Radiology, Rozzano, Milan (Italy); Chiti, Arturo [Humanitas University, Department of Biomedical Sciences, Pieve Emanuele, Milan (Italy); Humanitas Clinical and Research Center, Nuclear Medicine, Rozzano, Milan (Italy)

    2018-02-15

    Radiomic features derived from the texture analysis of different imaging modalities e show promise in lesion characterisation, response prediction, and prognostication in lung cancer patients. The present study aimed to identify an images-based radiomic signature capable of predicting disease-free survival (DFS) in non-small cell lung cancer (NSCLC) patients undergoing surgery. A cohort of 295 patients was selected. Clinical parameters (age, sex, histological type, tumour grade, and stage) were recorded for all patients. The endpoint of this study was DFS. Both computed tomography (CT) and fluorodeoxyglucose positron emission tomography (PET) images generated from the PET/CT scanner were analysed. Textural features were calculated using the LifeX package. Statistical analysis was performed using the R platform. The datasets were separated into two cohorts by random selection to perform training and validation of the statistical models. Predictors were fed into a multivariate Cox proportional hazard regression model and the receiver operating characteristic (ROC) curve as well as the corresponding area under the curve (AUC) were computed for each model built. The Cox models that included radiomic features for the CT, the PET, and the PET+CT images resulted in an AUC of 0.75 (95%CI: 0.65-0.85), 0.68 (95%CI: 0.57-0.80), and 0.68 (95%CI: 0.58-0.74), respectively. The addition of clinical predictors to the Cox models resulted in an AUC of 0.61 (95%CI: 0.51-0.69), 0.64 (95%CI: 0.53-0.75), and 0.65 (95%CI: 0.50-0.72) for the CT, the PET, and the PET+CT images, respectively. A radiomic signature, for either CT, PET, or PET/CT images, has been identified and validated for the prediction of disease-free survival in patients with non-small cell lung cancer treated by surgery. (orig.)

  16. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  17. Improving LiDAR Biomass Model Uncertainty through Non-Destructive Allometry and Plot-level 3D Reconstruction with Terrestrial Laser Scanning

    Science.gov (United States)

    Stovall, A. E.; Shugart, H. H., Jr.

    2017-12-01

    Future NASA and ESA satellite missions plan to better quantify global carbon through detailed observations of forest structure, but ultimately rely on uncertain ground measurement approaches for calibration and validation. A significant amount of the uncertainty in estimating plot-level biomass can be attributed to inadequate and unrepresentative allometric relationships used to convert plot-level tree measurements to estimates of aboveground biomass. These allometric equations are known to have high errors and biases, particularly in carbon rich forests because they were calibrated with small and often biased samples of destructively harvested trees. To overcome this issue, a non-destructive methodology for estimating tree and plot-level biomass has been proposed through the use of Terrestrial Laser Scanning (TLS). We investigated the potential for using TLS as a ground validation approach in LiDAR-based biomass mapping though virtual plot-level tree volume reconstruction and biomass estimation. Plot-level biomass estimates were compared on the Virginia-based Smithsonian Conservation Biology Institute's SIGEO forest with full 3D reconstruction, TLS allometry, and Jenkins et al. (2003) allometry. On average, full 3D reconstruction ultimately provided the lowest uncertainty estimate of plot-level biomass (9.6%), followed by TLS allometry (16.9%) and the national equations (20.2%). TLS offered modest improvements to the airborne LiDAR empirical models, reducing RMSE from 16.2% to 14%. Our findings suggest TLS plot acquisitions and non-destructive allometry can play a vital role for reducing uncertainty in calibration and validation data for biomass mapping in the upcoming NASA and ESA missions.

  18. Uncertainty and the de Finetti tables

    OpenAIRE

    Baratgin , Jean; Over , David; Politzer , Guy

    2013-01-01

    International audience; The new paradigm in the psychology of reasoning adopts a Bayesian, or prob-abilistic, model for studying human reasoning. Contrary to the traditional binary approach based on truth functional logic, with its binary values of truth and falsity, a third value that represents uncertainty can be introduced in the new paradigm. A variety of three-valued truth table systems are available in the formal literature, including one proposed by de Finetti. We examine the descripti...

  19. 21 CFR 11.70 - Signature/record linking.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Signature/record linking. 11.70 Section 11.70 Food... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.70 Signature/record linking. Electronic signatures and handwritten signatures executed to electronic records shall be linked to their respective...

  20. Non-Traditional Security Threats in the Border Areas: Terrorism, Piracy, Environmental Degradation in Southeast Asian Maritime Domain

    Science.gov (United States)

    Dabova, E. L.

    2013-11-01

    In addition to facilitating peaceful trade and economic development, sovereign territory, territorial waters and international waters are being used by various criminal groups that pose threats to governments, businesses and civilian population in Southeast Asia. Nonstate criminal maritime activities were not receiving appropriate attention as they were overshadowed by traditional military security challenges. Yet more and more frequently, the non-traditional actors challenge lines of communication, jeopardize access to strategic resources, complicate traditional defence tasks, and harm the environment. Understanding the nature of non-traditional threats, and the ways to combat them, requires international legal, historical and political science analysis within a united problem-oriented approach. A fair critique to pure interest, power and knowledge -based theories of regime formation was developed by E.K. Leonard's1, who explained the evolution of the international system from the global governance perspective. The present study is based on the premise that pure nation-state approaches are incapable of providing a theoretical ground for addressing the growing influence of international criminal networks in South East Asia. From an international relations theory perspective, the author of this study agrees with D.Snidal2 that the hegemonic stability theory has "limits" and is insufficient in describing modern challenges to sustainable international security regime, including non-traditional threats, where collective action is more efficient from an interest and capability standpoint. At the same time the author of this study does not share the viewpoint on "marginalization"3 of international law in current international order due to its fragmentation and regionalization4 and "global power shifts"5 . The United Nations, as a global institution at the top of the vertical hierarchy of international legal order, and the EU as an example of "self-contained" regime along

  1. Optimal Wind Power Uncertainty Intervals for Electricity Market Operation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Ying; Zhou, Zhi; Botterud, Audun; Zhang, Kaifeng

    2018-01-01

    It is important to select an appropriate uncertainty level of the wind power forecast for power system scheduling and electricity market operation. Traditional methods hedge against a predefined level of wind power uncertainty, such as a specific confidence interval or uncertainty set, which leaves the questions of how to best select the appropriate uncertainty levels. To bridge this gap, this paper proposes a model to optimize the forecast uncertainty intervals of wind power for power system scheduling problems, with the aim of achieving the best trade-off between economics and reliability. Then we reformulate and linearize the models into a mixed integer linear programming (MILP) without strong assumptions on the shape of the probability distribution. In order to invest the impacts on cost, reliability, and prices in a electricity market, we apply the proposed model on a twosettlement electricity market based on a six-bus test system and on a power system representing the U.S. state of Illinois. The results show that the proposed method can not only help to balance the economics and reliability of the power system scheduling, but also help to stabilize the energy prices in electricity market operation.

  2. Uncertainty and learning in a strategic environment. Global climate change

    International Nuclear Information System (INIS)

    Baker, Erin

    2005-01-01

    Global climate change is rife with uncertainties. Yet, we can expect to resolve much of this uncertainty in the next 100 years or so. Therefore, current actions should reflect the value of flexibility. Nevertheless, most models of climate change, particularly game-theoretic models, abstract from uncertainty. A model of the impacts of uncertainty and learning in a non-cooperative game shows that the level of correlation of damages across countries is crucial for determining optimal policy

  3. Expressiveness considerations of XML signatures

    DEFF Research Database (Denmark)

    Jensen, Meiko; Meyer, Christopher

    2011-01-01

    XML Signatures are used to protect XML-based Web Service communication against a broad range of attacks related to man-in-the-middle scenarios. However, due to the complexity of the Web Services specification landscape, the task of applying XML Signatures in a robust and reliable manner becomes...... more and more challenging. In this paper, we investigate this issue, describing how an attacker can still interfere with Web Services communication even in the presence of XML Signatures. Additionally, we discuss the interrelation of XML Signatures and XML Encryption, focussing on their security...

  4. Policy environments matters: Access to higher education of non-traditional students in Denmark. Paper presented at the 56th CIES conference, San Juan, Puerto Rico, 22-27 April

    DEFF Research Database (Denmark)

    Milana, Marcella

    2012-01-01

    Despite the massification of higher education that has brought about an increase in the enrollment rates of non-traditional students, and the internationalization of higher education, which has led towards cross-national homogenization when it comes to the typology of educational programs run...... by universities, access of non-traditional students is still a much debated issue. The scope of this paper is to critically examine the policy environment, and related practice, which supports (or hampers) access to higher education of non-traditional students, with a special attention to adult and mature...... from a common ideal that results from cross-national cooperation implemented through the Bologna process. The data source includes relevant scientific literature, policy documents as well as interviews with policy makers, representatives of higher education institutions and non-traditional students...

  5. Implementation of Scientific Community Laboratories and Their Effect on Student Conceptual Learning, Attitudes, and Understanding of Uncertainty

    Science.gov (United States)

    Lark, Adam

    Scientific Community Laboratories, developed by The University of Maryland, have shown initial promise as laboratories meant to emulate the practice of doing physics. These laboratories have been re-created by incorporating their design elements with the University of Toledo course structure and resources. The laboratories have been titled the Scientific Learning Community (SLC) Laboratories. A comparative study between these SLC laboratories and the University of Toledo physics department's traditional laboratories was executed during the fall 2012 semester on first semester calculus-based physics students. Three tests were executed as pre-test and post-tests to capture the change in students' concept knowledge, attitudes, and understanding of uncertainty. The Force Concept Inventory (FCI) was used to evaluate students' conceptual changes through the semester and average normalized gains were compared between both traditional and SLC laboratories. The Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) was conducted to elucidate students' change in attitudes through the course of each laboratory. Finally, interviews regarding data analysis and uncertainty were transcribed and coded to track changes in the way students understand uncertainty and data analysis in experimental physics after their participation in both laboratory type. Students in the SLC laboratories showed a notable an increase conceptual knowledge and attitudes when compared to traditional laboratories. SLC students' understanding of uncertainty showed most improvement, diverging completely from students in the traditional laboratories, who declined throughout the semester.

  6. Digital Signature Schemes with Complementary Functionality and Applications

    OpenAIRE

    S. N. Kyazhin

    2012-01-01

    Digital signature schemes with additional functionality (an undeniable signature, a signature of the designated confirmee, a signature blind, a group signature, a signature of the additional protection) and examples of their application are considered. These schemes are more practical, effective and useful than schemes of ordinary digital signature.

  7. 17 CFR 12.12 - Signature.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Signature. 12.12 Section 12.12... General Information and Preliminary Consideration of Pleadings § 12.12 Signature. (a) By whom. All... document on behalf of another person. (b) Effect. The signature on any document of any person acting either...

  8. High-speed high-security signatures

    NARCIS (Netherlands)

    Bernstein, D.J.; Duif, N.; Lange, T.; Schwabe, P.; Yang, B.Y.

    2011-01-01

    This paper shows that a $390 mass-market quad-core 2.4GHz Intel Westmere (Xeon E5620) CPU can create 108000 signatures per second and verify 71000 signatures per second on an elliptic curve at a 2128 security level. Public keys are 32 bytes, and signatures are 64 bytes. These performance figures

  9. (Small) Resonant non-Gaussianities: Signatures of a Discrete Shift Symmetry in the Effective Field Theory of Inflation

    Energy Technology Data Exchange (ETDEWEB)

    Behbahani, Siavosh R.; /SLAC /Stanford U., Phys. Dept. /Boston U.; Dymarsky, Anatoly; /Princeton, Inst. Advanced Study; Mirbabayi, Mehrdad; /New York U., CCPP /New York U.; Senatore, Leonardo; /Stanford U., Phys. Dept. /KIPAC, Menlo Park

    2012-06-06

    We apply the Effective Field Theory of Inflation to study the case where the continuous shift symmetry of the Goldstone boson {pi} is softly broken to a discrete subgroup. This case includes and generalizes recently proposed String Theory inspired models of Inflation based on Axion Monodromy. The models we study have the property that the 2-point function oscillates as a function of the wavenumber, leading to oscillations in the CMB power spectrum. The non-linear realization of time diffeomorphisms induces some self-interactions for the Goldstone boson that lead to a peculiar non-Gaussianity whose shape oscillates as a function of the wavenumber. We find that in the regime of validity of the effective theory, the oscillatory signal contained in the n-point correlation functions, with n > 2, is smaller than the one contained in the 2-point function, implying that the signature of oscillations, if ever detected, will be easier to find first in the 2-point function, and only then in the higher order correlation functions. Still the signal contained in higher-order correlation functions, that we study here in generality, could be detected at a subleading level, providing a very compelling consistency check for an approximate discrete shift symmetry being realized during inflation.

  10. Identification of uranium signatures in swipe samples on verification of nuclear activities for nuclear safeguards purposes; Identificacao de assinaturas de uranio em amostras de esfregacos (swipe samples) para verificacao de atividades nucleares para fins de salvaguardas nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Pestana, Rafael Cardoso Baptistini

    2013-07-01

    The use of environmental sampling for safeguards purposes, has been applied by the International Atomic Energy Agency–IAEA since 1996 and are routinely used as a complementary measure to strengthen the traditional nuclear safeguards procedures. The aim is verify if the states signatory to the safeguards agreements are not diverging their peaceful nuclear activities for undeclared nuclear activities. This work describes a new protocol of collect and analysis of the swipe samples for identification of nuclear signatures that may be related to the nuclear activities developed in the inspected facility. This work was used as a case of study a real uranium conversion plant of the nuclear fuel cycle of IPEN. The strategy proposed uses different analytical techniques, such as alpha radiation meter, SEM-EDX and ICP-MS to identify signatures of uranium adhered to the swipe samples. In the swipe samples analysis, it was possible to identify particles of UO{sub 2}F{sub 2} and UF4 through the morphological comparison and semi-quantitative analyses performed by SEM-EDX technique. In this work, methods were used that as a result has the average isotopic composition of the sample, in which the enrichment ranged from 1.453 ± 0.023 to 18.24 % ± 0.15 % in the {sup 235}U isotope. Through these externally collections, a non-intrusive sampling, it was possible to identify enriched material handling activities with enrichment of 1.453 % ± 0.023 % to 6.331 ± 0.055 % in the isotope {sup 235}U, as well as the use of reprocessed material, through the identification of the {sup 236}U isotope. The uncertainties obtained for the n({sup 235}U)/n({sup 238}U) ratio varied from 0.40% to 0.86 % for the internal swipe samples. (author)

  11. A systematic review and meta-analysis of traditional insect Chinese medicines combined chemotherapy for non-surgical hepatocellular carcinoma therapy.

    Science.gov (United States)

    Shi, Zhaofeng; Song, Tiebing; Wan, Yi; Xie, Juan; Yan, Yiquan; Shi, Kekai; Du, Yongping; Shang, Lei

    2017-06-28

    On the background of high morbidity and mortality of hepatocellular carcinoma (HCC) and rapid development of traditional Chinese medicine (TCM), we conducted a systematic review and meta-analysis of randomized clinical trials (RCTs) according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement to assess the clinical effectiveness and safety of traditional insect Chinese medicine and related preparation for non-surgical HCC. RCTs were searched based on standardized searching rules in mainstream medical databases from the inception up to May 2016. Ultimately, a total of 57 articles with 4,651 patients enrolled in this meta-analysis. We found that traditional insect Chinese medicine and related preparation combined chemotherapy show significantly effectiveness and safety in objective response rate (P traditional insect Chinese medicine and related preparations could be recommended as auxiliary therapy combined chemotherapy for HCC therapy.

  12. Uncertainty visualisation in the Model Web

    Science.gov (United States)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool

  13. Waste receiving and processing drum weight measurement uncertainty review findings

    International Nuclear Information System (INIS)

    LANE, M.P.

    1999-01-01

    The purpose of reviewing the weight scale operation at the WRAP facility was to determine the uncertainty associated with weight measurements. Weight measurement uncertainty is needed to support WRAP Nondestructive Examination (NDE) and Non-destructive Assay (NDA) analysis

  14. Present searches for Higgs signatures at the Tevatron

    International Nuclear Information System (INIS)

    Groer, L.

    1997-08-01

    We present results for various searches for signatures of standard and non-standard model Higgs boson decays conducted at the collider detectors CDF and D0 using ∼100 pb -1 of integrated luminosity each from the Tevatron collider Run 1 (1992-96) at √s=1.8 TeV. No evidence for a Higgs boson decay is found and various limits are set

  15. Uncertainty in hydraulic tests in fractured rock

    International Nuclear Information System (INIS)

    Ji, Sung-Hoon; Koh, Yong-Kwon

    2014-01-01

    Interpretation of hydraulic tests in fractured rock has uncertainty because of the different hydraulic properties of a fractured rock to a porous medium. In this study, we reviewed several interesting phenomena which show uncertainty in a hydraulic test at a fractured rock and discussed their origins and the how they should be considered during site characterisation. Our results show that the estimated hydraulic parameters of a fractured rock from a hydraulic test are associated with uncertainty due to the changed aperture and non-linear groundwater flow during the test. Although the magnitude of these two uncertainties is site-dependent, the results suggest that it is recommended to conduct a hydraulic test with a little disturbance from the natural groundwater flow to consider their uncertainty. Other effects reported from laboratory and numerical experiments such as the trapping zone effect (Boutt, 2006) and the slip condition effect (Lee, 2014) can also introduce uncertainty to a hydraulic test, which should be evaluated in a field test. It is necessary to consider the way how to evaluate the uncertainty in the hydraulic property during the site characterisation and how to apply it to the safety assessment of a subsurface repository. (authors)

  16. Evaluating measurement uncertainty in fluid phase equilibrium calculations

    Science.gov (United States)

    van der Veen, Adriaan M. H.

    2018-04-01

    The evaluation of measurement uncertainty in accordance with the ‘Guide to the expression of uncertainty in measurement’ (GUM) has not yet become widespread in physical chemistry. With only the law of the propagation of uncertainty from the GUM, many of these uncertainty evaluations would be cumbersome, as models are often non-linear and require iterative calculations. The methods from GUM supplements 1 and 2 enable the propagation of uncertainties under most circumstances. Experimental data in physical chemistry are used, for example, to derive reference property data and support trade—all applications where measurement uncertainty plays an important role. This paper aims to outline how the methods for evaluating and propagating uncertainty can be applied to some specific cases with a wide impact: deriving reference data from vapour pressure data, a flash calculation, and the use of an equation-of-state to predict the properties of both phases in a vapour-liquid equilibrium. The three uncertainty evaluations demonstrate that the methods of GUM and its supplements are a versatile toolbox that enable us to evaluate the measurement uncertainty of physical chemical measurements, including the derivation of reference data, such as the equilibrium thermodynamical properties of fluids.

  17. Computationally-generated nuclear forensic characteristics of early production reactors with an emphasis on sensitivity and uncertainty

    International Nuclear Information System (INIS)

    Redd, Evan M.; Sjoden, Glenn; Erickson, Anna

    2017-01-01

    Highlights: •X-10 reactor is used as a case study for nuclear forensic signatures. •S/U analysis is conducted to derive statistically relevant markers. •Computationally-generated signatures aid with proliferation pathway identification. •Highest uncertainty in total plutonium production originates from 238 Pu and 242 Pu. -- Abstract: With nuclear technology and analysis advancements, site access restrictions, and ban on nuclear testing, computationally-generated nuclear forensic signatures are becoming more important in gaining knowledge to a reclusive country’s weapon material production capabilities. In particular, graphite-moderated reactors provide an appropriate case study for isotopics relevant in Pu production in a clandestine nuclear program due to the ease of design and low thermal output. We study the production characteristics of the X-10 reactor with a goal to develop statistically-relevant nuclear forensic signatures from early Pu production. In X-10 reactor, a flat flux gradient and low burnup produce exceptionally pure Pu as evident by the 240 Pu/ 239 Pu ratio. However, these design aspects also make determining reactor zone attribution, done with the 242 Pu/ 240 Pu ratio, uncertain. Alternatively, the same ratios produce statistically differentiable results between Manhattan Project and post-Manhattan Project reactor configurations, allowing for attribution conclusions.

  18. Exploring entropic uncertainty relation in the Heisenberg XX model with inhomogeneous magnetic field

    Science.gov (United States)

    Huang, Ai-Jun; Wang, Dong; Wang, Jia-Ming; Shi, Jia-Dong; Sun, Wen-Yang; Ye, Liu

    2017-08-01

    In this work, we investigate the quantum-memory-assisted entropic uncertainty relation in a two-qubit Heisenberg XX model with inhomogeneous magnetic field. It has been found that larger coupling strength J between the two spin-chain qubits can effectively reduce the entropic uncertainty. Besides, we observe the mechanics of how the inhomogeneous field influences the uncertainty, and find out that when the inhomogeneous field parameter b1. Intriguingly, the entropic uncertainty can shrink to zero when the coupling coefficients are relatively large, while the entropic uncertainty only reduces to 1 with the increase of the homogeneous magnetic field. Additionally, we observe the purity of the state and Bell non-locality and obtain that the entropic uncertainty is anticorrelated with both the purity and Bell non-locality of the evolution state.

  19. Novel transcriptional signatures for sputum-independent diagnostics of tuberculosis in children

    DEFF Research Database (Denmark)

    Gjøen, John Espen; Jenum, Synne; Sivakumaran, Dhanasekaran

    2017-01-01

    Pediatric tuberculosis (TB) is challenging to diagnose, confirmed by growth of Mycobacterium tuberculosis at best in 40% of cases. The WHO has assigned high priority to the development of non-sputum diagnostic tools. We therefore sought to identify transcriptional signatures in whole blood...

  20. The effect of classroom instruction, attitudes towards science and motivation on students' views of uncertainty in science

    Science.gov (United States)

    Schroeder, Meadow

    This study examined developmental and gender differences in Grade 5 and 9 students' views of uncertainty in science and the effect of classroom instruction on attitudes towards science, and motivation. Study 1 examined views of uncertainty in science when students were taught science using constructivist pedagogy. A total of 33 Grade 5 (n = 17, 12 boys, 5 girls) and Grade 9 (n = 16, 8 boys, 8 girls) students were interviewed about the ideas they had about uncertainty in their own experiments (i.e., practical science) and in professional science activities (i.e., formal science). Analysis found an interaction between grade and gender in the number of categories of uncertainty identified for both practical and formal science. Additionally, in formal science, there was a developmental shift from dualism (i.e., science is a collection of basic facts that are the result of straightforward procedures) to multiplism (i.e., there is more than one answer or perspective on scientific knowledge) from Grade 5 to Grade 9. Finally, there was a positive correlation between the understanding uncertainty in practical and formal science. Study 2 compared the attitudes and motivation towards science and motivation of students in constructivist and traditional classrooms. Scores on the measures were also compared to students' views of uncertainty for constructivist-taught students. A total of 28 students in Grade 5 (n = 13, 11 boys, 2 girls) and Grade 9 (n = 15, 6 boys, 9 girls), from traditional science classrooms and the 33 constructivist students from Study 1 participated. Regardless of classroom instruction, fifth graders reported more positive attitudes towards science than ninth graders. Students from the constructivist classrooms reported more intrinsic motivation than students from the traditional classrooms. Constructivist students' views of uncertainty in formal and practical science did not correlate with their attitudes towards science and motivation.

  1. Observation of quantum-memory-assisted entropic uncertainty relation under open systems, and its steering

    Science.gov (United States)

    Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu

    2018-01-01

    Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.

  2. The effects of extrinsic motivation on signature authorship opinions in forensic signature blind trials.

    Science.gov (United States)

    Dewhurst, Tahnee N; Found, Bryan; Ballantyne, Kaye N; Rogers, Doug

    2014-03-01

    Expertise studies in forensic handwriting examination involve comparisons of Forensic Handwriting Examiners' (FHEs) opinions with lay-persons on blind tests. All published studies of this type have reported real and demonstrable skill differences between the specialist and lay groups. However, critics have proposed that any difference shown may be indicative of a lack of motivation on the part of lay participants, rather than a real difference in skill. It has been suggested that qualified FHEs would be inherently more motivated to succeed in blinded validation trials, as their professional reputations could be at risk, should they perform poorly on the task provided. Furthermore, critics suggest that lay-persons would be unlikely to be highly motivated to succeed, as they would have no fear of negative consequences should they perform badly. In an effort to investigate this concern, a blind signature trial was designed and administered to forty lay-persons. Participants were required to compare known (exemplar) signatures of an individual to questioned signatures and asked to express an opinion regarding whether the writer of the known signatures wrote each of the questioned signatures. The questioned signatures comprised a mixture of genuine, disguised and simulated signatures. The forty participants were divided into two separate groupings. Group 'A' were requested to complete the trial as directed and were advised that for each correct answer they would be financially rewarded, for each incorrect answer they would be financially penalized, and for each inconclusive opinion they would receive neither penalty nor reward. Group 'B' was requested to complete the trial as directed, with no mention of financial recompense or penalty. The results of this study do not support the proposition that motivation rather than skill difference is the source of the statistical difference in opinions between individuals' results in blinded signature proficiency trials. Crown

  3. Resolving uncertainty in chemical speciation determinations

    Science.gov (United States)

    Smith, D. Scott; Adams, Nicholas W. H.; Kramer, James R.

    1999-10-01

    Speciation determinations involve uncertainty in system definition and experimentation. Identification of appropriate metals and ligands from basic chemical principles, analytical window considerations, types of species and checking for consistency in equilibrium calculations are considered in system definition uncertainty. A systematic approach to system definition limits uncertainty in speciation investigations. Experimental uncertainty is discussed with an example of proton interactions with Suwannee River fulvic acid (SRFA). A Monte Carlo approach was used to estimate uncertainty in experimental data, resulting from the propagation of uncertainties in electrode calibration parameters and experimental data points. Monte Carlo simulations revealed large uncertainties present at high (>9-10) and low (monoprotic ligands. Least-squares fit the data with 21 sites, whereas linear programming fit the data equally well with 9 sites. Multiresponse fitting, involving simultaneous fluorescence and pH measurements, improved model discrimination. Deconvolution of the excitation versus emission fluorescence surface for SRFA establishes a minimum of five sites. Diprotic sites are also required for the five fluorescent sites, and one non-fluorescent monoprotic site was added to accommodate the pH data. Consistent with greater complexity, the multiresponse method had broader confidence limits than the uniresponse methods, but corresponded better with the accepted total carboxylic content for SRFA. Overall there was a 40% standard deviation in total carboxylic content for the multiresponse fitting, versus 10% and 1% for least-squares and linear programming, respectively.

  4. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  5. Non-Coding RNAs in Hodgkin Lymphoma

    Directory of Open Access Journals (Sweden)

    Anna Cordeiro

    2017-05-01

    Full Text Available MicroRNAs (miRNAs, small non-coding RNAs that regulate gene expression by binding to the 3’-UTR of their target genes, can act as oncogenes or tumor suppressors. Recently, other types of non-coding RNAs—piwiRNAs and long non-coding RNAs—have also been identified. Hodgkin lymphoma (HL is a B cell origin disease characterized by the presence of only 1% of tumor cells, known as Hodgkin and Reed-Stenberg (HRS cells, which interact with the microenvironment to evade apoptosis. Several studies have reported specific miRNA signatures that can differentiate HL lymph nodes from reactive lymph nodes, identify histologic groups within classical HL, and distinguish HRS cells from germinal center B cells. Moreover, some signatures are associated with survival or response to chemotherapy. Most of the miRNAs in the signatures regulate genes related to apoptosis, cell cycle arrest, or signaling pathways. Here we review findings on miRNAs in HL, as well as on other non-coding RNAs.

  6. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  7. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  8. Exotic signatures from supersymmetry

    International Nuclear Information System (INIS)

    Hall, L.J.

    1989-08-01

    Minor changes to the standard supersymmetric model, such as soft flavor violation and R parity violation, cause large changes in the signatures. The origin of these changes and the resulting signatures are discussed. 15 refs., 7 figs., 2 tabs

  9. Parametric uncertainty modeling for robust control

    DEFF Research Database (Denmark)

    Rasmussen, K.H.; Jørgensen, Sten Bay

    1999-01-01

    The dynamic behaviour of a non-linear process can often be approximated with a time-varying linear model. In the presented methodology the dynamics is modeled non-conservatively as parametric uncertainty in linear lime invariant models. The obtained uncertainty description makes it possible...... to perform robustness analysis on a control system using the structured singular value. The idea behind the proposed method is to fit a rational function to the parameter variation. The parameter variation can then be expressed as a linear fractional transformation (LFT), It is discussed how the proposed...... point changes. It is shown that a diagonal PI control structure provides robust performance towards variations in feed flow rate or feed concentrations. However including both liquid and vapor flow delays robust performance specifications cannot be satisfied with this simple diagonal control structure...

  10. Short-term effects of a nicotine-free e-cigarette compared to a traditional cigarette in smokers and non-smokers.

    Science.gov (United States)

    Ferrari, Marco; Zanasi, Alessandro; Nardi, Elena; Morselli Labate, Antonio Maria; Ceriana, Piero; Balestrino, Antonella; Pisani, Lara; Corcione, Nadia; Nava, Stefano

    2015-10-12

    A few studies have assessed the short-term effects of low-dose nicotine e-cigarettes, while data about nicotine-free e-cigarettes (NF e-cigarettes) are scanty. Concerns have been expressed about the use of NF e-cigarettes, because of the high concentrations of propylene glycol and other compounds in the e-cigarette vapor. This laboratory-based study was aimed to compare the effects of ad libitum use of a NF e-cigarette or and a traditional cigarette for 5 min in healthy adult smokers (n = 10) and non-smokers (n = 10). The main outcome measures were pulmonary function tests, fraction of exhaled nitric oxide (FeNO) and fractional concentration of carbon monoxide (FeCO) in exhaled breath. The traditional cigarette induced statistically significant increases in FeCO in both smokers and non-smokers, while no significant changes were observed in FeNO. In non-smokers, the traditional cigarette induced a significant decrease from baseline in FEF75 (81 % ± 35 % vs 70.2 % ± 28.2 %, P = 0.013), while in smokers significant decreases were observed in FEF25 (101.3 % ± 16.4 % vs 93.5 % ± 31.7 %, P = 0.037), FEV1 (102.2 % ± 9.5 % vs 98.3 % ± 10 %, P = 0.037) and PEF (109.5 % ± 14.6 % vs 99.2 % ± 17.5 %, P = 0.009). In contrast, the only statistically significant effects induced by the NF e-cigarette in smokers were reductions in FEV1 (102.2 % ± 9.5 % vs 99.5 ± 7.6 %, P = 0.041) and FEF25 (103.4 % ± 16.4 % vs 94.2 % ± 16.2 %, P =  .014). The present study demonstrated that the specific brand of NF e-cigarette utilized did not induce any majoracute effects. In contrast, several studies have shown that both traditional cigarettes and nicotine-containing e-cigarettes have acute effects on lung function. Our study expands on previous observations on the effects of NF e-cigarettes, but also for the first time describes the changes induced by smoking one traditional cigarette in a group of never smokers. The short-term use of the specific brand of NF e-cigarette assessed

  11. Research on a New Signature Scheme on Blockchain

    Directory of Open Access Journals (Sweden)

    Chao Yuan

    2017-01-01

    Full Text Available With the rise of Bitcoin, blockchain which is the core technology of Bitcoin has received increasing attention. Privacy preserving and performance on blockchain are two research points in academia and business, but there are still some unresolved issues in both respects. An aggregate signature scheme is a digital signature that supports making signatures on many different messages generated by many different users. Using aggregate signature, the size of the signature could be shortened by compressing multiple signatures into a single signature. In this paper, a new signature scheme for transactions on blockchain based on the aggregate signature was proposed. It was worth noting that elliptic curve discrete logarithm problem and bilinear maps played major roles in our signature scheme. And the security properties of our signature scheme were proved. In our signature scheme, the amount will be hidden especially in the transactions which contain multiple inputs and outputs. Additionally, the size of the signature on transaction is constant regardless of the number of inputs and outputs that the transaction contains, which can improve the performance of signature. Finally, we gave an application scenario for our signature scheme which aims to achieve the transactions of big data on blockchain.

  12. Quantum multi-signature protocol based on teleportation

    International Nuclear Information System (INIS)

    Wen Xiao-jun; Liu Yun; Sun Yu

    2007-01-01

    In this paper, a protocol which can be used in multi-user quantum signature is proposed. The scheme of signature and verification is based on the correlation of Greenberger-Horne-Zeilinger (GHZ) states and the controlled quantum teleportation. Different from the digital signatures, which are based on computational complexity, the proposed protocol has perfect security in the noiseless quantum channels. Compared to previous quantum signature schemes, this protocol can verify the signature independent of an arbitrator as well as realize multi-user signature together. (orig.)

  13. Data set on the bioprecipitation of sulfate and trivalent arsenic by acidophilic non-traditional sulfur reducing bacteria.

    Science.gov (United States)

    de Matos, Letícia Paiva; Costa, Patrícia Freitas; Moreira, Mariana; Gomes, Paula Cristine Silva; de Queiroz Silva, Silvana; Gurgel, Leandro Vinícius Alves; Teixeira, Mônica Cristina

    2018-04-01

    Data presented here are related to the original paper "Simultaneous removal of sulfate and arsenic using immobilized non-traditional sulfate reducing bacteria (SRB) mixed culture and alternative low-cost carbon sources" published by same authors (Matos et al., 2018) [1]. The data set here presented aims to facilitate this paper comprehension by giving readers some additional information. Data set includes a brief description of experimental conditions and the results obtained during both batch and semi-continuous reactors experiments. Data confirmed arsenic and sulfate were simultaneously removed under acidic pH by using a biological treatment based on the activity of a non-traditional sulfur reducing bacteria consortium. This microbial consortium was able to utilize glycerol, powdered chicken feathers as carbon donors, and proved to be resistant to arsenite up to 8.0 mg L - 1 . Data related to sulfate and arsenic removal efficiencies, residual arsenite and sulfate contents, pH and Eh measurements obtained under different experimental conditions were depicted in graphical format. Refers to https://doi.org/10.1016/j.cej.2017.11.035.

  14. A new NMIS characteristic signature acquisition method based on time-domain fission correlation spectrum

    International Nuclear Information System (INIS)

    Wei Biao; Feng Peng; Yang Fan; Ren Yong

    2014-01-01

    To deal with the disadvantages of the homogeneous signature of the nuclear material identification system (NMIS) and limited methods to extract the characteristic parameters of the nuclear materials, an enhanced method using the combination of the Time-of-Flight (TOF) and the Pulse Shape Discrimination (PSD) was introduced into the traditional characteristic parameters extraction and recognition system of the NMIS. With the help of the PSD, the γ signal and the neutron signal can be discriminated. Further based on the differences of the neutron-γ flight time of the detectors in various positions, a new time-domain signature reflecting the position information of unknown nuclear material was investigated. The simulation result showed that the algorithm is feasible and helpful to identify the relative position of unknown nuclear material. (authors)

  15. A Directed Signature Scheme and its Applications

    OpenAIRE

    Lal, Sunder; Kumar, Manoj

    2004-01-01

    This paper presents a directed signature scheme with the property that the signature can be verified only with the help of signer or signature receiver. We also propose its applications to share verification of signatures and to threshold cryptosystems.

  16. Assessing Changes in Medical Student Attitudes toward Non-Traditional Human Sexual Behaviors Using a Confidential Audience Response System

    Science.gov (United States)

    Tucker, Phebe; Candler, Chris; Hamm, Robert M.; Smith, E. Michael; Hudson, Joseph C.

    2010-01-01

    Medical students encountering patients with unfamiliar, unconventional sexual practices may have attitudes that can affect open communication during sexual history-taking. We measured changes in first-year US medical student attitudes toward 22 non-traditional sexual behaviors before and after exposure to human sexuality instruction. An…

  17. Validity of WTP measures under preference uncertainty

    OpenAIRE

    Kniebes, Carola; Rehdanz, Katrin; Schmidt, Ulrich

    2014-01-01

    This paper establishes a new method for eliciting Willingness to Pay (WTP) in contingent valuation (CV) studies with an open-ended elicitation format: the Range-WTP method. In contrast to the traditional approach for eliciting Point-WTP, Range-WTP explicitly allows for preference uncertainty in responses. Using data from two novel large-scale surveys on the perception of solar radiation management (SRM), a little-known technique for counteracting climate change, we compare the performance of ...

  18. On reliable discovery of molecular signatures

    Directory of Open Access Journals (Sweden)

    Björkegren Johan

    2009-01-01

    Full Text Available Abstract Background Molecular signatures are sets of genes, proteins, genetic variants or other variables that can be used as markers for a particular phenotype. Reliable signature discovery methods could yield valuable insight into cell biology and mechanisms of human disease. However, it is currently not clear how to control error rates such as the false discovery rate (FDR in signature discovery. Moreover, signatures for cancer gene expression have been shown to be unstable, that is, difficult to replicate in independent studies, casting doubts on their reliability. Results We demonstrate that with modern prediction methods, signatures that yield accurate predictions may still have a high FDR. Further, we show that even signatures with low FDR may fail to replicate in independent studies due to limited statistical power. Thus, neither stability nor predictive accuracy are relevant when FDR control is the primary goal. We therefore develop a general statistical hypothesis testing framework that for the first time provides FDR control for signature discovery. Our method is demonstrated to be correct in simulation studies. When applied to five cancer data sets, the method was able to discover molecular signatures with 5% FDR in three cases, while two data sets yielded no significant findings. Conclusion Our approach enables reliable discovery of molecular signatures from genome-wide data with current sample sizes. The statistical framework developed herein is potentially applicable to a wide range of prediction problems in bioinformatics.

  19. Non-traditional approaches to teaching GPS online

    Science.gov (United States)

    Matias, A.; Wolf, D. F., II

    2009-12-01

    Students are increasingly turning to the web for quality education that fits into their lives. Nonetheless, online learning brings challenges as well as a fresh opportunity for exploring pedagogical practices not present on traditional higher education programs, particularly in the sciences. A team of two dozen Empire State College-State University of New York instructional designers, faculty, and other staff are working on making science relevant to non-majors who may initially have anxiety about general education science courses. One of these courses, GPS and the New Geography, focuses on how Global Positioning System (GPS) technology provides a base for inquiry and scientific discovery from a range of environmental issues with local, regional, and global scope. GPS and the New Geography is an introductory level course developed under a grant supported by the Charitable Leadership Foundation. Taking advantage of the proliferation of tools currently available for online learning management systems, we explore current trends in Web 2.0 applications to aggregate and leverage data to create a nontraditional, interactive learning environment. Using our best practices to promote on-line discussion and interaction, these tools help engage students and foster deep learning. During the 15-week term students learn through case studies, problem-based exercises, and the use of scientific data; thus, expanding their spatial literacy and gain experience using real spatial technology tools to enhance their understanding of real-world issues. In particular, we present how the use of Mapblogs an in-house developed blogging platform that uses GIS interplaying with GPS units, interactive data presentations, intuitive visual working environments, harnessing RSS feeds, and other nontraditional Web 2.0 technology has successfully promoted active learning in the virtual learning environment.

  20. Uncertainty in projected climate change arising from uncertain fossil-fuel emission factors

    Science.gov (United States)

    Quilcaille, Y.; Gasser, T.; Ciais, P.; Lecocq, F.; Janssens-Maenhout, G.; Mohr, S.

    2018-04-01

    Emission inventories are widely used by the climate community, but their uncertainties are rarely accounted for. In this study, we evaluate the uncertainty in projected climate change induced by uncertainties in fossil-fuel emissions, accounting for non-CO2 species co-emitted with the combustion of fossil-fuels and their use in industrial processes. Using consistent historical reconstructions and three contrasted future projections of fossil-fuel extraction from Mohr et al we calculate CO2 emissions and their uncertainties stemming from estimates of fuel carbon content, net calorific value and oxidation fraction. Our historical reconstructions of fossil-fuel CO2 emissions are consistent with other inventories in terms of average and range. The uncertainties sum up to a ±15% relative uncertainty in cumulative CO2 emissions by 2300. Uncertainties in the emissions of non-CO2 species associated with the use of fossil fuels are estimated using co-emission ratios varying with time. Using these inputs, we use the compact Earth system model OSCAR v2.2 and a Monte Carlo setup, in order to attribute the uncertainty in projected global surface temperature change (ΔT) to three sources of uncertainty, namely on the Earth system’s response, on fossil-fuel CO2 emission and on non-CO2 co-emissions. Under the three future fuel extraction scenarios, we simulate the median ΔT to be 1.9, 2.7 or 4.0 °C in 2300, with an associated 90% confidence interval of about 65%, 52% and 42%. We show that virtually all of the total uncertainty is attributable to the uncertainty in the future Earth system’s response to the anthropogenic perturbation. We conclude that the uncertainty in emission estimates can be neglected for global temperature projections in the face of the large uncertainty in the Earth system response to the forcing of emissions. We show that this result does not hold for all variables of the climate system, such as the atmospheric partial pressure of CO2 and the

  1. Accessing the uncertainties of seismic velocity and anisotropy structure of Northern Great Plains using a transdimensional Bayesian approach

    Science.gov (United States)

    Gao, C.; Lekic, V.

    2017-12-01

    Seismic imaging utilizing complementary seismic data provides unique insight on the formation, evolution and current structure of continental lithosphere. While numerous efforts have improved the resolution of seismic structure, the quantification of uncertainties remains challenging due to the non-linearity and the non-uniqueness of geophysical inverse problem. In this project, we use a reverse jump Markov chain Monte Carlo (rjMcMC) algorithm to incorporate seismic observables including Rayleigh and Love wave dispersion, Ps and Sp receiver function to invert for shear velocity (Vs), compressional velocity (Vp), density, and radial anisotropy of the lithospheric structure. The Bayesian nature and the transdimensionality of this approach allow the quantification of the model parameter uncertainties while keeping the models parsimonious. Both synthetic test and inversion of actual data for Ps and Sp receiver functions are performed. We quantify the information gained in different inversions by calculating the Kullback-Leibler divergence. Furthermore, we explore the ability of Rayleigh and Love wave dispersion data to constrain radial anisotropy. We show that when multiple types of model parameters (Vsv, Vsh, and Vp) are inverted simultaneously, the constraints on radial anisotropy are limited by relatively large data uncertainties and trade-off strongly with Vp. We then perform joint inversion of the surface wave dispersion (SWD) and Ps, Sp receiver functions, and show that the constraints on both isotropic Vs and radial anisotropy are significantly improved. To achieve faster convergence of the rjMcMC, we propose a progressive inclusion scheme, and invert SWD measurements and receiver functions from about 400 USArray stations in the Northern Great Plains. We start by only using SWD data due to its fast convergence rate. We then use the average of the ensemble as a starting model for the joint inversion, which is able to resolve distinct seismic signatures of

  2. Aequorin-based measurements of intracellular Ca2+-signatures in plant cells

    Directory of Open Access Journals (Sweden)

    Mithöfer Axel

    2002-01-01

    Full Text Available Due to the involvement of calcium as a main second messenger in the plant signaling pathway, increasing interest has been focused on the calcium signatures supposed to be involved in the patterning of the specific response associated to a given stimulus. In order to follow these signatures we described here the practical approach to use the non-invasive method based on the aequorin technology. Besides reviewing the advantages and disadvantages of this method we report on results showing the usefulness of aequorin to study the calcium response to biotic (elicitors and abiotic stimuli (osmotic shocks in various compartments of plant cells such as cytosol and nucleus.

  3. Assessment of Uncertainty in the Determination of Activation Energy for Polymeric Materials

    Science.gov (United States)

    Darby, Stephania P.; Landrum, D. Brian; Coleman, Hugh W.

    1998-01-01

    An assessment of the experimental uncertainty in obtaining the kinetic activation energy from thermogravimetric analysis (TGA) data is presented. A neat phenolic resin, Borden SC1O08, was heated at three heating rates to obtain weight loss vs temperature data. Activation energy was calculated by two methods: the traditional Flynn and Wall method based on the slope of log(q) versus 1/T, and a modification of this method where the ordinate and abscissa are reversed in the linear regression. The modified method produced a more accurate curve fit of the data, was more sensitive to data nonlinearity, and gave a value of activation energy 75 percent greater than the original method. An uncertainty analysis using the modified method yielded a 60 percent uncertainty in the average activation energy. Based on this result, the activation energy for a carbon-phenolic material was doubled and used to calculate the ablation rate In a typical solid rocket environment. Doubling the activation energy increased surface recession by 3 percent. Current TGA data reduction techniques that use the traditional Flynn and Wall approach to calculate activation energy should be changed to the modified method.

  4. COMPARi\\ TIVE STUDIES OF TRADITIONAL (NON-ENERG\\T

    African Journals Online (AJOL)

    2012-12-19

    Dec 19, 2012 ... more energy and utilities cost than the traditional energy technique. . " ' .... ,. Keywords: ... An additional major advantage of the Pinch approach is that ... modification before embarking on actual implementation.(Adefila, I 994}.

  5. Aboriginal oral traditions of Australian impact craters

    Science.gov (United States)

    Hamacher, Duane W.; Goldsmith, John

    2013-11-01

    In this paper we explore Aboriginal oral traditions that relate to Australian meteorite craters. Using the literature, first-hand ethnographic records and field trip data, we identify oral traditions and artworks associated with four impact sites: Gosses Bluff, Henbury, Liverpool and Wolfe Creek. Oral traditions describe impact origins for Gosses Bluff, Henbury and Wolfe Creek Craters, and non-impact origins for Liverpool Crater, with Henbury and Wolfe Creek stories having both impact and non-impact origins. Three impact sites that are believed to have been formed during human habitation of Australia -- Dalgaranga, Veevers, and Boxhole -- do not have associated oral traditions that are reported in the literature.

  6. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  7. Real time gamma-ray signature identifier

    Science.gov (United States)

    Rowland, Mark [Alamo, CA; Gosnell, Tom B [Moraga, CA; Ham, Cheryl [Livermore, CA; Perkins, Dwight [Livermore, CA; Wong, James [Dublin, CA

    2012-05-15

    A real time gamma-ray signature/source identification method and system using principal components analysis (PCA) for transforming and substantially reducing one or more comprehensive spectral libraries of nuclear materials types and configurations into a corresponding concise representation/signature(s) representing and indexing each individual predetermined spectrum in principal component (PC) space, wherein an unknown gamma-ray signature may be compared against the representative signature to find a match or at least characterize the unknown signature from among all the entries in the library with a single regression or simple projection into the PC space, so as to substantially reduce processing time and computing resources and enable real-time characterization and/or identification.

  8. Information-integration category learning and the human uncertainty response.

    Science.gov (United States)

    Paul, Erick J; Boomer, Joseph; Smith, J David; Ashby, F Gregory

    2011-04-01

    The human response to uncertainty has been well studied in tasks requiring attention and declarative memory systems. However, uncertainty monitoring and control have not been studied in multi-dimensional, information-integration categorization tasks that rely on non-declarative procedural memory. Three experiments are described that investigated the human uncertainty response in such tasks. Experiment 1 showed that following standard categorization training, uncertainty responding was similar in information-integration tasks and rule-based tasks requiring declarative memory. In Experiment 2, however, uncertainty responding in untrained information-integration tasks impaired the ability of many participants to master those tasks. Finally, Experiment 3 showed that the deficit observed in Experiment 2 was not because of the uncertainty response option per se, but rather because the uncertainty response provided participants a mechanism via which to eliminate stimuli that were inconsistent with a simple declarative response strategy. These results are considered in the light of recent models of category learning and metacognition.

  9. The informational system model of Ukrainian national transport workflow improvement based on electronic signature introduction management

    Directory of Open Access Journals (Sweden)

    Grigoriy NECHAEY

    2007-01-01

    Full Text Available Proposed model of informational system supposes improvement of newconceptual method on the work with e-signature in transport nformational systems. Problems and aims that may be solved with the help of this system and the most important economical and technical advantages of the proposed system in comparison with traditional methods of e-signing use are marked out.

  10. Uncertainty propagation in nuclear forensics

    International Nuclear Information System (INIS)

    Pommé, S.; Jerome, S.M.; Venchiarutti, C.

    2014-01-01

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent–daughter pairs and the need for more precise half-life data is examined. - Highlights: • Uncertainty propagation formulae for age dating with nuclear chronometers. • Applied to parent–daughter pairs used in nuclear forensics. • Investigated need for better half-life data

  11. Integration of inaccurate data into model building and uncertainty assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coleou, Thierry

    1998-12-31

    Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.

  12. Latency and Criticality of Uncertainties in the Development of Product-Service Systems

    DEFF Research Database (Denmark)

    Ramirez Hernandez, Tabea; Kreye, Melanie; Pigosso, Daniela Cristina Antelmi

    2018-01-01

    Servitization requires manufacturers to develop new business models - compound offerings between products and services often referred to as Product-Service Systems (PSS). The development of PSS goes beyond the traditional product-development practices, requiring new processes and capabilities due...... to the high levels of uncertainty caused by the novelty and complexity of developing the product and the service in parallel. Uncertainty is further increased through mostly long life cycles of PSS and organisational complexity caused by a high degree of stakeholder involvement (Wolfenstetter et al., 2015...

  13. Uncertainty evaluation of a modified elimination weighing for source preparation

    Energy Technology Data Exchange (ETDEWEB)

    Cacais, F.L.; Loayza, V.M., E-mail: facacais@gmail.com [Instituto Nacional de Metrologia, Qualidade e Tecnologia, (INMETRO), Rio de Janeiro, RJ (Brazil); Delgado, J.U. [Instituto de Radioproteção e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. de Metrologia das Radiações Ionizantes

    2017-07-01

    Some modification in elimination weighing method for radioactive source allowed correcting weighing results without non-linearity problems assign a uncertainty contribution for the correction of the same order of the mass of drop uncertainty and check weighing variability in series source preparation. This analysis has focused in knowing the achievable weighing accuracy and the uncertainty estimated by Monte Carlo method for a mass of a 20 mg drop was at maximum of 0.06%. (author)

  14. WE-B-19A-01: SRT II: Uncertainties in SRT

    International Nuclear Information System (INIS)

    Dieterich, S; Schlesinger, D; Geneser, S

    2014-01-01

    SRS delivery has undergone major technical changes in the last decade, transitioning from predominantly frame-based treatment delivery to imageguided, frameless SRS. It is important for medical physicists working in SRS to understand the magnitude and sources of uncertainty involved in delivering SRS treatments for a multitude of technologies (Gamma Knife, CyberKnife, linac-based SRS and protons). Sources of SRS planning and delivery uncertainty include dose calculation, dose fusion, and intra- and inter-fraction motion. Dose calculations for small fields are particularly difficult because of the lack of electronic equilibrium and greater effect of inhomogeneities within and near the PTV. Going frameless introduces greater setup uncertainties that allows for potentially increased intra- and interfraction motion, The increased use of multiple imaging modalities to determine the tumor volume, necessitates (deformable) image and contour fusion, and the resulting uncertainties introduced in the image registration process further contribute to overall treatment planning uncertainties. Each of these uncertainties must be quantified and their impact on treatment delivery accuracy understood. If necessary, the uncertainties may then be accounted for during treatment planning either through techniques to make the uncertainty explicit, or by the appropriate addition of PTV margins. Further complicating matters, the statistics of 1-5 fraction SRS treatments differ from traditional margin recipes relying on Poisson statistics. In this session, we will discuss uncertainties introduced during each step of the SRS treatment planning and delivery process and present margin recipes to appropriately account for such uncertainties. Learning Objectives: To understand the major contributors to the total delivery uncertainty in SRS for Gamma Knife, CyberKnife, and linac-based SRS. Learn the various uncertainties introduced by image fusion, deformable image registration, and contouring

  15. Origin, Development and Decline of Monolithic Pillars and the Continuity of the Tradition in Polylithic, Non-Lithic and Structural Forms

    Directory of Open Access Journals (Sweden)

    S. Krishnamurthy

    2016-02-01

    Full Text Available The present paper deals with one such creations of Man, i.e. the tradition of erecting free standing monolithic pillars - its origin, growth and decline and the continuity of the tradition of erecting such pillars in its changed polylithic (from Greek word polloi = many + lithic = stone, non-lithic and structural forms. No exact reason can be found, pointing it to be the exact cause for the decline in the tradition of erecting monolithic pillar and its transformation. In this paper the authors try to analyse various phenomenon likesocio-political, economic and technical aspects which may have lead to their decline and subsequently their continuity in a changed form in Indian context.

  16. Combining Gene Signatures Improves Prediction of Breast Cancer Survival

    Science.gov (United States)

    Zhao, Xi; Naume, Bjørn; Langerød, Anita; Frigessi, Arnoldo; Kristensen, Vessela N.; Børresen-Dale, Anne-Lise; Lingjærde, Ole Christian

    2011-01-01

    Background Several gene sets for prediction of breast cancer survival have been derived from whole-genome mRNA expression profiles. Here, we develop a statistical framework to explore whether combination of the information from such sets may improve prediction of recurrence and breast cancer specific death in early-stage breast cancers. Microarray data from two clinically similar cohorts of breast cancer patients are used as training (n = 123) and test set (n = 81), respectively. Gene sets from eleven previously published gene signatures are included in the study. Principal Findings To investigate the relationship between breast cancer survival and gene expression on a particular gene set, a Cox proportional hazards model is applied using partial likelihood regression with an L2 penalty to avoid overfitting and using cross-validation to determine the penalty weight. The fitted models are applied to an independent test set to obtain a predicted risk for each individual and each gene set. Hierarchical clustering of the test individuals on the basis of the vector of predicted risks results in two clusters with distinct clinical characteristics in terms of the distribution of molecular subtypes, ER, PR status, TP53 mutation status and histological grade category, and associated with significantly different survival probabilities (recurrence: p = 0.005; breast cancer death: p = 0.014). Finally, principal components analysis of the gene signatures is used to derive combined predictors used to fit a new Cox model. This model classifies test individuals into two risk groups with distinct survival characteristics (recurrence: p = 0.003; breast cancer death: p = 0.001). The latter classifier outperforms all the individual gene signatures, as well as Cox models based on traditional clinical parameters and the Adjuvant! Online for survival prediction. Conclusion Combining the predictive strength of multiple gene signatures improves prediction of breast

  17. Combining gene signatures improves prediction of breast cancer survival.

    Directory of Open Access Journals (Sweden)

    Xi Zhao

    Full Text Available BACKGROUND: Several gene sets for prediction of breast cancer survival have been derived from whole-genome mRNA expression profiles. Here, we develop a statistical framework to explore whether combination of the information from such sets may improve prediction of recurrence and breast cancer specific death in early-stage breast cancers. Microarray data from two clinically similar cohorts of breast cancer patients are used as training (n = 123 and test set (n = 81, respectively. Gene sets from eleven previously published gene signatures are included in the study. PRINCIPAL FINDINGS: To investigate the relationship between breast cancer survival and gene expression on a particular gene set, a Cox proportional hazards model is applied using partial likelihood regression with an L2 penalty to avoid overfitting and using cross-validation to determine the penalty weight. The fitted models are applied to an independent test set to obtain a predicted risk for each individual and each gene set. Hierarchical clustering of the test individuals on the basis of the vector of predicted risks results in two clusters with distinct clinical characteristics in terms of the distribution of molecular subtypes, ER, PR status, TP53 mutation status and histological grade category, and associated with significantly different survival probabilities (recurrence: p = 0.005; breast cancer death: p = 0.014. Finally, principal components analysis of the gene signatures is used to derive combined predictors used to fit a new Cox model. This model classifies test individuals into two risk groups with distinct survival characteristics (recurrence: p = 0.003; breast cancer death: p = 0.001. The latter classifier outperforms all the individual gene signatures, as well as Cox models based on traditional clinical parameters and the Adjuvant! Online for survival prediction. CONCLUSION: Combining the predictive strength of multiple gene signatures improves

  18. 42 CFR 424.36 - Signature requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Signature requirements. 424.36 Section 424.36... (CONTINUED) MEDICARE PROGRAM CONDITIONS FOR MEDICARE PAYMENT Claims for Payment § 424.36 Signature requirements. (a) General rule. The beneficiary's own signature is required on the claim unless the beneficiary...

  19. Unsupervised signature extraction from forensic logs

    NARCIS (Netherlands)

    Thaler, S.M.; Menkovski, V.; Petkovic, M.; Altun, Y.; Das, K.; Mielikäinen, T.; Malerba, D.; Stefanowski, J.; Read, J.; Žitnik, M.; Ceci, M.

    2017-01-01

    Signature extraction is a key part of forensic log analysis. It involves recognizing patterns in log lines such that log lines that originated from the same line of code are grouped together. A log signature consists of immutable parts and mutable parts. The immutable parts define the signature, and

  20. Comparison of ISO-GUM and Monte Carlo Method for Evaluation of Measurement Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Young-Cheol; Her, Jae-Young; Lee, Seung-Jun; Lee, Kang-Jin [Korea Gas Corporation, Daegu (Korea, Republic of)

    2014-07-15

    To supplement the ISO-GUM method for the evaluation of measurement uncertainty, a simulation program using the Monte Carlo method (MCM) was developed, and the MCM and GUM methods were compared. The results are as follows: (1) Even under a non-normal probability distribution of the measurement, MCM provides an accurate coverage interval; (2) Even if a probability distribution that emerged from combining a few non-normal distributions looks as normal, there are cases in which the actual distribution is not normal and the non-normality can be determined by the probability distribution of the combined variance; and (3) If type-A standard uncertainties are involved in the evaluation of measurement uncertainty, GUM generally offers an under-valued coverage interval. However, this problem can be solved by the Bayesian evaluation of type-A standard uncertainty. In this case, the effective degree of freedom for the combined variance is not required in the evaluation of expanded uncertainty, and the appropriate coverage factor for 95% level of confidence was determined to be 1.96.

  1. Retail applications of signature verification

    Science.gov (United States)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  2. Non-Traditional Systemic Treatments for Diabetic Retinopathy: An
Evidence-Based Review

    Science.gov (United States)

    Simó, Rafael; Ballarini, Stefania; Cunha-Vaz, José; Ji, Linong; Haller, Hermann; Zimmet, Paul; Wong, Tien Y.

    2015-01-01

    The rapid escalation in the global prevalence diabetes, with more than 30% being afflicted with diabetic retinopathy (DR), means it is likely that associated vision-threatening conditions will also rise substantially. This means that new therapeutic approaches need to be found that go beyond the current standards of diabetic care, and which are effective in the early stages of the disease. In recent decades several new pharmacological agents have been investigated for their effectiveness in preventing the appearance and progression of DR or in reversing DR; some with limited success while others appear promising. This up-to-date critical review of non-traditional systemic treatments for DR is based on the published evidence in MEDLINE spanning 1980-December 2014. It discusses a number of therapeutic options, paying particular attention to the mechanisms of action and the clinical evidence for the use of renin-angiotensin system blockade, fenofibrate and calcium dobesilate monohydrate in DR. PMID:25989912

  3. Application of Sensitivity and Uncertainty Analysis Methods to a Validation Study for Weapons-Grade Mixed-Oxide Fuel

    International Nuclear Information System (INIS)

    Dunn, M.E.

    2001-01-01

    At the Oak Ridge National Laboratory (ORNL), sensitivity and uncertainty (S/U) analysis methods and a Generalized Linear Least-Squares Methodology (GLLSM) have been developed to quantitatively determine the similarity or lack thereof between critical benchmark experiments and an application of interest. The S/U and GLLSM methods provide a mathematical approach, which is less judgment based relative to traditional validation procedures, to assess system similarity and estimate the calculational bias and uncertainty for an application of interest. The objective of this paper is to gain experience with the S/U and GLLSM methods by revisiting a criticality safety evaluation and associated traditional validation for the shipment of weapons-grade (WG) MOX fuel in the MO-1 transportation package. In the original validation, critical experiments were selected based on a qualitative assessment of the MO-1 and MOX contents relative to the available experiments. Subsequently, traditional trending analyses were used to estimate the Δk bias and associated uncertainty. In this paper, the S/U and GLLSM procedures are used to re-evaluate the suite of critical experiments associated with the original MO-1 evaluation. Using the S/U procedures developed at ORNL, critical experiments that are similar to the undamaged and damaged MO-1 package are identified based on sensitivity and uncertainty analyses of the criticals and the MO-1 package configurations. Based on the trending analyses developed for the S/U and GLLSM procedures, the Δk bias and uncertainty for the most reactive MO-1 package configurations are estimated and used to calculate an upper subcritical limit (USL) for the MO-1 evaluation. The calculated bias and uncertainty from the S/U and GLLSM analyses lead to a calculational USL that supports the original validation study for the MO-1

  4. "Too big to fail" or "Too non-traditional to fail"?: The determinants of banks' systemic importance

    OpenAIRE

    Moore, Kyle; Zhou, Chen

    2013-01-01

    This paper empirically analyzes the determinants of banks' systemic importance. In constructing a measure on the systemic importance of financial institutions we find that size is a leading determinant. This confirms the usual "Too big to fail'' argument. Nevertheless, banks with size above a sufficiently high level have equal systemic importance. In addition to size, we find that the extent to which banks engage in non-traditional banking activities is also positively related to ...

  5. Characterizing Sources of Uncertainty in Item Response Theory Scale Scores

    Science.gov (United States)

    Yang, Ji Seung; Hansen, Mark; Cai, Li

    2012-01-01

    Traditional estimators of item response theory scale scores ignore uncertainty carried over from the item calibration process, which can lead to incorrect estimates of the standard errors of measurement (SEMs). Here, the authors review a variety of approaches that have been applied to this problem and compare them on the basis of their statistical…

  6. 7 CFR 718.9 - Signature requirements.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Signature requirements. 718.9 Section 718.9... MULTIPLE PROGRAMS General Provisions § 718.9 Signature requirements. (a) When a program authorized by this chapter or Chapter XIV of this title requires the signature of a producer; landowner; landlord; or tenant...

  7. 27 CFR 17.6 - Signature authority.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Signature authority. 17.6... PRODUCTS General Provisions § 17.6 Signature authority. No claim, bond, tax return, or other required... other proper notification of signature authority has been filed with the TTB office where the required...

  8. Justification for recommended uncertainties

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  9. A comparative study of traditional lecture methods and interactive lecture methods in introductory geology courses for non-science majors at the college level

    Science.gov (United States)

    Hundley, Stacey A.

    In recent years there has been a national call for reform in undergraduate science education. The goal of this reform movement in science education is to develop ways to improve undergraduate student learning with an emphasis on developing more effective teaching practices. Introductory science courses at the college level are generally taught using a traditional lecture format. Recent studies have shown incorporating active learning strategies within the traditional lecture classroom has positive effects on student outcomes. This study focuses on incorporating interactive teaching methods into the traditional lecture classroom to enhance student learning for non-science majors enrolled in introductory geology courses at a private university. Students' experience and instructional preferences regarding introductory geology courses were identified from survey data analysis. The information gained from responses to the questionnaire was utilized to develop an interactive lecture introductory geology course for non-science majors. Student outcomes were examined in introductory geology courses based on two teaching methods: interactive lecture and traditional lecture. There were no significant statistical differences between the groups based on the student outcomes and teaching methods. Incorporating interactive lecture methods did not statistically improve student outcomes when compared to traditional lecture teaching methods. However, the responses to the survey revealed students have a preference for introductory geology courses taught with lecture and instructor-led discussions and students prefer to work independently or in small groups. The results of this study are useful to individuals who teach introductory geology courses and individuals who teach introductory science courses for non-science majors at the college level.

  10. Online Signature Verification on MOBISIG Finger-Drawn Signature Corpus

    Directory of Open Access Journals (Sweden)

    Margit Antal

    2018-01-01

    Full Text Available We present MOBISIG, a pseudosignature dataset containing finger-drawn signatures from 83 users captured with a capacitive touchscreen-based mobile device. The database was captured in three sessions resulting in 45 genuine signatures and 20 skilled forgeries for each user. The database was evaluated by two state-of-the-art methods: a function-based system using local features and a feature-based system using global features. Two types of equal error rate computations are performed: one using a global threshold and the other using user-specific thresholds. The lowest equal error rate was 0.01% against random forgeries and 5.81% against skilled forgeries using user-specific thresholds that were computed a posteriori. However, these equal error rates were significantly raised to 1.68% (random forgeries case and 14.31% (skilled forgeries case using global thresholds. The same evaluation protocol was performed on the DooDB publicly available dataset. Besides verification performance evaluations conducted on the two finger-drawn datasets, we evaluated the quality of the samples and the users of the two datasets using basic quality measures. The results show that finger-drawn signatures can be used by biometric systems with reasonable accuracy.

  11. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  12. 48 CFR 804.101 - Contracting officer's signature.

    Science.gov (United States)

    2010-10-01

    ... signature. 804.101 Section 804.101 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL ADMINISTRATIVE MATTERS Contract Execution 804.101 Contracting officer's signature. (a) If a... signature. ...

  13. Data set on the bioprecipitation of sulfate and trivalent arsenic by acidophilic non-traditional sulfur reducing bacteria

    Directory of Open Access Journals (Sweden)

    Letícia Paiva de Matos

    2018-04-01

    Full Text Available Data presented here are related to the original paper “Simultaneous removal of sulfate and arsenic using immobilized non-traditional sulfate reducing bacteria (SRB mixed culture and alternative low-cost carbon sources” published by same authors (Matos et al., 2018 [1]. The data set here presented aims to facilitate this paper comprehension by giving readers some additional information. Data set includes a brief description of experimental conditions and the results obtained during both batch and semi-continuous reactors experiments. Data confirmed arsenic and sulfate were simultaneously removed under acidic pH by using a biological treatment based on the activity of a non-traditional sulfur reducing bacteria consortium. This microbial consortium was able to utilize glycerol, powdered chicken feathers as carbon donors, and proved to be resistant to arsenite up to 8.0 mg L−1. Data related to sulfate and arsenic removal efficiencies, residual arsenite and sulfate contents, pH and Eh measurements obtained under different experimental conditions were depicted in graphical format.Refers to https://doi.org/10.1016/j.cej.2017.11.035 Keywords: Arsenite, Sulfate reduction, Bioremediation, Immobilized cells, Acid pH

  14. Investment Decisions with Two-Factor Uncertainty

    NARCIS (Netherlands)

    Compernolle, T.; Huisman, Kuno; Kort, Peter; Lavrutich, Maria; Nunes, Claudia; Thijssen, J.J.J.

    2018-01-01

    This paper considers investment problems in real options with non-homogeneous two-factor uncertainty. It shows that, despite claims made in the literature, the method used to derive an analytical solution in one dimensional problems cannot be straightforwardly extended to problems with two

  15. Approaches to handling uncertainty when setting environmental exposure standards

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe

    2009-01-01

    attempts for the first time to cover the full range of issues related to model uncertainties, from the subjectivity of setting up a conceptual model of a given system, all the way to communicating the nature of model uncertainties to non-scientists and accounting for model uncertainties in policy decisions....... Theoretical chapters, providing background information on specific steps in the modelling process and in the adoption of models by end-users, are complemented by illustrative case studies dealing with soils and global climate change. All the chapters are authored by recognized experts in their respective...

  16. A non-linear and stochastic response surface method for Bayesian estimation of uncertainty in soil moisture simulation from a land surface model

    Directory of Open Access Journals (Sweden)

    F. Hossain

    2004-01-01

    Full Text Available This study presents a simple and efficient scheme for Bayesian estimation of uncertainty in soil moisture simulation by a Land Surface Model (LSM. The scheme is assessed within a Monte Carlo (MC simulation framework based on the Generalized Likelihood Uncertainty Estimation (GLUE methodology. A primary limitation of using the GLUE method is the prohibitive computational burden imposed by uniform random sampling of the model's parameter distributions. Sampling is improved in the proposed scheme by stochastic modeling of the parameters' response surface that recognizes the non-linear deterministic behavior between soil moisture and land surface parameters. Uncertainty in soil moisture simulation (model output is approximated through a Hermite polynomial chaos expansion of normal random variables that represent the model's parameter (model input uncertainty. The unknown coefficients of the polynomial are calculated using limited number of model simulation runs. The calibrated polynomial is then used as a fast-running proxy to the slower-running LSM to predict the degree of representativeness of a randomly sampled model parameter set. An evaluation of the scheme's efficiency in sampling is made through comparison with the fully random MC sampling (the norm for GLUE and the nearest-neighborhood sampling technique. The scheme was able to reduce computational burden of random MC sampling for GLUE in the ranges of 10%-70%. The scheme was also found to be about 10% more efficient than the nearest-neighborhood sampling method in predicting a sampled parameter set's degree of representativeness. The GLUE based on the proposed sampling scheme did not alter the essential features of the uncertainty structure in soil moisture simulation. The scheme can potentially make GLUE uncertainty estimation for any LSM more efficient as it does not impose any additional structural or distributional assumptions.

  17. Perancangan Aplikasi Undeniable Digital Signature Dengan Algoritma Chaum’s Blind Signature

    OpenAIRE

    Simanjuntak, Martin Dennain

    2012-01-01

    Desperaty need a securiry system in the exchange of information via computer media, so that information can not be accessed by unauthorized parties. One of the security system is to use a system of digital signatures as a means of authenticating the authenticity of digital document that are exchanged. By using a digital a digital signature system is undeniable, the security system can be generated digital document exchange, where the system is free from the from of rejection...

  18. Practical quantum digital signature

    Science.gov (United States)

    Yin, Hua-Lei; Fu, Yao; Chen, Zeng-Bing

    2016-03-01

    Guaranteeing nonrepudiation, unforgeability as well as transferability of a signature is one of the most vital safeguards in today's e-commerce era. Based on fundamental laws of quantum physics, quantum digital signature (QDS) aims to provide information-theoretic security for this cryptographic task. However, up to date, the previously proposed QDS protocols are impractical due to various challenging problems and most importantly, the requirement of authenticated (secure) quantum channels between participants. Here, we present the first quantum digital signature protocol that removes the assumption of authenticated quantum channels while remaining secure against the collective attacks. Besides, our QDS protocol can be practically implemented over more than 100 km under current mature technology as used in quantum key distribution.

  19. Hyperheat: a thermal signature model for super- and hypersonic missiles

    Science.gov (United States)

    van Binsbergen, S. A.; van Zelderen, B.; Veraar, R. G.; Bouquet, F.; Halswijk, W. H. C.; Schleijpen, H. M. A.

    2017-10-01

    In performance prediction of IR sensor systems for missile detection, apart from the sensor specifications, target signatures are essential variables. Very often, for velocities up to Mach 2-2.5, a simple model based on the aerodynamic heating of a perfect gas was used to calculate the temperatures of missile targets. This typically results in an overestimate of the target temperature with correspondingly large infrared signatures and detection ranges. Especially for even higher velocities, this approach is no longer accurate. Alternatives like CFD calculations typically require more complex sets of inputs and significantly more computing power. The MATLAB code Hyperheat was developed to calculate the time-resolved skin temperature of axisymmetric high speed missiles during flight, taking into account the behaviour of non-perfect gas and proper heat transfer to the missile surface. Allowing for variations in parameters like missile shape, altitude, atmospheric profile, angle of attack, flight duration and super- and hypersonic velocities up to Mach 30 enables more accurate calculations of the actual target temperature. The model calculates a map of the skin temperature of the missile, which is updated over the flight time of the missile. The sets of skin temperature maps are calculated within minutes, even for >100 km trajectories, and can be easily converted in thermal infrared signatures for further processing. This paper discusses the approach taken in Hyperheat. Then, the thermal signature of a set of typical missile threats is calculated using both the simple aerodynamic heating model and the Hyperheat code. The respective infrared signatures are compared, as well as the difference in the corresponding calculated detection ranges.

  20. 25 CFR 213.10 - Lessor's signature.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Lessor's signature. 213.10 Section 213.10 Indians BUREAU... MEMBERS OF FIVE CIVILIZED TRIBES, OKLAHOMA, FOR MINING How to Acquire Leases § 213.10 Lessor's signature... thumbprint which shall be designated as “right” or “left” thumbmark. Such signatures must be witnessed by two...

  1. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  2. Initial Semantics for Strengthened Signatures

    Directory of Open Access Journals (Sweden)

    André Hirschowitz

    2012-02-01

    Full Text Available We give a new general definition of arity, yielding the companion notions of signature and associated syntax. This setting is modular in the sense requested by Ghani and Uustalu: merging two extensions of syntax corresponds to building an amalgamated sum. These signatures are too general in the sense that we are not able to prove the existence of an associated syntax in this general context. So we have to select arities and signatures for which there exists the desired initial monad. For this, we follow a track opened by Matthes and Uustalu: we introduce a notion of strengthened arity and prove that the corresponding signatures have initial semantics (i.e. associated syntax. Our strengthened arities admit colimits, which allows the treatment of the λ-calculus with explicit substitution.

  3. Spectral signature selection for mapping unvegetated soils

    Science.gov (United States)

    May, G. A.; Petersen, G. W.

    1975-01-01

    Airborne multispectral scanner data covering the wavelength interval from 0.40-2.60 microns were collected at an altitude of 1000 m above the terrain in southeastern Pennsylvania. Uniform training areas were selected within three sites from this flightline. Soil samples were collected from each site and a procedure developed to allow assignment of scan line and element number from the multispectral scanner data to each sampling location. These soil samples were analyzed on a spectrophotometer and laboratory spectral signatures were derived. After correcting for solar radiation and atmospheric attenuation, the laboratory signatures were compared to the spectral signatures derived from these same soils using multispectral scanner data. Both signatures were used in supervised and unsupervised classification routines. Computer-generated maps using the laboratory and multispectral scanner derived signatures resulted in maps that were similar to maps resulting from field surveys. Approximately 90% agreement was obtained between classification maps produced using multispectral scanner derived signatures and laboratory derived signatures.

  4. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  5. Quantification of Uncertainty in Predicting Building Energy Consumption

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    2012-01-01

    Traditional building energy consumption calculation methods are characterised by rough approaches providing approximate figures with high and unknown levels of uncertainty. Lack of reliable energy resources and increasing concerns about climate change call for improved predictive tools. A new...... approach for the prediction of building energy consumption is presented. The approach quantifies the uncertainty of building energy consumption by means of stochastic differential equations. The approach is applied to a general heat balance for an arbitrary number of loads and zones in a building...... for the dynamic thermal behaviour of buildings. However, for air flow and energy consumption it is found to be much more significant due to less “damping”. Probabilistic methods establish a new approach to the prediction of building energy consumption, enabling designers to include stochastic parameters like...

  6. Genome signature analysis of thermal virus metagenomes reveals Archaea and thermophilic signatures.

    Science.gov (United States)

    Pride, David T; Schoenfeld, Thomas

    2008-09-17

    Metagenomic analysis provides a rich source of biological information for otherwise intractable viral communities. However, study of viral metagenomes has been hampered by its nearly complete reliance on BLAST algorithms for identification of DNA sequences. We sought to develop algorithms for examination of viral metagenomes to identify the origin of sequences independent of BLAST algorithms. We chose viral metagenomes obtained from two hot springs, Bear Paw and Octopus, in Yellowstone National Park, as they represent simple microbial populations where comparatively large contigs were obtained. Thermal spring metagenomes have high proportions of sequences without significant Genbank homology, which has hampered identification of viruses and their linkage with hosts. To analyze each metagenome, we developed a method to classify DNA fragments using genome signature-based phylogenetic classification (GSPC), where metagenomic fragments are compared to a database of oligonucleotide signatures for all previously sequenced Bacteria, Archaea, and viruses. From both Bear Paw and Octopus hot springs, each assembled contig had more similarity to other metagenome contigs than to any sequenced microbial genome based on GSPC analysis, suggesting a genome signature common to each of these extreme environments. While viral metagenomes from Bear Paw and Octopus share some similarity, the genome signatures from each locale are largely unique. GSPC using a microbial database predicts most of the Octopus metagenome has archaeal signatures, while bacterial signatures predominate in Bear Paw; a finding consistent with those of Genbank BLAST. When using a viral database, the majority of the Octopus metagenome is predicted to belong to archaeal virus Families Globuloviridae and Fuselloviridae, while none of the Bear Paw metagenome is predicted to belong to archaeal viruses. As expected, when microbial and viral databases are combined, each of the Octopus and Bear Paw metagenomic contigs

  7. Signatures de l'invisible

    CERN Multimedia

    CERN Press Office. Geneva

    2000-01-01

    "Signatures of the Invisible" is an unique collaboration between contemporary artists and contemporary physicists which has the potential to help redefine the relationship between science and art. "Signatures of the Invisible" is jointly organised by the London Institute - the world's largest college of art and design and CERN*, the world's leading particle physics laboratory. 12 leading visual artists:

  8. Negative branes, supergroups and the signature of spacetime

    Science.gov (United States)

    Dijkgraaf, Robbert; Heidenreich, Ben; Jefferson, Patrick; Vafa, Cumrun

    2018-02-01

    We study the realization of supergroup gauge theories using negative branes in string theory. We show that negative branes are intimately connected with the possibility of timelike compactification and exotic spacetime signatures previously studied by Hull. Isolated negative branes dynamically generate a change in spacetime signature near their worldvolumes, and are related by string dualities to a smooth M-theory geometry with closed timelike curves. Using negative D3-branes, we show that SU(0| N) supergroup theories are holographically dual to an exotic variant of type IIB string theory on {dS}_{3,2}× {\\overline{S}}^5 , for which the emergent dimensions are timelike. Using branes, mirror symmetry and Nekrasov's instanton calculus, all of which agree, we derive the Seiberg-Witten curve for N=2 SU( N | M ) gauge theories. Together with our exploration of holography and string dualities for negative branes, this suggests that supergroup gauge theories may be non-perturbatively well-defined objects, though several puzzles remain.

  9. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  10. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  11. Understanding Climate Uncertainty with an Ocean Focus

    Science.gov (United States)

    Tokmakian, R. T.

    2009-12-01

    Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in

  12. New Paradigms for the Study of Ocular Alphaherpesvirus Infections: Insights into the Use of Non-Traditional Host Model Systems

    Directory of Open Access Journals (Sweden)

    Matthew R. Pennington

    2017-11-01

    Full Text Available Ocular herpesviruses, most notably human alphaherpesvirus 1 (HSV-1, canid alphaherpesvirus 1 (CHV-1 and felid alphaherpesvirus 1 (FHV-1, infect and cause severe disease that may lead to blindness. CHV-1 and FHV-1 have a pathogenesis and induce clinical disease in their hosts that is similar to HSV-1 ocular infections in humans, suggesting that infection of dogs and cats with CHV-1 and FHV-1, respectively, can be used as a comparative natural host model of herpesvirus-induced ocular disease. In this review, we discuss both strengths and limitations of the various available model systems to study ocular herpesvirus infection, with a focus on the use of these non-traditional virus-natural host models. Recent work has demonstrated the robustness and reproducibility of experimental ocular herpesvirus infections in dogs and cats, and, therefore, these non-traditional models can provide additional insights into the pathogenesis of ocular herpesvirus infections.

  13. Improvement of engineering soil properties using non -traditional additives

    Directory of Open Access Journals (Sweden)

    Waheed Mohanned

    2018-01-01

    Full Text Available Laboratory experiments are conducted to evaluate the effect of some non-traditional additives on the engineering properties of clayey soil, which show problematic phenomenon when used as a construction material. The conducted tests covered the influence of these additives on various parameters like consistency limits, compaction characteristics and CBR value. Two nontraditional stabilizers are selected in this study, polymers and phosphoric acid at three different percent which are (1%, 3% and 5% of the dry soil weight. It is concluded that addition of the polymer to the clayey soil results in a slight increase in plastic limit while the liquid limit is not affected accompanied by a marginal decrease in the dry unit weight while the optimum moisture content remains unaffected. The addition of phosphoric acid to the clayey soil has no effect on its Atterberg limits. In general, it is observed that polymer is found to be ineffective as a stabilizer to improve clayey soils, especially in small amounts of about (3%. The phosphoric acid treated soil gained better improvement for all amounts of additive used. For (3% acid treated soil the CBR is about (360% compared to that of untreated soil, for that, it can be concluded that the improvement using phosphoric acid in the clay soils is a promising option and can be applied to solve the geotechnical stabilization problems.

  14. Uncertainty estimation of a complex water quality model: The influence of Box-Cox transformation on Bayesian approaches and comparison with a non-Bayesian method

    Science.gov (United States)

    Freni, Gabriele; Mannina, Giorgio

    residuals distribution. If residuals are not normally distributed, the uncertainty is over-estimated if Box-Cox transformation is not applied or non-calibrated parameter is used.

  15. A signature-based method for indexing cell cycle phase distribution from microarray profiles

    Directory of Open Access Journals (Sweden)

    Mizuno Hideaki

    2009-03-01

    Full Text Available Abstract Background The cell cycle machinery interprets oncogenic signals and reflects the biology of cancers. To date, various methods for cell cycle phase estimation such as mitotic index, S phase fraction, and immunohistochemistry have provided valuable information on cancers (e.g. proliferation rate. However, those methods rely on one or few measurements and the scope of the information is limited. There is a need for more systematic cell cycle analysis methods. Results We developed a signature-based method for indexing cell cycle phase distribution from microarray profiles under consideration of cycling and non-cycling cells. A cell cycle signature masterset, composed of genes which express preferentially in cycling cells and in a cell cycle-regulated manner, was created to index the proportion of cycling cells in the sample. Cell cycle signature subsets, composed of genes whose expressions peak at specific stages of the cell cycle, were also created to index the proportion of cells in the corresponding stages. The method was validated using cell cycle datasets and quiescence-induced cell datasets. Analyses of a mouse tumor model dataset and human breast cancer datasets revealed variations in the proportion of cycling cells. When the influence of non-cycling cells was taken into account, "buried" cell cycle phase distributions were depicted that were oncogenic-event specific in the mouse tumor model dataset and were associated with patients' prognosis in the human breast cancer datasets. Conclusion The signature-based cell cycle analysis method presented in this report, would potentially be of value for cancer characterization and diagnostics.

  16. A genomic biomarker signature can predict skin sensitizers using a cell-based in vitro alternative to animal tests

    Directory of Open Access Journals (Sweden)

    Albrekt Ann-Sofie

    2011-08-01

    Full Text Available Abstract Background Allergic contact dermatitis is an inflammatory skin disease that affects a significant proportion of the population. This disease is caused by an adverse immune response towards chemical haptens, and leads to a substantial economic burden for society. Current test of sensitizing chemicals rely on animal experimentation. New legislations on the registration and use of chemicals within pharmaceutical and cosmetic industries have stimulated significant research efforts to develop alternative, human cell-based assays for the prediction of sensitization. The aim is to replace animal experiments with in vitro tests displaying a higher predictive power. Results We have developed a novel cell-based assay for the prediction of sensitizing chemicals. By analyzing the transcriptome of the human cell line MUTZ-3 after 24 h stimulation, using 20 different sensitizing chemicals, 20 non-sensitizing chemicals and vehicle controls, we have identified a biomarker signature of 200 genes with potent discriminatory ability. Using a Support Vector Machine for supervised classification, the prediction performance of the assay revealed an area under the ROC curve of 0.98. In addition, categorizing the chemicals according to the LLNA assay, this gene signature could also predict sensitizing potency. The identified markers are involved in biological pathways with immunological relevant functions, which can shed light on the process of human sensitization. Conclusions A gene signature predicting sensitization, using a human cell line in vitro, has been identified. This simple and robust cell-based assay has the potential to completely replace or drastically reduce the utilization of test systems based on experimental animals. Being based on human biology, the assay is proposed to be more accurate for predicting sensitization in humans, than the traditional animal-based tests.

  17. A genomic biomarker signature can predict skin sensitizers using a cell-based in vitro alternative to animal tests

    Science.gov (United States)

    2011-01-01

    Background Allergic contact dermatitis is an inflammatory skin disease that affects a significant proportion of the population. This disease is caused by an adverse immune response towards chemical haptens, and leads to a substantial economic burden for society. Current test of sensitizing chemicals rely on animal experimentation. New legislations on the registration and use of chemicals within pharmaceutical and cosmetic industries have stimulated significant research efforts to develop alternative, human cell-based assays for the prediction of sensitization. The aim is to replace animal experiments with in vitro tests displaying a higher predictive power. Results We have developed a novel cell-based assay for the prediction of sensitizing chemicals. By analyzing the transcriptome of the human cell line MUTZ-3 after 24 h stimulation, using 20 different sensitizing chemicals, 20 non-sensitizing chemicals and vehicle controls, we have identified a biomarker signature of 200 genes with potent discriminatory ability. Using a Support Vector Machine for supervised classification, the prediction performance of the assay revealed an area under the ROC curve of 0.98. In addition, categorizing the chemicals according to the LLNA assay, this gene signature could also predict sensitizing potency. The identified markers are involved in biological pathways with immunological relevant functions, which can shed light on the process of human sensitization. Conclusions A gene signature predicting sensitization, using a human cell line in vitro, has been identified. This simple and robust cell-based assay has the potential to completely replace or drastically reduce the utilization of test systems based on experimental animals. Being based on human biology, the assay is proposed to be more accurate for predicting sensitization in humans, than the traditional animal-based tests. PMID:21824406

  18. Do oil shocks predict economic policy uncertainty?

    Science.gov (United States)

    Rehman, Mobeen Ur

    2018-05-01

    Oil price fluctuations have influential role in global economic policies for developed as well as emerging countries. I investigate the role of international oil prices disintegrated into structural (i) oil supply shock, (ii) aggregate demand shock and (iii) oil market specific demand shocks, based on the work of Kilian (2009) using structural VAR framework on economic policies uncertainty of sampled markets. Economic policy uncertainty, due to its non-linear behavior is modeled in a regime switching framework with disintegrated structural oil shocks. Our results highlight that Indian, Spain and Japanese economic policy uncertainty responds to the global oil price shocks, however aggregate demand shocks fail to induce any change. Oil specific demand shocks are significant only for China and India in high volatility state.

  19. Maximizing biomarker discovery by minimizing gene signatures

    Directory of Open Access Journals (Sweden)

    Chang Chang

    2011-12-01

    Full Text Available Abstract Background The use of gene signatures can potentially be of considerable value in the field of clinical diagnosis. However, gene signatures defined with different methods can be quite various even when applied the same disease and the same endpoint. Previous studies have shown that the correct selection of subsets of genes from microarray data is key for the accurate classification of disease phenotypes, and a number of methods have been proposed for the purpose. However, these methods refine the subsets by only considering each single feature, and they do not confirm the association between the genes identified in each gene signature and the phenotype of the disease. We proposed an innovative new method termed Minimize Feature's Size (MFS based on multiple level similarity analyses and association between the genes and disease for breast cancer endpoints by comparing classifier models generated from the second phase of MicroArray Quality Control (MAQC-II, trying to develop effective meta-analysis strategies to transform the MAQC-II signatures into a robust and reliable set of biomarker for clinical applications. Results We analyzed the similarity of the multiple gene signatures in an endpoint and between the two endpoints of breast cancer at probe and gene levels, the results indicate that disease-related genes can be preferably selected as the components of gene signature, and that the gene signatures for the two endpoints could be interchangeable. The minimized signatures were built at probe level by using MFS for each endpoint. By applying the approach, we generated a much smaller set of gene signature with the similar predictive power compared with those gene signatures from MAQC-II. Conclusions Our results indicate that gene signatures of both large and small sizes could perform equally well in clinical applications. Besides, consistency and biological significances can be detected among different gene signatures, reflecting the

  20. Development of Evaluation Code for MUF Uncertainty

    International Nuclear Information System (INIS)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan

    2015-01-01

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities

  1. Development of Evaluation Code for MUF Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities.

  2. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data

  3. Fuzzy randomness uncertainty in civil engineering and computational mechanics

    CERN Document Server

    Möller, Bernd

    2004-01-01

    This book, for the first time, provides a coherent, overall concept for taking account of uncertainty in the analysis, the safety assessment, and the design of structures. The reader is introduced to the problem of uncertainty modeling and familiarized with particular uncertainty models. For simultaneously considering stochastic and non-stochastic uncertainty the superordinated uncertainty model fuzzy randomness, which contains real valued random variables as well as fuzzy variables as special cases, is presented. For this purpose basic mathematical knowledge concerning the fuzzy set theory and the theory of fuzzy random variables is imparted. The body of the book comprises the appropriate quantification of uncertain structural parameters, the fuzzy and fuzzy probabilistic structural analysis, the fuzzy probabilistic safety assessment, and the fuzzy cluster structural design. The completely new algorithms are described in detail and illustrated by way of demonstrative examples.

  4. A Phenomenological Study of the Lived Experiences of Non-Traditional Students in Higher Level Mathematics at a Midwest University

    Science.gov (United States)

    Wood, Brian B.

    2017-01-01

    The current literature suggests that the use of Husserl's and Heidegger's approaches to phenomenology is still practiced. However, a clear gap exists on how these approaches are viewed in the context of constructivism, particularly with non-traditional female students' study of mathematics. The dissertation attempts to clarify the constructivist…

  5. Lattice-Based Revocable Certificateless Signature

    Directory of Open Access Journals (Sweden)

    Ying-Hao Hung

    2017-10-01

    Full Text Available Certificateless signatures (CLS are noticeable because they may resolve the key escrow problem in ID-based signatures and break away the management problem regarding certificate in conventional signatures. However, the security of the mostly previous CLS schemes relies on the difficulty of solving discrete logarithm or large integer factorization problems. These two problems would be solved by quantum computers in the future so that the signature schemes based on them will also become insecure. For post-quantum cryptography, lattice-based cryptography is significant due to its efficiency and security. However, no study on addressing the revocation problem in the existing lattice-based CLS schemes is presented. In this paper, we focus on the revocation issue and present the first revocable CLS (RCLS scheme over lattices. Based on the short integer solution (SIS assumption over lattices, the proposed lattice-based RCLS scheme is shown to be existential unforgeability against adaptive chosen message attacks. By performance analysis and comparisons, the proposed lattice-based RCLS scheme is better than the previously proposed lattice-based CLS scheme, in terms of private key size, signature length and the revocation mechanism.

  6. Teaching Climate Science in Non-traditional Classrooms

    Science.gov (United States)

    Strybos, J.

    2015-12-01

    San Antonio College is the oldest, largest and centrally-located campus of Alamo Colleges, a network of five community colleges based around San Antonio, Texas with a headcount enrollment of approximately 20,000 students. The student population is diverse in ethnicity, age and income; and the Colleges understand that they play a salient role in educating its students on the foreseen impacts of climate change. This presentation will discuss the key investment Alamo Colleges has adopted to incorporate sustainability and climate science into non-traditional classrooms. The established courses that cover climate-related course material have historically had low enrollments. One of the most significant challenges is informing the student population of the value of this class both in their academic career and in their personal lives. By hosting these lessons in hands-on simulations and demonstrations that are accessible and understandable to students of any age, and pursuing any major, we have found an exciting way to teach all students about climate change and identify solutions. San Antonio College (SAC) hosts the Bill R. Sinkin Eco Centro Community Center, completed in early 2014, that serves as an environmental hub for Alamo Colleges' staff and students as well as the San Antonio community. The center actively engages staff and faculty during training days in sustainability by presenting information on Eco Centro, personal sustainability habits, and inviting faculty to bring their classes for a tour and sustainability primer for students. The Centro has hosted professors from diverse disciplines that include Architecture, Psychology, Engineering, Science, English, Fine Arts, and International Studies to bring their classes to center to learn about energy, water conservation, landscaping, and green building. Additionally, Eco Centro encourages and assists students with research projects, including a solar-hydroponic project currently under development with the support

  7. Eating habits of a population undergoing a rapid dietary transition: portion sizes of traditional and non-traditional foods and beverages consumed by Inuit adults in Nunavut, Canada

    Science.gov (United States)

    2013-01-01

    Background To determine the portion sizes of traditional and non-traditional foods being consumed by Inuit adults in three remote communities in Nunavut, Canada. Methods A cross-sectional study was carried out between June and October, 2008. Trained field workers collected dietary data using a culturally appropriate, validated quantitative food frequency questionnaire (QFFQ) developed specifically for the study population. Results Caribou, muktuk (whale blubber and skin) and Arctic char (salmon family), were the most commonly consumed traditional foods; mean portion sizes for traditional foods ranged from 10 g for fermented seal fat to 424 g for fried caribou. Fried bannock and white bread were consumed by >85% of participants; mean portion sizes for these foods were 189 g and 70 g, respectively. Sugar-sweetened beverages and energy-dense, nutrient-poor foods were also widely consumed. Mean portion sizes for regular pop and sweetened juices with added sugar were 663 g and 572 g, respectively. Mean portion sizes for potato chips, pilot biscuits, cakes, chocolate and cookies were 59 g, 59 g, 106 g, 59 g, and 46 g, respectively. Conclusions The present study provides further evidence of the nutrition transition that is occurring among Inuit in the Canadian Arctic. It also highlights a number of foods and beverages that could be targeted in future nutritional intervention programs aimed at obesity and diet-related chronic disease prevention in these and other Inuit communities. PMID:23724920

  8. Observable Signatures of Energy Release in Braided Coronal Loops

    Energy Technology Data Exchange (ETDEWEB)

    Pontin, D. I. [University of Dundee, Nethergate, Dundee, DD1 4HN (United Kingdom); Janvier, M. [Institut d’Astrophysique Spatiale, CNRS, Univ. Paris-Sud, Université Paris-Saclay, Bât. 121, F-91405, Orsay Cedex (France); Tiwari, S. K.; Winebarger, A. R.; Cirtain, J. W. [NASA Marshall Space Flight Center, ZP 13, Huntsville, AL 35812 (United States); Galsgaard, K. [Niels Bohr Institute, Geological Museum Østervoldgade 5-7, DK-1350, Copenhagen K (Denmark)

    2017-03-10

    We examine the turbulent relaxation of solar coronal loops containing non-trivial field line braiding. Such field line tangling in the corona has long been postulated in the context of coronal heating models. We focus on the observational signatures of energy release in such braided magnetic structures using MHD simulations and forward modeling tools. The aim is to answer the following question: if energy release occurs in a coronal loop containing braided magnetic flux, should we expect a clearly observable signature in emissions? We demonstrate that the presence of braided magnetic field lines does not guarantee a braided appearance to the observed intensities. Observed intensities may—but need not necessarily—reveal the underlying braided nature of the magnetic field, depending on the degree and pattern of the field line tangling within the loop. However, in all cases considered, the evolution of the braided loop is accompanied by localized heating regions as the loop relaxes. Factors that may influence the observational signatures are discussed. Recent high-resolution observations from Hi-C have claimed the first direct evidence of braided magnetic fields in the corona. Here we show that both the Hi-C data and some of our simulations give the appearance of braiding at a range of scales.

  9. General Conversion for Obtaining Strongly Existentially Unforgeable Signatures

    Science.gov (United States)

    Teranishi, Isamu; Oyama, Takuro; Ogata, Wakaha

    We say that a signature scheme is strongly existentially unforgeable (SEU) if no adversary, given message/signature pairs adaptively, can generate a signature on a new message or a new signature on a previously signed message. We propose a general and efficient conversion in the standard model that transforms a secure signature scheme to SEU signature scheme. In order to construct that conversion, we use a chameleon commitment scheme. Here a chameleon commitment scheme is a variant of commitment scheme such that one can change the committed value after publishing the commitment if one knows the secret key. We define the chosen message security notion for the chameleon commitment scheme, and show that the signature scheme transformed by our proposed conversion satisfies the SEU property if the chameleon commitment scheme is chosen message secure. By modifying the proposed conversion, we also give a general and efficient conversion in the random oracle model, that transforms a secure signature scheme into a SEU signature scheme. This second conversion also uses a chameleon commitment scheme but only requires the key only attack security for it.

  10. Complex Empiricism and the Quantification of Uncertainty in Paleoclimate Reconstructions

    Science.gov (United States)

    Brumble, K. C.

    2014-12-01

    Because the global climate cannot be observed directly, and because of vast and noisy data sets, climate science is a rich field to study how computational statistics informs what it means to do empirical science. Traditionally held virtues of empirical science and empirical methods like reproducibility, independence, and straightforward observation are complicated by representational choices involved in statistical modeling and data handling. Examining how climate reconstructions instantiate complicated empirical relationships between model, data, and predictions reveals that the path from data to prediction does not match traditional conceptions of empirical inference either. Rather, the empirical inferences involved are "complex" in that they require articulation of a good deal of statistical processing wherein assumptions are adopted and representational decisions made, often in the face of substantial uncertainties. Proxy reconstructions are both statistical and paleoclimate science activities aimed at using a variety of proxies to reconstruct past climate behavior. Paleoclimate proxy reconstructions also involve complex data handling and statistical refinement, leading to the current emphasis in the field on the quantification of uncertainty in reconstructions. In this presentation I explore how the processing needed for the correlation of diverse, large, and messy data sets necessitate the explicit quantification of the uncertainties stemming from wrangling proxies into manageable suites. I also address how semi-empirical pseudo-proxy methods allow for the exploration of signal detection in data sets, and as intermediary steps for statistical experimentation.

  11. Entanglement criteria via the uncertainty relations in su(2) and su(1,1) algebras: Detection of non-Gaussian entangled states

    International Nuclear Information System (INIS)

    Nha, Hyunchul; Kim, Jaewan

    2006-01-01

    We derive a class of inequalities, from the uncertainty relations of the su(1,1) and the su(2) algebra in conjunction with partial transposition, that must be satisfied by any separable two-mode states. These inequalities are presented in terms of the su(2) operators J x =(a † b+ab † )/2, J y =(a † b-ab † )/2i, and the total photon number a +N b >. They include as special cases the inequality derived by Hillery and Zubairy [Phys. Rev. Lett. 96, 050503 (2006)], and the one by Agarwal and Biswas [New J. Phys. 7, 211 (2005)]. In particular, optimization over the whole inequalities leads to the criterion obtained by Agarwal and Biswas. We show that this optimal criterion can detect entanglement for a broad class of non-Gaussian entangled states, i.e., the su(2) minimum-uncertainty states. Experimental schemes to test the optimal criterion are also discussed, especially the one using linear optical devices and photodetectors

  12. Associating uncertainty with datasets using Linked Data and allowing propagation via provenance chains

    Science.gov (United States)

    Car, Nicholas; Cox, Simon; Fitch, Peter

    2015-04-01

    With earth-science datasets increasingly being published to enable re-use in projects disassociated from the original data acquisition or generation, there is an urgent need for associated metadata to be connected, in order to guide their application. In particular, provenance traces should support the evaluation of data quality and reliability. However, while standards for describing provenance are emerging (e.g. PROV-O), these do not include the necessary statistical descriptors and confidence assessments. UncertML has a mature conceptual model that may be used to record uncertainty metadata. However, by itself UncertML does not support the representation of uncertainty of multi-part datasets, and provides no direct way of associating the uncertainty information - metadata in relation to a dataset - with dataset objects.We present a method to address both these issues by combining UncertML with PROV-O, and delivering resulting uncertainty-enriched provenance traces through the Linked Data API. UncertProv extends the PROV-O provenance ontology with an RDF formulation of the UncertML conceptual model elements, adds further elements to support uncertainty representation without a conceptual model and the integration of UncertML through links to documents. The Linked ID API provides a systematic way of navigating from dataset objects to their UncertProv metadata and back again. The Linked Data API's 'views' capability enables access to UncertML and non-UncertML uncertainty metadata representations for a dataset. With this approach, it is possible to access and navigate the uncertainty metadata associated with a published dataset using standard semantic web tools, such as SPARQL queries. Where the uncertainty data follows the UncertML model it can be automatically interpreted and may also support automatic uncertainty propagation . Repositories wishing to enable uncertainty propagation for all datasets must ensure that all elements that are associated with uncertainty

  13. Radar micro-doppler signatures processing and applications

    CERN Document Server

    Chen, Victor C; Miceli, William J

    2014-01-01

    Radar Micro-Doppler Signatures: Processing and applications concentrates on the processing and application of radar micro-Doppler signatures in real world situations, providing readers with a good working knowledge on a variety of applications of radar micro-Doppler signatures.

  14. Uncertainty Regarding Waste Handling in Everyday Life

    Directory of Open Access Journals (Sweden)

    Susanne Ewert

    2010-09-01

    Full Text Available According to our study, based on interviews with households in a residential area in Sweden, uncertainty is a cultural barrier to improved recycling. Four causes of uncertainty are identified. Firstly, professional categories not matching cultural categories—people easily discriminate between certain categories (e.g., materials such as plastic and paper but not between others (e.g., packaging and “non-packaging”. Thus a frequent cause of uncertainty is that the basic categories of the waste recycling system do not coincide with the basic categories used in everyday life. Challenged habits—source separation in everyday life is habitual, but when a habit is challenged, by a particular element or feature of the waste system, uncertainty can arise. Lacking fractions—some kinds of items cannot be left for recycling and this makes waste collection incomplete from the user’s point of view and in turn lowers the credibility of the system. Missing or contradictory rules of thumb—the above causes seem to be particularly relevant if no motivating principle or rule of thumb (within the context of use is successfully conveyed to the user. This paper discusses how reducing uncertainty can improve recycling.

  15. Geological-structural models used in SR 97. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saksa, P.; Nummela, J. [FINTACT Oy (Finland)

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that

  16. Geological-structural models used in SR 97. Uncertainty analysis

    International Nuclear Information System (INIS)

    Saksa, P.; Nummela, J.

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km 3 . Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that the

  17. Breaking Down the Door: A Nonprofit Model Creating Pathways for Non-Traditional STEM Student Engagement

    Science.gov (United States)

    Pelaez, C.; Pelaez, J.

    2015-12-01

    Blueprint Earth was created as a nonprofit scientific research organization dedicated to conducting micro-scale interdisciplinary environmental investigations to generate macroscopic, system-level environmental understanding. The field data collection and analysis process was conceived to be dependent on student participation and collaboration with more senior scientists, effecting knowledge transfer and emphasizing the critical nature of interdisciplinary research in investigating complex, macroscopic questions. Recruiting for student volunteer researchers is conducted in academic institutions, and to date has focused primarily on the Los Angeles area. Self-selecting student participation has run contrary to traditional STEM demographics. The vast majority of research participants in Blueprint Earth's work are female and/or from a minority (non-white) background, and most are first-generation college students or from low-income, Pell grant-eligible households. Traditional field research programs for students often come at a high cost, creating barriers to access for field-based STEM opportunities. The nonprofit model employed by Blueprint Earth provides zero-cost access to opportunity for students that the STEM world is currently targeting for future professional development.

  18. Physics Signatures at CLIC

    CERN Document Server

    Battaglia, Marco

    2001-01-01

    A set of signatures for physics processes of potential interests for the CLIC programme at = 1 - 5 TeV are discussed. These signatures, that may correspond to the manifestation of different scenarios of new physics as well as to Standard Model precision tests, are proposed as benchmarks for the optimisation of the CLIC accelerator parameters and for a first definition of the required detector response.

  19. Non-Traditional Security: The Case of Water Security in the Mekong Subregion

    Directory of Open Access Journals (Sweden)

    Haefner, Andrea

    2013-09-01

    Full Text Available In the first decade of the twenty-first century Non-Traditional Security (NTS challenges are of rising importance due to their increasing impact on daily life and broader national interests. This paper focuses on the Mekong Region as an important subregion due to its significance for more than 70 million people living directly on the river banks and its importance for the economic development of the six riparian countries. This paper investigates NTS challenges in the Mekong Subregion with a focus on environmental challenges and argues that NTS are of increasing importance in the region and will increase in the future. Whereas economic growth is crucial for the improvements of the livelihoods on the Mekong River and the overall economic performance of the riparian states, environmental protection cannot be disregarded as doing so would have devastating impact on the subregion and the wider region in the future.

  20. Signatures of discrete breathers in coherent state quantum dynamics

    International Nuclear Information System (INIS)

    Igumenshchev, Kirill; Ovchinnikov, Misha; Prezhdo, Oleg; Maniadis, Panagiotis

    2013-01-01

    In classical mechanics, discrete breathers (DBs) – a spatial time-periodic localization of energy – are predicted in a large variety of nonlinear systems. Motivated by a conceptual bridging of the DB phenomena in classical and quantum mechanical representations, we study their signatures in the dynamics of a quantum equivalent of a classical mechanical point in phase space – a coherent state. In contrast to the classical point that exhibits either delocalized or localized motion, the coherent state shows signatures of both localized and delocalized behavior. The transition from normal to local modes have different characteristics in quantum and classical perspectives. Here, we get an insight into the connection between classical and quantum perspectives by analyzing the decomposition of the coherent state into system's eigenstates, and analyzing the spacial distribution of the wave-function density within these eigenstates. We find that the delocalized and localized eigenvalue components of the coherent state are separated by a mixed region, where both kinds of behavior can be observed. Further analysis leads to the following observations. Considered as a function of coupling, energy eigenstates go through avoided crossings between tunneling and non-tunneling modes. The dominance of tunneling modes in the high nonlinearity region is compromised by the appearance of new types of modes – high order tunneling modes – that are similar to the tunneling modes but have attributes of non-tunneling modes. Certain types of excitations preferentially excite higher order tunneling modes, allowing one to study their properties. Since auto-correlation functions decrease quickly in highly nonlinear systems, short-time dynamics are sufficient for modeling quantum DBs. This work provides a foundation for implementing modern semi-classical methods to model quantum DBs, bridging classical and quantum mechanical signatures of DBs, and understanding spectroscopic experiments

  1. Using lumped modelling for providing simple metrics and associated uncertainties of catchment response to agricultural-derived nitrates pollutions

    Science.gov (United States)

    RUIZ, L.; Fovet, O.; Faucheux, M.; Molenat, J.; Sekhar, M.; Aquilina, L.; Gascuel-odoux, C.

    2013-12-01

    The development of simple and easily accessible metrics is required for characterizing and comparing catchment response to external forcings (climate or anthropogenic) and for managing water resources. The hydrological and geochemical signatures in the stream represent the integration of the various processes controlling this response. The complexity of these signatures over several time scales from sub-daily to several decades [Kirchner et al., 2001] makes their deconvolution very difficult. A large range of modeling approaches intent to represent this complexity by accounting for the spatial and/or temporal variability of the processes involved. However, simple metrics are not easily retrieved from these approaches, mostly because of over-parametrization issues. We hypothesize that to obtain relevant metrics, we need to use models that are able to simulate the observed variability of river signatures at different time scales, while being as parsimonious as possible. The lumped model ETNA (modified from[Ruiz et al., 2002]) is able to simulate adequately the seasonal and inter-annual patterns of stream NO3 concentration. Shallow groundwater is represented by two linear stores with double porosity and riparian processes are represented by a constant nitrogen removal function. Our objective was to identify simple metrics of catchment response by calibrating this lumped model on two paired agricultural catchments where both N inputs and outputs were monitored for a period of 20 years. These catchments, belonging to ORE AgrHys, although underlain by the same granitic bedrock are displaying contrasted chemical signatures. The model was able to simulate the two contrasted observed patterns in stream and groundwater, both on hydrology and chemistry, and at the seasonal and pluri-annual scales. It was also compatible with the expected trends of nitrate concentration since 1960. The output variables of the model were used to compute the nitrate residence time in both the

  2. Medication errors detected in non-traditional databases

    DEFF Research Database (Denmark)

    Perregaard, Helene; Aronson, Jeffrey K; Dalhoff, Kim

    2015-01-01

    AIMS: We have looked for medication errors involving the use of low-dose methotrexate, by extracting information from Danish sources other than traditional pharmacovigilance databases. We used the data to establish the relative frequencies of different types of errors. METHODS: We searched four...... errors, whereas knowledge-based errors more often resulted in near misses. CONCLUSIONS: The medication errors in this survey were most often action-based (50%) and knowledge-based (34%), suggesting that greater attention should be paid to education and surveillance of medical personnel who prescribe...

  3. Uncertainty Estimate in Resources Assessment: A Geostatistical Contribution

    International Nuclear Information System (INIS)

    Souza, Luis Eduardo de; Costa, Joao Felipe C. L.; Koppe, Jair C.

    2004-01-01

    For many decades the mining industry regarded resources/reserves estimation and classification as a mere calculation requiring basic mathematical and geological knowledge. Most methods were based on geometrical procedures and spatial data distribution. Therefore, uncertainty associated with tonnages and grades either were ignored or mishandled, although various mining codes require a measure of confidence in the values reported. Traditional methods fail in reporting the level of confidence in the quantities and grades. Conversely, kriging is known to provide the best estimate and its associated variance. Among kriging methods, Ordinary Kriging (OK) probably is the most widely used one for mineral resource/reserve estimation, mainly because of its robustness and its facility in uncertainty assessment by using the kriging variance. It also is known that OK variance is unable to recognize local data variability, an important issue when heterogeneous mineral deposits with higher and poorer grade zones are being evaluated. Alternatively, stochastic simulation are used to build local or global uncertainty about a geological attribute respecting its statistical moments. This study investigates methods capable of incorporating uncertainty to the estimates of resources and reserves via OK and sequential gaussian and sequential indicator simulation The results showed that for the type of mineralization studied all methods classified the tonnages similarly. The methods are illustrated using an exploration drill hole data sets from a large Brazilian coal deposit

  4. Lot quality assurance sampling for monitoring coverage and quality of a targeted condom social marketing programme in traditional and non-traditional outlets in India.

    Science.gov (United States)

    Piot, Bram; Mukherjee, Amajit; Navin, Deepa; Krishnan, Nattu; Bhardwaj, Ashish; Sharma, Vivek; Marjara, Pritpal

    2010-02-01

    This study reports on the results of a large-scale targeted condom social marketing campaign in and around areas where female sex workers are present. The paper also describes the method that was used for the routine monitoring of condom availability in these sites. The lot quality assurance sampling (LQAS) method was used for the assessment of the geographical coverage and quality of coverage of condoms in target areas in four states and along selected national highways in India, as part of Avahan, the India AIDS initiative. A significant general increase in condom availability was observed in the intervention area between 2005 and 2008. High coverage rates were gradually achieved through an extensive network of pharmacies and particularly of non-traditional outlets, whereas traditional outlets were instrumental in providing large volumes of condoms. LQAS is seen as a valuable tool for the routine monitoring of the geographical coverage and of the quality of delivery systems of condoms and of health products and services in general. With a relatively small sample size, easy data collection procedures and simple analytical methods, it was possible to inform decision-makers regularly on progress towards coverage targets.

  5. Exploring uncertainty in the Earth Sciences - the potential field perspective

    Science.gov (United States)

    Saltus, R. W.; Blakely, R. J.

    2013-12-01

    Interpretation of gravity and magnetic anomalies is mathematically non-unique because multiple theoretical solutions are possible. The mathematical label of 'non-uniqueness' can lead to the erroneous impression that no single interpretation is better in a geologic sense than any other. The purpose of this talk is to present a practical perspective on the theoretical non-uniqueness of potential field interpretation in geology. There are multiple ways to approach and constrain potential field studies to produce significant, robust, and definitive results. For example, a smooth, bell-shaped gravity profile, in theory, could be caused by an infinite set of physical density bodies, ranging from a deep, compact, circular source to a shallow, smoothly varying, inverted bell-shaped source. In practice, however, we can use independent geologic or geophysical information to limit the range of possible source densities and rule out many of the theoretical solutions. We can further reduce the theoretical uncertainty by careful attention to subtle anomaly details. For example, short-wavelength anomalies are a well-known and theoretically established characteristic of shallow geologic sources. The 'non-uniqueness' of potential field studies is closely related to the more general topic of scientific uncertainty in the Earth sciences and beyond. Nearly all results in the Earth sciences are subject to significant uncertainty because problems are generally addressed with incomplete and imprecise data. The increasing need to combine results from multiple disciplines into integrated solutions in order to address complex global issues requires special attention to the appreciation and communication of uncertainty in geologic interpretation.

  6. Reaching Non-Traditional and Under-Served Communities through Global Astronomy Month Programs

    Science.gov (United States)

    Simmons, Michael

    2013-01-01

    Global Astronomy Month (GAM), organized each year by Astronomers Without Borders (AWB), has become the world's largest annual celebration of astronomy. Launched as a follow-up to the unprecedented success of the 100 Hours of Astronomy Cornerstone Project of IYA2009, GAM quickly attracted not only traditional partners in astronomy and space science outreach, but also unusual partners from very different fields. GAM's third annual edition, GAM2012, included worldwide programs for the sight-impaired, astronomy in the arts, and other non-traditional programs. The special planetarium program, OPTICKS, combined elements such as Moonbounce (sending images to the Moon and back) and artistic elements in a unique presentation of the heavens. Programs were developed to present the heavens to the sight-impaired as well. The Cosmic Concert, in which a new musical piece is composed each year, combined with background images of celestial objects, and presented during GAM, has become an annual event. Several astronomy themed art video projects were presented online. AWB's Astropoetry Blog held a very successful contest during GAM2012 that attracted more than 70 entries from 17 countries. Students were engaged by participation in special GAM campaigns of the International Asteroid Search Campaign. AWB and GAM have both developed into platforms where innovative programs can develop, and interdisciplinary collaborations can flourish. As AWB's largest program, GAM brings the audience and resources that provide a boost for these new types of programs. Examples, lessons learned, new projects, and plans for the future of AWB and GAM will be presented.

  7. Genome signature analysis of thermal virus metagenomes reveals Archaea and thermophilic signatures

    Directory of Open Access Journals (Sweden)

    Pride David T

    2008-09-01

    Full Text Available Abstract Background Metagenomic analysis provides a rich source of biological information for otherwise intractable viral communities. However, study of viral metagenomes has been hampered by its nearly complete reliance on BLAST algorithms for identification of DNA sequences. We sought to develop algorithms for examination of viral metagenomes to identify the origin of sequences independent of BLAST algorithms. We chose viral metagenomes obtained from two hot springs, Bear Paw and Octopus, in Yellowstone National Park, as they represent simple microbial populations where comparatively large contigs were obtained. Thermal spring metagenomes have high proportions of sequences without significant Genbank homology, which has hampered identification of viruses and their linkage with hosts. To analyze each metagenome, we developed a method to classify DNA fragments using genome signature-based phylogenetic classification (GSPC, where metagenomic fragments are compared to a database of oligonucleotide signatures for all previously sequenced Bacteria, Archaea, and viruses. Results From both Bear Paw and Octopus hot springs, each assembled contig had more similarity to other metagenome contigs than to any sequenced microbial genome based on GSPC analysis, suggesting a genome signature common to each of these extreme environments. While viral metagenomes from Bear Paw and Octopus share some similarity, the genome signatures from each locale are largely unique. GSPC using a microbial database predicts most of the Octopus metagenome has archaeal signatures, while bacterial signatures predominate in Bear Paw; a finding consistent with those of Genbank BLAST. When using a viral database, the majority of the Octopus metagenome is predicted to belong to archaeal virus Families Globuloviridae and Fuselloviridae, while none of the Bear Paw metagenome is predicted to belong to archaeal viruses. As expected, when microbial and viral databases are combined, each of

  8. Uncertainty in soil-structure interaction analysis arising from differences in analytical techniques

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Chen, J.C.; Johnson, J.J.

    1982-07-01

    This study addresses uncertainties arising from variations in different modeling approaches to soil-structure interaction of massive structures at a nuclear power plant. To perform a comprehensive systems analysis, it is necessary to quantify, for each phase of the traditional analysis procedure, both the realistic seismic response and the uncertainties associated with them. In this study two linear soil-structure interaction techniques were used to analyze the Zion, Illinois nuclear power plant: a direct method using the FLUSH computer program and a substructure approach using the CLASSI family of computer programs. In-structure response from two earthquakes, one real and one synthetic, was compared. Structure configurations from relatively simple to complicated multi-structure cases were analyzed. The resulting variations help quantify uncertainty in structure response due to analysis procedures

  9. 48 CFR 204.101 - Contracting officer's signature.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Contracting officer's signature. 204.101 Section 204.101 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS... officer's signature. Follow the procedures at PGI 204.101 for signature of contract documents. [71 FR 9268...

  10. Crawling up the value chain: domestic institutions and non-traditional foreign direct investment in Brazil, 1990-2010

    Directory of Open Access Journals (Sweden)

    PATRICK J. W. EGAN

    2015-03-01

    Full Text Available Brazil attracted relatively little innovation-intensive and export-oriented foreign investment during the liberalization period of 1990 to 2010, especially compared with competitors such as China and India. Adopting an institutionalist perspective, I argue that multinational firm investment profiles can be partly explained by the characteristics of investment promotion policies and bureaucracies charged with their implementation. Brazil's FDI policies were passive and non-discriminating in the second half of the 1990s, but became more selective under Lula. Investment promotion efforts have often been undercut by weakly coordinated and inconsistent institutions. The paper highlights the need for active, discriminating investment promotion policies if benefits from non-traditional FDI are to be realized.

  11. A Novel Non-Intrusive Method to Resolve the Thermal-Dome-Effect of Pyranometers: Radiometric Calibration and Implications

    Science.gov (United States)

    Ji, Qiang; Tsay, Si-Chee; Lau, K. M.; Hansell, R. A.; Butler, J. J.; Cooper, J. W.

    2011-01-01

    Traditionally the calibration equation for pyranometers assumes that the measured solar irradiance is solely proportional to the thermopile's output voltage; therefore only a single calibration factor is derived. This causes additional measurement uncertainties because it does not capture sufficient information to correctly account for a pyranometer's thermal effect. In our updated calibration equation, temperatures from the pyranometer's dome and case are incorporated to describe the instrument's thermal behavior, and a new set of calibration constants are determined, thereby reducing measurement uncertainties. In this paper, we demonstrate why a pyranometer's uncertainty using the traditional calibration equation is always larger than a-few-percent, but with the new approach can become much less than 1% after the thermal issue is resolved. The highlighted calibration results are based on NIST-traceable light sources under controlled laboratory conditions. The significance of the new approach lends itself to not only avoiding the uncertainty caused by a pyranometer's thermal effect but also the opportunity to better isolate and characterize other instrumental artifacts, such as angular response and non-linearity of the thermopile, to further reduce additional uncertainties. We also discuss some of the implications, including an example of how the thermal issue can potentially impact climate studies by evaluating aerosol's direct-radiative effect using field measurements with and without considering the pyranometer's thermal effect. The results of radiative transfer model simulation show that a pyranometer's thermal effect on solar irradiance measurements at the surface can be translated into a significant alteration of the calculated distribution of solar energy inside the column atmosphere.

  12. Can specific transcriptional regulators assemble a universal cancer signature?

    Science.gov (United States)

    Roy, Janine; Isik, Zerrin; Pilarsky, Christian; Schroeder, Michael

    2013-10-01

    Recently, there is a lot of interest in using biomarker signatures derived from gene expression data to predict cancer progression. We assembled signatures of 25 published datasets covering 13 types of cancers. How do these signatures compare with each other? On one hand signatures answering the same biological question should overlap, whereas signatures predicting different cancer types should differ. On the other hand, there could also be a Universal Cancer Signature that is predictive independently of the cancer type. Initially, we generate signatures for all datasets using classical approaches such as t-test and fold change and then, we explore signatures resulting from a network-based method, that applies the random surfer model of Google's PageRank algorithm. We show that the signatures as published by the authors and the signatures generated with classical methods do not overlap - not even for the same cancer type - whereas the network-based signatures strongly overlap. Selecting 10 out of 37 universal cancer genes gives the optimal prediction for all cancers thus taking a first step towards a Universal Cancer Signature. We furthermore analyze and discuss the involved genes in terms of the Hallmarks of cancer and in particular single out SP1, JUN/FOS and NFKB1 and examine their specific role in cancer progression.

  13. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  14. Enterprise strategic development under conditions of uncertainty

    Directory of Open Access Journals (Sweden)

    O.L. Truhan

    2016-09-01

    Full Text Available The author points out the necessity to conduct researches in the field of enterprise strategic development under conditions of increased dynamism and uncertainty of external environment. It is determined that under conditions of external uncertainty it’s reasonable to conduct the strategic planning of entities using the life cycle models of organization and planning on the basis of disclosure. Any organization has to react in a flexible way upon external calls applying the cognitive knowledge about its own business model of development and the ability to intensify internal working reserves. The article determines that in the process of long-term business activity planning managers use traditional approaches based on the familiar facts and conditions that the present tendencies will not be subjected to essential changes in the future. Planning a new risky business one has to act when prerequisites and assumptions are predominant over knowledge. The author proves that under such conditions the powerful tool of enterprise strategic development may be such a well-known approach as “planning on the basis of disclosure”. The approach suggested helps take into account numerous factors of uncertainty of external environment that makes the strategic planning process maximum adaptable to the conditions of venture business development.

  15. 36 CFR 1150.22 - Signature of documents.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Signature of documents. 1150.22 Section 1150.22 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION BARRIERS... Documents for Proceedings on Citations § 1150.22 Signature of documents. The signature of a party...

  16. Nature's chemical signatures in human olfaction: a foodborne perspective for future biotechnology.

    Science.gov (United States)

    Dunkel, Andreas; Steinhaus, Martin; Kotthoff, Matthias; Nowak, Bettina; Krautwurst, Dietmar; Schieberle, Peter; Hofmann, Thomas

    2014-07-07

    The biocatalytic production of flavor naturals that determine chemosensory percepts of foods and beverages is an ever challenging target for academic and industrial research. Advances in chemical trace analysis and post-genomic progress at the chemistry-biology interface revealed odor qualities of nature's chemosensory entities to be defined by odorant-induced olfactory receptor activity patterns. Beyond traditional views, this review and meta-analysis now shows characteristic ratios of only about 3 to 40 genuine key odorants for each food, from a group of about 230 out of circa 10 000 food volatiles. This suggests the foodborn stimulus space has co-evolved with, and roughly match our circa 400 olfactory receptors as best natural agonists. This perspective gives insight into nature's chemical signatures of smell, provides the chemical odor codes of more than 220 food samples, and beyond addresses industrial implications for producing recombinants that fully reconstruct the natural odor signatures for use in flavors and fragrances, fully immersive interactive virtual environments, or humanoid bioelectronic noses. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Automated Generation of Tabular Equations of State with Uncertainty Information

    Science.gov (United States)

    Carpenter, John H.; Robinson, Allen C.; Debusschere, Bert J.; Mattsson, Ann E.

    2015-06-01

    As computational science pushes toward higher fidelity prediction, understanding the uncertainty associated with closure models, such as the equation of state (EOS), has become a key focus. Traditional EOS development often involves a fair amount of art, where expert modelers may appear as magicians, providing what is felt to be the closest possible representation of the truth. Automation of the development process gives a means by which one may demystify the art of EOS, while simultaneously obtaining uncertainty information in a manner that is both quantifiable and reproducible. We describe our progress on the implementation of such a system to provide tabular EOS tables with uncertainty information to hydrocodes. Key challenges include encoding the artistic expert opinion into an algorithmic form and preserving the analytic models and uncertainty information in a manner that is both accurate and computationally efficient. Results are demonstrated on a multi-phase aluminum model. *Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  18. Approach to uncertainty evaluation for safety analysis

    International Nuclear Information System (INIS)

    Ogura, Katsunori

    2005-01-01

    Nuclear power plant safety used to be verified and confirmed through accident simulations using computer codes generally because it is very difficult to perform integrated experiments or tests for the verification and validation of the plant safety due to radioactive consequence, cost, and scaling to the actual plant. Traditionally the plant safety had been secured owing to the sufficient safety margin through the conservative assumptions and models to be applied to those simulations. Meanwhile the best-estimate analysis based on the realistic assumptions and models in support of the accumulated insights could be performed recently, inducing the reduction of safety margin in the analysis results and the increase of necessity to evaluate the reliability or uncertainty of the analysis results. This paper introduces an approach to evaluate the uncertainty of accident simulation and its results. (Note: This research had been done not in the Japan Nuclear Energy Safety Organization but in the Tokyo Institute of Technology.) (author)

  19. Limited entropic uncertainty as new principle of quantum physics

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.

    2001-01-01

    The Uncertainty Principle (UP) of quantum mechanics discovered by Heisenberg, which constitute the corner-stone of quantum physics, asserts that: there is an irreducible lower bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables. In order to avoid this state-dependence many authors proposed to use the information entropy as a measure of the uncertainty instead of above standard quantitative formulation of the Heisenberg uncertainty principle. In this paper the Principle of Limited Entropic Uncertainty (LEU-Principle), as a new principle in quantum physics, is proved. Then, consistent experimental tests of the LEU-principle, obtained by using the available 49 sets of the pion-nucleus phase shifts, are presented for both, extensive (q=1) and nonextensive (q=0.5 and q=2.0) cases. Some results obtained by the application of LEU-Principle to the diffraction phenomena are also discussed. The main results and conclusions of our paper can be summarized as follows: (i) We introduced a new principle in quantum physics namely the Principle of Limited Entropic Uncertainty (LEU-Principle). This new principle includes in a more general and exact form not only the old Heisenberg uncertainty principle but also introduce an upper limit on the magnitude of the uncertainty in the quantum physics. The LEU-Principle asserts that: 'there is an irreducible lower bound as well as an upper bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables for any extensive and nonextensive (q ≥ 0) quantum systems'; (ii) Two important concrete realizations of the LEU-Principle are explicitly obtained in this paper, namely: (a) the LEU-inequalities for the quantum scattering of spinless particles and (b) the LEU-inequalities for the diffraction on single slit of width 2a. In particular from our general results, in the limit y → +1 we recover in an exact form all the results previously reported. In our paper an

  20. Implementing Signature Neural Networks with Spiking Neurons.

    Science.gov (United States)

    Carrillo-Medina, José Luis; Latorre, Roberto

    2016-01-01

    Spiking Neural Networks constitute the most promising approach to develop realistic Artificial Neural Networks (ANNs). Unlike traditional firing rate-based paradigms, information coding in spiking models is based on the precise timing of individual spikes. It has been demonstrated that spiking ANNs can be successfully and efficiently applied to multiple realistic problems solvable with traditional strategies (e.g., data classification or pattern recognition). In recent years, major breakthroughs in neuroscience research have discovered new relevant computational principles in different living neural systems. Could ANNs benefit from some of these recent findings providing novel elements of inspiration? This is an intriguing question for the research community and the development of spiking ANNs including novel bio-inspired information coding and processing strategies is gaining attention. From this perspective, in this work, we adapt the core concepts of the recently proposed Signature Neural Network paradigm-i.e., neural signatures to identify each unit in the network, local information contextualization during the processing, and multicoding strategies for information propagation regarding the origin and the content of the data-to be employed in a spiking neural network. To the best of our knowledge, none of these mechanisms have been used yet in the context of ANNs of spiking neurons. This paper provides a proof-of-concept for their applicability in such networks. Computer simulations show that a simple network model like the discussed here exhibits complex self-organizing properties. The combination of multiple simultaneous encoding schemes allows the network to generate coexisting spatio-temporal patterns of activity encoding information in different spatio-temporal spaces. As a function of the network and/or intra-unit parameters shaping the corresponding encoding modality, different forms of competition among the evoked patterns can emerge even in the absence

  1. 17 CFR 201.65 - Identity and signature.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false Identity and signature. 201.65... of 1934 § 201.65 Identity and signature. Applications pursuant to this subpart may omit the identity, mailing address, and signature of the applicant; provided, that such identity, mailing address and...

  2. CORONAL AND CHROMOSPHERIC SIGNATURES OF LARGE-SCALE DISTURBANCES ASSOCIATED WITH A MAJOR SOLAR ERUPTION

    International Nuclear Information System (INIS)

    Zong, Weiguo; Dai, Yu

    2015-01-01

    We present both coronal and chromospheric observations of large-scale disturbances associated with a major solar eruption on 2005 September 7. In the Geostationary Operational Environmental Satellites/Solar X-ray Imager (SXI), arclike coronal brightenings are recorded propagating in the southern hemisphere. The SXI front shows an initially constant speed of 730 km s −1 and decelerates later on, and its center is near the central position angle of the associated coronal mass ejection (CME) but away from the flare site. Chromospheric signatures of the disturbances are observed in both Mauna Loa Solar Observatory (MLSO)/Polarimeter for Inner Coronal Studies Hα and MLSO/Chromospheric Helium I Imaging Photometer He i λ10830 and can be divided into two parts. The southern signatures occur in regions where the SXI front sweeps over, with the Hα bright front coincident with the SXI front, while the He i dark front lags the SXI front but shows a similar kinematics. Ahead of the path of the southern signatures, oscillations of a filament are observed. The northern signatures occur near the equator, with the Hα and He i fronts coincident with each other. They first propagate westward and then deflect to the north at the boundary of an equatorial coronal hole. Based on these observational facts, we suggest that the global disturbances are associated with the CME lift-off and show a hybrid nature: a mainly non-wave CME flank nature for the SXI signatures and the corresponding southern chromospheric signatures, and a shocked fast-mode coronal MHD wave nature for the northern chromospheric signatures

  3. CORONAL AND CHROMOSPHERIC SIGNATURES OF LARGE-SCALE DISTURBANCES ASSOCIATED WITH A MAJOR SOLAR ERUPTION

    Energy Technology Data Exchange (ETDEWEB)

    Zong, Weiguo [Key Laboratory of Space Weather, National Center for Space Weather, China Meteorological Administration, Beijing 100081 (China); Dai, Yu, E-mail: ydai@nju.edu.cn [Key Laboratory of Modern Astronomy and Astrophysics (Nanjing University), Ministry of Education, Nanjing 210023 (China)

    2015-08-20

    We present both coronal and chromospheric observations of large-scale disturbances associated with a major solar eruption on 2005 September 7. In the Geostationary Operational Environmental Satellites/Solar X-ray Imager (SXI), arclike coronal brightenings are recorded propagating in the southern hemisphere. The SXI front shows an initially constant speed of 730 km s{sup −1} and decelerates later on, and its center is near the central position angle of the associated coronal mass ejection (CME) but away from the flare site. Chromospheric signatures of the disturbances are observed in both Mauna Loa Solar Observatory (MLSO)/Polarimeter for Inner Coronal Studies Hα and MLSO/Chromospheric Helium I Imaging Photometer He i λ10830 and can be divided into two parts. The southern signatures occur in regions where the SXI front sweeps over, with the Hα bright front coincident with the SXI front, while the He i dark front lags the SXI front but shows a similar kinematics. Ahead of the path of the southern signatures, oscillations of a filament are observed. The northern signatures occur near the equator, with the Hα and He i fronts coincident with each other. They first propagate westward and then deflect to the north at the boundary of an equatorial coronal hole. Based on these observational facts, we suggest that the global disturbances are associated with the CME lift-off and show a hybrid nature: a mainly non-wave CME flank nature for the SXI signatures and the corresponding southern chromospheric signatures, and a shocked fast-mode coronal MHD wave nature for the northern chromospheric signatures.

  4. Quantification of margins and mixed uncertainties using evidence theory and stochastic expansions

    International Nuclear Information System (INIS)

    Shah, Harsheel; Hosder, Serhat; Winter, Tyler

    2015-01-01

    The objective of this paper is to implement Dempster–Shafer Theory of Evidence (DSTE) in the presence of mixed (aleatory and multiple sources of epistemic) uncertainty to the reliability and performance assessment of complex engineering systems through the use of quantification of margins and uncertainties (QMU) methodology. This study focuses on quantifying the simulation uncertainties, both in the design condition and the performance boundaries along with the determination of margins. To address the possibility of multiple sources and intervals for epistemic uncertainty characterization, DSTE is used for uncertainty quantification. An approach to incorporate aleatory uncertainty in Dempster–Shafer structures is presented by discretizing the aleatory variable distributions into sets of intervals. In view of excessive computational costs for large scale applications and repetitive simulations needed for DSTE analysis, a stochastic response surface based on point-collocation non-intrusive polynomial chaos (NIPC) has been implemented as the surrogate for the model response. The technique is demonstrated on a model problem with non-linear analytical functions representing the outputs and performance boundaries of two coupled systems. Finally, the QMU approach is demonstrated on a multi-disciplinary analysis of a high speed civil transport (HSCT). - Highlights: • Quantification of margins and uncertainties (QMU) methodology with evidence theory. • Treatment of both inherent and epistemic uncertainties within evidence theory. • Stochastic expansions for representation of performance metrics and boundaries. • Demonstration of QMU on an analytical problem. • QMU analysis applied to an aerospace system (high speed civil transport)

  5. 15 CFR 908.16 - Signature.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Signature. 908.16 Section 908.16 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) NATIONAL OCEANIC... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National...

  6. Non-parametric order statistics method applied to uncertainty propagation in fuel rod calculations

    International Nuclear Information System (INIS)

    Arimescu, V.E.; Heins, L.

    2001-01-01

    Advances in modeling fuel rod behavior and accumulations of adequate experimental data have made possible the introduction of quantitative methods to estimate the uncertainty of predictions made with best-estimate fuel rod codes. The uncertainty range of the input variables is characterized by a truncated distribution which is typically a normal, lognormal, or uniform distribution. While the distribution for fabrication parameters is defined to cover the design or fabrication tolerances, the distribution of modeling parameters is inferred from the experimental database consisting of separate effects tests and global tests. The final step of the methodology uses a Monte Carlo type of random sampling of all relevant input variables and performs best-estimate code calculations to propagate these uncertainties in order to evaluate the uncertainty range of outputs of interest for design analysis, such as internal rod pressure and fuel centerline temperature. The statistical method underlying this Monte Carlo sampling is non-parametric order statistics, which is perfectly suited to evaluate quantiles of populations with unknown distribution. The application of this method is straightforward in the case of one single fuel rod, when a 95/95 statement is applicable: 'with a probability of 95% and confidence level of 95% the values of output of interest are below a certain value'. Therefore, the 0.95-quantile is estimated for the distribution of all possible values of one fuel rod with a statistical confidence of 95%. On the other hand, a more elaborate procedure is required if all the fuel rods in the core are being analyzed. In this case, the aim is to evaluate the following global statement: with 95% confidence level, the expected number of fuel rods which are not exceeding a certain value is all the fuel rods in the core except only a few fuel rods. In both cases, the thresholds determined by the analysis should be below the safety acceptable design limit. An indirect

  7. Uncertainty analysis of neural network based flood forecasting models: An ensemble based approach for constructing prediction interval

    Science.gov (United States)

    Kasiviswanathan, K.; Sudheer, K.

    2013-05-01

    Artificial neural network (ANN) based hydrologic models have gained lot of attention among water resources engineers and scientists, owing to their potential for accurate prediction of flood flows as compared to conceptual or physics based hydrologic models. The ANN approximates the non-linear functional relationship between the complex hydrologic variables in arriving at the river flow forecast values. Despite a large number of applications, there is still some criticism that ANN's point prediction lacks in reliability since the uncertainty of predictions are not quantified, and it limits its use in practical applications. A major concern in application of traditional uncertainty analysis techniques on neural network framework is its parallel computing architecture with large degrees of freedom, which makes the uncertainty assessment a challenging task. Very limited studies have considered assessment of predictive uncertainty of ANN based hydrologic models. In this study, a novel method is proposed that help construct the prediction interval of ANN flood forecasting model during calibration itself. The method is designed to have two stages of optimization during calibration: at stage 1, the ANN model is trained with genetic algorithm (GA) to obtain optimal set of weights and biases vector, and during stage 2, the optimal variability of ANN parameters (obtained in stage 1) is identified so as to create an ensemble of predictions. During the 2nd stage, the optimization is performed with multiple objectives, (i) minimum residual variance for the ensemble mean, (ii) maximum measured data points to fall within the estimated prediction interval and (iii) minimum width of prediction interval. The method is illustrated using a real world case study of an Indian basin. The method was able to produce an ensemble that has an average prediction interval width of 23.03 m3/s, with 97.17% of the total validation data points (measured) lying within the interval. The derived

  8. 34 CFR 101.32 - Signature of documents.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Signature of documents. 101.32 Section 101.32 Education Regulations of the Offices of the Department of Education OFFICE FOR CIVIL RIGHTS, DEPARTMENT OF EDUCATION... Documents § 101.32 Signature of documents. The signature of a party, authorized officer, employee or...

  9. 29 CFR 102.116 - Signature of orders.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 2 2010-07-01 2010-07-01 false Signature of orders. 102.116 Section 102.116 Labor Regulations Relating to Labor NATIONAL LABOR RELATIONS BOARD RULES AND REGULATIONS, SERIES 8 Certification and Signature of Documents § 102.116 Signature of orders. The executive secretary or the associate executive...

  10. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  11. On signature change in p-adic space-times

    International Nuclear Information System (INIS)

    Dragovic, B.G.

    1991-01-01

    Change of signature by linear coordinate transformations in p-adic space-times is considered. In this paper it is shown that there exists arbitrary change of trivial signature in Q p n for all n ≥ 1 if p ≡ 1 (mod 4). In other cases it is possible to change only even number of the signs of the signature. The authors suggest new concept of signature with respect to distinct quadratic extensions, of Q p . If space-time dimension is restricted to four there is no signature change

  12. Economic uncertainty and its impact on the Croatian economy

    Directory of Open Access Journals (Sweden)

    Petar Soric

    2017-12-01

    Full Text Available The aim of this paper is to quantify institutional (political and fiscal and non-institutional uncertainty (economic policy uncertainty, Economists’ recession index, natural disasters-related uncertainty, and several disagreement measures. The stated indicators are based on articles from highly popular Croatian news portals, the repository of law amendments (Narodne novine, and Business and Consumer Surveys. We also introduce a composite uncertainty indicator, obtained by the principal components method. The analysis of a structural VAR model of the Croatian economy (both with fixed and time-varying parameters has showed that a vast part of the analysed indicators are significant predictors of economic activity. It is demonstrated that their impact on industrial production is the strongest in the onset of a crisis. On the other hand, the influence of fiscal uncertainty exhibits just the opposite tendencies. It strengthens with the intensification of economic activity, which partially exculpates the possible utilization of fiscal expansion as a counter-crisis tool.

  13. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  14. Signature Curves Statistics of DNA Supercoils

    OpenAIRE

    Shakiban, Cheri; Lloyd, Peter

    2004-01-01

    In this paper we describe the Euclidean signature curves for two dimensional closed curves in the plane and their generalization to closed space curves. The focus will be on discrete numerical methods for approximating such curves. Further we will apply these numerical methods to plot the signature curves related to three-dimensional simulated DNA supercoils. Our primary focus will be on statistical analysis of the data generated for the signature curves of the supercoils. We will try to esta...

  15. A Traditional Turkish Fermented Non-Alcoholic Beverage, “Shalgam”

    Directory of Open Access Journals (Sweden)

    Fatma Coskun

    2017-10-01

    Full Text Available Shalgam is a traditional Turkish beverage produced by lactic acid fermentation. Shalgam is also sold in markets in some European cities. In shalgam production, bulgur flour (formed during the crushing process, it is the part that remains under the sieve after breaking the outer shells of boiled dried wheat for processing, salt, water, purple carrot, turnip, and sometimes red beet is used. The traditional method of production can take 10–12 days. Commercial production takes 4–5 days. Shalgam is a probiotic food and a good source of nutrients. It helps regulate the pH of the digestive system. It contains β-carotene, group B vitamins, calcium, potassium, and iron. People also use it as a medicine because of its antiseptic agents. Shalgam consumption should be increased and become worldwide.

  16. Re-Entry Women Students in Higher Education: A Model for Non-Traditional Support Programs in Counseling and Career Advisement.

    Science.gov (United States)

    Karr-Kidwell, PJ

    A model program of support for non-traditional women students has been developed at Texas Woman's University (TWU). Based on a pilot study, several steps were taken to assist these re-entry students at TWU. For example, in spring semester of 1983, a committee for re-entry students was established, with a student organization--Women in…

  17. 45 CFR 81.32 - Signature of documents.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Signature of documents. 81.32 Section 81.32 Public... UNDER PART 80 OF THIS TITLE Form, Execution, Service and Filing of Documents § 81.32 Signature of documents. The signature of a party, authorized officer, employee or attorney constitutes a certificate that...

  18. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    Science.gov (United States)

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  19. Analysis of uncertainties of thermal hydraulic calculations

    International Nuclear Information System (INIS)

    Macek, J.; Vavrin, J.

    2002-12-01

    In 1993-1997 it was proposed, within OECD projects, that a common program should be set up for uncertainty analysis by a probabilistic method based on a non-parametric statistical approach for system computer codes such as RELAP, ATHLET and CATHARE and that a method should be developed for statistical analysis of experimental databases for the preparation of the input deck and statistical analysis of the output calculation results. Software for such statistical analyses would then have to be processed as individual tools independent of the computer codes used for the thermal hydraulic analysis and programs for uncertainty analysis. In this context, a method for estimation of a thermal hydraulic calculation is outlined and selected methods of statistical analysis of uncertainties are described, including methods for prediction accuracy assessment based on the discrete Fourier transformation principle. (author)

  20. An inexact mixed risk-aversion two-stage stochastic programming model for water resources management under uncertainty.

    Science.gov (United States)

    Li, W; Wang, B; Xie, Y L; Huang, G H; Liu, L

    2015-02-01

    Uncertainties exist in the water resources system, while traditional two-stage stochastic programming is risk-neutral and compares the random variables (e.g., total benefit) to identify the best decisions. To deal with the risk issues, a risk-aversion inexact two-stage stochastic programming model is developed for water resources management under uncertainty. The model was a hybrid methodology of interval-parameter programming, conditional value-at-risk measure, and a general two-stage stochastic programming framework. The method extends on the traditional two-stage stochastic programming method by enabling uncertainties presented as probability density functions and discrete intervals to be effectively incorporated within the optimization framework. It could not only provide information on the benefits of the allocation plan to the decision makers but also measure the extreme expected loss on the second-stage penalty cost. The developed model was applied to a hypothetical case of water resources management. Results showed that that could help managers generate feasible and balanced risk-aversion allocation plans, and analyze the trade-offs between system stability and economy.